mirror of
https://github.com/Toumash/mlflow-docker
synced 2025-11-04 15:19:21 +01:00
Update README.md
This commit is contained in:
37
README.md
37
README.md
@@ -59,16 +59,15 @@ s3Client.make_bucket('mlflow')
|
|||||||
|
|
||||||
</details>
|
</details>
|
||||||
|
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
4. Open up http://localhost:5000/#/ for MlFlow, and http://localhost:9000/minio/mlflow/ for S3 bucket (you artifacts) with credentials from `.env` file
|
4. Open up http://localhost:5000 for MlFlow, and http://localhost:9000/minio/mlflow/ for S3 bucket (you artifacts) with credentials from `.env` file
|
||||||
|
|
||||||
5. Configure your client-side
|
5. Configure your client-side
|
||||||
|
|
||||||
For running mlflow files you AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY environment variables present on the client-side.
|
For running mlflow files you AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY environment variables present on the client-side.
|
||||||
|
|
||||||
Also, you will need to specify the address of your S3 server (minio) and mlflow tracking server
|
Also, you will need to specify the address of your S3 server (minio) and mlflow tracking server. For that, run following script
|
||||||
|
|
||||||
```shell
|
```shell
|
||||||
export AWS_ACCESS_KEY_ID=AKIAIOSFODNN7EXAMPLE
|
export AWS_ACCESS_KEY_ID=AKIAIOSFODNN7EXAMPLE
|
||||||
@@ -77,43 +76,17 @@ export MLFLOW_S3_ENDPOINT_URL=http://localhost:9000
|
|||||||
export MLFLOW_TRACKING_URI=http://localhost:5000
|
export MLFLOW_TRACKING_URI=http://localhost:5000
|
||||||
```
|
```
|
||||||
|
|
||||||
You can load them from the .env file. But i recommend putting it in the .bashrc as below
|
or paste it into your .bashrc file and then run `source ~/.bashrc`
|
||||||
```
|
|
||||||
AWS_ACCESS_KEY_ID=AKIAIOSFODNN7EXAMPLE
|
|
||||||
AWS_SECRET_ACCESS_KEY=wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY
|
|
||||||
AWS_REGION=us-east-1
|
|
||||||
AWS_BUCKET_NAME=mlflow
|
|
||||||
MYSQL_DATABASE=mlflow
|
|
||||||
MYSQL_USER=mlflow_user
|
|
||||||
MYSQL_PASSWORD=mlflow_password
|
|
||||||
MYSQL_ROOT_PASSWORD=toor
|
|
||||||
MLFLOW_S3_ENDPOINT_URL=http://localhost:9000
|
|
||||||
MLFLOW_TRACKING_URI=http://localhost:5000
|
|
||||||
```
|
|
||||||
Then run
|
|
||||||
```shell
|
|
||||||
source .env
|
|
||||||
```
|
|
||||||
|
|
||||||
or add them as `export X=Y` to the .bashrc file and then run
|
|
||||||
|
|
||||||
```shell
|
|
||||||
source ~/.bashrc
|
|
||||||
```
|
|
||||||
|
|
||||||
|
|
||||||
6. Test the pipeline with below command with conda. If you dont have conda installed run with `--no-conda`
|
6. Test the pipeline with below command with conda. If you dont have conda installed run with `--no-conda`
|
||||||
|
|
||||||
```shell
|
```shell
|
||||||
mlflow run git@github.com:databricks/mlflow-example.git -P alpha=0.5
|
mlflow run git@github.com:databricks/mlflow-example.git -P alpha=0.5
|
||||||
```
|
# or
|
||||||
|
|
||||||
Optionally you can run
|
|
||||||
```shell
|
|
||||||
python ./quickstart/mlflow_tracking.py
|
python ./quickstart/mlflow_tracking.py
|
||||||
```
|
```
|
||||||
|
|
||||||
7. (Optional) If you are constantly switching your environment you can use this environment variable syntax
|
7. *(Optional)* If you are constantly switching your environment you can use this environment variable syntax
|
||||||
|
|
||||||
```shell
|
```shell
|
||||||
MLFLOW_S3_ENDPOINT_URL=http://localhost:9000 MLFLOW_TRACKING_URI=http://localhost:5000 mlflow run git@github.com:databricks/mlflow-example.git -P alpha=0.5
|
MLFLOW_S3_ENDPOINT_URL=http://localhost:9000 MLFLOW_TRACKING_URI=http://localhost:5000 mlflow run git@github.com:databricks/mlflow-example.git -P alpha=0.5
|
||||||
|
|||||||
Reference in New Issue
Block a user