diff --git a/README.md b/README.md index 4645a96..09ce27b 100644 --- a/README.md +++ b/README.md @@ -15,74 +15,17 @@ The only requirement is docker installed on your system and we are going to use - Ready to use bash scripts for python development! - Automatically-created s3 buckets - -## Simple setup guide +# 🚀 Setup guide 1. Configure `.env` file for your choice. You can put there anything you like, it will be used to configure you services +2. Run `docker compose up` +3. Open up http://localhost:5000 for MlFlow, and http://localhost:9001/ to browse your files in S3 artifact store -2. Run the Infrastructure by this one line: -```shell -$ docker-compose up -d -Creating network "mlflow-basis_A" with driver "bridge" -Creating mlflow_db ... done -Creating tracker_mlflow ... done -Creating aws-s3 ... done -``` +## How to use in ML development in python -3. Create mlflow bucket. You can use my bundled script. +
+Click to show -Just run -```shell -bash ./run_create_bucket.sh -``` - -You can also do it **either using AWS CLI or Python Api**. -
AWS CLI - -1. [Install AWS cli](https://aws.amazon.com/cli/) **Yes, i know that you dont have an Amazon Web Services Subscription - dont worry! It wont be needed!** -2. Configure AWS CLI - enter the same credentials from the `.env` file - -```shell -aws configure -``` -> AWS Access Key ID [****************123]: AKIAIOSFODNN7EXAMPLE -> AWS Secret Access Key [****************123]: wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY -> Default region name [us-west-2]: us-east-1 -> Default output format [json]: - -3. Run -```shell -aws --endpoint-url=http://localhost:9000 s3 mb s3://mlflow -``` - -
- -
Python API - -1. Install Minio -```shell -pip install Minio -``` -2. Run this to create a bucket -```python -from minio import Minio -from minio.error import ResponseError - -s3Client = Minio( - 'localhost:9000', - access_key='', # copy from .env file - secret_key='', # copy from .env file - secure=False -) -s3Client.make_bucket('mlflow') -``` - -
- ---- - -4. Open up http://localhost:5000 for MlFlow, and http://localhost:9000/minio/mlflow/ for S3 bucket (you artifacts) with credentials from `.env` file - -5. Configure your client-side +1. Configure your client-side For running mlflow files you need various environment variables set on the client side. To generate them user the convienience script `./bashrc_install.sh`, which installs it on your system or `./bashrc_generate.sh`, which just displays the config to copy & paste. @@ -91,7 +34,7 @@ For running mlflow files you need various environment variables set on the clien The script installs this variables: AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, MLFLOW_S3_ENDPOINT_URL, MLFLOW_TRACKING_URI. All of them are needed to use mlflow from the client-side. -6. Test the pipeline with below command with conda. If you dont have conda installed run with `--no-conda` +2. Test the pipeline with below command with conda. If you dont have conda installed run with `--no-conda` ```shell mlflow run git@github.com:databricks/mlflow-example.git -P alpha=0.5 @@ -99,12 +42,15 @@ mlflow run git@github.com:databricks/mlflow-example.git -P alpha=0.5 python ./quickstart/mlflow_tracking.py ``` -7. *(Optional)* If you are constantly switching your environment you can use this environment variable syntax +3. *(Optional)* If you are constantly switching your environment you can use this environment variable syntax ```shell MLFLOW_S3_ENDPOINT_URL=http://localhost:9000 MLFLOW_TRACKING_URI=http://localhost:5000 mlflow run git@github.com:databricks/mlflow-example.git -P alpha=0.5 ``` + +
+ ## Licensing Copyright (c) 2021 Tomasz Dłuski