Deploying an MLFlow Remote Server with Docker, S3 and SQL

MLFlow is an open-source platform for managing your machine learning lifecycle. You can either run MLFlow locally on your system, or host an MLFlow Tracking server, which allows for mutiple people to log models and store them remotely in a model repository for quick deployment/reuse.

In this article, I’ll tell you how to deploy MLFlow on a remote server using Docker, an S3 storage container of your choice Minio or Ceph and SQL SQLite or MySQL.

Setting up the Server

  • Create a new folder for your Mlflow server
mkdir mlflow-server
cd mlflow-server
  • Open a new terminal on your server, go to your mlflow-server directory and create a virtual environment for MLFlow installation
python3 -m venv env
source env/bin/activate
  • Install necessary packages:
 pip3 install mlflow

Setting up the Backend Store

Using SQLite3

sudo apt install sqlite3
  • Setup an SQLite database for your MLFlow server’s backend-uri. Make sure that this file is in your mlflow-server folder.
sqlite3 store.db
# press cntrl+D to exit sqlite
# nothing else needs to be done
  • Note that yuor backend-store-uri would now be — sqlite:///store.db.

Using MySQL

  • Install PyMySQL in your python envirionment which has MLFlow.
pip3 install pymysql
  • Sign in to you mysql shell and create a new database for MLFlow. Also create a new user/use an existing user and assign access to the MLFLow database.
CREATE USER ‘mlflow-user’ IDENTIFIED BY ‘password’;
CREATE DATABASE ‘mlflowruns’;
GRANT ALL PRIVILEGES ON mlflowruns.* TO ‘mlflow-user’;
  • Note that your backend-store-uri would now be — mysql+pymysql://'mlflow-user':'password'@localhost:3306/mlflowruns

Setting up the Artifact Store

Using Minio

sudo docker pull minio/minio
  • Run a Minio server and bind it to a server port that is exposed. The Minio server needs to run on a port that is exposed so that a remote client can upload model artifacts to it. Minio will serve as your default artifact root. You can run the following command. Note that you need to fill in an exposed port for minio, and any access/secret key pair of your choice.
docker run -p <exposed_port_for_minio>:9000 — name minio1 \
-e MINIO_ACCESS_KEY=<your_access_key_id> \
-e MINIO_SECRET_KEY=<your_secret_key> \
-v /mnt/data:/data \
-v /mnt/config:/root/.minio \
minio/minio server /data
  • Install necessary packages in your created environment.
pip3 install minio
pip3 install boto3
  • Setup local environment variables:
export MLFLOW_S3_ENDPOINT_URL=<exposed_port_for_minio>
export AWS_ACCESS_KEY_ID=<your_access_key_id>
export AWS_SECRET_ACCESS_KEY=<your_secret_key>
  • Create a Minio bucket — in your mlflow-server folder, run the following:
  • Note that your default-artifact-root should now be s3://mlflow/artifacts

Using Ceph

  • Create a Ceph nano cluster
./cn cluster start -d /tmp mlflow-cluster -f huge
  • Note down the ceph endpoint URL, AWS access key id and AWS secret key that the cluster auto-generates and prints to shell.
  • Cretae an S3 bucket
./cn s3 mb mlflow-cluster mlflow-buc
  • Setup local environment variables:
export MLFLOW_S3_ENDPOINT_URL=<ceph-endpoint-url>
export AWS_ACCESS_KEY_ID=<your_access_key_id>
export AWS_SECRET_ACCESS_KEY=<your_secret_key>
  • Note that your default-artifact-root should now be s3://mlflow-buc

Starting the Server

mlflow server -p <exposed_port_for_mlflow> \
— host \
— backend-store-uri <enter_backend_store_uri> \
— default-artifact-root <enter_deafult_artifact_root>
  • And that’s it! your server is up and running. Note that while you’re specifying the endpoint URL, do not use localhost. Stick to with the http. Ceph is specific about the endpoint URL.

Setting up the Client

mkdir mlflow-work
cd mlflow-work
python3 -m venv env
source env/bin/activate
pip3 install mlflow
  • Setting up local environment variables — you need to setup your environment for accessing MLFlow’s remote tracking server and logging data to your remote minio artifact server
export MLFLOW_TRACKING_URI=http://<remote_server_ip>:<exposed_port_for_mlflow>
export MLFLOW_S3_ENDPOINT_URL=http://<remote_server_ip>:<exposed_port_for_artifact_root>
export AWS_ACCESS_KEY_ID=<enter_aws_access_key_id>
export AWS_SECRET_ACCESS_KEY=<enter_aws_secret_access_key>
  • And you’re all set! You can download example code from MLFlow examples, and start executing them directly in your mlflow-work directory. For more information about MLFlow checkout MLFlow Documentation.

Hacker | Building Enterpret | ex-Samsung, IIIT-H |