Deploy MLflow 3.2.0 with PostgreSQL + MinIO in Docker (2025 Edition)
Your One-Click MLOps Stack for Experiments, Artifacts, and Tracking - complete setup guide with Docker Compose
Quick Navigation
Difficulty: 🟡 Intermediate
Estimated Time: 45-60 minutes
Prerequisites: Basic Docker knowledge, Understanding of ML concepts, Command line experience
What You'll Learn
This tutorial covers essential MLOps deployment concepts and tools:
- MLflow Setup - Complete MLflow 3.2.0 deployment with Docker
- PostgreSQL Integration - Backend metadata store configuration
- MinIO Storage - S3-compatible artifact storage setup
- Docker Orchestration - Multi-service container management
- Production Deployment - Enterprise-ready MLOps infrastructure
- Monitoring and Maintenance - System health and performance tracking
- Security Configuration - Secure access and data protection
Prerequisites
- Basic Docker knowledge and container concepts
- Understanding of ML concepts and MLOps workflows
- Command line experience with Docker commands
- Basic understanding of database and storage concepts
Related Tutorials
- MLflow vs. Kubeflow - MLOps platform comparison
- GPU-Ready Docker Setup - GPU-enabled development environment
- Main Tutorials Hub - Step-by-step implementation guides
Introduction
If you're building ML pipelines in 2025, you're likely juggling multiple tools for tracking, storage, and versioning. What if you could deploy MLflow, PostgreSQL, and MinIO in a single docker-compose.yml
?
This article delivers the exact stack you need to track models, store artifacts, and run MLflow in production mode – all from scratch.
The Stack Components
- MLflow 3.2.0 — For experiment tracking
- PostgreSQL 17.5 — Backend metadata store
- MinIO — S3-compatible storage for artifacts
- Docker + Compose v3.8 — Container orchestration
Full docker-compose.yml
Here's the production-ready docker-compose.yml
:
services:
postgres:
image: postgres:17.5
container_name: mlflow_postgres
restart: unless-stopped
environment:
POSTGRES_USER: mlflow
POSTGRES_PASSWORD: mlflow_pass
POSTGRES_DB: mlflow_db
ports:
- "5432:5432"
volumes:
- postgres_data:/var/lib/postgresql/data
minio:
image: minio/minio:RELEASE.2025-07-23T15-54-02Z
container_name: mlflow_minio
restart: unless-stopped
environment:
MINIO_ROOT_USER: minio
MINIO_ROOT_PASSWORD: minio_pass
command: server /data --console-address ":9001"
ports:
- "9000:9000"
- "9001:9001"
volumes:
- minio_data:/data
mlflow:
build:
context: .
dockerfile: Dockerfile
container_name: mlflow_server
restart: unless-stopped
depends_on:
- postgres
- minio
environment:
MLFLOW_TRACKING_URI: postgresql+psycopg://mlflow:mlflow_pass@postgres:5432/mlflow_db
AWS_ACCESS_KEY_ID: minio
AWS_SECRET_ACCESS_KEY: minio_pass
MLFLOW_S3_ENDPOINT_URL: http://minio:9000
command: >
mlflow server
--backend-store-uri postgresql+psycopg://mlflow:mlflow_pass@postgres:5432/mlflow_db
--default-artifact-root s3://mlflow/
--host 0.0.0.0
--port 5000
ports:
- "5000:5000"
volumes:
postgres_data:
minio_data:
mlflow_data:
Example Dockerfile for MLflow
Make sure this file exists in the same directory as your docker-compose.yml
:
# Dockerfile
FROM ghcr.io/mlflow/mlflow:v3.2.0
RUN pip install --no-cache-dir "psycopg[binary]>=3.1,<3.2"
How to Launch
-
Place both files (
docker-compose.yml
andDockerfile
) in your project root. -
Run the following in your terminal:
docker-compose up --build -d
MLflow UI will be available at: http://localhost:5000
MinIO UI Console at: http://localhost:9001
(Username: minio
, Password: minio_pass
)
Optional: Set Up a MinIO Bucket
Before using S3 artifact storage, create a bucket called mlflow
:
mc alias set local http://localhost:9000 minio minio_pass
mc mb local/mlflow
Configuration and Customization
Environment Variables
Customize your deployment by modifying environment variables:
environment:
MLFLOW_TRACKING_URI: postgresql+psycopg://mlflow:mlflow_pass@postgres:5432/mlflow_db
AWS_ACCESS_KEY_ID: minio
AWS_SECRET_ACCESS_KEY: minio_pass
MLFLOW_S3_ENDPOINT_URL: http://minio:9000
MLFLOW_SERVE_ARTIFACTS: true
MLFLOW_ARTIFACTS_DESTINATION: s3://mlflow/
Volume Mounts
Persist your data across container restarts:
volumes:
- ./mlflow_data:/mlflow
- ./postgres_data:/var/lib/postgresql/data
- ./minio_data:/data
Network Configuration
Customize network settings for production:
networks:
mlflow_network:
driver: bridge
ipam:
config:
- subnet: 172.20.0.0/16
Monitoring and Maintenance
Health Checks
Add health checks to your services:
healthcheck:
test: ["CMD", "curl", "-f", "http://localhost:5000/health"]
interval: 30s
timeout: 10s
retries: 3
Logging
Configure centralized logging:
logging:
driver: "json-file"
options:
max-size: "10m"
max-file: "3"
Backup Strategy
Implement regular backups for your data:
# Backup PostgreSQL
docker exec mlflow_postgres pg_dump -U mlflow mlflow_db > backup_$(date +%Y%m%d_%H%M%S).sql
# Backup MinIO data
docker cp mlflow_minio:/data ./minio_backup_$(date +%Y%m%d_%H%M%S)
Security Considerations
Authentication
Implement proper authentication for production:
environment:
MLFLOW_TRACKING_USERNAME: admin
MLFLOW_TRACKING_PASSWORD: secure_password
Network Security
Restrict network access in production:
networks:
mlflow_network:
driver: bridge
internal: true # No external access
SSL/TLS
Enable HTTPS for production deployments:
ports:
- "443:5000"
environment:
MLFLOW_TRACKING_URI: https://your-domain.com
Troubleshooting
Common Issues
Issue: MLflow can't connect to PostgreSQL
# Check PostgreSQL logs
docker logs mlflow_postgres
# Verify connection
docker exec mlflow_server ping postgres
Issue: MinIO connection fails
# Check MinIO status
docker logs mlflow_minio
# Verify S3 endpoint
docker exec mlflow_server curl -I http://minio:9000
Issue: Port conflicts
# Check port usage
netstat -tulpn | grep :5000
# Change ports in docker-compose.yml
ports:
- "5001:5000" # Use port 5001 instead
Debug Commands
# Check all container statuses
docker-compose ps
# View logs for specific service
docker-compose logs mlflow
# Enter container for debugging
docker exec -it mlflow_server bash
# Check network connectivity
docker network ls
docker network inspect mlflow_network
Scaling and Performance
Resource Limits
Set resource constraints for production:
deploy:
resources:
limits:
memory: 2G
cpus: '1.0'
reservations:
memory: 1G
cpus: '0.5'
Load Balancing
Scale MLflow instances:
mlflow:
# ... existing configuration ...
deploy:
replicas: 3
update_config:
parallelism: 1
delay: 10s
Monitoring
Add monitoring with Prometheus and Grafana:
prometheus:
image: prom/prometheus
ports:
- "9090:9090"
volumes:
- ./prometheus.yml:/etc/prometheus/prometheus.yml
Conclusion
This plug-and-play setup makes it easier than ever to manage your ML experiments, track results, and persist artifacts without touching the cloud. In a world moving fast with AI and MLOps, this local stack gives you the freedom to innovate safely and efficiently.
Key Takeaways
- Complete MLOps Stack - MLflow, PostgreSQL, and MinIO in one deployment
- Production Ready - Enterprise-grade configuration and security
- Easy Management - Simple Docker Compose commands for all operations
- Scalable Architecture - Designed for growth and performance
- Local Control - Full control over your MLOps infrastructure
Next Steps
- Deploy the stack using the provided docker-compose.yml
- Configure authentication and security for production use
- Set up monitoring and alerting for system health
- Implement backup strategies for data protection
- Scale the infrastructure as your ML needs grow
Tags: #MLflow2025 #DockerCompose #MinIO #PostgreSQL #MLOps #ModelTracking #DevOpsTools #AIInfrastructure #MLflow3