(Taken from the Notes made by LoHertel )
The extent to which MLOps is implemented into a team or organization could be expressed as maturity. A framework for classifying different levels of MLOps maturity is listed below:
Lvl |
|
Overview |
Use Case |
0️⃣ |
No MLOps |
- ML process highly manual
- poor cooperation
- lack of standards, success depends on an individual's expertise
|
- proof of concept (PoC)
- academic project
|
1️⃣ |
DevOps but no MLOps |
- ML training is most often manual
- software engineers might help with the deployment
- automated tests and releases
|
- bringing PoC to production
|
2️⃣ |
Automated Training |
- ML experiment results are centrally tracked
- training code and models are version controlled
- deployment is handled by software engineers
|
- maintaining 2-3+ ML models
|
3️⃣ |
Automated Model Deployment |
- releases are managed by an automated CI/CD pipeline
- close cooperation between data and software engineers
- performance of the deployed model is monitored, A/B tests for model selection are used
|
- business-critical ML services
|
4️⃣ |
Full MLOps Automated Operations |
- clearly defined metrics for model monitoring
- automatic retraining triggered when passing a model metric's threshold
|
- use only when a favorable trade-off between implementation cost and increase in efficiency is likely
- retraining is needed often and is repetitive (has potential for automation)
|
A high maturity level is not always needed because it comes with additional costs. The trade-off between automated model maintenance and the required effort to set up the automation should be considered. An ideal maturity level could be picked based on the use case / SLAs and the number of models deployed.
If you want to read more on maturity, visit Microsoft's MLOps maturity model.