Image credit: databricks
MLflow, a leading end-to-end MLOps platform with over 13 million monthly downloads, has unveiled its latest update, MLflow 2.3. The update brings enhanced support for managing and deploying large language models (LLMs) and includes new features like three new model flavors: HuggingFace Transformers, OpenAI functions, and LangChain. Additionally, the update significantly improves model download and upload speeds to and from cloud services.
The native integration of HuggingFace Transformers library in MLflow 2.3 enables users to access over 170,000 machine learning models on the HuggingFace Hub easily. The update also features automatic validations, model signature inference, and ModelCard data fetching, streamlining the deployment process. Furthermore, the transformers flavor supports automatic signature schema detection and passing of pipeline-specific input formats.
MLflow 2.3 also integrates the OpenAI Python library, providing convenient access to the OpenAI API for Python applications. It includes automatic signature schema detection, parallelized API requests for faster inference, and automatic API request retries on transient errors. Users can now leverage pre-trained models hosted by OpenAI while taking advantage of MLflow's tracking and deployment capabilities.
The LangChain flavor in MLflow 2.3 simplifies the process of building and deploying LLM-based applications, such as question-answering systems and chatbots. Users can now benefit from LangChain's advanced capabilities combined with MLflow's streamlined development and deployment support.
To start using MLflow 2.3 and its new LLM-supporting features, simply install the Python MLflow library using the command pip install mlflow==2.3. For a complete list of new features and improvements in MLflow 2.3, refer to the release changelog, and for more information on how to get started with MLflow, consult the MLflow documentation. Source