The Unified Model Serving Framework
-
Updated
Jul 9, 2022 - Python
{{ message }}
The Unified Model Serving Framework
In this repository, I will share some useful notes and references about deploying deep learning-based models in production.
Machine Learning automation and tracking
Currently deployment creation allow users to set an env var and pass to the BentoServer container. However in production scenarios, users may want to get env var values directly from existing configmap or secret resources in their k8s cluster. e.g.:
env:
# Define the environment variable
- name: SPECIAL_LEVEL_KEY
valueFrom:
configMapKeyRef:
Python + Inference - Model Deployment library in Python. Simplest model inference server ever.
FastAPI Skeleton App to serve machine learning models production-ready.
Code samples for the Lightbend tutorial on writing microservices with Akka Streams, Kafka Streams, and Kafka
A multi-functional library for full-stack Deep Learning. Simplifies Model Building, API development, and Model Deployment.
Common library for serving TensorFlow, XGBoost and scikit-learn models in production.
BentoML Sample Projects Gallery
A scalable, high-performance serving system for federated learning models
flink-jpmml is a fresh-made library for dynamic real time machine learning predictions built on top of PMML standard models and Apache Flink streaming engine
So far, there is only one Worker base type with JSON serialization and Pickle IPC serialization.
Serving PyTorch models with TorchServe
Code and presentation for Strata Model Serving tutorial
Deploy DL/ ML inference pipelines with minimal extra code.
A collection of model deployment library and technique.
An umbrella project for multiple implementations of model serving
Serving TensorFlow models with TensorFlow Serving
Location of incorrect documentation
(https://docs.monai.io/projects/monai-deploy-app-sdk/en/stable/getting_started/tutorials/03_segmentation_app.html)
Describe the problems or issues found in the documentation
Missing several install steps.
Steps taken to verify documentation is incorrect
I had to add the following pip installations: scikit
ClearML - Model-Serving Orchestration and Repository Solution
Kubeflow example of machine learning/model serving
A hands-on case study for demonstrating the stages involved in a machine learning project, from EDA to production.
A lightweight machine learning framework for Kubernetes
mlserve turns your python models into RESTful API, serves web page with form generated to match your input data.
Titus 2 : Portable Format for Analytics (PFA) implementation for Python 3.4+
Add a description, image, and links to the model-serving topic page so that developers can more easily learn about it.
To associate your repository with the model-serving topic, visit your repo's landing page and select "manage topics."
/kind feature
Describe the solution you'd like
[A clear and concise description of what you want to happen.]
Add a sample of MMS using a custom model server similar to sklearn and triton that will reside [in