Workflow Engine for Kubernetes
-
Updated
Aug 12, 2025 - Go
Workflow Engine for Kubernetes
Weaviate is an open-source vector database that stores both objects and vectors, allowing for the combination of vector search with structured filtering with the fault tolerance and scalability of a cloud-native database.
Scalable and flexible workflow orchestration platform that seamlessly unifies data, ML and analytics stacks.
An MLOps framework to package, deploy, monitor and manage thousands of production machine learning models
Determined is an open-source machine learning platform that simplifies distributed training, hyperparameter tuning, experiment tracking, and resource management. Works with PyTorch and TensorFlow.
🏕️ Reproducible development environment
The Virtual Feature Store. Turn your existing data infrastructure into a feature store.
An open source DevOps tool for packaging and versioning AI/ML models, datasets, code, and configuration into an OCI artifact.
The open source, end-to-end computer vision platform. Label, build, train, tune, deploy and automate in a unified platform that runs on any cloud and on-premises.
BharatMLStack is an open-source, end-to-end machine learning infrastructure stack built at Meesho to support real-time and batch ML workloads at Bharat scale
Aqueduct is no longer being maintained. Aqueduct allows you to run LLM and ML workloads on any cloud infrastructure.
☁️ Terraform plugin for machine learning workloads: spot instance recovery & auto-termination | AWS, GCP, Azure, Kubernetes
Autoscale LLM (vLLM, SGLang, LMDeploy) inferences on Kubernetes (and others)
A lightweight CLI tool for versioning data alongside source code and building data pipelines.
Kubernetes-friendly ML model management, deployment, and serving.
Finetune LLMs on K8s by using Runbooks
Transform your pythonic research to an artifact that engineers can deploy easily.
A lightweight tool to get an AI Infrastructure Stack up in minutes not days. K3ai will take care of setup K8s for You, deploy the AI tool of your choice and even run your code on it.
Experiment tracking server focused on speed and scalability
Add a description, image, and links to the mlops topic page so that developers can more easily learn about it.
To associate your repository with the mlops topic, visit your repo's landing page and select "manage topics."