Autoscaling volumes for Kubernetes (with the help of Prometheus)
-
Updated
May 30, 2024 - Python
Autoscaling volumes for Kubernetes (with the help of Prometheus)
Automatically scale virtual machines resources on Proxmox hosts
Automatically scale LXC containers resources on Proxmox hosts
Automatically assign Elastic IPs to AWS EC2 Auto Scaling Group instances
Simple, elastic Kubernetes cluster autoscaler for AWS Auto Scaling Groups
Libraries, samples, and tools to help AWS customers onboard with custom resource auto scaling.
A framework that runs on AWS Lambda for autoscaling ECS clusters and services
Solution Accelerators for Serverless Spark on GCP, the industry's first auto-scaling and serverless Spark as a service
Autoscaling for Clouds
Automatically scale the LXC containers resources on Proxmox hosts with AI
Django web application deployed to AWS via Cloudformation templates, with an infrastructure as code approach following the AWS architecture best practices. The application is a sample blog-like website which can be used as a personal portfolio.
A Framework For Intelligence Farming
This service is an automatic scaler for Liara's cloud infrastructure and does not require anything to run. It provides the ability to automatically check the app scale every few seconds and zoom in or out.
An auto-scaling solution for Amazon DocumentDB.
Spoptimize: Replace AWS AutoScaling instances with spot instances
A python project to scale the redis resources on Heroku platform
Some of the code used to set up an auto-scaling infrastructure on Amazon AWS
The autoscaling simulation toolbox provides tools to experiment with policies for multilayered autoscaling (inc. VM clusters and applications). Multiverse simulator is the core of the toolbox. Using it, one can evaluate autoscaling policies under different conditions, including various applications, platforms, and workloads.
Kubernetes with kubespray in Centos 7 using HAPROXY to loadbalancer Kubernetes_API,http and http traffic ports using Traefik ingress.
Deployment of RAG + LLM model serving on multiple K8s cloud clusters
Add a description, image, and links to the autoscaling topic page so that developers can more easily learn about it.
To associate your repository with the autoscaling topic, visit your repo's landing page and select "manage topics."