High-Performance server for NATS.io, the cloud and edge native messaging system.
-
Updated
Aug 7, 2025 - Go
High-Performance server for NATS.io, the cloud and edge native messaging system.
🚀 全网效果最好的移动端【实时对话数字人】。 支持本地部署、多模态交互(语音、文本、表情),响应速度低于 1.5 秒,适用于直播、教学、客服、金融、政务等对隐私与实时性要求极高的场景。开箱即用,开发者友好。
FEDML - The unified and scalable ML library for large-scale distributed training, model serving, and federated learning. FEDML Launch, a cross-cloud scheduler, further enables running any AI jobs on any GPU cloud or on-premise cluster. Built on this library, TensorOpera AI (https://TensorOpera.ai) is your generative AI platform at scale.
[ICLR 2020] Once for All: Train One Network and Specialize it for Efficient Deployment
Deep learning gateway on Raspberry Pi and other edge devices
AI-in-a-Box leverages the expertise of Microsoft across the globe to develop and provide AI and ML solutions to the technical community. Our intent is to present a curated collection of solution accelerators that can help engineers establish their AI/ML environments and solutions rapidly and with minimal friction.
Real-time portrait segmentation for mobile devices
💜 The best free Telegram bot for ChatGPT, Microsoft Copilot (aka Bing AI / Sidney / EdgeGPT), Microsoft Copilot Designer (aka BingImageCreator), Gemini and Groq with stream writing, requests with images, multiple languages, admin control, data logging and more!
Jetson Nano-based smart camera system that measures crowd face mask usage in real-time.
Olares: An Open-Source Personal Cloud to Reclaim Your Data
A curated list of awesome edge computing, including Frameworks, Simulators, Tools, etc.
Model Compression Toolkit (MCT) is an open source project for neural network model optimization under efficient, constrained hardware. This project provides researchers, developers, and engineers advanced quantization and compression tools for deploying state-of-the-art neural networks.
Neuralet is an open-source platform for edge deep learning models on edge TPU, Jetson Nano, and more.
speech to text benchmark framework
The Hailo Model Zoo includes pre-trained models and a full building and evaluation environment
High-performance, optimized pre-trained template AI application pipelines for systems using Hailo devices
On-Device Training Under 256KB Memory [NeurIPS'22]
A DNN inference latency prediction toolkit for accurately modeling and predicting the latency on diverse edge devices.
Add a description, image, and links to the edge-ai topic page so that developers can more easily learn about it.
To associate your repository with the edge-ai topic, visit your repo's landing page and select "manage topics."