-
✅ Seamless capability enhancement: It is possible to achieve an enterprise-level upgrade of the LLM API without code modification, seamlessly integrating knowledge bases, real-time networking, permanent memory, code execution, multimodal capabilities (vision/drawing/audition/speech), Automation capabilities (controlling smart homes, controlling browsers), deep thinking control and research, and other modular functions into the existing model interface, creating a pluggable LLM enhancement platform.
-
✅ One-click deployment across all channels: It supports the rapid deployment of intelligent agents to multiple types of terminals, and is compatible with scenarios such as classic chat interfaces, WeChat/QQ official robots, Bilibili live interaction, and VRM virtual desktop pets, ready to use out of the box.
-
✅ Ecological tool interconnection: It can freely access third-party intelligent entities and workflows as a tool chain (already adapted to systems such as ComfyUI/MCP/A2A), and achieve cross-platform capability aggregation through the agent-party architecture.
-
✅ Standardized interface opening: Provide OpenAI API-compatible interfaces and MCP protocol support, making it easy for developers to directly connect to external systems and achieve rapid transfer and secondary development of the agent's capabilities.
-
✅ Compatible and adaptable across all platforms: It covers the native running environments of Windows/macOS/Linux, supports Docker containerized deployment and web-based cloud services, and meets the needs of multi-scene technology stacks.
VRM desktop companion: Supports uploading custom VRM models to create a personalized desktop companion
WeChat bot: Simulate user operations, take over your WeChat in a non-intrusive way, and avoid risk control
QQ Bot: Supports one-click deployment to the official QQ bot, enabling users to access the agent anytime, anywhere
ComfyUI Integration: Converts ComfyUI workflows into agent tools with multi-ComfyUI server load balancing
⭐ Note! Choose to install only for the current user during installation, otherwise, administrator privileges will be required to start.
⭐Attention! After downloading, drag the app file of the dmg file into the /Applications
directory, then open end point, execute the following command and enter the root password to remove the attached Quarantine attribute from the network download:
sudo xattr -dr com.apple.quarantine /Applications/Super-Agent-Party.app
We provide two mainstream Linux installation package formats for your convenience in different scenarios.
.AppImage
is a Linux application format that does not require installation and can be used immediately. Suitable for most Linux distributions.
-
Two commands to install this project:
docker pull ailm32442/super-agent-party:latest docker run -d -p 3456:3456 -v ./super-agent-data:/app/data ailm32442/super-agent-party:latest
-
⭐Note!
./super-agent-data
can be replaced with any local folder, after Docker starts, all data will be cached in this local folder and will not be uploaded anywhere. -
Plug and play: access http://localhost:3456/
-
Windows:
git clone https://github.com/heshengtao/super-agent-party.git cd super-agent-party uv sync npm install start_with_dev.bat
-
Linux or Mac:
git clone https://github.com/heshengtao/super-agent-party.git cd super-agent-party uv sync npm install chmod +x start_with_dev.sh ./start_with_dev.sh
-
Desktop: Click the desktop icon to use immediately.
-
Web or docker: Access http://localhost:3456/ after startup.
-
API call: Developer-friendly, perfectly compatible with OpenAI format, can output in real-time, and does not affect the original API's response speed. No need to modify the calling code:
from openai import OpenAI client = OpenAI( api_key="super-secret-key", base_url="http://localhost:3456/v1" ) response = client.chat.completions.create( model="super-model", messages=[ {"role": "user", "content": "What is Super Agent Party?"} ] ) print(response.choices[0].message.content)
-
MCP call: After starting, you can invoke the local MCP service by writing the following content in the configuration file:
{ "mcpServers": { "super-agent-party": { "url": "http://127.0.0.1:3456/mcp", } } }
Please refer to the following document for the main functions:
This open-source project and its content (hereinafter referred to as the "project") are for reference only and do not imply any explicit or implicit warranties. The project contributors do not assume any responsibility for the completeness, accuracy, reliability, or applicability of the project. Any behavior that relies on the project content shall be at the user's own risk. In any case, the project contributors shall not be liable for any indirect, special, or incidental losses or damages arising from the use of the project content.
This project uses a dual licensing model:
- By default, this project follows the GNU Affero General Public License v3.0 (AGPLv3) license agreement
- If you need to use this project for closed-source commercial purposes, you must obtain a commercial license from the project administrator
Using this project for closed-source commercial purposes without written authorization is considered a violation of this agreement. The complete text of AGPLv3 can be found in the LICENSE file in the project root directory or at gnu.org/licenses.
⭐Your support is the driving force for us to move forward!
If you have any questions or issues with the project, you are welcome to join our community.
- QQ Group:
931057213
-
WeChat Group:
we_glm
(add the assistant's WeChat and join the group) -
Discord: Discord link