Build armies of digital workers: fast, secure, and extendable. Automate anything using Wasm tools, channels, agents, and flows. |
Greentic.ai is now at version 0.2.0, offering a growing store with free flows, plugins, and tools to get you started. You can easily build your own flows, tools, and plugins, including those that connect to APIs without requiring authentication or API keys. Support for OAuth integrations is coming in v0.3.0, and full Cloud deployment will be available in v0.4.0.
Looking ahead, the vision for v1.0.0 is ambitious: imagine simply messaging via WhatsApp, Teams, Slack, or Telegram to request a digital worker—Greentic.ai will create it automatically based on your request, just like ChatGPT.
Discover how Greentic.ai enables revenue oppotunities for partners and be part of the future of intelligent automation.
- Introduction
- What is a Digital Worker?
- Key Concepts
- Getting Started
- Quick Flow Example (YAML)
- Controlling Flows, Channels & Tools
- Coming Soon
- Need Custom Agentic Automation?
- Contributing
- License
Greentic.AI is an open-source platform designed to let you build, deploy, and manage digital workers at lightning speed.
- Fastest runtime with zero cold-starts for WebAssembly tools.
- Extendable architecture: plug in your own channels, tools, agents and processes, all defined in an easy to understand text-based flow.
- Secure by design: tools are sandboxed inside Wasm allowing securely running untrusted third-party MCP tools.
- Observability via OpenTelemetry integrations
A Digital Worker is a flow that acts autonomously and intelligently to handle a complete task, from end to end.
It:
- Listens for messages (via Channels like Telegram or Slack)
- Extracts meaning or decisions (via Agents, powered by LLMs)
- Calls APIs or executes functions (via Tools written in Wasm)
- Handles control logic (via Processes like retries, conditionals, loops)
Flows link these components into one cohesive automation. Your digital workers are secure, modular, and language-agnostic.
- MCP (Model-Context Protocol) modules compile to WebAssembly.
- Each tool can define its own actions, inputs, outputs, and run logic securely.
- Tools live in
tools/
and are called by the flows.
👉 Learn how to build MCP Tools
- Channels allow flows to send/receive messages to/from the outside world.
- Examples: Telegram, Slack, Email, HTTP Webhooks.
👉 How to build Channel Plugins
- Processes are a collection of builtIn processes and soon extendable via Wasm.
- Debug: allows you to easily understand the output of the previous flow nodes.
- Script: create a script in Rhai to programme logic.
- Template: a Handlebars-based template processor for rending string output.
- QA: A dynamic, multi-question form-like process with optional validation, LLM user assistance and routing.
- Defined declaratively in YAML.
- Agents are LLM-powered nodes capable of autonomous decision-making.
- The first type of agent is oLlama. More types coming soon.
- Coming Soon: Agents understand context, use memory, trigger tools, and follow goals.
To build and run this project, you need to have Rust installed.
If you don’t have Rust yet, the easiest way is via rustup
:
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh
Install Greentic.AI via:
cargo install greentic
The first time you use Greentic, run:
greentic init
This will:
- Create the Greentic configuration directories
- Register your user and generate a
GREENTIC_TOKEN
- Allow you to pull flows, channels, tools, etc. from greenticstore.com
- Will give an error about the TELEGRAM_TOKEN and WEATHERAPI_KEY not being set. Read about how to create a Telegram bot and get your free WEATHERAPI_KEY at https://www.weatherapi.com/
Pull your first flow: (greentic init does this for you already)
greentic flow pull weather_bot_telegram.ygtc
Then:
-
Create and configure a Telegram bot, and add your token:
greentic secret add TELEGRAM_TOKEN <your_token>
-
Sign up to WeatherAPI and add your free API key:
greentic secret add WEATHERAPI_KEY <your_key>
-
(Optional) To enable AI-powered queries like “What’s the weather in London tomorrow?”:
-
Pull the model:
ollama pull gemma:instruct
greentic run
You should now have a fully working Telegram Weather Bot.
To deploy your own flows:
greentic flow deploy <file>.ygtc
To start a flow:
greentic flow start <flow_id>
id: weather_bot
title: Get your weather prediction
description: >
This flow shows how you can combine either a fixed question and answer process
with an AI fallback if the user is not answering the questions correctly.
channels:
- telegram
nodes:
# 1) Messages come in via Telegram
telegram_in:
channel: telegram
in: true
# 2) QA node: ask for the city and fallback to the OllamaAgent if more than 3 words are used
extract_city:
qa:
welcome_template: "Hi there! Let's get your weather forecast."
questions:
- id: q_location
prompt: "👉 What location would you like a forecast for?"
answer_type: text
state_key: q
max_words: 3
fallback_agent:
type: ollama
model: gemma:instruct
task: |
The user wants the weather forecast. Find out for which city or location they want the weather and
assign this to a state value named `q`. If they mention the days, assign the number to a state value named `days`,
otherwise use `3` for `days`.
If you are unsure about the place (`q`), ask the user to clarify where they want the weather forecast for.
routing:
- to: forecast_weather
# 3) “forecast_weather”: the Weather API tool, using the JSON from parse_request.
forecast_weather:
tool:
name: weather_api
action: forecast_weather
parameters:
q: "{{extract_city.payload.city}}"
days: 3
# 4) “weather_template”: format the weather API’s JSON into a friendly sentence.
weather_out_template:
template: |
🌤️ Weather forecast for {{payload.location.name}}:
{{#each payload.forecast.forecastday}}
📅 Day {{@indexPlusOne}} ({{this.date}}):
• High: {{this.day.maxtemp_c}}°C
• Low: {{this.day.mintemp_c}}°C
• Condition: {{this.day.condition.text}}
• Rain? {{#if (eq this.day.daily_will_it_rain 1)}}Yes{{else}}No{{/if}}
{{/each}}
# 5) “telegram_out”: send the forecast back to Telegram.
telegram_out:
channel: telegram
out: true
connections:
telegram_in:
- extract_city
extract_city:
- forecast_weather
forecast_weather:
- weather_out_template
weather_out_template:
- telegram_out
# Validate a flow before deploying. Afterwards you can start/stop the flow
greentic flow validate <file>.ygtc
greentic flow deploy <file>.ygtc
greentic flow start <flow-id>
greentic flow stop <flow-id>
- v0.3.0 oAuth MCP Tools - connect to any SaaS
- v0.4.0 Serverless Cloud deployment of flows - greentic deploy
Roadmap:
- More Agentic: memory persistence, vector databases, A2A,...
- AI Flow Designer
- Flow, Tools, Channels & Processes marketplace
Have a specific use-case or need expert help?
Please fill out our form: Agentic Automation Inquiry
We are actively looking for contributors and welcome contributions of all kinds!
- Bug reports 🐞
- Feature requests 🎉
- Code & documentation PRs 📝
- Fork the repo
- Create a feature branch
- Open a PR against
main
See CONTRIBUTING.md for full guidelines.
Distributed under the MIT License. See LICENSE for details.
Thank you for checking out Greentic.AI—let’s build the future of automation together! 🚀