A Gradle plugin that automatically manages Ollama LLM instances for your build tasks
GMAI (Gradle Managed AI) is a Gradle plugin that seamlessly integrates AI capabilities into your build process by automatically managing Ollama instances. It handles the entire lifecycle of AI services - from installation and startup to model management and cleanup - so you can focus on using AI in your tasks.
- Automatic Ollama Management: Installs, starts, and stops Ollama automatically based on your build needs
- Task Integration: Simple API to make any Gradle task depend on AI services with
useManagedAi()
- Model Management: Automatically pulls and manages AI models defined in your configuration
- Lifecycle Management: Ensures AI services are available when needed and cleaned up afterward
- Cross-Platform: Works on macOS, Linux, and Windows with automatic platform detection
Tasks can declare dependencies on AI services, and GMAI handles everything automatically:
tasks.withType<Test> {
useManagedAi() // AI services start before tests, stop after
}
GMAI finds existing Ollama installations or installs per-project for isolation:
- Uses existing installations when available
- Falls back to project-local installation (
.ollama/bin/ollama
) - Configurable installation strategies for different environments
Define models in your build script and GMAI handles the rest:
managedAi {
models {
"llama3" {
version = "8b"
}
"codellama" {
version = "7b"
}
}
}
// build.gradle.kts
plugins {
id("se.premex.gmai") version "0.0.2"
}
managedAi {
models {
"llama3" {
version = "8b"
}
}
}
// Use AI in your tasks
tasks.withType<Test> {
useManagedAi()
systemProperty("ollama.url", "http://localhost:11434")
}
- AI-Powered Testing: Use LLMs in your test suites for dynamic test generation or validation
- Code Generation: Generate code during build time using AI models
- Documentation: Generate or validate documentation with AI assistance
- CI/CD Integration: Run AI-powered tasks in continuous integration environments
- Zero Configuration: Works out of the box with sensible defaults
- Build Integration: Native Gradle task dependencies and lifecycle management
- Team Consistency: Same AI environment for all team members
- CI/CD Ready: Designed for continuous integration environments
- Isolation: Project-specific installations don't interfere with system setup
GMAI provides several built-in tasks for managing AI services:
setupManagedAi
- Start Ollama and ensure all models are availableteardownManagedAi
- Stop Ollama and cleanup resourcesstartOllama
- Start the Ollama servicestopOllama
- Stop the Ollama serviceollamaStatus
- Check Ollama status and list available modelspullModel{ModelName}
- Pull specific models (auto-generated for each configured model)
plugin/
- Main plugin source codeplugin/src/main/kotlin/
- Plugin implementationplugin/src/test/kotlin/
- Unit testsplugin/src/functionalTest/kotlin/
- Integration tests
To test the plugin locally:
# Run tests
./gradlew :plugin:check
# Validate plugin
./gradlew :plugin:validatePlugins
# Publish to local repository
./gradlew :plugin:publishToMavenLocal
# Build the plugin
./gradlew :plugin:build
# Run all tests
./gradlew :plugin:check
- ⭐ Star this repository if you find GMAI useful
- 🍴 Fork the repository to contribute or customize
- 🐛 Report issues for bugs or feature requests
- 📖 Read the documentation for comprehensive guides
GMAI is developed and maintained by Premex, a company specializing in innovative software solutions.
This project is licensed under the MIT License - see the LICENSE file for details.