Skip to content

Askimo - A powerful cross-platform CLI (Windows, macOS, Linux) to chat with LLMs like OpenAI, Ollama, and more. It simplifies working across multiple models, gives you direct local file access, and lets you create reusable custom prompts for repetitive tasks.

License

Notifications You must be signed in to change notification settings

haiphucnguyen/askimo

Repository files navigation

Askimo - AI at your command line.

Build License GitHub release DCO

Askimo

Askimo is a command-line assistant that talks with LLMs - from online providers like OpenAI, X AI, Gemini to local models like Ollama.

Askimo - AI for your workflows, with the freedom to choose any provider.

Why Askimo?

  • Switch providers anytime – Talk to OpenAI, Gemini, X AI, or Ollama with the same commands.

  • Automation-first – Pipe files, logs, or command output into Askimo and let AI handle the rest.

  • Your choice of interface – Use the CLI if you love the terminal, or the web UI if you prefer a browser.

  • No lock-in – Designed to stay provider-neutral so you can change models as the AI landscape evolves.

Demo

  • Piping commands & switching providers in Askimo

Demo

  • Interacting with the local file system in Askimo

Demo

💬 Simple Web Chat (Local Usage)

Askimo isn’t only for the terminal - you can also start a lightweight local web chat UI if you prefer a browser interface. This feature is designed for quick testing or personal use, not for production deployment.

  • Start Askimo web server
askimo --web

Then open your browser to the URL printed in the console (look for Web server running at http://127.0.0.1:8080). If port 8080 is busy, Askimo will pick the next free port-use the exact address shown in that log line. The web UI supports real-time streaming responses and Markdown rendering.

Askimo-web

⚠️ Important: Before running askimo web, you must finish setting up your AI provider (e.g., Ollama, OpenAI) and select a model using the Askimo CLI. The web version is currently a simple chat page - it does not support configuring providers, models, or AI parameters. All configuration must be done beforehand via the CLI.

Quickstart

macOS / Linux (Homebrew)

brew tap haiphucnguyen/askimo
brew install askimo
askimo

Windows (Scoop)

scoop bucket add askimo https://github.com/haiphucnguyen/scoop-askimo
scoop install askimo
askimo

Other ways to install → Installation Guide

👉 Once installed, you can connect Askimo to providers like Ollama, OpenAI, Gemini, or X AI and start chatting.

📖 See Getting started for tutorials on setting up Ollama, adding API keys (OpenAI, Gemini, X AI), switching providers, and running real workflow examples.

Available Commands

Command Description Example Usage
:help Show all available commands :help
:setparam Set a parameter for the current provider :setparam style creative
:params View current session parameters :params
:config Edit Askimo configuration file :config
:providers List all supported AI providers :providers
:setprovider Switch to a different AI provider :setprovider ollama
:models List available models for the current provider :models
:copy Copy the last response to the clipboard :copy
:clear Clear the chat history for the current session :clear
:exit Exit the Askimo REPL :exit

View the full command reference »
Includes detailed usage, options, and examples for each command.

💡 Note: Some providers (such as OpenAI, X AI, etc.) require an API key.
Make sure you create and configure the appropriate key from your provider’s account dashboard before using them.

Extending Askimo

Askimo is designed to be pluggable, so you can tailor it to your needs:

Contributing

  • Fork & clone the repo

  • Create a feature branch

  • Open a PR

About

Askimo - A powerful cross-platform CLI (Windows, macOS, Linux) to chat with LLMs like OpenAI, Ollama, and more. It simplifies working across multiple models, gives you direct local file access, and lets you create reusable custom prompts for repetitive tasks.

Topics

Resources

License

Contributing

Stars

Watchers

Forks

Packages