Skip to content

Releases: getcellm/cellm

v0.3.0

12 Apr 14:40
c492df8
Compare
Choose a tag to compare

This release introduces new capabilities for connecting Cellm to external systems and improves underlying integrations with model providers.

What's New

  • Add support for Model Context Protocol (MCP): You can now connect Cellm to MCP-compatible servers, allowing your Excel sheets to interact with external data sources or trigger actions via Cellm functions. Effectively, it turns Excel into a low-code automation tool, enabling workflows orchestrated directly from your spreadsheet. We're still trying to wrap our head around the possibilities this unlocks. They are freakin' endless.
  • Adopt Anthropic.SDK for Anthropic Calls: Previously, we rolled our own custom Anthropic client because of the lack of an official SDK. We've migrated our Anthropic integration to the community-driven Anthropic.SDK (github.com/tghamm/Anthropic.SDK). This decision was driven by Anthropic's adoption of this SDK for their official .NET MCP library. This move ensures better standardization, aligns Cellm with Microsoft.Extensions.AI patterns, and leverages ongoing community improvements.
  • Telemetry: To help us identify bug and understand usage patterns, we now send anonymized crash reports and model interaction details to Sentry. We never capture any data from your spreadsheet. Still, you can opt out of this at any time by adding the following to your appsettings.Local.json:
{
    "SentryConfiguration": {
        "IsEnabled": false
    }
}

Bug fixes

  • Fixed a regression that inadvertently broke the use of tools with AI models.
  • Adjusted the prompt caching mechanism to correctly invalidate the cache when tools are added or removed.
  • Tuned the default settings for the rate limiter (Retry and Circuit Breaker policies) to work together more effectively during periods of high activity. These defaults prioritize stability and avoiding upstream provider rate limits. You can always crank up the limits or remove them altogether, but aggressive settings may lead to more errors from the AI provider.

What's Changed

New Contributors

Full Changelog: v0.2.0...v0.3.0

v0.2.0

19 Mar 20:17
5bd4e30
Compare
Choose a tag to compare

This release addresses several issues when working with multiple prompts simultaneously and introduces comprehensive documentation!

What's New:

  • Official Documentation: Our new documentation is now available!

Bug Fixes:

  • UI Responsiveness: Fixed an issue where the interface would become unresponsive when sending multiple prompts
  • UI Configuration: Configuration changes now apply immediately without requiring application restart
  • Smarter Rate Limiting: Implemented centralized rate limiting that applies individually to each provider, creating a smoother experience when sending multiple prompts to different models simultaneously

These improvements should make your workflow more efficient and reliable when working with multiple prompts across different models. Enjoy!

What's Changed

New Contributors

Full Changelog: v0.1.1...v0.2.0

v0.1.1

06 Feb 08:32
30a288c
Compare
Choose a tag to compare

This is a minor release that mainly fixes a bug where changing API keys either via UI or appsettings.Local.json would not get picked until Excel was restarted.

What's Changed

Full Changelog: v0.1.0...v0.1.1

v0.1.0

30 Jan 22:18
e7856b5
Compare
Choose a tag to compare

Release v0.1.0

Cellm is an Excel extension that lets you use Large Language Models (LLMs) like ChatGPT in cell formulas. It is designed for automation of repetitive text-based tasks and comes with

  • Local and hosted models: Defaults to free local inference (Gemma 2 2B via Ollama) while supporting commercial APIs
  • Formula-driven workflow: =PROMPT() and =PROMPTWITH() functions for drag-and-fill operations across cell ranges.

Install

  1. Download Cellm-AddIn64-packed.xll and appsettings.json. Put them in the same folder.

  2. Double-click on Cellm-AddIn64-packed.xll. Excel will open and install Cellm.

  3. Download and install Ollama. Cellm uses Ollama and the Gemma 2 2B model by default. Gemma 2 2B will be downloaded automatically the first time you call =PROMPT(). To call other models, see the Models section in the README.

Uninstall

  1. In Excel, go to File > Options > Add-Ins.
  2. In the Manage drop-down menu, select Excel Add-ins and click Go....
  3. Uncheck Cellm-AddIn64-packed.xll and click OK.

Known Limitations

  1. Windows-only: No macOS/Linux support planned for initial versions
  2. Input constraints:
    • Formula arguments limited to 8,192 characters (Excel string limit)
    • No native support for multi-turn conversations
  3. Model variability: Output quality depends on selected LLM (validate critically)

Contribution & Feedback

Report issues or suggest improvements via GitHub Issues.

Install

Download Cellm-AddIn64-packed.xll and appsettings.json and put it in the same folder. Then double-click on Cellm-AddIn64-packed.xll. Excel will open with Cellm installed.


License: Fair Core License
Full Documentation: README

What's Changed

Full Changelog: https://github.com/getcellm/cellm/commits/v0.1.0