Releases: getcellm/cellm
v0.3.0
This release introduces new capabilities for connecting Cellm to external systems and improves underlying integrations with model providers.
What's New
- Add support for Model Context Protocol (MCP): You can now connect Cellm to MCP-compatible servers, allowing your Excel sheets to interact with external data sources or trigger actions via Cellm functions. Effectively, it turns Excel into a low-code automation tool, enabling workflows orchestrated directly from your spreadsheet. We're still trying to wrap our head around the possibilities this unlocks. They are freakin' endless.
- Adopt
Anthropic.SDK
for Anthropic Calls: Previously, we rolled our own custom Anthropic client because of the lack of an official SDK. We've migrated our Anthropic integration to the community-drivenAnthropic.SDK
(github.com/tghamm/Anthropic.SDK). This decision was driven by Anthropic's adoption of this SDK for their official .NET MCP library. This move ensures better standardization, aligns Cellm withMicrosoft.Extensions.AI
patterns, and leverages ongoing community improvements. - Telemetry: To help us identify bug and understand usage patterns, we now send anonymized crash reports and model interaction details to Sentry. We never capture any data from your spreadsheet. Still, you can opt out of this at any time by adding the following to your appsettings.Local.json:
{
"SentryConfiguration": {
"IsEnabled": false
}
}
Bug fixes
- Fixed a regression that inadvertently broke the use of tools with AI models.
- Adjusted the prompt caching mechanism to correctly invalidate the cache when tools are added or removed.
- Tuned the default settings for the rate limiter (Retry and Circuit Breaker policies) to work together more effectively during periods of high activity. These defaults prioritize stability and avoiding upstream provider rate limits. You can always crank up the limits or remove them altogether, but aggressive settings may lead to more errors from the AI provider.
What's Changed
- Merge develop into main by @kaspermarstal in #129
- fix: Prompts involving tool calls would return the tool call message, not the assistant's reply to the tool call result by @kaspermarstal in #130
- build: Update Microsoft.Extensions.AI version to 9.4.0-preview.1.25207.5 by @MackinnonBuck in #133
- Merge dev by @kaspermarstal in #136
- build: Update dependencies by @kaspermarstal in #138
- fix: Retry on broken circuit and Anthropic's rate limits exceeded, guarantee first retry waits until opened circuit is closed, and reduce aggresiveness of circuit breaker by @kaspermarstal in #140
- fix: Adding/removing tools did not invalidate cache by @kaspermarstal in #139
- refactor: Sentry PipelineBehavior by @kaspermarstal in #137
- fix: Sentry transaction contexts by @kaspermarstal in #142
- build: Bump version by @kaspermarstal in #143
New Contributors
- @MackinnonBuck made their first contribution in #133
Full Changelog: v0.2.0...v0.3.0
v0.2.0
This release addresses several issues when working with multiple prompts simultaneously and introduces comprehensive documentation!
What's New:
- Official Documentation: Our new documentation is now available!
Bug Fixes:
- UI Responsiveness: Fixed an issue where the interface would become unresponsive when sending multiple prompts
- UI Configuration: Configuration changes now apply immediately without requiring application restart
- Smarter Rate Limiting: Implemented centralized rate limiting that applies individually to each provider, creating a smoother experience when sending multiple prompts to different models simultaneously
These improvements should make your workflow more efficient and reliable when working with multiple prompts across different models. Enjoy!
What's Changed
- refactor: Move Cellm.Models to shared project by @kaspermarstal in #111
- docs: Improve README by @kaspermarstal in #112
- feat: Add retry mechanism to Ollama and OpenAiCompatible providers by @kaspermarstal in #114
- fix: Prompt not cached because cache key length sometimes too long by @kaspermarstal in #115
- refactor: Add IChatClient with transient lifetime for all providers by @kaspermarstal in #116
- docs: Add documentation by @kaspermarstal in #117
- docs: Tighten up README, link to docs by @kaspermarstal in #118
- docs: Update R2 domain by @zachasme in #119
- fix: Unblock UI thread when sending many prompts by @kaspermarstal in #120
- refactor: Apply rate limit across providers by @kaspermarstal in #121
- fix: Increase default rate limit by @kaspermarstal in #125
New Contributors
Full Changelog: v0.1.1...v0.2.0
v0.1.1
This is a minor release that mainly fixes a bug where changing API keys either via UI or appsettings.Local.json would not get picked until Excel was restarted.
What's Changed
- docs: Add CLA by @kaspermarstal in #102
- docs: Fix README typos by @kaspermarstal in #103
- build: Add global.json to specify .NET SDK version 9.X.X by @johnnyoshika in #101
- fix: Changing API keys while app is running by @kaspermarstal in #109
Full Changelog: v0.1.0...v0.1.1
v0.1.0
Release v0.1.0
Cellm is an Excel extension that lets you use Large Language Models (LLMs) like ChatGPT in cell formulas. It is designed for automation of repetitive text-based tasks and comes with
- Local and hosted models: Defaults to free local inference (Gemma 2 2B via Ollama) while supporting commercial APIs
- Formula-driven workflow:
=PROMPT()
and=PROMPTWITH()
functions for drag-and-fill operations across cell ranges.
Install
-
Download
Cellm-AddIn64-packed.xll
andappsettings.json
. Put them in the same folder. -
Double-click on
Cellm-AddIn64-packed.xll
. Excel will open and install Cellm. -
Download and install Ollama. Cellm uses Ollama and the Gemma 2 2B model by default. Gemma 2 2B will be downloaded automatically the first time you call
=PROMPT()
. To call other models, see the Models section in the README.
Uninstall
- In Excel, go to File > Options > Add-Ins.
- In the
Manage
drop-down menu, selectExcel Add-ins
and clickGo...
. - Uncheck
Cellm-AddIn64-packed.xll
and clickOK
.
Known Limitations
- Windows-only: No macOS/Linux support planned for initial versions
- Input constraints:
- Formula arguments limited to 8,192 characters (Excel string limit)
- No native support for multi-turn conversations
- Model variability: Output quality depends on selected LLM (validate critically)
Contribution & Feedback
Report issues or suggest improvements via GitHub Issues.
Install
Download Cellm-AddIn64-packed.xll
and appsettings.json
and put it in the same folder. Then double-click on Cellm-AddIn64-packed.xll
. Excel will open with Cellm installed.
License: Fair Core License
Full Documentation: README
What's Changed
- feat: Add LlamafileClient by @kaspermarstal in #1
- bug: Fix AddSystemMessage by @kaspermarstal in #4
- bug: Fix llamafile health uri by @kaspermarstal in #3
- models: Add qwen-0.5b by @kaspermarstal in #2
- docs: Tighten up README by @kaspermarstal in #6
- feat: Manually dispose of ServiceLocator by @kaspermarstal in #7
- bug: By default assign telemetry to default model of provider, not default model of Cellm; refactor: Rename GoogleClient to GoogleAiClient by @kaspermarstal in #8
- docs: Add support for Mistral by @kaspermarstal in #9
- bug: Disable sentry by default until fix for missing immutable arrays is identified by @kaspermarstal in #11
- feat: Add concurrency rate limiting by @kaspermarstal in #10
- feat: Add support for running multiple Llamafiles simultaneously by @kaspermarstal in #12
- build: Enforce code style in build by @kaspermarstal in #13
- git: Add Excel files to .gitignore by @kaspermarstal in #14
- docs: Improve README by @kaspermarstal in #15
- Prompt: Further optimize system prompt for small models with limited instruction-following capability. Larger models will understand anyway by @kaspermarstal in #16
- docs: Proof-read README.md by @kaspermarstal in #17
- feat: Add support for OpenAI tools by @kaspermarstal in #18
- feat: Upgrade default Anthropic model to claude-3-5-sonnet-20241022 by @kaspermarstal in #20
- refactor: Add provider enum by @kaspermarstal in #19
- refactor: Rename CellmFunctions to Functions and CellPrompts to SystemMessages by @kaspermarstal in #21
- ci: Add conventional commits lint by @kaspermarstal in #22
- build: Make internals visible to Cellm.Tests by @kaspermarstal in #23
- refactor: Pull out XllPath into settable property by @kaspermarstal in #24
- feat: Add SentryBehavior and CachingBehavior to model request pipeline by @kaspermarstal in #25
- refactor: Splot Tools into ToolRunner and ToolFactory by @kaspermarstal in #26
- feat: Upgrade Claude 3.5 Sonnet by @kaspermarstal in #27
- refactor: Remove superfluous interfaces by @kaspermarstal in #28
- feat: Add FileReader tool by @kaspermarstal in #29
- ci: Add dependabot by @kaspermarstal in #36
- build(deps): bump Microsoft.Extensions.Caching.Memory from 8.0.0 to 8.0.1 in /src/Cellm by @dependabot in #38
- ci: Disable commitlint for dependabot by @kaspermarstal in #43
- build(deps): bump Sentry.Extensions.Logging from 4.10.2 to 4.12.1 by @dependabot in #39
- ci: Disable conventional commits lint by @kaspermarstal in #44
- build(deps): bump Sentry.Profiling from 4.10.2 to 4.12.1 by @dependabot in #41
- build(deps): bump Microsoft.Extensions.Configuration.Json from 8.0.0 to 8.0.1 by @dependabot in #42
- fix: Parse tool description attributes by @kaspermarstal in #45
- feat: Add support for prompt as single string argument by @kaspermarstal in #50
- fix: CacheBehavior caches value even when model request is mutated downstream by @kaspermarstal in #56
- ci: Run dotnet format with restore by @kaspermarstal in #58
- fix: Remove Sentry metrics which were deprecated by @kaspermarstal in #59
- build(deps): bump Microsoft.Extensions.Configuration, Microsoft.Extensions.DependencyInjection, Microsoft.Extensions.Logging.Console, Microsoft.Extensions.Options and Microsoft.Extensions.Options.ConfigurationExtensions by @dependabot in #51
- fix: Increase timeouts for local LLMs by @kaspermarstal in #57
- build(deps): bump Microsoft.Extensions.Caching.Memory and Microsoft.Extensions.Options by @dependabot in #54
- build: Target .NET 8.0 by @kaspermarstal in #62
- feat: Replace GoogleAI provider with Google's OpenAI compatible endpoint by @kaspermarstal in #63
- feat: Update Llamafile version, add larger Llamafile models by @kaspermarstal in #64
- refactor: Clean up src/Cellm/AddIn by @kaspermarstal in #65
- feat: Use Microsoft.Extensions.AI by @kaspermarstal in #66
- feat: Add Ollama provider by @kaspermarstal in #67
- fix: Ollama provider by @kaspermarstal in #69
- build: Remove json schema deps no longer needed by @kaspermarstal in #72
- feat: Add HybridCache, remove MemoryCache by @kaspermarstal in #73
- feat: Add OpenAiCompatible chat client by @kaspermarstal in #74
- refactor: Models by @kaspermarstal in #75
- build: Target .NET 9 and update deps by @kaspermarstal in #83
- refactor: Use ExcelAsyncUtil to run task by @kaspermarstal in #84
- feat: Remove support for embedded Ollama and Llamafile servers by @kaspermarstal in #85
- build: Copy appsettings.Local.json to bin dir only if exists by @kaspermarstal in #86
- docs: Fix appsettings.Local.*.json examples and update readme to match by @kaspermarstal in #87
- bug: Fix OpenAiCompatible API key by @kaspermarstal in #94
- feat: Add Mistral and DeepSeek providers by @kaspermarstal in #95
- feat: Add Ribbon UI by @kaspermarstal in #96
- feat: Auto-download Ollama models by @kaspermarstal in #98
- docs: Update README.md with installations instructions via Release page by @kaspermarstal in #100
Full Changelog: https://github.com/getcellm/cellm/commits/v0.1.0