-
Notifications
You must be signed in to change notification settings - Fork 2k
Description
Describe the bug
I am using Ollama on a remote server as a model. When launching a CodeAgent with this model and giving stream_outputs=True to the CodeAgents I get an error because it tries to access url http://localhost:11434 instead of the api_base I gave in the LiteLLMModel definition.
Code to reproduce the error
"""
from smolagents import CodeAgent, LiteLLMModel
model = LiteLLMModel(model_id="ollama/qwen3:32b", api_base="http://1.1.1.1:11434", num_ctx=8192)
agent = CodeAgent(
name="test_agent",
description="A test agent.",
tools=[],
managed_agents=[],
model=model,
stream_outputs=True,
)
agent.run("Hello")
"""
This code tries to access a model in localhost:11434 instead of the 1.1.1.1 provided. When removing the flag stream_outputs=True, or setting it to False, the code tries to access 1.1.1.1:11434 as it should.
Error logs (if any)
smolagents.utils.AgentGenerationError: Error in generating model output:
litellm.APIConnectionError: OllamaException - {"error":"model 'qwen3:32b' not found"}
Expected behavior
with or without stream_outputs, it should access the right endpoint for ollama.
Packages version:
smolagents==1.16.1