Skip to content

[BUG] Can't stream and plan with Ollama for a CodeAgent. #1347

@FlorianVal

Description

@FlorianVal

Describe the bug
In a CodeAgent, when trying to stream and set a planning interval, the stream does not work.

Code to reproduce the error

from src.smolagents import CodeAgent, LiteLLMModel, ToolCallingAgent, WebSearchTool

model = LiteLLMModel(model_id="ollama/qwen3:8b", api_base="http://localhost:11434", num_ctx=128000)
# Can be any model, I also tried with :model = InferenceClientModel()   

agent = CodeAgent(
    name="test_agent",
    description="A test agent.",
    tools=[],
    managed_agents=[WebSearchTool],
    model=model,
    stream_outputs=True,
    planning_interval=2,
)
agent.run(
    "Given the average speed of a chicken, how much time does a chicken take to run the Pont des Arts in Paris?"
)

Error logs (if any)
No error, just no stream

Expected behavior
A stream of the planning step and the LLM output.

Packages version:
On main branch

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions