Skip to content

Conversation

ochafik
Copy link
Contributor

@ochafik ochafik commented Jan 22, 2025

Fixes #28

  • Parse {% generation %} blocks (no-op for inference)
  • Accommodate models that require typed content ("messages": {"role": "user", "content": [{"type": "text", "text": "Hey"}]})

@ochafik ochafik changed the title Add generation support for inference (#28) Support MiniMaxAI/MiniMax-Text-01 Jan 22, 2025
@ochafik ochafik marked this pull request as ready for review January 22, 2025 14:27
@ochafik ochafik merged commit 0f5f7f2 into main Jan 22, 2025
1 check passed
@ochafik ochafik deleted the generation branch January 22, 2025 14:29
ochafik pushed a commit to ochafik/llama.cpp that referenced this pull request Jan 22, 2025
ochafik pushed a commit to ochafik/llama.cpp that referenced this pull request Jan 22, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Support MiniMaxAI/MiniMax-Text-01 ({% generation %} blocks)
1 participant