Skip to content

How to use Gemini with txtai #843

@igorlima

Description

@igorlima

I've been exploring the possibilities of using Google Gemini with txtai, but I haven't found any references to Gemini in the documentation yet.

Is there a way to embed text in txtai using Gemini? The documentation references other LLMs, but Gemini seems missing.

Here's a snippet of what I've attempted using the litellm method, though I haven't had any success so far:

import os
os.environ["GEMINI_API_KEY"] = "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"

from txtai import LLM
llm = LLM("gemini/gemini-pro", method="litellm")
llm("Where is one place you'd go in Washington, DC?")

from txtai import Embeddings
embeddings = Embeddings(path="litellm/gemini/gemini-pro")
data = [
  "US tops 5 million confirmed virus cases",
  "Canada's last fully intact ice shelf has suddenly collapsed, forming a Manhattan-sized iceberg",
]
embeddings.index(data)

I'd appreciate the guidance if anyone has insights or knows of any documentation or examples on using Gemini with txtai.

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions