-
Notifications
You must be signed in to change notification settings - Fork 38
Closed
Milestone
Description
OSError: ollama/llama3.1 is not a local folder and is not a valid model identifier listed on 'https://huggingface.co/models'
Took me a while to figure out why I couldn't use llama3.1 given that another I could in another virtual environment. I realised that the pipeline-llm extra is needed.
Perhaps it would be helpful to provide the option when installing?
Metadata
Metadata
Assignees
Labels
No labels