-
Notifications
You must be signed in to change notification settings - Fork 731
Closed
Description
In regard to our conversation #635, i've tried to download manually and automatically a model, then passed a path of it to the class LLM and got the error
>>> llm = LLM('/data/models/tinyllama-1.1b-chat-v1.0.Q4_K_M.gguf', method='llama.cpp')
Traceback (most recent call last):
File "/usr/local/lib/python3.11/site-packages/transformers/configuration_utils.py", line 729, in _get_config_dict
config_dict = cls._dict_from_json_file(resolved_config_file)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/transformers/configuration_utils.py", line 827, in _dict_from_json_file
text = reader.read()
^^^^^^^^^^^^^
File "<frozen codecs>", line 322, in decode
UnicodeDecodeError: 'utf-8' codec can't decode byte 0xc9 in position 8: invalid continuation byte
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/local/lib/python3.11/site-packages/txtai/pipeline/text/llm.py", line 18, in __init__
super().__init__(self.task(path, task, **kwargs), path if path else "google/flan-t5-base", quantize, gpu, model, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/txtai/pipeline/text/llm.py", line 103, in task
task = Models.task(path, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/txtai/models/models.py", line 239, in task
config = AutoConfig.from_pretrained(path, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/transformers/models/auto/configuration_auto.py", line 1082, in from_pretrained
config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/transformers/configuration_utils.py", line 644, in get_config_dict
config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/transformers/configuration_utils.py", line 732, in _get_config_dict
raise EnvironmentError(
OSError: It looks like the config file at '/data/models/tinyllama-1.1b-chat-v1.0.Q4_K_M.gguf' is not a valid JSON file.
Code that was run
>>> from txtai.pipeline import LLM
>>> llm = LLM('/data/models/tinyllama-1.1b-chat-v1.0.Q4_K_M.gguf', method='llama.cpp')
Directory with models
root@efe095bdf9d9:/app# ls -la /data/models
total 14103508
drwxr-xr-x 7 1000 1000 4096 Jan 13 00:51 .
drwxr-xr-x 3 root root 4096 Jan 13 00:51 ..
-rw-r--r-- 1 1000 1000 667814368 Jan 12 18:47 tinyllama-1.1b-1t-openorca.Q4_K_M.gguf
-rw-r--r-- 1 1000 1000 668788096 Jan 12 23:40 tinyllama-1.1b-chat-v1.0.Q4_K_M.gguf
Do not understand the error, why the path is interpreted as a json config file...
Don't you have thoughts about it?
paulmcq
Metadata
Metadata
Assignees
Labels
No labels