runtime error

The token has not been saved to the git credentials helper. Pass `add_to_git_credential=True` in this function directly or `--add-to-git-credential` if using via `huggingface-cli` if you want to set the git credential as well. Token is valid (permission: fineGrained). Your token has been saved to /home/user/.cache/huggingface/token Login successful Traceback (most recent call last): File "/usr/local/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py", line 982, in from_pretrained config_class = CONFIG_MAPPING[config_dict["model_type"]] File "/usr/local/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py", line 684, in __getitem__ raise KeyError(key) KeyError: 'llava_llama' During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/home/user/app/app.py", line 63, in <module> ai_assistant = MultimodalAI() File "/home/user/app/multimodal_ai.py", line 28, in __init__ self._load_model_and_tokenizer() File "/home/user/app/multimodal_ai.py", line 32, in _load_model_and_tokenizer self.model = AutoModelForCausalLM.from_pretrained(self.model_name, File "/usr/local/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 524, in from_pretrained config, kwargs = AutoConfig.from_pretrained( File "/usr/local/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py", line 984, in from_pretrained raise ValueError( ValueError: The checkpoint you are trying to load has model type `llava_llama` but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date.

Container logs:

Fetching error logs...