vLLM says it is not a safetensors repo
#5
by
gnoale
- opened
Hello, while loading the model with vLLM (v0.12.0) it surface an error coming from the hf_hub library
(APIServer pid=1) ERROR 12-24 13:41:28 [transformers_utils/repo_utils.py:65] Error retrieving safetensors: 'mistralai/Mistral-Large-3-675B-Instruct-2512-NVFP4' is not a safetensors repo. Couldn't find 'model.safetensors.index.json' or 'model.safetensors' files., retrying 1 of 2
(APIServer pid=1) ERROR 12-24 13:41:30 [transformers_utils/repo_utils.py:63] Error retrieving safetensors: 'mistralai/Mistral-Large-3-675B-Instruct-2512-NVFP4' is not a safetensors repo. Couldn't find 'model.safetensors.index.json' or 'model.safetensors' files.
The library only check for (model.safetensors.index.json)[https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/constants.py#L48] and this repository has only a consolidated.safetensors.index.json
what do you think ?
gnoale
changed discussion title from
vLLM cannot find consolidated.safetensors.index.json
to vLLM says it is not a safetensors repo
yes, i would suggest the model authors use the conventional filenames if possible!