Static quantization of Mistral-Large-Instruct-2411
- Downloads last month
- 24
Hardware compatibility
Log In
to view the estimation
6-bit
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for Valeciela/Mistral-Large-Instruct-2411-Q6_K_L-GGUF
Base model
mistralai/Mistral-Large-Instruct-2411