TinyLlama-1.1B-Chat-v1.0-llamafile
llamafile lets you distribute and run LLMs with a single file. announcement blog post
Downloads
- tinyllama-1.1b-chat-v1.0.Q3_K_M-server.llamafile
- tinyllama-1.1b-chat-v1.0.Q4_K_M-server.llamafile
- tinyllama-1.1b-chat-v1.0.Q5_0-server.llamafile
- tinyllama-1.1b-chat-v1.0.Q5_K_M-server.llamafile
- tinyllama-1.1b-chat-v1.0.Q8_0-server.llamafile
This repository was created using the llamafile-builder
- Downloads last month
- 88
Model tree for rabil/TinyLlama-1.1B-Chat-v1.0-llamafile
Base model
TinyLlama/TinyLlama-1.1B-Chat-v1.0
Quantized
TheBloke/TinyLlama-1.1B-Chat-v1.0-GGUF