ollama quantized variants

#8
by mswinds - opened

Hi, could you please kindly upload an fp16 version of the ajindal/llama3.1-storm 8b model on Ollama? The default is Q4_0, and other tags are also quantized.

Sign up or log in to comment