nomic-embed-text running slowly

#32
by xtreme786 - opened

when i am running embedding on a text file using this command
"embedding=OllamaEmbeddings(model="nomic-embed-text",show_progress=True),"

the output is really slow "OllamaEmbeddings: 56%|█████████████████████████████████████████████▊ | 152/272 [05:23<04:10, 2.09s/it]"

PS: I have local 4090gpu , on which i am able to run llama3.1

Nomic AI org

Without having more context, this might be better suited for the Ollama github. However, we just added SDPA support for Nomic Embed which should make inference faster. I'm not sure how Ollama implements Nomic Embed

zpn changed discussion status to closed
Nomic AI org

You might get better performance with HuggingFaceEmbeddings. I think the code is:

embedding = HuggingFaceEmbeddings("nomic-ai/nomic-embed-text-v1.5", trust_remote_code=True)
  • Tom Aarsen

Sign up or log in to comment