Hardware requirement

#1
by Dtree07 - opened

Anyone knows how big should my VRAM be if I want to run this model?Thx.

LLaMA-MoE org

Hi there. The model is constructed on LLaMA-2 7B, so it'll take (almost) the same VRAM consumption as the 7B model. ~14GB for the model only, and some extra space for the context cache depending on the context length.

Thanks for ur reply.♥

Sign up or log in to comment