7B is cold on Inference at just 7B !

#11
by Syndicate604 - opened

For some reason we cannot run this on server-less even though its only 5-6GB on my GPU, HF only shows 70B when you try and run it!

Sign up or log in to comment