Text Generation
Transformers
PyTorch
mpt
Composer
MosaicML
llm-foundry
custom_code
text-generation-inference

How to run on Colab through CPU ?

#63
by deepakkaura26 - opened

Can someone share examples that how to run this model "mosaicml/mpt-7b-instruct" through CPU over colab ?

Sign up or log in to comment