Using in Colab with T4 GPU
#1
by
Manu9000k
- opened
Can you please explain how to use it with Colab free? I see in the server there is the option --load_in_8bit but it does not run in colab free
!python3 server.py --model "meetkai/functionary-7b-v1.1" --load_in_8bit True --device "cuda:0"
Hi, unfortunately we have just removed server.py and moved to server vllm. That might be the reason. Maybe we can add it in the future, but it takes resources/time to keep two server files.