Phi-3-mini-instruct-graph / requirements.txt
wagnercosta's picture
Add flash atttention to requirements file
fe9095d verified
raw
history blame contribute delete
No virus
259 Bytes
python-dotenv
gradio
transformers
load_dotenv
accelerate
python-rapidjson
spaces
pyvis
networkx
spacy
https://github.com/Dao-AILab/flash-attention/releases/download/v2.5.9.post1/flash_attn-2.5.9.post1+cu118torch1.12cxx11abiFALSE-cp310-cp310-linux_x86_64.whl