What are the installation requirements for using triton?
I'm using triton:
self.config.attn_config['attn_impl'] = 'triton'
The error I'm getting is:
Exception in thread Thread-13 (generate_and_signal_complete): Traceback (most recent call last): File "/root/.cache/huggingface/modules/transformers_modules/mosaicml/mpt-7b-instruct/925e0d80e50e77aaddaf9c3ced41ca4ea23a1025/attention.py", line 109, in triton_flash_attn_fn from .flash_attn_triton import flash_attn_func File "/root/.cache/huggingface/modules/transformers_modules/mosaicml/mpt-7b-instruct/925e0d80e50e77aaddaf9c3ced41ca4ea23a1025/flash_attn_triton.py", line 46, in <module> import triton_pre_mlir as triton ModuleNotFoundError: No module named 'triton_pre_mlir'
Currently I'm installing:
!pip install --upgrade transformers einops datasets accelerate
Please see llm foundry for our deps (https://github.com/mosaicml/llm-foundry/blob/2ba9224f6a841e157cdc5069c1e0a6fa830557dc/setup.py#L66)