Text Generation
Transformers
Safetensors
9 languages
mistral
chat
conversational
text-generation-inference
Inference Endpoints
alpindale commited on
Commit
13a5f02
1 Parent(s): ec0d9b2

Create axolotl_rocm_setup.sh

Browse files
axolotl_config/axolotl_rocm_setup.sh ADDED
@@ -0,0 +1,30 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ #!/bin/bash
2
+
3
+ # Setup Axolotl with FA2 and BnB ROCm - doctorshotgun Aug 6, 2024
4
+ # Runpod image: RunPod Pytorch 2.1.2 ROCm 6.1 runpod/pytorch:2.1.2-py3.10-rocm6.1-ubuntu22.04
5
+
6
+ # Install torch and flash-attn
7
+ pip install torch==2.4.0 --index-url https://download.pytorch.org/whl/rocm6.1
8
+ pip install https://github.com/DocShotgun/flash-attention/releases/download/v2.6.3/flash_attn-2.6.3+rocm6.1+torch2.4.0-cp310-cp310-linux_x86_64.whl
9
+
10
+ # For some reason we need to manually install amdsmi for torch 2.4.0 with ROCm 6.1
11
+ cd /opt/rocm/share/amd_smi && pip install .
12
+
13
+ # Install Axolotl
14
+ cd /workspace/
15
+ git clone https://github.com/axolotl-ai-cloud/axolotl && cd axolotl
16
+ git checkout 70978467a088da3abf3fe45d92d90f6529f19ea9
17
+ pip install -e '.[deepspeed]'
18
+
19
+ # Install Bitsandbytes (multi-backend-refactor branch)
20
+ cd /workspace/
21
+ git clone https://github.com/TimDettmers/bitsandbytes.git && cd bitsandbytes/
22
+ git checkout 6d9b69b626bf93a9ec22b068d1d4107f70979e34
23
+ pip install -r requirements-dev.txt
24
+ cmake -DCOMPUTE_BACKEND=hip -S .
25
+ make
26
+ pip install -e .
27
+
28
+ # To begin training, run:
29
+ # accelerate launch -m axolotl.cli.train <your_config.yml>
30
+ # If you encounter an error related to xformers, you can try editing /src/axolotl/monkeypatch/llama_attn_hijack_flash.py (for llama-type models) to comment out the xformers import and force is_xformers_swiglu_available to return False