Update README.md
Browse files
README.md
CHANGED
@@ -50,9 +50,7 @@ Our models are not specifically designed or evaluated for all downstream purpose
|
|
50 |
## Usage
|
51 |
|
52 |
### Requirements
|
53 |
-
Phi-3.5-MoE-instruct
|
54 |
-
* When loading the model, ensure that `trust_remote_code=True` is passed as an argument of the `from_pretrained()` function.
|
55 |
-
|
56 |
The current `transformers` version can be verified with: `pip list | grep transformers`.
|
57 |
|
58 |
Examples of required packages:
|
@@ -60,7 +58,7 @@ Examples of required packages:
|
|
60 |
flash_attn==2.5.8
|
61 |
torch==2.3.1
|
62 |
accelerate==0.31.0
|
63 |
-
transformers==4.
|
64 |
```
|
65 |
|
66 |
Phi-3.5-MoE-instruct is also available in [Azure AI Studio](https://aka.ms/try-phi3.5moe)
|
|
|
50 |
## Usage
|
51 |
|
52 |
### Requirements
|
53 |
+
Phi-3.5-MoE-instruct is integrated in the official version of `transformers` starting from **4.46.0**.
|
|
|
|
|
54 |
The current `transformers` version can be verified with: `pip list | grep transformers`.
|
55 |
|
56 |
Examples of required packages:
|
|
|
58 |
flash_attn==2.5.8
|
59 |
torch==2.3.1
|
60 |
accelerate==0.31.0
|
61 |
+
transformers==4.46.0
|
62 |
```
|
63 |
|
64 |
Phi-3.5-MoE-instruct is also available in [Azure AI Studio](https://aka.ms/try-phi3.5moe)
|