File size: 1,438 Bytes
a91a564
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
174e2ce
a91a564
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
---
language:
- en
base_model: louisbrulenaudet/Pearl-3x7B
library_name: mlx
tags:
- moe
- frankenmoe
- merge
- mergekit
- lazymergekit
- dvilasuero/DistilabelBeagle14-7B
- beowolx/CodeNinja-1.0-OpenChat-7B
- WizardLM/WizardMath-7B-V1.1
- Maths
- Code
- Python
pipeline_tag: text-generation
license: apache-2.0
---

<center><img src='https://i.imgur.com/0xFTuAX.png' width='450px'></center>

# mlx-community/Pearl-3x7B

This model was converted to MLX format from [`louisbrulenaudet/Pearl-3x7B`]() using mlx-vlm version **0.16.1**.
Refer to the [original model card](louisbrulenaudet/Pearl-3x7B) for more details on the model.

## Use with mlx

```bash
pip install -U mlx-vlm
python -m mlx_vlm.generate --model  mlx-community/Pearl-3x7B --max-tokens 100 --temp 0.0
```

```python
from mlx_lm import load, generate

model, tokenizer = load("mlx-community/Pearl-3x7B")
response = generate(model, tokenizer, prompt="hello", verbose=True)
```

## Citing & Authors

If you use this code in your research, please use the following BibTeX entry.

```BibTeX
@misc{louisbrulenaudet2024,
  author =       {Louis Brulé Naudet},
  title =        {Pearl-3x7B, an xtraordinary Mixture of Experts (MoE) for data science},
  year =         {2024}
  howpublished = {\url{https://huggingface.co/mlx-community/Pearl-3x7B}},
}
```

## Feedback

If you have any feedback, please reach out at [[email protected]](mailto:[email protected]).