musicaudiopretrain commited on
Commit
9c29caf
β€’
1 Parent(s): c40b01f

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -10,7 +10,7 @@ pinned: false
10
  We are a group of people working towards Music AGI (Artificial General Intelligence)~ We pre-train large music models (LMMs)~πŸ”₯
11
 
12
  The development log of our Music Audio Pre-training (m-a-p) model family:
13
- - 02/06/2023: officially release the [MERt pre-print paper](https://arxiv.org/abs/2306.00107) and training [codes](https://github.com/yizhilll/MERT).
14
  - 17/03/2023: we release two advanced music understanding models, [MERT-v1-95M](https://huggingface.co/m-a-p/MERT-v1-95M) and [MERT-v1-330M](https://huggingface.co/m-a-p/MERT-v1-330M) , trained with new paradigm and dataset. They outperform the previous models and can better generalize to more tasks.
15
  - 14/03/2023: we retrained the MERT-v0 model with open-source-only music dataset [MERT-v0-public](https://huggingface.co/m-a-p/MERT-v0-public)
16
  - 29/12/2022: a music understanding model [MERT-v0](https://huggingface.co/m-a-p/MERT-v0) trained with **MLM** paradigm, which performs better at downstream tasks.
 
10
  We are a group of people working towards Music AGI (Artificial General Intelligence)~ We pre-train large music models (LMMs)~πŸ”₯
11
 
12
  The development log of our Music Audio Pre-training (m-a-p) model family:
13
+ - 02/06/2023: officially release the [MERT pre-print paper](https://arxiv.org/abs/2306.00107) and training [codes](https://github.com/yizhilll/MERT).
14
  - 17/03/2023: we release two advanced music understanding models, [MERT-v1-95M](https://huggingface.co/m-a-p/MERT-v1-95M) and [MERT-v1-330M](https://huggingface.co/m-a-p/MERT-v1-330M) , trained with new paradigm and dataset. They outperform the previous models and can better generalize to more tasks.
15
  - 14/03/2023: we retrained the MERT-v0 model with open-source-only music dataset [MERT-v0-public](https://huggingface.co/m-a-p/MERT-v0-public)
16
  - 29/12/2022: a music understanding model [MERT-v0](https://huggingface.co/m-a-p/MERT-v0) trained with **MLM** paradigm, which performs better at downstream tasks.