vijaye12 commited on
Commit
d36e37a
1 Parent(s): 672fe08

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -21,7 +21,7 @@ TinyTimeMixers (TTMs) are compact pre-trained models for Multivariate Time-Serie
21
 
22
 
23
  TTM-R1 comprises TTM variants pre-trained on 250M public training samples. We have another set of TTM models released under TTM-R2 trained on a much larger pretraining
24
- dataset (~700M samples) which can be accessed from [here](https://huggingface.co/ibm-granite/granite-timeseries-ttm-r2) In general, TTM-R2 models perform better than
25
  TTM-R1 models as they are trained on larger pretraining dataset. However, the choice of R1 vs R2 depends on your target data distribution. Hence requesting users to
26
  try both R1 and R2 variants and pick the best for your data.
27
 
 
21
 
22
 
23
  TTM-R1 comprises TTM variants pre-trained on 250M public training samples. We have another set of TTM models released under TTM-R2 trained on a much larger pretraining
24
+ dataset (~700M samples) which can be accessed from [here](https://huggingface.co/ibm-granite/granite-timeseries-ttm-r2). In general, TTM-R2 models perform better than
25
  TTM-R1 models as they are trained on larger pretraining dataset. However, the choice of R1 vs R2 depends on your target data distribution. Hence requesting users to
26
  try both R1 and R2 variants and pick the best for your data.
27