Update README.md
Browse files
README.md
CHANGED
@@ -107,7 +107,7 @@ This repo contains models, code and pointers to datasets from our paper: [TwHIN-
|
|
107 |
### Overview
|
108 |
TwHIN-BERT is a new multi-lingual Tweet language model that is trained on 7 billion Tweets from over 100 distinct languages. TwHIN-BERT differs from prior pre-trained language models as it is trained with not only text-based self-supervision (e.g., MLM), but also with a social objective based on the rich social engagements within a Twitter Heterogeneous Information Network (TwHIN).
|
109 |
|
110 |
-
TwHIN-BERT can be used as a drop-in replacement for BERT in a variety of NLP and recommendation tasks. It not only outperforms similar models semantic understanding tasks such text classification), but also **social recommendation
|
111 |
|
112 |
## 1. Pretrained Models
|
113 |
|
@@ -142,7 +142,7 @@ TBD -->
|
|
142 |
|
143 |
|
144 |
## Citation
|
145 |
-
If you use TwHIN-BERT or out datasets in your work, please cite
|
146 |
```bib
|
147 |
@article{zhang2022twhin,
|
148 |
title={TwHIN-BERT: A Socially-Enriched Pre-trained Language Model for Multilingual Tweet Representations},
|
|
|
107 |
### Overview
|
108 |
TwHIN-BERT is a new multi-lingual Tweet language model that is trained on 7 billion Tweets from over 100 distinct languages. TwHIN-BERT differs from prior pre-trained language models as it is trained with not only text-based self-supervision (e.g., MLM), but also with a social objective based on the rich social engagements within a Twitter Heterogeneous Information Network (TwHIN).
|
109 |
|
110 |
+
TwHIN-BERT can be used as a drop-in replacement for BERT in a variety of NLP and recommendation tasks. It not only outperforms similar models semantic understanding tasks such text classification), but also **social recommendation** tasks such as predicting user to Tweet engagement.
|
111 |
|
112 |
## 1. Pretrained Models
|
113 |
|
|
|
142 |
|
143 |
|
144 |
## Citation
|
145 |
+
If you use TwHIN-BERT or out datasets in your work, please cite the following:
|
146 |
```bib
|
147 |
@article{zhang2022twhin,
|
148 |
title={TwHIN-BERT: A Socially-Enriched Pre-trained Language Model for Multilingual Tweet Representations},
|