StevenTang commited on
Commit
752a3ec
1 Parent(s): 3192185

Update README

Browse files
Files changed (1) hide show
  1. README.md +69 -0
README.md ADDED
@@ -0,0 +1,69 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ language:
4
+ - en
5
+ tags:
6
+ - text-generation
7
+ - text2text-generation
8
+ - summarization
9
+ - conversational
10
+ pipeline_tag: text2text-generation
11
+ widget:
12
+ - text: "Summarize: You may want to stick it to your boss and leave your job, but don't do it if these are your reasons."
13
+ example_title: "Summarization"
14
+ - text: "Given the dialog: do you like dance? [SEP] Yes I do. Did you know Bruce Lee was a cha cha dancer?"
15
+ example_title: "Dialog"
16
+ - text: "Describe the following data: Iron Man | instance of | Superhero [SEP] Stan Lee | creator | Iron Man"
17
+ example_title: "Data-to-text"
18
+ - text: "Given the story title: I think all public schools should have a uniform dress code."
19
+ example_title: "Story Generation"
20
+ - text: "Answer the following question: From which country did Angola achieve independence in 1975?"
21
+ example_title: "Question Answering"
22
+ - text: "Generate the question based on the answer: boxing [X_SEP] A bolo punch is a punch used in martial arts . A hook is a punch in boxing ."
23
+ example_title: "Question Generaion"
24
+ ---
25
+
26
+ # MVP-multi-task
27
+ The MVP-multi-task model was proposed in [**MVP: Multi-task Supervised Pre-training for Natural Language Generation**](https://github.com/RUCAIBox/MVP/blob/main/paper.pdf) by Tianyi Tang, Junyi Li, Wayne Xin Zhao and Ji-Rong Wen.
28
+
29
+ The detailed information and instructions can be found [https://github.com/RUCAIBox/MVP](https://github.com/RUCAIBox/MVP).
30
+
31
+ ## Model Description
32
+ MVP-multi-task is a prompt-based model that MVP is further equipped with prompts pre-trained using labeled open dialogue system datasets. It is a variant (MVP+M) of our main [MVP](https://huggingface.co/RUCAIBox/mvp) model. It follows a Transformer encoder-decoder architecture with layer-wise prompts.
33
+
34
+ MVP is specially designed for natural language generation and can be adapted to a wide range of generation tasks, including but not limited to summarization, data-to-text generation, open-ended dialogue system, story generation, question answering, question generation, task-oriented dialogue system, commonsense generation, paraphrase generation, text style transfer, and text simplification. Our model can also be adapted to natural language understanding tasks such as sequence classification and (extractive) question answering.
35
+
36
+ ## Example
37
+ For summarization:
38
+ ```python
39
+ >>> from transformers import MvpTokenizer, MvpForConditionalGeneration
40
+
41
+ >>> tokenizer = MvpTokenizer.from_pretrained("RUCAIBox/mvp")
42
+ >>> model = MvpForConditionalGeneration.from_pretrained("RUCAIBox/mvp")
43
+
44
+ >>> inputs = tokenizer(
45
+ ... "Summarize: You may want to stick it to your boss and leave your job, but don't do it if these are your reasons.",
46
+ ... return_tensors="pt",
47
+ ... )
48
+ >>> generated_ids = model.generate(**inputs)
49
+ >>> tokenizer.batch_decode(generated_ids, skip_special_tokens=True)
50
+ ["Why You Shouldn't Quit Your Job"]
51
+ ```
52
+
53
+ For data-to-text generation:
54
+ ```python
55
+ >>> from transformers import MvpTokenizerFast, MvpForConditionalGeneration
56
+
57
+ >>> tokenizer = MvpTokenizerFast.from_pretrained("RUCAIBox/mvp")
58
+ >>> model = MvpForConditionalGeneration.from_pretrained("RUCAIBox/mvp")
59
+
60
+ >>> inputs = tokenizer(
61
+ ... "Describe the following data: Iron Man | instance of | Superhero [SEP] Stan Lee | creator | Iron Man",
62
+ ... return_tensors="pt",
63
+ ... )
64
+ >>> generated_ids = model.generate(**inputs)
65
+ >>> tokenizer.batch_decode(generated_ids, skip_special_tokens=True)
66
+ ['Iron Man is a fictional superhero appearing in American comic books published by Marvel Comics.']
67
+ ```
68
+
69
+ ## Citation