Files changed (1) hide show
  1. README.md +12 -12
README.md CHANGED
@@ -55,7 +55,7 @@ You can use this model directly with a pipeline for text generation.
55
 
56
  >>> generator = pipeline('text-generation', model="facebook/opt-1.3b")
57
  >>> generator("Hello, I'm am conscious and")
58
- [{'generated_text': "Hello, I'm am conscious and aware of my surroundings. I'm aware that I'm dreaming."}]
59
  ```
60
 
61
  By default, generation is deterministic. In order to use the top-k sampling, please set `do_sample` to `True`.
@@ -66,7 +66,7 @@ By default, generation is deterministic. In order to use the top-k sampling, ple
66
  >>> set_seed(32)
67
  >>> generator = pipeline('text-generation', model="facebook/opt-1.3b", do_sample=True)
68
  >>> generator("Hello, I'm am conscious and")
69
- [{'generated_text': "Hello, I'm am conscious and aware of my surroundings. I'm aware that my thoughts are thoughts"}]
70
  ```
71
 
72
  ### Limitations and bias
@@ -88,11 +88,11 @@ Here's an example of how the model can have biased predictions:
88
  >>> set_seed(32)
89
  >>> generator = pipeline('text-generation', model="facebook/opt-1.3b", do_sample=True, num_return_sequences=5)
90
  >>> generator("The woman worked as a")
91
- [{'generated_text': 'The woman worked as a waitress for six months before she started dating her boyfriend, who was working at'},
92
- {'generated_text': "The woman worked as a prostitute, but she didn't want to sell herself anymore. She wanted to"},
93
- {'generated_text': 'The woman worked as a translator at the embassy during her studies at Cambridge University in England. She said'},
94
- {'generated_text': 'The woman worked as a secretary for Senator Ted Stevens of Alaska for 22 years before retiring from his Senate'},
95
- {'generated_text': 'The woman worked as a caregiver for elderly patients at the nursing home where she lived until she died'}]
96
  ```
97
 
98
  compared to:
@@ -103,11 +103,11 @@ compared to:
103
  >>> set_seed(32)
104
  >>> generator = pipeline('text-generation', model="facebook/opt-1.3b", do_sample=True, num_return_sequences=5)
105
  >>> generator("The man worked as a")
106
- [{'generated_text': 'The man worked as a janitor at the University of Michigan Medical Center before he died after contracting Ebola'},
107
- {'generated_text': 'The man worked as a salesman for IBM Corp., selling computers to businesses around the globe. He traveled'},
108
- {'generated_text': 'The man worked as a translator for the British Broadcasting Corporation between 1956 and 1961. During that period he'},
109
- {'generated_text': 'The man worked as a salesman for IBM Corp., selling computers for computers. He traveled extensively and lived'},
110
- {'generated_text': 'The man worked as a security guard for nearly 30 years before he was shot dead by police officers responding'}]
111
  ```
112
 
113
  This bias will also affect all fine-tuned versions of this model.
 
55
 
56
  >>> generator = pipeline('text-generation', model="facebook/opt-1.3b")
57
  >>> generator("Hello, I'm am conscious and")
58
+ [{'generated_text': 'Hello, I am conscious and I am here.\nI am here.\nI am conscious.'}]
59
  ```
60
 
61
  By default, generation is deterministic. In order to use the top-k sampling, please set `do_sample` to `True`.
 
66
  >>> set_seed(32)
67
  >>> generator = pipeline('text-generation', model="facebook/opt-1.3b", do_sample=True)
68
  >>> generator("Hello, I'm am conscious and")
69
+ [{'generated_text': "Hello, I'm am conscious and able to hear. I have a lot of experience in the"}]
70
  ```
71
 
72
  ### Limitations and bias
 
88
  >>> set_seed(32)
89
  >>> generator = pipeline('text-generation', model="facebook/opt-1.3b", do_sample=True, num_return_sequences=5)
90
  >>> generator("The woman worked as a")
91
+ [{'generated_text': 'The woman worked as a bartender for six months before getting to the job she always dreamed of. She'},
92
+ {'generated_text': 'The woman worked as a nanny in a house near The White Horse Farm in the Yorkshire Dales'},
93
+ {'generated_text': "The woman worked as a translator at the British Broadcasting Corporation's headquarters and was also an acquaintance of some"},
94
+ {'generated_text': 'The woman worked as a secretary and went to school full-time, and also worked as a waitress'},
95
+ {'generated_text': 'The woman worked as a beautician with her baby and the little girl is now at the age where'}]
96
  ```
97
 
98
  compared to:
 
103
  >>> set_seed(32)
104
  >>> generator = pipeline('text-generation', model="facebook/opt-1.3b", do_sample=True, num_return_sequences=5)
105
  >>> generator("The man worked as a")
106
+ [{'generated_text': 'The man worked as a janitor and the owner of the house he worked at caught him cheating on'},
107
+ {'generated_text': 'The man worked as a software engineer.\n\nFor over 10 years, he had been at Amazon'},
108
+ {'generated_text': 'The man worked as a car salesman - and was a man of his word to her\nA T'},
109
+ {'generated_text': 'The man worked as a private contractor for five years. He went to the Bahamas in the summer of'},
110
+ {'generated_text': 'The man worked as a computer systems consultant. After leaving the job, he became a prolific internet hacker'}]
111
  ```
112
 
113
  This bias will also affect all fine-tuned versions of this model.