seanbenhur commited on
Commit
0d35883
1 Parent(s): ba7d82e

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +7 -3
README.md CHANGED
@@ -12,8 +12,8 @@ datasets:
12
  metrics:
13
  - F1-Score
14
  widget:
15
- - text: "Shk ka d nupi cgi huithu ga mnle"
16
- - text: "Nupi do hatok khro mahik mapini"
17
  ---
18
  # Automatic Identification of Gender Bias in Hindi,Bengali,Meitei Codemixed Texts
19
 
@@ -44,4 +44,8 @@ if __name__ == "__main__":
44
  target = predict_pipe(text)
45
  print(target)
46
 
47
- ```
 
 
 
 
 
12
  metrics:
13
  - F1-Score
14
  widget:
15
+ - text: "but who in the holy hell says to relate with it,or inspired by it😂😂,i'm a 23 yr old student,and i say it's wrong,watch for entertainment purpose,and those who get inspired by such movies,its their mental problem.and all the praise that shahid's getting is for dark charachter that he portrays.and those sittis she's talking abt,don't we hear those when a villian arrives on [screen.my](http://screen.my/) point is bash sexism,whether it's by a man or a group of woman.and as far as i remember,those girls were not shown as dark characters,as kabir singh is🙂"
16
+ - text: "सही है, बोलने के अधिकार पर गाली दो, parotest के अधिकार पर पुलिश का सर फोड़ो ,मादरचोदो अधिकारो का कब सही इस्तेमाल करोगें🐷🐷🐷😠😠😠🖕"
17
  ---
18
  # Automatic Identification of Gender Bias in Hindi,Bengali,Meitei Codemixed Texts
19
 
 
44
  target = predict_pipe(text)
45
  print(target)
46
 
47
+ ```
48
+
49
+
50
+ ### Some concerns
51
+ - Note the model is trained on relatively lower samples (i.e 12k) but with languages Hindi, Bengali, Meitei, and English. The native scripts were trained on both native on codemixed scripts, So the model might perform poorly on many text samples and might not generalize well. We are continuously improving the model.