1-800-BAD-CODE
commited on
Commit
•
b7a0492
1
Parent(s):
7af1fcd
Update README.md
Browse files
README.md
CHANGED
@@ -213,6 +213,60 @@ In these metrics, keep in mind that
|
|
213 |
|
214 |
When the sentences are longer and more practical, these ambiguities abound and affect all 3 analytics.
|
215 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
216 |
## Test Data and Example Generation
|
217 |
Each test example was generated using the following procedure:
|
218 |
|
|
|
213 |
|
214 |
When the sentences are longer and more practical, these ambiguities abound and affect all 3 analytics.
|
215 |
|
216 |
+
<details open>
|
217 |
+
|
218 |
+
<summary>Punctuation Report</summary>
|
219 |
+
|
220 |
+
```text
|
221 |
+
label precision recall f1 support
|
222 |
+
<NULL> (label_id: 0) 98.83 98.49 98.66 446496
|
223 |
+
<ACRONYM> (label_id: 1) 74.15 94.26 83.01 697
|
224 |
+
. (label_id: 2) 90.64 92.99 91.80 30002
|
225 |
+
, (label_id: 3) 77.19 79.13 78.15 23321
|
226 |
+
? (label_id: 4) 76.58 74.56 75.56 1022
|
227 |
+
-------------------
|
228 |
+
micro avg 97.21 97.21 97.21 501538
|
229 |
+
macro avg 83.48 87.89 85.44 501538
|
230 |
+
weighted avg 97.25 97.21 97.23 501538
|
231 |
+
|
232 |
+
```
|
233 |
+
|
234 |
+
</details>
|
235 |
+
|
236 |
+
<details open>
|
237 |
+
|
238 |
+
<summary>True-casing Report</summary>
|
239 |
+
|
240 |
+
```text
|
241 |
+
label precision recall f1 support
|
242 |
+
LOWER (label_id: 0) 99.83 99.81 99.82 2020678
|
243 |
+
UPPER (label_id: 1) 95.51 95.90 95.71 83873
|
244 |
+
-------------------
|
245 |
+
micro avg 99.66 99.66 99.66 2104551
|
246 |
+
macro avg 97.67 97.86 97.76 2104551
|
247 |
+
weighted avg 99.66 99.66 99.66 2104551
|
248 |
+
|
249 |
+
```
|
250 |
+
|
251 |
+
</details>
|
252 |
+
|
253 |
+
<details open>
|
254 |
+
|
255 |
+
<summary>True-casing report</summary>
|
256 |
+
|
257 |
+
```text
|
258 |
+
label precision recall f1 support
|
259 |
+
NOSTOP (label_id: 0) 100.00 99.97 99.98 471608
|
260 |
+
FULLSTOP (label_id: 1) 99.63 99.93 99.78 32923
|
261 |
+
-------------------
|
262 |
+
micro avg 99.97 99.97 99.97 504531
|
263 |
+
macro avg 99.81 99.95 99.88 504531
|
264 |
+
weighted avg 99.97 99.97 99.97 504531
|
265 |
+
|
266 |
+
```
|
267 |
+
|
268 |
+
</details>
|
269 |
+
|
270 |
## Test Data and Example Generation
|
271 |
Each test example was generated using the following procedure:
|
272 |
|