Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
FacebookAI
/
xlm-roberta-large-finetuned-conll03-german
like
10
Follow
Facebook AI community
50
Token Classification
Transformers
PyTorch
Rust
ONNX
94 languages
xlm-roberta
Inference Endpoints
arxiv:
1911.02116
arxiv:
1910.09700
Model card
Files
Files and versions
Community
5
Train
Deploy
Use this model
main
xlm-roberta-large-finetuned-conll03-german
6 contributors
History:
12 commits
lysandre
HF staff
Adds the tokenizer configuration file (
#4
)
1fbcc7a
verified
9 months ago
onnx
Adding ONNX file of this model (#3)
11 months ago
.gitattributes
Safe
523 Bytes
Adding ONNX file of this model (#3)
11 months ago
README.md
Safe
6.47 kB
Add model card (#1)
over 2 years ago
config.json
Safe
886 Bytes
Update config.json
over 4 years ago
pytorch_model.bin
Safe
pickle
Detected Pickle imports (3)
"torch.FloatStorage"
,
"collections.OrderedDict"
,
"torch._utils._rebuild_tensor_v2"
What is a pickle import?
2.24 GB
LFS
Update pytorch_model.bin
almost 5 years ago
rust_model.ot
Safe
2.24 GB
LFS
Update rust_model.ot
over 4 years ago
sentencepiece.bpe.model
Safe
5.07 MB
Update sentencepiece.bpe.model
almost 5 years ago
tokenizer.json
Safe
9.1 MB
Update tokenizer.json
about 4 years ago
tokenizer_config.json
Safe
25 Bytes
Adds the tokenizer configuration file (#4)
9 months ago