此模型为字符切分版的ernie1.0(即对非whitespace的字符按个切分),去除长度超过1的token的tokenizer和模型。
- Downloads last month
- 2
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.