Please use 'Bert' related functions to load this model!

Chinese small pre-trained model MiniRBT

In order to further promote the research and development of Chinese information processing, we launched a Chinese small pre-training model MiniRBT based on the self-developed knowledge distillation tool TextBrewer, combined with Whole Word Masking technology and Knowledge Distillation technology.

This repository is developed based on:https://github.com/iflytek/MiniRBT

You may also interested in,

More resources by HFL: https://github.com/iflytek/HFL-Anthology

Downloads last month
1,137
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.