Edit model card

Normal1919/mbart-large-50-one-to-many-lil-fine-tune

  • base model: mbart-large-50
  • pretrained_ckpt: facebook/mbart-large-50-one-to-many-mmt
  • This model was trained for rpy dl translate

Model description

  • source group: English
  • target group: Chinese
  • model: transformer
  • source language(s): eng
  • target language(s): cjy_Hans cjy_Hant cmn cmn_Hans cmn_Hant gan lzh lzh_Hans nan wuu yue yue_Hans yue_Hant
  • fine_tune: On the basis of mbart-large-50-one-to-many-mmt checkpoints, train English original text with renpy text features (including but not limited to {i} [text] {/i}) to Chinese with the same reserved flag, as well as training for English name retention for LIL

How to use

>>> from transformers import MBartForConditionalGeneration, MBart50TokenizerFast
>>> mode_name = 'Normal1919/mbart-large-50-one-to-many-lil-fine-tune'
>>> model = MBartForConditionalGeneration.from_pretrained(mode_name)
>>> tokenizer = MBart50TokenizerFast.from_pretrained(mode_name, src_lang="en_XX", tgt_lang="zh_CN")
>>> translation = pipeline("mbart-large-50-one-to-many-lil-fine-tune", model=model, tokenizer=tokenizer)
>>> translation('I {i} should {/i} say that I feel a little relieved to find out that {i}this {/i} is why you’ve been hanging out with Kaori lately, though. She’s really pretty and I got jealous and...I’m sorry', max_length=400)
    [{'我{i}应该{/i}说,我有点松了一口气,发现{i}这个{/i}是你最近和Kaori一起出去玩的原因。她真的很漂亮,我嫉妒了,而且......对不起。'}]

Contact

[email protected] or [email protected]

Downloads last month
6
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.