O->ConBART document simplification system
This is a pretrained version of the document simplification model presented in the Findings of ACL 2023 paper "Context-Aware Document Simplification".
It is a system based on a modification to the BART architecture and operates on individual sentences. It is intended to be guided by a document-level simplification planner.
Target reading levels (1-4) should be indicated via a control token prepended to each input sequence ("<RL_1>", "<RL_2>", "<RL_3>", "<RL_4>"). If using the terminal interface, this will be handled automatically.
How to use
It is recommended to use the plan_simp library to interface with the model.
Here is how to use this model in PyTorch:
from plan_simp.models.bart import load_simplifier
simplifier, tokenizer, hparams = load_simplifier("liamcripwell/o-conbart")
# dynamic plan-guided generation
from plan_simp.scripts.generate import Launcher
launcher = Launcher()
launcher.dynamic(model_ckpt="liamcripwell/o-conbart", clf_model_ckpt="liamcripwell/pgdyn-plan", **params)
Generation and evaluation can also be run from the terminal.
python plan_simp/scripts/generate.py dynamic
--clf_model_ckpt=liamcripwell/pgdyn-plan
--model_ckpt=liamcripwell/o-conbart
--test_file=<test_data>
--doc_id_col=pair_id
--context_dir=<context_dir>
--reading_lvl=s_level
--context_doc_id=c_id
--out_file=<output_csv>
python plan_simp/scripts/eval_simp.py
--input_data=newselaauto_docs_test.csv
--output_data=test_out_oconbart.csv
--x_col=complex_str
--r_col=simple_str
--y_col=pred
--doc_id_col=pair_id
--prepro=True
--sent_level=True
- Downloads last month
- 7