metadata
license: mit
Port of the original lilt-only-base model weights from the Language-Independent Layout Transformer (LiLT)
The weights found here are not useful as a standalone and should be instead used in combination with Roberta-like models as outlined HERE
This repository aims to make it easier for others to combine LiLT with a Roberta-like model of their liking. Please refer to the following script on how to fuse XLM-Roberta with LiLT for multi-modal training/fine-tuning HERE