aLLMA models
Collection
aLLMA small, base, and large models
•
3 items
•
Updated
•
2
This model is a trained version of google-bert/bert-base-uncased architecture on the allmalab/DOLLMA dataset.
More information needed
More information needed
More information needed
The following hyperparameters were used during training: