---
library_name: sklearn
tags:
- sklearn
- skops
- text-classification
---
# Model description
This is a logistic regression model trained with GPT-2 embeddings on imdb dataset.
The notebook to generate this model is in this repository and in this [kaggle link](https://www.kaggle.com/code/unofficialmerve/scikit-learn-with-transformers-with-skops/notebook).
## Intended uses & limitations
This model is trained for educational purposes.
## Training Procedure
### Hyperparameters
The model is trained with below hyperparameters.
Click to expand
| Hyperparameter | Value |
|-------------------------------|-------------------------------------------------------------------------------------------------------------------|
| memory | |
| steps | [('embedding', HFTransformersLanguage(model_name_or_path='facebook/bart-base')), ('model', LogisticRegression())] |
| verbose | False |
| embedding | HFTransformersLanguage(model_name_or_path='facebook/bart-base') |
| model | LogisticRegression() |
| embedding__model_name_or_path | facebook/bart-base |
| model__C | 1.0 |
| model__class_weight | |
| model__dual | False |
| model__fit_intercept | True |
| model__intercept_scaling | 1 |
| model__l1_ratio | |
| model__max_iter | 100 |
| model__multi_class | auto |
| model__n_jobs | |
| model__penalty | l2 |
| model__random_state | |
| model__solver | lbfgs |
| model__tol | 0.0001 |
| model__verbose | 0 |
| model__warm_start | False |
Pipeline(steps=[('embedding',HFTransformersLanguage(model_name_or_path='facebook/bart-base')),('model', LogisticRegression())])Please rerun this cell to show the HTML repr or trust the notebook.
Pipeline(steps=[('embedding',HFTransformersLanguage(model_name_or_path='facebook/bart-base')),('model', LogisticRegression())])
HFTransformersLanguage(model_name_or_path='facebook/bart-base')
LogisticRegression()