---
base_model: sentence-transformers/all-MiniLM-L6-v2
datasets:
- momo22/eng2nep
language:
- en
- ne
library_name: sentence-transformers
metrics:
- negative_mse
- src2trg_accuracy
- trg2src_accuracy
- mean_accuracy
pipeline_tag: sentence-similarity
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- generated_from_trainer
- dataset_size:1000
- loss:MSELoss
- dataset_size:5000
- dataset_size:8000
widget:
- source_sentence: 'The aggressive semi-employed religion workshop of Razzak, (EFP).
'
sentences:
- 'मा ग्रिटर भेट्टाउन सकेन वा GDM प्रयोगकर्ताले कार्यान्वयन गर्न सकेन
'
- 'रज्जाकको आक्रामक अर्द्धशतक धर्मशाला, (एएफपी)।
'
- 'त्यसैले मेरो विजयपछि म त्यस्तो अवस्था आउन दिनेछैन।
'
- source_sentence: 'The authority is being a constitutional body, it was also empowered
by passing the bill from Parliament.
'
sentences:
- 'अख्तियार संवैधानिक निकाय त हुँदै हो, त्यसमा पनि संसदबाटै विधेयक पास गरेर अख्तियारलाई
अधिकारसम्पन्न पनि गराइएको थियो।
'
- 'म यहूदाका राजा सिदकियाहलाई र उसका मानिसहरूलाई तिनीहरूका शत्रुहरूकहाँ सुम्पिनेछु
जसले तिनीहरूलाई मार्न चाहन्छन्। ती सेनाहरू यरूशलेमबाट गइसकेका भएता पनि म तिनीहरूलाई
बाबेलका राजाको सेनाहरूकहाँ सुम्पिनेछु।
'
- '– संकटकालको असर न्यायिक क्षेत्रमा मात्रै पर्दैन, समग्र मुलुकमै पर्छ।
'
- source_sentence: 'The two-day conference will participate in investors from China,
India, Japan, the US, European countries, Britain and other countries, the Federation
said.
'
sentences:
- 'उनीहरूको जनजीविकाको आधार प्राकृतिक स्रोत रहेको छ।
'
- 'दुई दिनसम्म हुने सम्मेलनमा चीन, भारत, जापान, अमेरिका, युरोपियन देशहरू, बेलायत
लगायत देशबाट लगानीकर्ताको सहभागिता गराउने महासंघले जानकारी दिएको छ
'
- 'नयाँ स्न्यापसट लिनका लागि यो बटन क्लिक गर्नुहोस् ।
'
- source_sentence: 'Mr Sankey issued a "confession" through his solicitor after Shields
had been convicted but then withdrew it.
'
sentences:
- 'श्री सान्कीले ढालहरू दोषी भएपछि आफ्नो समाधानकर्तामार्फत "स्वीकृति" जारी गर्नुभयो
तर त्यसपछि यसलाई फिर्ता लिनुभयो।
'
- 'कृत्रिम रुपमा पेट्रोलियम पदार्थको मूल्य स्थिर राख्न अनुदान दिदै जाने हो भने नेपाली
अर्थतन्त्र एकदिन धराशायी हुनेछ।
'
- 'ओली सरकारले "राष्ट्रियता-राष्ट्रवाद र" आर्थिक सम्ब्रिद्धि "-आर्थिक विकासलाई यसको
प्राथमिकताको रूपमा घोषणा गरेको छ।
'
- source_sentence: 'We want to use this time to appeal to the American government
to see if they can finally close this chapter.
'
sentences:
- 'धेरैले घाउ पाए र ओछ्यानमा थिए।
'
- 'नाम यसको अन्तरराष्ट्रिय हलको अद्वितिय डिजाइनबाट स्पष्ट रूपमा प्राप्त हुन्छ, जुन
शीर्षकनियम स्पेसबाट बनेको छ, जुन ठूलो गहिराइमा उच्च दबाब बुझ्न सक्षम छ।
'
- 'हामी अमेरिकी सरकारलाई अपील गर्न यसपटक प्रयोग गर्न चाहन्छौं कि उनीहरूले अन्त्यमा
यो अध्याय बन्द गर्न सक्छन्।
'
model-index:
- name: SentenceTransformer based on sentence-transformers/all-MiniLM-L6-v2
results:
- task:
type: knowledge-distillation
name: Knowledge Distillation
dataset:
name: Unknown
type: unknown
metrics:
- type: negative_mse
value: -0.37439612206071615
name: Negative Mse
- task:
type: translation
name: Translation
dataset:
name: Unknown
type: unknown
metrics:
- type: src2trg_accuracy
value: 0.0186
name: Src2Trg Accuracy
- type: trg2src_accuracy
value: 0.00835
name: Trg2Src Accuracy
- type: mean_accuracy
value: 0.013474999999999999
name: Mean Accuracy
---
# SentenceTransformer based on sentence-transformers/all-MiniLM-L6-v2
This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [sentence-transformers/all-MiniLM-L6-v2](https://huggingface.co/sentence-transformers/all-MiniLM-L6-v2) on the [momo22/eng2nep](https://huggingface.co/datasets/momo22/eng2nep) dataset. It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
## Model Details
### Model Description
- **Model Type:** Sentence Transformer
- **Base model:** [sentence-transformers/all-MiniLM-L6-v2](https://huggingface.co/sentence-transformers/all-MiniLM-L6-v2)
- **Maximum Sequence Length:** 256 tokens
- **Output Dimensionality:** 384 tokens
- **Similarity Function:** Cosine Similarity
- **Training Dataset:**
- [momo22/eng2nep](https://huggingface.co/datasets/momo22/eng2nep)
- **Languages:** en, ne
### Model Sources
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
### Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 256, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
)
```
## Usage
### Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
```bash
pip install -U sentence-transformers
```
Then you can load this model and run inference.
```python
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("jangedoo/all-MiniLM-L6-v2-nepali")
# Run inference
sentences = [
'We want to use this time to appeal to the American government to see if they can finally close this chapter.\n',
'हामी अमेरिकी सरकारलाई अपील गर्न यसपटक प्रयोग गर्न चाहन्छौं कि उनीहरूले अन्त्यमा यो अध्याय बन्द गर्न सक्छन्।\n',
'नाम यसको अन्तरराष्ट्रिय हलको अद्वितिय डिजाइनबाट स्पष्ट रूपमा प्राप्त हुन्छ, जुन शीर्षकनियम स्पेसबाट बनेको छ, जुन ठूलो गहिराइमा उच्च दबाब बुझ्न सक्षम छ।\n',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 384]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
```
## Evaluation
### Metrics
#### Knowledge Distillation
* Evaluated with [MSEEvaluator
](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.MSEEvaluator)
| Metric | Value |
|:-----------------|:------------|
| **negative_mse** | **-0.3744** |
#### Translation
* Evaluated with [TranslationEvaluator
](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.TranslationEvaluator)
| Metric | Value |
|:------------------|:-----------|
| src2trg_accuracy | 0.0186 |
| trg2src_accuracy | 0.0083 |
| **mean_accuracy** | **0.0135** |
## Training Details
### Training Dataset
#### momo22/eng2nep
* Dataset: [momo22/eng2nep](https://huggingface.co/datasets/momo22/eng2nep) at [57da8d4](https://huggingface.co/datasets/momo22/eng2nep/tree/57da8d44266896e334c1d8f2528cbbf666fbd0ca)
* Size: 8,000 training samples
* Columns: English
, Nepali
, and label
* Approximate statistics based on the first 1000 samples:
| | English | Nepali | label |
|:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-------------------------------------|
| type | string | string | list |
| details |
But with the origin of feudal practices in the Middle Ages, the practice of untouchability began, as well as discrimination against women.
| तर मध्ययुगमा सामन्ती प्रथाको उद्भव भएसँगै जसरी छुवाछुत प्रथाको शुरुवात भयो, त्यसैगरी नारी प्रति पनि विभेद गरिन थालियो
| [-0.05432726442813873, 0.029996933415532112, -0.008532932959496975, -0.035200122743844986, 0.008856767788529396, ...]
|
| A Pandit was found on the way to Pokhara from Baglung.
| वाग्लुङ्गबाट पोखरा आउँदा बाटोमा एकजना पण्डित भेटिए।
| [-0.023763148114085197, 0.0959007516503334, -0.11197677254676819, 0.10978179425001144, -0.028137238696217537, ...]
|
| He went on: "She ate a perfectly normal and healthy diet.
| उनी गए: "उनले पूर्ण सामान्य र स्वस्थ आहार खाइन्।
| [0.028130479156970978, 0.030386686325073242, -0.012276170775294304, 0.1316223442554474, -0.01928202621638775, ...]
|
* Loss: [MSELoss
](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#mseloss)
### Evaluation Dataset
#### momo22/eng2nep
* Dataset: [momo22/eng2nep](https://huggingface.co/datasets/momo22/eng2nep) at [57da8d4](https://huggingface.co/datasets/momo22/eng2nep/tree/57da8d44266896e334c1d8f2528cbbf666fbd0ca)
* Size: 500 evaluation samples
* Columns: English
, Nepali
, and label
* Approximate statistics based on the first 1000 samples:
| | English | Nepali | label |
|:--------|:-----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:-------------------------------------|
| type | string | string | list |
| details | Chapter 3
| परिच्छेद–३
| [-0.049459926784038544, 0.048675183206796646, 0.016583453863859177, 0.04876156523823738, -0.020754676312208176, ...]
|
| The capability of MOF would be strengthened to enable it to efficiently play the lead role in donor coordination, and to secure support from all stakeholders in aid coordination activities.
| दाताहरूको समन्वयमा नेतृत्वदायीको भूमिका निर्वाह प्रभावकारी ढंगले गर्न अर्थ मन्त्रालयको क्षमता सुदृढ गरिनेछ यसको लागि सबै सरोकारवालाबाट समर्थन प्राप्त गरिनेछ ।
| [-0.06200315058231354, -0.016507938504219055, -0.029924314469099045, -0.052509162575006485, 0.07746178656816483, ...]
|
| Polimatrix, Inc. is a system integrator and total solutions provider delivering radiation and nuclear protection and detection.
| पोलिमाट्रिक्स, इन्कर्पोरेटिड प्रणाली इन्टिजर र कुल समाधान प्रदायक रेडियो र आणविक संरक्षण र पत्ता लगाउने प्रणाली इन्टिजर र कुल समाधान प्रदायक हो।
| [-0.0446796678006649, 0.026428330689668655, -0.09837698936462402, -0.07765442878007889, -0.020364686846733093, ...]
|
* Loss: [MSELoss
](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#mseloss)
### Training Hyperparameters
#### Non-Default Hyperparameters
- `eval_strategy`: steps
- `per_device_train_batch_size`: 64
- `per_device_eval_batch_size`: 64
- `learning_rate`: 2e-05
- `num_train_epochs`: 1
- `warmup_ratio`: 0.1
- `bf16`: True
- `push_to_hub`: True
- `hub_model_id`: jangedoo/all-MiniLM-L6-v2-nepali
- `push_to_hub_model_id`: all-MiniLM-L6-v2-nepali
#### All Hyperparameters