|
--- |
|
library_name: peft |
|
base_model: mistralai/Mistral-7B-v0.1 |
|
pipeline_tag: text-generation |
|
--- |
|
Description: Coding tasks in multiple languages\ |
|
Original dataset: https://huggingface.co/datasets/ise-uiuc/Magicoder-OSS-Instruct-75K \ |
|
---\ |
|
Try querying this adapter for free in Lora Land at https://predibase.com/lora-land! \ |
|
The adapter_category is STEM and the name is Code Generation (magicoder)\ |
|
---\ |
|
Sample input: Below is a programming problem, paired with a language in which the solution should be written. Write a solution in the provided that appropriately solves the programming problem.\n\n### Problem: |
|
|
|
def strlen(string: str) -> int: |
|
""" Return length of given string |
|
>>> strlen('') |
|
0 |
|
>>> strlen('abc') |
|
3 |
|
""" |
|
\n\n### Language: python\n\n### Solution: \ |
|
---\ |
|
Sample output: ```python |
|
def strlen(string: str) -> int: |
|
return len(string)```\ |
|
---\ |
|
Try using this adapter yourself! |
|
|
|
``` |
|
from transformers import AutoModelForCausalLM, AutoTokenizer |
|
|
|
model_id = "mistralai/Mistral-7B-v0.1" |
|
peft_model_id = "predibase/magicoder" |
|
|
|
model = AutoModelForCausalLM.from_pretrained(model_id) |
|
model.load_adapter(peft_model_id) |
|
``` |