Edit model card
YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

I will give more info but this is how to generate text with the model. You will need to install

pip install peft

To run in python

from transformers import AutoModelForCausalLM, AutoTokenizer
from peft import PeftConfig, PeftModelForCausalLM

peft_model_id = 'GrantC/alpaca-opt-1.3b-lora'
BASE_MODEL = 'facebook/opt-1.3b'
config = PeftConfig.from_pretrained(peft_model_id)
model = AutoModelForCausalLM.from_pretrained(BASE_MODEL)
tokenizer = AutoTokenizer.from_pretrained(BASE_MODEL)

model = PeftModelForCausalLM.from_pretrained(model, peft_model_id, device_map="auto")

prompt = "Write a blog post about shaving cream:"
print(prompt)
inputs = tokenizer(prompt, return_tensors='pt')
output = model.generate(input_ids=inputs["input_ids"], do_sample= True, penalty_alpha=0.6, top_k=4, max_new_tokens=256)
outputs = tokenizer.decode(output[0], skip_special_tokens=True, clean_up_tokenization_spaces=False)
print(outputs)
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference API
Unable to determine this model's library. Check the docs .