Edit model card

StableLM-Tuned-Alpha 7b: sharded checkpoint

Open In Colab

This is a sharded checkpoint (with ~4GB shards) of the model. Refer to the original model for all details.

  • this enables low-RAM loading, i.e. Colab :)

Basic Usage

install transformers, accelerate, and bitsandbytes.

pip install -U -q transformers bitsandbytes accelerate

Load the model in 8bit, then run inference:

from transformers import AutoTokenizer, AutoModelForCausalLM

model_name = "ethzanalytics/stablelm-tuned-alpha-7b-sharded"
tokenizer = AutoTokenizer.from_pretrained(model_name)

model = AutoModelForCausalLM.from_pretrained(
    model_name, load_in_8bit=True, device_map="auto"
)
Downloads last month
13
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.