Edit model card

This is a ggml quantized version of Replit-v2-CodeInstruct-3B. Quantized to 4bit -> q4_1. To run inference you can use ggml directly or ctransformers.

Downloads last month
20
Inference Examples
Inference API (serverless) is not available, repository is disabled.

Spaces using abacaj/Replit-v2-CodeInstruct-3B-ggml 2