--- base_model: WAI-ANI-NSFW-PONYXL license: apache-2.0 model_creator: WAI0731 model_name: waiANINSFWPONYXL_v90 quantized_by: Second State Inc. ---

# waiANINSFWPONYXL_v90-GGUF ## Original Model [WAI-ANI-NSFW-PONYXL](https://civitai.com/models/404154/wai-ani-nsfw-ponyxl) ## Run with LlamaEdge - LlamaEdge version: coming soon - Run as LlamaEdge service ```bash wasmedge --dir .:. \ sd-api-server.wasm \ --model-name wai \ --model waiANINSFWPONYXL_v90-Q4_0.gguf \ --task text2image ``` ## Quantized GGUF Models | Name | Quant method | Bits | Size | Use case | | ---- | ---- | ---- | ---- | ----- | | [waiANINSFWPONYXL_v90-Q2_K.gguf](https://huggingface.co/second-state/waiANINSFWPONYXL_v90-GGUF/blob/main/waiANINSFWPONYXL_v90-Q2_K.gguf) | Q2_K | 2 | 1.99 GB| | | [waiANINSFWPONYXL_v90-Q3_K.gguf](https://huggingface.co/second-state/waiANINSFWPONYXL_v90-GGUF/blob/main/waiANINSFWPONYXL_v90-Q3_K.gguf) | Q3_K | 3 | 2.29 GB| | | [waiANINSFWPONYXL_v90-Q4_0.gguf](https://huggingface.co/second-state/waiANINSFWPONYXL_v90-GGUF/blob/main/waiANINSFWPONYXL_v90-Q4_0.gguf) | Q4_0 | 4 | 2.60 GB| | | [waiANINSFWPONYXL_v90-Q4_1.gguf](https://huggingface.co/second-state/waiANINSFWPONYXL_v90-GGUF/blob/main/waiANINSFWPONYXL_v90-Q4_1.gguf) | Q4_1 | 4 | 2.79 GB| | | [waiANINSFWPONYXL_v90-Q4_K.gguf](https://huggingface.co/second-state/waiANINSFWPONYXL_v90-GGUF/blob/main/waiANINSFWPONYXL_v90-Q4_K.gguf) | Q4_K | 4 | 2.68 GB| | | [waiANINSFWPONYXL_v90-Q5_0.gguf](https://huggingface.co/second-state/waiANINSFWPONYXL_v90-GGUF/blob/main/waiANINSFWPONYXL_v90-Q5_0.gguf) | Q5_0 | 5 | 2.98 GB| | | [waiANINSFWPONYXL_v90-Q5_1.gguf](https://huggingface.co/second-state/waiANINSFWPONYXL_v90-GGUF/blob/main/waiANINSFWPONYXL_v90-Q5_1.gguf) | Q5_1 | 5 | 3.16 GB| | | [waiANINSFWPONYXL_v90-Q8_0.gguf](https://huggingface.co/second-state/waiANINSFWPONYXL_v90-GGUF/blob/main/waiANINSFWPONYXL_v90-Q8_0.gguf) | Q8_0 | 8 | 4.11 GB| | | [waiANINSFWPONYXL_v90-f16.safetensors](https://huggingface.co/second-state/waiANINSFWPONYXL_v90-GGUF/blob/main/waiANINSFWPONYXL_v90-f16.safetensors) | f16 | 16 | 6.94 GB| | *Quantized with stable-diffusion.cpp master-14206fd*