Spaces:
Sleeping
Sleeping
<html> | |
<head> | |
<title>Wizardlm-13b-v1.2.Q4_0.gguf</title> | |
</head> | |
<body> | |
<h1>Wizardlm-13b-v1.2.Q4_0.gguf</h1> | |
<p> | |
With the utilization of the | |
<a href="https://github.com/abetlen/llama-cpp-python">llama-cpp-python</a> | |
package, we are excited to introduce the GGUF model hosted in the Hugging | |
Face Docker Spaces, made accessible through an OpenAI-compatible API. This | |
space includes comprehensive API documentation to facilitate seamless | |
integration. | |
</p> | |
<ul> | |
<li> | |
The API endpoint: | |
<a | |
href="afischer1985-wizardlm-13b-v1-2-q4-0-gguf.space/v1" | |
>https://afischer1985-wizardlm-13b-v1-2-q4-0-gguf.hf.space/v1</a | |
> | |
</li> | |
<li> | |
The API doc: | |
<a | |
href="afischer1985-wizardlm-13b-v1-2-q4-0-gguf.hf.space/docs" | |
>afischer1985-wizardlm-13b-v1-2-q4-0-gguf.hf.space/docs</a | |
> | |
</li> | |
</ul> | |
<p> | |
If you find this resource valuable, your support in the form of starring | |
the space would be greatly appreciated. Your engagement plays a vital role | |
in furthering the application for a community GPU grant, ultimately | |
enhancing the capabilities and accessibility of this space. | |
</p> | |
</body> | |
</html> | |