Edit model card
YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

Quantization made by Richard Erkhov.

Github

Discord

Request more models

typhoon-7b-instruct-02-19-2024 - GGUF

Original model description:

license: apache-2.0 pipeline_tag: text-generation

Typhoon-0219: Thai Large Language Model (Instruct)

Typhoon-0219 is a instruct Thai ๐Ÿ‡น๐Ÿ‡ญ large language model with 7 billion parameters, and it is based on Typhoon 7B. It is the second-generation instruct model version that serves opentyphoon.ai. It is trained on a diverse instruction tuning dataset with more than 1 million rows, similar to OpenHermes, and supports using system prompts.

Model Description

  • Model type: A 7B instruct decoder-only model based on Mistral architecture.
  • Requirement: transformers 4.38.0 or newer.
  • Primary Language(s): Thai ๐Ÿ‡น๐Ÿ‡ญ and English ๐Ÿ‡ฌ๐Ÿ‡ง
  • License: Apache-2.0

Intended Uses & Limitations

This model is an instructional model. However, itโ€™s still undergoing development. It incorporates some level of guardrails, but it still may produce answers that are inaccurate, biased, or otherwise objectionable in response to user prompts. We recommend that developers assess these risks in the context of their use case.

Production Deployment

We suggest using the OpenAI-compatible API server from the vLLM project.

python -m vllm.entrypoints.openai.api_server --port 8080 --model scb10x/typhoon-7b-instruct-02-19-2024 --max-num-batched-tokens 8192 --max-model-len 8192 --served-model-name typhoon-instruct

Chat Template

We use chatml chat-template.

{% for message in messages %}{{'<|im_start|>' + message['role'] + '\n' + message['content']}}{% if (loop.last and add_generation_prompt) or not loop.last %}{{ '<|im_end|>' + '\n'}}{% endif %}{% endfor %}
{% if add_generation_prompt and messages[-1]['role'] != 'assistant' %}{{ '<|im_start|>assistant\n' }}{% endif %}

Follow us

https://twitter.com/opentyphoon

Support

https://discord.gg/CqyBscMFpg

SCB10X AI Team

  • Kunat Pipatanakul, Potsawee Manakul, Sittipong Sripaisarnmongkol, Pathomporn Chokchainant, Kasima Tharnpipitchai
  • If you find Typhoon useful for your work, please cite it using:
@article{pipatanakul2023typhoon,
    title={Typhoon: Thai Large Language Models}, 
    author={Kunat Pipatanakul and Phatrasek Jirabovonvisut and Potsawee Manakul and Sittipong Sripaisarnmongkol and Ruangsak Patomwong and Pathomporn Chokchainant and Kasima Tharnpipitchai},
    year={2023},
    journal={arXiv preprint arXiv:2312.13951},
    url={https://arxiv.org/abs/2312.13951}
}

Contact Us

Downloads last month
203
GGUF
Model size
7.27B params
Architecture
llama

2-bit

3-bit

4-bit

5-bit

6-bit

8-bit

Inference API
Unable to determine this model's library. Check the docs .