Edit model card

You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

GECKO: Generative Language Model for English, Code and Korean

ArXiv Logo Paper | GitHub Logo GitHub |

GECKO-7B

GECKO is a 7B parameter deconder-only transformer pretrained on Korean, English and code. It is trained on 200 billion tokens and use terabytes of Korean corpus. GECKO is an open-source model released under Apache 2.0 License. For more details about our model, please read our technical report.

Model Details

GECKO is a generative language model using Llama architecture. Therefore, our model is easlily integrated with other frameworks which support Llama.

Training Data Params Content Length GQA Tokens LR
GECKO A mix of publicly available online data 7B 8k X 200B 3.0 x 10-4

Usage

~14GB RAM is the required minimum memory size with half-precision like float16 or bfloat16.

import torch
from transformers import AutoTokenizer, AutoModelForCausalLM

model_id = 'kifai/GECKO-7B'
tokenizer = AutoTokenizer.from_pretrained(model_id)
model = AutoModelForCausalLM.from_pretrained(model_id, torch_dtype=torch.bfloat16, device_map="auto")

text = """์ด HTML ์ฝ”๋“œ๊ฐ€ ์–ด๋–ค ๊ธฐ๋Šฅ์„ ํ•˜๋Š”์ง€ ์„ค๋ช…ํ•˜๊ณ , ๊ทธ ์„ค๋ช…์„ ์˜์–ด๋กœ ์ œ๊ณตํ•ด์ฃผ์„ธ์š”.
\```html
<button onclick="alert('Welcome!')">Click Me</button>
\```
"""
inputs = tokenizer(text, return_tensors='pt')['input_ids'].to('cuda')
output = model.generate(inputs, max_new_tokens=512, repetition_penalty=1.2)
print(tokenizer.decode(output[0], skip_special_tokens=True))
# ์ด HTML ์ฝ”๋“œ๊ฐ€ ์–ด๋–ค ๊ธฐ๋Šฅ์„ ํ•˜๋Š”์ง€ ์„ค๋ช…ํ•˜๊ณ , ๊ทธ ์„ค๋ช…์„ ์˜์–ด๋กœ ์ œ๊ณตํ•ด์ฃผ์„ธ์š”.
# \```html
# <button onclick="alert('Welcome!')">Click Me</button>
# \```
# 
# ## Description
# 
# This is a button that will display the message "Welcome!" when clicked.
# 
# ## Expected Output
# 
# The expected output should be:
# 
# \```text
# Welcome!
# \```

Limitation

GECKO is a generative language model that comes with some risks. Its testing has mainly been conducted in Korean and has not covered all possible scenarios. As with all large language models, the outputs from GECKO cannot be predicted in advance and might somtimes be inaccurate, biased, or otherwise problematic. Therefore, developers should conduct safety testing and fine-tune model for the intended uses before deploying it.

License

GECKO is released under Apache 2.0 license.

Citation

@misc{oh2024gecko,
      title={GECKO: Generative Language Model for English, Code and Korean}, 
      author={Sungwoo Oh and Donggyu Kim},
      year={2024},
      eprint={2405.15640},
      archivePrefix={arXiv},
      primaryClass={cs.CL}
}

Acknowledgement

The training is supported by TPU Research Cloud program.

Contact

We look forward to hearing you and collaborating with us

Downloads last month
12
Safetensors
Model size
6.74B params
Tensor type
FP16
ยท
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.