Edit model card

Model Card for Model ID

AI ์™€ ๋น…๋ฐ์ดํ„ฐ ๋ถ„์„ ์ „๋ฌธ ๊ธฐ์—…์ธ Linkbricks์˜ ๋ฐ์ดํ„ฐ์‚ฌ์ด์–ธํ‹ฐ์ŠคํŠธ์ธ ์ง€์œค์„ฑ(Saxo) ์ด์‚ฌ๊ฐ€
Nous-Hermes-2-Mixtral-8x7B-DPO ๋ฒ ์ด์Šค๋ชจ๋ธ์„ ์‚ฌ์šฉํ•ด์„œ H100-80G 8๊ฐœ๋ฅผ ํ†ตํ•ด ์•ฝ 26%์ •๋„์˜ ํŒŒ๋ผ๋ฏธํ„ฐ๋ฅผ ํ•œ๊ตญ์–ด CPT(Continued-Pretraining)->SFT->DPO ํ•œ ํ•œ๊ธ€ ์–ธ์–ด ๋ชจ๋ธ
์ฒœ๋งŒ๊ฑด์˜ ํ•œ๊ธ€ ๋‰ด์Šค ์ฝ”ํผ์Šค๋ฅผ ๊ธฐ์ค€์œผ๋กœ ๋‹ค์–‘ํ•œ ํ…Œ์Šคํฌ๋ณ„ ํ•œ๊ตญ์–ด-์ค‘๊ตญ์–ด-์˜์–ด-์ผ๋ณธ์–ด ๊ต์ฐจ ํ•™์Šต ๋ฐ์ดํ„ฐ์™€ ์ˆ˜ํ•™ ๋ฐ ๋…ผ๋ฆฌํŒ๋‹จ ๋ฐ์ดํ„ฐ๋ฅผ ํ†ตํ•˜์—ฌ ํ•œ์ค‘์ผ์˜ ์–ธ์–ด ๊ต์ฐจ ์ฆ๊ฐ• ์ฒ˜๋ฆฌ์™€ ๋ณต์žกํ•œ ๋…ผ๋ฆฌ ๋ฌธ์ œ ์—ญ์‹œ ๋Œ€์‘ ๊ฐ€๋Šฅํ•˜๋„๋ก ํ›ˆ๋ จํ•œ ๋ชจ๋ธ์ด๋‹ค.
-ํ† ํฌ๋‚˜์ด์ €๋Š” ๋‹จ์–ด ํ™•์žฅ ์—†์ด ๋ฒ ์ด์Šค ๋ชจ๋ธ ๊ทธ๋Œ€๋กœ ์‚ฌ์šฉ
-๊ณ ๊ฐ ๋ฆฌ๋ทฐ๋‚˜ ์†Œ์…œ ํฌ์ŠคํŒ… ๊ณ ์ฐจ์› ๋ถ„์„ ๋ฐ ์ฝ”๋”ฉ๊ณผ ์ž‘๋ฌธ, ์ˆ˜ํ•™, ๋…ผ๋ฆฌํŒ๋‹จ ๋“ฑ์ด ๊ฐ•ํ™”๋œ ๋ชจ๋ธ
-128k-Context Window
-Deepspeed Stage=3, rslora ๋ฐ BAdam Layer Mode ์‚ฌ์šฉ
-ollama run benedict/linkbricks-hermes2-mixtral-8x7b-korean-advanced-q4
-ollama run benedict/linkbricks-hermes2-mixtral-8x7b-korean-advanced-q6
-ollama run benedict/linkbricks-hermes2-mixtral-8x7b-korean-advanced-q8

Finetuned by Mr. Yunsung Ji (Saxo), a data scientist at Linkbricks, a company specializing in AI and big data analytics
about 28% of total parameters Korean CPT(Continued-Pretraining)->SFT->DPO training model based on Nous-Hermes-2-Mixtral-8x7B-DPO through 8 H100-80Gs as a Korean language model
It is a model that has been trained to handle Korean-Chinese-English-Japanese cross-training data and 10M korean news corpus and logic judgment data for various tasks to enable cross-fertilization processing and complex Korean logic & math problems.
-Tokenizer uses the base model without word expansion
-Models enhanced with high-dimensional analysis of customer reviews and social posts, as well as coding, writing, math and decision making
-128k-Context Window
-Deepspeed Stage=3, use rslora and BAdam Layer Mode


www.linkbricks.com, www.linkbricks.vc

Downloads last month
453
Safetensors
Model size
46.7B params
Tensor type
BF16
ยท
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for Saxo/Linkbricks-Horizon-AI-Korean-Advanced-56B

Datasets used to train Saxo/Linkbricks-Horizon-AI-Korean-Advanced-56B