Can you release a chat version soon ?
Can you release a chat version soon ?
or I can directly finetune the Yi-34B model on chat dataset ?
Can you release a chat version soon ?
The Yi chat models will be released in weeks~
or I can directly finetune the Yi-34B model on chat dataset ?
Of course you can~ We're looking forward to your chat model~
Just quickly fine-tuned on the guanaco dataset, I highly suspect there will be a chat model release soon. The base model is uncensored and aligning it in Chinese will be fun :p
Just quickly fine-tuned on the guanaco dataset, I highly suspect there will be a chat model release soon. The base model is uncensored and aligning it in Chinese will be fun :p
Please share your fine tuning experience and results.
Just quickly fine-tuned on the guanaco dataset, I highly suspect there will be a chat model release soon. The base model is uncensored and aligning it in Chinese will be fun :p
Please share your fine tuning experience and results.
Pretty much, you can use any existing Llama2 fine-tune script to do the fine tuning with trust remote code enabled. In my experience, the result feels better at Chinese than English.
Just quickly fine-tuned on the guanaco dataset, I highly suspect there will be a chat model release soon. The base model is uncensored and aligning it in Chinese will be fun :p
Please share your fine tuning experience and results.
Pretty much, you can use any existing Llama2 fine-tune script to do the fine tuning with trust remote code enabled. In my experience, the result feels better at Chinese than English.
Cool, thanks. I'm going to test this pretrained model as is and see how well it can follow instructions. If your finetuning produced a good chat model please publish it.
It's an excellent model. Made a qlora fine tune based on a llamafied version:
https://huggingface.co/KnutJaegersberg/Deacon-34B-qlora
This is one of the best responses I got from an open model on the question how to build AGI so far (though I played a bit with the settings):
Will your fine tuning approach inherit the 200k context length capability?
not yet, but after I finish a job running, I'll redo it with the 200k model.
this one is done with the 200k context window version @tjtanaa
Here is one with chat ability https://huggingface.co/TheBloke/Nous-Capybara-34B-GGUF,
where I tested out a little bit here https://huggingface.co/01-ai/Yi-34B/discussions/22#654fb707380ee26b49b3b180