Will you make a 3B model as well?
#7
by
zokica
- opened
Hello,
Thanks for the hard work in making this model.
- Will you make a 3B model as well
- do you think finetuning with lora adapter would work for this 7B model. (https://github.com/tloen/alpaca-lora)
- It is our company policy not to talk about our future plans until we launch them. We prefer to speak through our work rather than our words. We are working on many exciting projects right now, and we hope you will find them valuable when they launch.
- We're not sure, but we'd be very curious to hear how it goes for you. We'd be excited to accept pull requests that support it in https://www.github.com/mosaicml/llm-foundry
jfrankle
changed discussion status to
closed
Ok, I will update once i try the Peft.
zokica
changed discussion status to
open
Awesome - thank you! Please keep us posted!
Lets move the peft/lora discussion here https://github.com/mosaicml/llm-foundry/issues/64 to keep it all in one place, so everyone can share their progress :)
daking
changed discussion status to
closed