Good morning. You have killer calves. Lookin' great. Merge and GGUF please?

#1
by snombler - opened

Thanks. Your penis is study and fearsome.

I don't really understand your request.
I can't make an entire model out of a LoRA (yet), and I already merge this with my last versions of MLewd : https://huggingface.co/models?search=MLewd

Please more context kek

You can't merge the LoRA with the base model via something like https://github.com/tloen/alpaca-lora/blob/main/export_hf_checkpoint.py?

That was the original way of merging LoRAs. Does LLaMa 2 present some problem? I would expect GQA might for 34B or 70B files, but I thought LLaMa 2 was pretty normal.

No worries if that's the case.

snombler changed discussion status to closed

Sign up or log in to comment