Llama-3.1-8B-Pruned-4-Layers / mergekit_config.yml
Na0s's picture
Upload folder using huggingface_hub
85bc296 verified
raw
history blame contribute delete
202 Bytes
dtype: bfloat16
merge_method: passthrough
slices:
- sources:
- layer_range: [0, 23]
model: meta-llama/Meta-Llama-3.1-8B
- sources:
- layer_range: [28, 32]
model: meta-llama/Meta-Llama-3.1-8B