Edit model card

RPmerge

See the main model card: https://huggingface.co/brucethemoose/Yi-34B-200K-RPMerge

Quantized with default exl2 quantization, still investigating the benefits/drawbacks of long context (32K) quantization.

Merge Details

Merge Method

This model was merged using the DARE TIES merge method using /home/alpha/Models/Raw/chargoddard_Yi-34B-200K-Llama as a base.

Models Merged

The following models were included in the merge:

  • /home/alpha/Models/Raw/migtissera_Tess-34B-v1.5b
  • /home/alpha/Models/Raw/migtissera_Tess-M-Creative-v1.0
  • /home/alpha/Models/Raw/cgato_Thespis-34b-DPO-v0.7
  • /home/alpha/Models/Raw/Nous-Capybara-34B
  • /home/alpha/Models/Raw/admo_limarp
  • /home/alpha/Models/Raw/DrNicefellow_ChatAllInOne-Yi-34B-200K-V1

Configuration

The following YAML configuration was used to produce this model:

models:
  - model: /home/alpha/Models/Raw/chargoddard_Yi-34B-200K-Llama
    # No parameters necessary for base model
  - model: /home/alpha/Models/Raw/migtissera_Tess-34B-v1.5b
    #Emphasize the beginning of Vicuna format models
    parameters:
      weight: 0.19
      density: 0.59
  - model: /home/alpha/Models/Raw/Nous-Capybara-34B
    parameters:
      weight: 0.19
      density: 0.55
  # Vicuna format
  - model: /home/alpha/Models/Raw/migtissera_Tess-M-Creative-v1.0
    parameters:
      weight: 0.05
      density: 0.55
  - model: /home/alpha/Models/Raw/DrNicefellow_ChatAllInOne-Yi-34B-200K-V1
    parameters:
      weight: 0.19
      density: 0.55
  - model: /home/alpha/Models/Raw/admo_limarp
    parameters:
      weight: 0.19
      density: 0.48
  - model: /home/alpha/Models/Raw/cgato_Thespis-34b-DPO-v0.7
    parameters:
      weight: 0.19
      density: 0.59


merge_method: dare_ties
tokenizer_source: union
base_model: /home/alpha/Models/Raw/chargoddard_Yi-34B-200K-Llama
parameters:
  int8_mask: true
dtype: bfloat16
Downloads last month
4
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Collection including brucethemoose/Yi-34B-200K-RPMerge-exl2-31bpw