merge
This is a merge of pre-trained language models created using mergekit.
Merge Details
Merge Method
This model was merged using the della_linear merge method using anthracite-org/magnum-v2.5-12b-kto as a base.
Models Merged
The following models were included in the merge:
Configuration
The following YAML configuration was used to produce this model:
models:
- model: nbeerbower/mistral-nemo-gutenberg-12B-v3
parameters:
weight: 0.30
density: 0.90
- model: anthracite-org/magnum-v2.5-12b-kto
parameters:
weight: 0.40
density: 0.95
- model: TheDrummer/Rocinante-12B-v1
parameters:
weight: 0.20
density: 0.45
merge_method: della_linear
base_model: anthracite-org/magnum-v2.5-12b-kto
parameters:
int8_mask: true
epsilon: 0.05
lambda: 1
dtype: bfloat16
- Downloads last month
- 10
Inference API (serverless) is not available, repository is disabled.
Model tree for Frowning/Type-IV-Last-12B
Merge model
this model