A large LLM created by combining two fine-tuned Llama 70B models into one 120B model. Combines Xwin and Euryale.
Credits to
- @chargoddard for developing the framework used to merge the model - mergekit.
- @Undi95 for helping with the merge ratios.
#merge
Context Inputs Outputs Input Price Cache Read Price Cache Write Price Output Price6k3.75- -7.50