Goliath 120B chat

openrouter
A large LLM created by combining two fine-tuned Llama 70B models into one 120B model. Combines Xwin and Euryale. Credits to - @chargoddard for developing the framework used to merge the model - mergekit. - @Undi95 for helping with the merge ratios. #merge

Capabilities

Context Window 6k tokens
Max Output 1k tokens
Inputs
Outputs

Pricing (per 1M tokens)

Input $3.75
Output $7.50
Cache Read -
Cache Write -

Supported Parameters

frequency_penaltylogit_biaslogprobsmax_tokensmin_ppresence_penaltyrepetition_penaltyresponse_formatseedstoptemperaturetop_atop_ktop_logprobstop_p