Z.ai: GLM 5.1 chat
GLM-5.1 delivers a major leap in coding capability, with particularly significant gains in handling long-horizon tasks. Unlike previous models built around minute-level interactions, GLM-5.1 can work independently and continuously on...
Capabilities
Context Window 202k tokens
Max Output 0 tokens
Inputs
Outputs
Pricing (per 1M tokens)
Input $1.26
Output $3.96
Cache Read -
Cache Write -
Supported Parameters
frequency_penaltyinclude_reasoninglogit_biaslogprobsmax_tokensmin_ppresence_penaltyreasoningrepetition_penaltyresponse_formatseedstopstructured_outputstemperaturetool_choicetoolstop_ktop_logprobstop_p