Llama 4 Scout chat

openrouter
Llama 4 Scout 17B Instruct (16E) is a mixture-of-experts (MoE) language model developed by Meta, activating 17 billion parameters out of a total of 109B. It supports native multimodal input...

Capabilities

Context Window 327k tokens
Max Output 16k tokens
Inputs
Outputs

Pricing (per 1M tokens)

Input $0.08
Output $0.30
Cache Read -
Cache Write -

Supported Parameters

frequency_penaltylogit_biasmax_tokensmin_ppresence_penaltyrepetition_penaltyresponse_formatseedstopstructured_outputstemperaturetool_choicetoolstop_ktop_p