AI: LFM2-2.6B chat

openrouter
LFM2 is a new generation of hybrid models developed by Liquid AI, specifically designed for edge AI and on-device deployment. It sets a new standard in terms of quality, speed, and memory efficiency.

Capabilities

Context Window 32k tokens
Max Output 0 tokens
Inputs
Outputs

Pricing (per 1M tokens)

Input $0.01
Output $0.02
Cache Read -
Cache Write -

Supported Parameters

frequency_penaltymax_tokensmin_ppresence_penaltyrepetition_penaltyseedstoptemperaturetop_ktop_p