roddsrod/llama-4-maverick-instruct-17bx128e icon
public
Published on 4/29/2025
Llama 4 Maverick Instruct (free)

Llama 4 Maverick 17B Instruct (128E) is a high-capacity multimodal language model from Meta, built on a mixture-of-experts (MoE) architecture with 128 experts and 17 billion active parameters per forward pass (400B total). Optimized for vision-language tasks, Maverick is instruction-tuned for assistant-like behavior, image reasoning, and general-purpose multimodal interaction.

Models
openrouter Llama 4 Maverick Instruct (free) model icon

Llama 4 Maverick Instruct (free)

openrouter