novita/dolphin-mixtral-8x22b icon
public
Published on 3/5/2025
dolphin-mixtral-8x22b

Dolphin-Mixtral-8x22B fine-tunes Mixtral's MoE architecture with 64k context length. This model enhances instruction following, conversation, and coding capabilities while requiring external alignment for deployment.

Models
anthropic dolphin-mixtral-8x22b model icon

dolphin-mixtral-8x22b

anthropic