Dolphin-Mixtral-8x22B fine-tunes Mixtral's MoE architecture with 64k context length. This model enhances instruction following, conversation, and coding capabilities while requiring external alignment for deployment.
anthropic