AI Model Release Tracker - Timeline of Major AI Models from 2022-2026

Home/Compare/DeepSeek-MoE vs Codestral 22B

DeepSeek-MoE vs Codestral 22B

DeepSeek DeepSeek-MoE vs Mistral Codestral 22B — side-by-side specs.

DeepSeek
DeepSeek-MoE
Mistral
Codestral 22B
Overview
CompanyDeepSeekMistral
Release dateJan 9 2024May 29 2024
Model type
Open sourceYesNo
Specifications
Parameters
Context window
Benchmarks
Science reasoning
GPQA Diamond
Software engineering
SWE-Bench Verified
Multimodal understanding
MMMU
Timeline
Release gapDeepSeek-MoE shipped 141 days before Codestral 22B

Which is better: DeepSeek-MoE or Codestral 22B?

DeepSeek-MoE and Codestral 22B do not have directly comparable benchmark scores published on GPQA Diamond, SWE-Bench Verified, or MMMU. DeepSeek-MoE shipped 141 days before Codestral 22B, so benchmark comparisons should account for the intervening progress.

DeepSeek-MoE is an open-source / open-weight model; Codestral 22B is proprietary.

Direct benchmark comparisons are unavailable — at least one of these models has not published scores on GPQA Diamond, SWE-Bench Verified, or MMMU.

Frequently asked questions

When was DeepSeek-MoE released?
DeepSeek-MoE was released by DeepSeek on Jan 9 2024.
When was Codestral 22B released?
Codestral 22B was released by Mistral on May 29 2024.
Is DeepSeek-MoE or Codestral 22B open source?
DeepSeek-MoE is an open-source / open-weight model released by DeepSeek. Codestral 22B is a proprietary model released by Mistral.

Other comparisons