Beta
Meta

Meta

🇺🇸United StatesSite web
Meilleur modèle
LLaMA-65B
61.9
score moyen
24
Total des modèles
suivis sur BenchGecko
24
Code source ouvert
100% des modèles
$0.02
Modèle le moins cher
par 1M de tokens en entrée
40.4
Benchmark moyen
sur 18 modèles évalués

Catégories de modèles

LLM20Multimodal4

Fourchette de prix $/1M de tokens en entrée

$0.02
$0.02
$0.03
$0.03
$0.05
$0.05
$0.08
$0.10
$0.15
$0.18
$0.40
$0.51
Bas: $0.02Médian: $0.08Haut: $0.51

Ratio code source ouvert

100%
24 code source ouvert0 propriétaire

Tous les modèles Meta24 total

#ModelAvgARC AI2?BBH?GSM8K?HellaSwag?LAMBADA?MMLU?GPQA diamond?MATH level 5?otis mock ?WeirdML?Winogrande?SimpleBench?aider poly?lech mazur?GSO-Bench?fiction li?swe bench ?terminal b?frontierma?simpleqa v?frontierma?chess puzz?APEX-Agents?OSWorld?ARC-AGI-2?HLE?TriviaQA?ScienceQA?PIQA?OpenBookQA?CadEval?Balrog?GeoBench?Cybench?ANLI?the agent ?VideoMME?ARC-AGI?deepresear?VPCT?$/1M inContextReleased
1MetaLLaMA-65B🇺🇸 MetaOpen61.959.344.554.478.977.751.2----54.0---------------86.0-65.646.9------------Jan 242y ago
2MetaLLaMA-33B🇺🇸 MetaOpen57.956.733.344.177.177.244.9----52.0---------------83.8-64.644.8------------Jan 242y ago
3MetaLlama 2-70B🇺🇸 MetaOpen56.671.153.269.680.478.959.91.83.3--60.4---------------87.6-65.646.9------------Jan 242y ago
4MetaLlama 2-13B🇺🇸 MetaOpen53.747.144.336.974.376.540.8----45.6---------------79.641.061.642.7------------Jan 242y ago
5MetaLlama 2-34B🇺🇸 MetaOpen50.439.325.542.2--50.1----53.4---------------84.6-63.844.3------------Jan 242y ago
6MetaLlama 3.1-405B🇺🇸 MetaOpen49.393.777.2-85.6-79.334.549.89.621.478.47.6--------------82.7-71.832.3---7.5-7.4------Jan 242y ago
7MetaLLaMA-13B🇺🇸 MetaOpen45.736.917.220.672.375.230.3----46.0---------------77.924.460.241.9------------Jan 242y ago
8MetaLlama 2-7B🇺🇸 MetaOpen43.027.918.916.769.673.327.7----38.4---------------73.724.157.644.8------------Jan 242y ago
9MetaLlama 3 8B Instruct🇺🇸 MetaOpen41.777.1----58.41.46.10.7-51.4---------------67.7--76.8----36.0-----$0.038KApr 241y ago
10MetaLLaMA-7B🇺🇸 MetaOpen39.730.111.311.068.373.314.1----40.2---------------71.014.959.642.9------------Jan 242y ago
11MetaLlama 3.2 90B🇺🇸 MetaOpen36.1-----73.721.439.42.5----------------------27.352.0---------Jan 242y ago
12MetaLlama 3 70B Instruct🇺🇸 MetaOpen31.7-----72.420.822.64.2-67.0------------------30.1---5.0------$0.518KApr 241y ago
13MetaLlama 3.3 70B Instruct (free)🇺🇸 MetaOpen29.1-----81.729.941.65.014.4-3.9---33.3---------------23.0--------$0.0066KDec 241y ago
14MetaLlama 3.3 70B Instruct🇺🇸 MetaOpen29.1-----81.729.941.65.014.4-3.9---33.3---------------23.0--------$0.10131KDec 241y ago
15MetaLlama 3.1 8B Instruct🇺🇸 MetaOpen28.7--82.4--41.51.322.92.41.7------------------62.4--15.1--------$0.0216KJul 241y ago
16MetaLlama 4 Maverick🇺🇸 MetaOpen28.0------56.073.020.524.5-13.215.663.7-46.221.0-0.7-----0.10.9------52.0----4.4--$0.151.0MApr 2511mo ago
17MetaLlama 3.1 70B Instruct🇺🇸 MetaOpen26.1-----73.525.636.73.59.0---------------------27.9---6.9----$0.40131KJul 241y ago
18MetaLlama 4 Scout🇺🇸 MetaOpen18.9------35.862.37.7------36.09.1-0.1-----0.1------------0.5--$0.08328KApr 2511mo ago
90+ 80-89 70-79 60-69 <60Scores in % unless noted. Avg = unweighted mean across tested benchmarks.