EVALITA-LLM Leaderboard

Open Italian LLM Leaderboard

Evalita-LLM is a benchmark designed to evaluate Large Language Models (LLMs) on Italian tasks. The distinguishing features of Evalita-LLM are the following: (i) all tasks are native Italian, avoiding translation issues and potential cultural biases; (ii) the benchmark includes generative tasks, enabling more natural interaction with LLMs; (iii) all tasks are evaluated against multiple prompts, this way mitigating the model sensitivity to specific prompts and allowing a fairer evaluation.

Multiple-choice tasks: ๐Ÿ“ŠTE (Textual Entailment), ๐Ÿ˜ƒSA (Sentiment Analysis), โš ๏ธHS (Hate Speech Detection), ๐ŸฅAT (Admission Test), ๐Ÿ”คWIC (Word in Context), โ“FAQ (Frequently Asked Questions)
Generative tasks: ๐Ÿ”„LS (Lexical Substitution), ๐Ÿ“SU (Summarization), ๐Ÿท๏ธNER (Named Entity Recognition), ๐Ÿ”—REL (Relation Extraction)

{
  • "headers": [
    • "Rank",
    • "Size",
    • "FS",
    • "IS_FS",
    • "Model",
    • "Avg. Comb. Perf. โฌ†๏ธ",
    • "TE",
    • "TE Prompt Average",
    • "TE Prompt Std",
    • "TE Best Prompt",
    • "TE Best Prompt Id",
    • "SA",
    • "SA Prompt Average",
    • "SA STD Accuracy",
    • "SA Best Prompt",
    • "SA Best Prompt Id",
    • "HS",
    • "HS Prompt Average",
    • "HS Prompt Std",
    • "HS Best Prompt",
    • "HS Best Prompt Id",
    • "AT",
    • "AT Prompt Average",
    • "AT Prompt Std",
    • "AT Best Prompt",
    • "AT Best Prompt Id",
    • "WIC",
    • "WIC Prompt Average",
    • "WIC Prompt Std",
    • "WIC Best Prompt",
    • "WIC Best Prompt Id",
    • "FAQ",
    • "FAQ Prompt Average",
    • "FAQ Prompt Std",
    • "FAQ Best Prompt",
    • "FAQ Best Prompt Id",
    • "LS",
    • "LS Prompt Average",
    • "LS Prompt Std",
    • "LS Best Prompt",
    • "LS Best Prompt Id",
    • "SU",
    • "SU Prompt Average",
    • "SU Prompt Std",
    • "SU Best Prompt",
    • "SU Best Prompt Id",
    • "NER",
    • "NER Prompt Average",
    • "NER Prompt Std",
    • "NER Best Prompt",
    • "NER Best Prompt Id",
    • "REL",
    • "REL Prompt Average",
    • "REL Prompt Std",
    • "REL Best Prompt",
    • "REL Best Prompt Id",
    • "Architecture",
    • "Hub License",
    • "#Params (B)",
    • "Hub โค๏ธ",
    • "Available on the hub",
    • "Model sha"
    ],
  • "data": [
    • [
      • 1,
      • "๐Ÿ”ต๐Ÿ”ต๐Ÿ”ต",
      • "5๏ธโƒฃ",
      • true,
      • "<a target="_blank" href="https://huggingface.co/mistralai/Mistral-Large-Instruct-2411" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Mistral-Large-Instruct-2411</a> ๐Ÿ”ต๐Ÿ”ต๐Ÿ”ต๐Ÿ†",
      • 62.28,
      • 80.96,
      • 80.5,
      • 1.87,
      • 83.25,
      • 6,
      • 80.94,
      • 80.71,
      • 0.98,
      • 81.98,
      • 3,
      • 77.03,
      • 76.61,
      • 2.71,
      • 78.58,
      • 2,
      • 76.28,
      • 75.27,
      • 15.06,
      • 94.8,
      • 3,
      • 75.55,
      • 74.43,
      • 6.32,
      • 80.07,
      • 4,
      • 54.42,
      • 54.2,
      • 35.09,
      • 99.5,
      • 3,
      • 38.56,
      • 38.24,
      • 0.74,
      • 38.76,
      • 2,
      • 33.41,
      • 30.91,
      • 5.42,
      • 34.74,
      • 2,
      • 38.79,
      • 38.75,
      • 0.08,
      • 38.81,
      • 1,
      • 66.82,
      • 66.76,
      • 0.24,
      • 66.93,
      • 1,
      • "MistralForCausalLM",
      • "?",
      • 123,
      • 0,
      • true,
      • ""
      ],
    • [
      • 2,
      • "๐Ÿ”ต๐Ÿ”ต๐Ÿ”ต",
      • "5๏ธโƒฃ",
      • true,
      • "<a target="_blank" href="https://huggingface.co/allenai/Llama-3.1-Tulu-3-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Llama-3.1-Tulu-3-70B</a>",
      • 60.16,
      • 82.42,
      • 82.12,
      • 1.71,
      • 84,
      • 2,
      • 80.41,
      • 80.07,
      • 1.66,
      • 81.94,
      • 3,
      • 75.51,
      • 74.85,
      • 2.89,
      • 77.85,
      • 2,
      • 72.44,
      • 70.8,
      • 16.68,
      • 92.4,
      • 3,
      • 65.44,
      • 62.35,
      • 16.2,
      • 74.46,
      • 2,
      • 54.52,
      • 54.41,
      • 35.12,
      • 99.75,
      • 3,
      • 40.89,
      • 39.89,
      • 2.43,
      • 41.6,
      • 1,
      • 35.46,
      • 34.9,
      • 1.23,
      • 35.77,
      • 2,
      • 39.12,
      • 38.44,
      • 1.61,
      • 39.57,
      • 1,
      • 55.38,
      • 52.58,
      • 9.77,
      • 59.49,
      • 2,
      • "LlamaForCausalLM",
      • "?",
      • 71,
      • 0,
      • true,
      • ""
      ],
    • [
      • 3,
      • "๐Ÿ”ต๐Ÿ”ต๐Ÿ”ต",
      • "5๏ธโƒฃ",
      • true,
      • "<a target="_blank" href="https://huggingface.co/meta-llama/Llama-3.3-70B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Llama-3.3-70B-Instruct</a>",
      • 57.93,
      • 80.79,
      • 80.25,
      • 2.38,
      • 83.5,
      • 3,
      • 79.55,
      • 78.98,
      • 2.57,
      • 82.19,
      • 4,
      • 77.82,
      • 77.4,
      • 1.44,
      • 79.45,
      • 2,
      • 70.72,
      • 68.83,
      • 17.56,
      • 91.8,
      • 3,
      • 67.15,
      • 65.01,
      • 9.36,
      • 72.89,
      • 2,
      • 54.11,
      • 53.66,
      • 35.03,
      • 99,
      • 3,
      • 46.8,
      • 45.77,
      • 2.79,
      • 47.74,
      • 1,
      • 21.24,
      • 20.94,
      • 0.54,
      • 21.32,
      • 2,
      • 44.58,
      • 44.53,
      • 0.13,
      • 44.62,
      • 1,
      • 36.52,
      • 36.51,
      • 0.03,
      • 36.53,
      • 1,
      • "LlamaForCausalLM",
      • "?",
      • 71,
      • 0,
      • true,
      • ""
      ],
    • [
      • 4,
      • "๐Ÿ”ต๐Ÿ”ต",
      • "5๏ธโƒฃ",
      • true,
      • "<a target="_blank" href="https://huggingface.co/mistralai/Mistral-Small-24B-Instruct-2501" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Mistral-Small-24B-Instruct-2501</a> ๐Ÿ”ต๐Ÿ”ต๐Ÿ†",
      • 57.67,
      • 82,
      • 81.71,
      • 1.61,
      • 83.5,
      • 6,
      • 75.95,
      • 75.1,
      • 3.31,
      • 79.21,
      • 3,
      • 72.9,
      • 72.1,
      • 3.35,
      • 75.36,
      • 3,
      • 69.77,
      • 67.73,
      • 18.33,
      • 91.4,
      • 3,
      • 71.65,
      • 70.86,
      • 3.21,
      • 73.89,
      • 4,
      • 53.79,
      • 53.45,
      • 35.38,
      • 99.25,
      • 3,
      • 41.89,
      • 41.15,
      • 1.81,
      • 42.43,
      • 1,
      • 29.7,
      • 28.42,
      • 2.58,
      • 30.25,
      • 2,
      • 38.16,
      • 38.09,
      • 0.17,
      • 38.21,
      • 1,
      • 40.9,
      • 40.18,
      • 1.73,
      • 41.41,
      • 2,
      • "MistralForCausalLM",
      • "?",
      • 24,
      • 0,
      • true,
      • ""
      ],
    • [
      • 5,
      • "๐Ÿ”ต๐Ÿ”ต",
      • "5๏ธโƒฃ",
      • true,
      • "<a target="_blank" href="https://huggingface.co/google/gemma-3-27b-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gemma-3-27b-it</a>",
      • 57.42,
      • 81.13,
      • 80.75,
      • 1.96,
      • 83,
      • 4,
      • 80.3,
      • 80.18,
      • 0.47,
      • 80.8,
      • 2,
      • 75.87,
      • 75.33,
      • 2.32,
      • 77.77,
      • 4,
      • 73.91,
      • 72.1,
      • 13.63,
      • 89.8,
      • 3,
      • 66.49,
      • 64.51,
      • 8.29,
      • 71.46,
      • 4,
      • 53.13,
      • 52.33,
      • 35.57,
      • 98.25,
      • 3,
      • 36.08,
      • 35.64,
      • 0.98,
      • 36.33,
      • 2,
      • 20.02,
      • 19.72,
      • 0.52,
      • 20.09,
      • 1,
      • 38.76,
      • 38.72,
      • 0.11,
      • 38.79,
      • 2,
      • 48.48,
      • 46.34,
      • 6.14,
      • 50.68,
      • 2,
      • "Gemma3ForConditionalGeneration",
      • "?",
      • 28,
      • 0,
      • true,
      • ""
      ],
    • [
      • 6,
      • "๐Ÿ”ต๐Ÿ”ต๐Ÿ”ต",
      • "5๏ธโƒฃ",
      • true,
      • "<a target="_blank" href="https://huggingface.co/Qwen/Qwen2.5-72B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen2.5-72B-Instruct</a>",
      • 57.36,
      • 85.07,
      • 85,
      • 0.45,
      • 85.5,
      • 6,
      • 77.68,
      • 77.6,
      • 0.42,
      • 77.97,
      • 6,
      • 76.38,
      • 75.89,
      • 1.67,
      • 78.11,
      • 3,
      • 69.3,
      • 67.9,
      • 20.69,
      • 94.8,
      • 3,
      • 72.06,
      • 71.27,
      • 2.46,
      • 74.34,
      • 1,
      • 54,
      • 53.66,
      • 35.32,
      • 99.25,
      • 3,
      • 38.03,
      • 37.94,
      • 0.2,
      • 38.08,
      • 2,
      • 24.49,
      • 24.15,
      • 0.64,
      • 24.6,
      • 1,
      • 38.91,
      • 38.76,
      • 0.34,
      • 39,
      • 2,
      • 37.65,
      • 37.51,
      • 0.31,
      • 37.73,
      • 1,
      • "Qwen2ForCausalLM",
      • "?",
      • 73,
      • 0,
      • true,
      • ""
      ],
    • [
      • 7,
      • "๐Ÿ”ต๐Ÿ”ต",
      • "5๏ธโƒฃ",
      • true,
      • "<a target="_blank" href="https://huggingface.co/Qwen/Qwen2.5-14B-Instruct-1M" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen2.5-14B-Instruct-1M</a>",
      • 54.72,
      • 85.02,
      • 84.79,
      • 1.8,
      • 86.5,
      • 4,
      • 73.17,
      • 71.75,
      • 5.41,
      • 78.26,
      • 3,
      • 72.8,
      • 71.8,
      • 5.46,
      • 75.96,
      • 4,
      • 59.75,
      • 55.4,
      • 23.32,
      • 85.6,
      • 3,
      • 63.59,
      • 61.7,
      • 4.86,
      • 67.52,
      • 6,
      • 52.61,
      • 52.37,
      • 36.41,
      • 99.5,
      • 4,
      • 35.03,
      • 34.65,
      • 0.84,
      • 35.24,
      • 1,
      • 25.9,
      • 25.51,
      • 0.75,
      • 26.04,
      • 2,
      • 35.1,
      • 35.03,
      • 0.17,
      • 35.14,
      • 1,
      • 44.23,
      • 42.63,
      • 4.14,
      • 45.56,
      • 1,
      • "Qwen2ForCausalLM",
      • "?",
      • 15,
      • 0,
      • true,
      • ""
      ],
    • [
      • 8,
      • "๐Ÿ”ต๐Ÿ”ต",
      • "5๏ธโƒฃ",
      • true,
      • "<a target="_blank" href="https://huggingface.co/google/gemma-3-12b-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gemma-3-12b-it</a>",
      • 53.92,
      • 79.52,
      • 79.12,
      • 1.4,
      • 81.25,
      • 1,
      • 78.34,
      • 78.17,
      • 0.64,
      • 78.98,
      • 6,
      • 71.4,
      • 70.14,
      • 4.61,
      • 75.24,
      • 4,
      • 65.9,
      • 62.5,
      • 16.97,
      • 84.6,
      • 4,
      • 66.5,
      • 64.94,
      • 6.14,
      • 70.16,
      • 3,
      • 52.35,
      • 52,
      • 36.41,
      • 99.25,
      • 3,
      • 26.77,
      • 26.64,
      • 0.23,
      • 26.81,
      • 2,
      • 18.93,
      • 18.74,
      • 0.34,
      • 18.98,
      • 2,
      • 37.86,
      • 37.82,
      • 0.1,
      • 37.89,
      • 1,
      • 41.59,
      • 41.46,
      • 0.33,
      • 41.69,
      • 1,
      • "Gemma3ForConditionalGeneration",
      • "?",
      • 13,
      • 0,
      • true,
      • ""
      ],
    • [
      • 9,
      • "๐Ÿ”ต๐Ÿ”ต",
      • "5๏ธโƒฃ",
      • true,
      • "<a target="_blank" href="https://huggingface.co/google/gemma-2-27b-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gemma-2-27b-it</a>",
      • 53.86,
      • 78.72,
      • 78.46,
      • 0.98,
      • 79.75,
      • 5,
      • 74.26,
      • 73.84,
      • 1.05,
      • 75.56,
      • 1,
      • 74.14,
      • 73.42,
      • 3.33,
      • 76.47,
      • 4,
      • 68.47,
      • 65.63,
      • 15.47,
      • 86.2,
      • 3,
      • 68.56,
      • 66.82,
      • 6.31,
      • 73.31,
      • 4,
      • 52.37,
      • 51.33,
      • 35.96,
      • 97.76,
      • 3,
      • 28.52,
      • 28.5,
      • 0.04,
      • 28.53,
      • 2,
      • 18.66,
      • 18.51,
      • 0.26,
      • 18.69,
      • 2,
      • 38.33,
      • 38.23,
      • 0.24,
      • 38.4,
      • 2,
      • 36.56,
      • 36.56,
      • 0,
      • 36.56,
      • 1,
      • "Gemma2ForCausalLM",
      • "?",
      • 28,
      • 0,
      • true,
      • ""
      ],
    • [
      • 10,
      • "๐Ÿ”ต",
      • "5๏ธโƒฃ",
      • true,
      • "<a target="_blank" href="https://huggingface.co/google/gemma-2-9b-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gemma-2-9b-it</a> ๐Ÿ”ต๐Ÿ†",
      • 53.64,
      • 79.97,
      • 79.62,
      • 2.35,
      • 81.5,
      • 2,
      • 72.85,
      • 72.49,
      • 1.06,
      • 73.89,
      • 1,
      • 71.99,
      • 70.71,
      • 4.79,
      • 76.03,
      • 3,
      • 59.96,
      • 54.9,
      • 16.19,
      • 75.8,
      • 3,
      • 57.79,
      • 55.64,
      • 5.34,
      • 61.17,
      • 3,
      • 51.76,
      • 51.16,
      • 36.67,
      • 98.75,
      • 3,
      • 22.36,
      • 21.97,
      • 0.71,
      • 22.47,
      • 2,
      • 30.43,
      • 29.12,
      • 2.68,
      • 31.02,
      • 1,
      • 38.03,
      • 37.92,
      • 0.24,
      • 38.09,
      • 1,
      • 51.26,
      • 50.98,
      • 0.81,
      • 51.56,
      • 1,
      • "Gemma2ForCausalLM",
      • "?",
      • 10,
      • 0,
      • true,
      • ""
      ],
    • [
      • 11,
      • "๐Ÿ”ต",
      • "5๏ธโƒฃ",
      • true,
      • "<a target="_blank" href="https://huggingface.co/Qwen/Qwen2.5-7B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen2.5-7B-Instruct</a>",
      • 52.62,
      • 83.65,
      • 83.42,
      • 0.82,
      • 85,
      • 2,
      • 74.16,
      • 73.77,
      • 1.12,
      • 75.35,
      • 4,
      • 68.51,
      • 67.14,
      • 5.8,
      • 72.04,
      • 4,
      • 53.92,
      • 46.93,
      • 21.82,
      • 75.8,
      • 4,
      • 62.7,
      • 60.38,
      • 5.38,
      • 67.5,
      • 6,
      • 52.46,
      • 51.87,
      • 36.22,
      • 98.75,
      • 4,
      • 28.17,
      • 27.85,
      • 0.62,
      • 28.29,
      • 2,
      • 30.79,
      • 30.36,
      • 0.88,
      • 30.98,
      • 2,
      • 35.18,
      • 34.45,
      • 1.6,
      • 35.58,
      • 1,
      • 36.69,
      • 36.64,
      • 0.11,
      • 36.72,
      • 2,
      • "Qwen2ForCausalLM",
      • "?",
      • 8,
      • 0,
      • true,
      • ""
      ],
    • [
      • 12,
      • "๐Ÿ”ต",
      • "5๏ธโƒฃ",
      • true,
      • "<a target="_blank" href="https://huggingface.co/google/gemma-3n-E4B-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gemma-3n-E4B-it</a>",
      • 52.4,
      • 73.38,
      • 72.08,
      • 4.01,
      • 78,
      • 2,
      • 76.34,
      • 76.02,
      • 1.37,
      • 77.47,
      • 6,
      • 68.27,
      • 66.55,
      • 6.05,
      • 72.92,
      • 4,
      • 58.24,
      • 53.1,
      • 13.18,
      • 70.6,
      • 4,
      • 62.63,
      • 60.55,
      • 9.3,
      • 66.81,
      • 4,
      • 52.1,
      • 47.55,
      • 31.75,
      • 89.03,
      • 3,
      • 25.27,
      • 25.18,
      • 0.16,
      • 25.3,
      • 1,
      • 25.8,
      • 25.35,
      • 0.84,
      • 25.95,
      • 1,
      • 32.35,
      • 31.9,
      • 0.93,
      • 32.56,
      • 2,
      • 49.65,
      • 49.29,
      • 1.03,
      • 50.02,
      • 1,
      • "Gemma3nForConditionalGeneration",
      • "?",
      • 8,
      • 0,
      • true,
      • ""
      ],
    • [
      • 13,
      • "๐Ÿ”ต๐Ÿ”ต๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/Qwen/Qwen2.5-72B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen2.5-72B-Instruct</a> ๐Ÿ”ต๐Ÿ”ต๐Ÿ”ต๐ŸŽ–๏ธ",
      • 52.38,
      • 75.16,
      • 73.46,
      • 13.98,
      • 84.25,
      • 3,
      • 57.41,
      • 51.56,
      • 20.98,
      • 77.2,
      • 4,
      • 71.56,
      • 69.86,
      • 6.22,
      • 77.42,
      • 4,
      • 59.36,
      • 56.77,
      • 28.65,
      • 92.8,
      • 3,
      • 56.32,
      • 51.14,
      • 17.02,
      • 66.67,
      • 5,
      • 53.42,
      • 53.08,
      • 35.39,
      • 99.25,
      • 4,
      • 50.24,
      • 50.1,
      • 0.37,
      • 50.37,
      • 1,
      • 25.84,
      • 25.12,
      • 1.39,
      • 26.1,
      • 1,
      • 32.87,
      • 31.83,
      • 2.21,
      • 33.4,
      • 2,
      • 41.6,
      • 39.77,
      • 4.54,
      • 42.98,
      • 1,
      • "Qwen2ForCausalLM",
      • "?",
      • 73,
      • 0,
      • true,
      • ""
      ],
    • [
      • 14,
      • "๐Ÿ”ต๐Ÿ”ต",
      • "5๏ธโƒฃ",
      • true,
      • "<a target="_blank" href="https://huggingface.co/microsoft/phi-4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">phi-4</a>",
      • 51.84,
      • 76.84,
      • 76.21,
      • 3.56,
      • 79.25,
      • 3,
      • 75.21,
      • 74.44,
      • 2.76,
      • 77.95,
      • 4,
      • 69.2,
      • 67.08,
      • 6.48,
      • 75.89,
      • 4,
      • 61.91,
      • 57.87,
      • 21.1,
      • 85.2,
      • 4,
      • 59.38,
      • 57.17,
      • 4.21,
      • 63.16,
      • 4,
      • 54.3,
      • 53.41,
      • 34.45,
      • 98,
      • 4,
      • 35.1,
      • 35.1,
      • 0.02,
      • 35.11,
      • 2,
      • 21.24,
      • 21.06,
      • 0.33,
      • 21.29,
      • 1,
      • 28.05,
      • 27.93,
      • 0.25,
      • 28.1,
      • 2,
      • 37.15,
      • 36.83,
      • 0.72,
      • 37.34,
      • 1,
      • "Phi3ForCausalLM",
      • "?",
      • 15,
      • 0,
      • true,
      • ""
      ],
    • [
      • 15,
      • "๐Ÿ”ต",
      • "5๏ธโƒฃ",
      • true,
      • "<a target="_blank" href="https://huggingface.co/arcee-ai/Llama-3.1-SuperNova-Lite" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Llama-3.1-SuperNova-Lite</a>",
      • 51.71,
      • 74.19,
      • 73.62,
      • 2.64,
      • 76,
      • 3,
      • 75.24,
      • 74.23,
      • 3.64,
      • 79.03,
      • 4,
      • 69.24,
      • 67.58,
      • 5.12,
      • 73.93,
      • 4,
      • 54.83,
      • 48.43,
      • 15.87,
      • 69.2,
      • 4,
      • 62.58,
      • 61.35,
      • 2.04,
      • 64.86,
      • 6,
      • 54.43,
      • 51,
      • 31.2,
      • 91.52,
      • 3,
      • 29.5,
      • 29.28,
      • 0.43,
      • 29.59,
      • 2,
      • 21.6,
      • 21.38,
      • 0.39,
      • 21.66,
      • 2,
      • 39.49,
      • 38.24,
      • 2.96,
      • 40.34,
      • 2,
      • 36.02,
      • 35.67,
      • 0.76,
      • 36.21,
      • 1,
      • "LlamaForCausalLM",
      • "?",
      • 9,
      • 0,
      • true,
      • ""
      ],
    • [
      • 16,
      • "๐Ÿ”ต",
      • "5๏ธโƒฃ",
      • true,
      • "<a target="_blank" href="https://huggingface.co/ibm-granite/granite-3.1-8b-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">granite-3.1-8b-instruct</a>",
      • 51.7,
      • 72.23,
      • 70.92,
      • 5.97,
      • 76.5,
      • 4,
      • 74.76,
      • 74.47,
      • 0.84,
      • 75.67,
      • 1,
      • 66.88,
      • 64.05,
      • 8.78,
      • 75.68,
      • 4,
      • 54.23,
      • 48.17,
      • 13.29,
      • 66,
      • 4,
      • 65.03,
      • 63.71,
      • 5.26,
      • 67.83,
      • 4,
      • 53.49,
      • 52.58,
      • 35.09,
      • 98,
      • 3,
      • 25.74,
      • 25.6,
      • 0.26,
      • 25.79,
      • 1,
      • 32.88,
      • 32.36,
      • 1.1,
      • 33.14,
      • 1,
      • 31.8,
      • 30.59,
      • 2.53,
      • 32.38,
      • 2,
      • 39.91,
      • 39.7,
      • 0.52,
      • 40.06,
      • 1,
      • "GraniteForCausalLM",
      • "?",
      • 9,
      • 0,
      • true,
      • ""
      ],
    • [
      • 17,
      • "๐Ÿ”ต๐Ÿ”ต๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/meta-llama/Llama-4-Scout-17B-16E-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Llama-4-Scout-17B-16E-Instruct</a>",
      • 51.65,
      • 71.24,
      • 68.92,
      • 12.62,
      • 81.5,
      • 4,
      • 69.34,
      • 67,
      • 9.58,
      • 77.35,
      • 4,
      • 68.16,
      • 65.82,
      • 6.9,
      • 75.27,
      • 4,
      • 68.17,
      • 66.1,
      • 20.14,
      • 92,
      • 3,
      • 62.82,
      • 60.82,
      • 7.79,
      • 66.85,
      • 5,
      • 53.97,
      • 53.41,
      • 34.93,
      • 98.75,
      • 3,
      • 39.29,
      • 38.46,
      • 1.95,
      • 39.84,
      • 1,
      • 28.24,
      • 28.16,
      • 0.15,
      • 28.27,
      • 2,
      • 21.59,
      • 16.62,
      • 9.15,
      • 23.08,
      • 1,
      • 33.69,
      • 31.78,
      • 4.12,
      • 34.7,
      • 1,
      • "Llama4ForConditionalGeneration",
      • "?",
      • 109,
      • 0,
      • true,
      • ""
      ],
    • [
      • 18,
      • "๐Ÿ”ต๐Ÿ”ต๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/allenai/Llama-3.1-Tulu-3-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Llama-3.1-Tulu-3-70B</a>",
      • 51.6,
      • 71.71,
      • 69.67,
      • 11.26,
      • 79.75,
      • 2,
      • 59.89,
      • 54.84,
      • 16.28,
      • 75.25,
      • 4,
      • 64.86,
      • 61.08,
      • 27.3,
      • 81.43,
      • 3,
      • 65.16,
      • 62.6,
      • 21.94,
      • 91,
      • 3,
      • 58.64,
      • 54.24,
      • 21.7,
      • 67.96,
      • 4,
      • 54.46,
      • 53.57,
      • 34.22,
      • 98,
      • 4,
      • 46.45,
      • 46.33,
      • 0.33,
      • 46.56,
      • 2,
      • 29.25,
      • 29.14,
      • 0.23,
      • 29.3,
      • 1,
      • 30.14,
      • 28.84,
      • 2.64,
      • 30.71,
      • 2,
      • 35.47,
      • 35.05,
      • 0.93,
      • 35.71,
      • 2,
      • "LlamaForCausalLM",
      • "?",
      • 71,
      • 0,
      • true,
      • ""
      ],
    • [
      • 19,
      • "๐Ÿ”ต๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/mistralai/Mistral-Small-24B-Instruct-2501" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Mistral-Small-24B-Instruct-2501</a> ๐Ÿ”ต๐Ÿ”ต๐ŸŽ–๏ธ",
      • 51.3,
      • 69.06,
      • 66.21,
      • 13.7,
      • 82.25,
      • 4,
      • 62.91,
      • 58.64,
      • 20.07,
      • 78.32,
      • 3,
      • 67.33,
      • 64.95,
      • 5.31,
      • 74.19,
      • 4,
      • 61.59,
      • 58.53,
      • 24.89,
      • 90.4,
      • 3,
      • 60.31,
      • 56.56,
      • 16.96,
      • 68.43,
      • 2,
      • 54.55,
      • 53.45,
      • 33.74,
      • 97.51,
      • 4,
      • 52.67,
      • 51.95,
      • 2.18,
      • 53.49,
      • 2,
      • 31.86,
      • 31.83,
      • 0.06,
      • 31.87,
      • 1,
      • 18.61,
      • 17.66,
      • 1.65,
      • 18.83,
      • 2,
      • 34.14,
      • 33.6,
      • 1.17,
      • 34.43,
      • 2,
      • "MistralForCausalLM",
      • "?",
      • 24,
      • 0,
      • true,
      • ""
      ],
    • [
      • 20,
      • "๐Ÿ”ต๐Ÿ”ต",
      • "5๏ธโƒฃ",
      • true,
      • "<a target="_blank" href="https://huggingface.co/microsoft/Phi-3-medium-4k-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Phi-3-medium-4k-instruct</a>",
      • 51.22,
      • 76.27,
      • 75.25,
      • 6.22,
      • 80.5,
      • 4,
      • 70.48,
      • 69.85,
      • 2.03,
      • 72.1,
      • 3,
      • 62.83,
      • 58.75,
      • 22.01,
      • 75.2,
      • 3,
      • 61.47,
      • 56.9,
      • 18.75,
      • 81.2,
      • 4,
      • 68.92,
      • 67.56,
      • 5.96,
      • 72.47,
      • 2,
      • 52.17,
      • 45.47,
      • 25.69,
      • 81.3,
      • 4,
      • 35.87,
      • 35.46,
      • 0.93,
      • 36.11,
      • 1,
      • 21.59,
      • 21.35,
      • 0.42,
      • 21.65,
      • 2,
      • 25.67,
      • 25.55,
      • 0.23,
      • 25.71,
      • 2,
      • 36.95,
      • 36.78,
      • 0.36,
      • 37.04,
      • 2,
      • "Phi3ForCausalLM",
      • "?",
      • 14,
      • 0,
      • true,
      • ""
      ],
    • [
      • 21,
      • "๐Ÿ”ต๐Ÿ”ต๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/mistralai/Mistral-Large-Instruct-2411" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Mistral-Large-Instruct-2411</a>",
      • 50.86,
      • 68.53,
      • 65.58,
      • 13.21,
      • 81.75,
      • 4,
      • 65.91,
      • 62.37,
      • 17.69,
      • 80.46,
      • 4,
      • 69.06,
      • 66.95,
      • 6.29,
      • 75.58,
      • 4,
      • 67.64,
      • 65.73,
      • 21.69,
      • 93,
      • 3,
      • 69.63,
      • 67.68,
      • 5.89,
      • 75.73,
      • 2,
      • 54.69,
      • 54.24,
      • 34.39,
      • 99,
      • 3,
      • 21.8,
      • 18.56,
      • 5.93,
      • 22.75,
      • 1,
      • 30.77,
      • 30.58,
      • 0.38,
      • 30.85,
      • 2,
      • 16.06,
      • 14.17,
      • 3.21,
      • 16.44,
      • 1,
      • 44.55,
      • 44.07,
      • 1.23,
      • 44.94,
      • 2,
      • "MistralForCausalLM",
      • "?",
      • 123,
      • 0,
      • true,
      • ""
      ],
    • [
      • 22,
      • "๐Ÿ”ต๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/google/gemma-2-27b-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gemma-2-27b-it</a>",
      • 50.6,
      • 68.52,
      • 66.25,
      • 8.97,
      • 75.5,
      • 4,
      • 67.05,
      • 65.61,
      • 6.64,
      • 70.48,
      • 3,
      • 61.78,
      • 59.68,
      • 5.88,
      • 65.82,
      • 4,
      • 62.01,
      • 57.67,
      • 19.69,
      • 82.6,
      • 4,
      • 55.52,
      • 49.62,
      • 24,
      • 68.14,
      • 6,
      • 52.52,
      • 51.71,
      • 35.19,
      • 98.25,
      • 4,
      • 36.04,
      • 34.61,
      • 3.22,
      • 36.88,
      • 1,
      • 29.98,
      • 29.43,
      • 1.12,
      • 30.22,
      • 2,
      • 36.61,
      • 36.47,
      • 0.32,
      • 36.69,
      • 1,
      • 36.02,
      • 35.78,
      • 0.52,
      • 36.15,
      • 1,
      • "Gemma2ForCausalLM",
      • "?",
      • 28,
      • 0,
      • true,
      • ""
      ],
    • [
      • 23,
      • "๐Ÿ”ต",
      • "5๏ธโƒฃ",
      • true,
      • "<a target="_blank" href="https://huggingface.co/meta-llama/Meta-Llama-3.1-8B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Meta-Llama-3.1-8B-Instruct</a>",
      • 50.37,
      • 70.53,
      • 69.04,
      • 7.9,
      • 75,
      • 4,
      • 73.77,
      • 72.72,
      • 3.67,
      • 77.35,
      • 4,
      • 65.72,
      • 63.22,
      • 8.24,
      • 72.22,
      • 4,
      • 53.89,
      • 46.97,
      • 17.77,
      • 70.2,
      • 4,
      • 59.03,
      • 57.65,
      • 3.2,
      • 61.2,
      • 2,
      • 53.73,
      • 51.16,
      • 33,
      • 94.01,
      • 3,
      • 30.32,
      • 29.76,
      • 1.15,
      • 30.57,
      • 2,
      • 20.44,
      • 20.2,
      • 0.43,
      • 20.5,
      • 2,
      • 40.3,
      • 39.7,
      • 1.44,
      • 40.72,
      • 2,
      • 35.99,
      • 35.89,
      • 0.22,
      • 36.05,
      • 2,
      • "LlamaForCausalLM",
      • "?",
      • 9,
      • 0,
      • true,
      • ""
      ],
    • [
      • 24,
      • "๐Ÿ”ต",
      • "5๏ธโƒฃ",
      • true,
      • "<a target="_blank" href="https://huggingface.co/microsoft/Phi-3.5-mini-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Phi-3.5-mini-instruct</a>",
      • 50.06,
      • 77.35,
      • 76.92,
      • 2.32,
      • 79,
      • 2,
      • 72.81,
      • 72.21,
      • 2.05,
      • 74.57,
      • 5,
      • 67.05,
      • 65.31,
      • 6.51,
      • 71.39,
      • 4,
      • 53.19,
      • 46.5,
      • 14.96,
      • 66.4,
      • 3,
      • 65.33,
      • 63.97,
      • 3.54,
      • 68.25,
      • 6,
      • 52.72,
      • 47.92,
      • 29.83,
      • 88.03,
      • 4,
      • 24.07,
      • 23.86,
      • 0.4,
      • 24.14,
      • 2,
      • 19.85,
      • 19.71,
      • 0.25,
      • 19.89,
      • 1,
      • 31.98,
      • 31.5,
      • 1.01,
      • 32.21,
      • 1,
      • 36.25,
      • 36.15,
      • 0.21,
      • 36.3,
      • 2,
      • "Phi3ForCausalLM",
      • "?",
      • 4,
      • 0,
      • true,
      • ""
      ],
    • [
      • 25,
      • "๐Ÿ”ต๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/google/gemma-3-27b-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gemma-3-27b-it</a>",
      • 49.89,
      • 68.77,
      • 65.96,
      • 15.57,
      • 80,
      • 4,
      • 67.84,
      • 65.76,
      • 9.22,
      • 73.63,
      • 4,
      • 66.66,
      • 65.32,
      • 3.51,
      • 69.74,
      • 4,
      • 64.85,
      • 62.13,
      • 23.01,
      • 90.4,
      • 3,
      • 40.61,
      • 27.58,
      • 30.52,
      • 66.58,
      • 5,
      • 53.27,
      • 52.58,
      • 35.48,
      • 98.5,
      • 3,
      • 42.41,
      • 38.66,
      • 9.74,
      • 45.55,
      • 2,
      • 29.77,
      • 29.56,
      • 0.43,
      • 29.86,
      • 2,
      • 32.51,
      • 32.32,
      • 0.39,
      • 32.6,
      • 2,
      • 32.25,
      • 30.52,
      • 3.64,
      • 33.1,
      • 1,
      • "Gemma3ForConditionalGeneration",
      • "?",
      • 28,
      • 0,
      • true,
      • ""
      ],
    • [
      • 26,
      • "๐Ÿ”ต",
      • "5๏ธโƒฃ",
      • true,
      • "<a target="_blank" href="https://huggingface.co/DeepMount00/Llama-3-8b-Ita" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Llama-3-8b-Ita</a>",
      • 49.41,
      • 72.61,
      • 71.54,
      • 3.77,
      • 76,
      • 1,
      • 71.9,
      • 70.46,
      • 4.33,
      • 76.59,
      • 4,
      • 68.95,
      • 67.49,
      • 3.78,
      • 72.86,
      • 4,
      • 52.96,
      • 46.33,
      • 14.93,
      • 65.6,
      • 3,
      • 57.04,
      • 54.37,
      • 5.33,
      • 61.26,
      • 1,
      • 53.68,
      • 51.41,
      • 33.29,
      • 94.76,
      • 4,
      • 25.08,
      • 24.86,
      • 0.42,
      • 25.16,
      • 1,
      • 24.3,
      • 24.26,
      • 0.07,
      • 24.31,
      • 2,
      • 30.98,
      • 30.27,
      • 1.47,
      • 31.31,
      • 2,
      • 36.56,
      • 36.56,
      • 0,
      • 36.56,
      • 1,
      • "LlamaForCausalLM",
      • "?",
      • 9,
      • 0,
      • true,
      • ""
      ],
    • [
      • 27,
      • "๐Ÿ”ต",
      • "5๏ธโƒฃ",
      • true,
      • "<a target="_blank" href="https://huggingface.co/swap-uniba/LLaMAntino-3-ANITA-8B-Inst-DPO-ITA" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">LLaMAntino-3-ANITA-8B-Inst-DPO-ITA</a>",
      • 49.39,
      • 72.91,
      • 71.75,
      • 3.34,
      • 76.75,
      • 1,
      • 72.07,
      • 70.82,
      • 3.81,
      • 76.05,
      • 4,
      • 70.99,
      • 69.88,
      • 2.93,
      • 74.19,
      • 4,
      • 52.81,
      • 46.4,
      • 13.88,
      • 64.4,
      • 4,
      • 56.82,
      • 52.88,
      • 10.47,
      • 63.74,
      • 1,
      • 56.55,
      • 52.41,
      • 27.9,
      • 88.53,
      • 3,
      • 23.26,
      • 22.92,
      • 0.64,
      • 23.37,
      • 1,
      • 29.1,
      • 28.92,
      • 0.37,
      • 29.18,
      • 2,
      • 17.2,
      • 15.77,
      • 2.46,
      • 17.51,
      • 2,
      • 42.14,
      • 41.31,
      • 2.05,
      • 42.76,
      • 1,
      • "LlamaForCausalLM",
      • "?",
      • 9,
      • 0,
      • true,
      • ""
      ],
    • [
      • 28,
      • "๐Ÿ”ต",
      • "5๏ธโƒฃ",
      • true,
      • "<a target="_blank" href="https://huggingface.co/mii-llm/maestrale-chat-v0.4-beta" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">maestrale-chat-v0.4-beta</a>",
      • 49.37,
      • 73.89,
      • 72.62,
      • 6.4,
      • 78.5,
      • 4,
      • 72.23,
      • 71.75,
      • 1.77,
      • 73.57,
      • 2,
      • 66.86,
      • 64.99,
      • 8.41,
      • 71.56,
      • 3,
      • 55.62,
      • 50.7,
      • 10.77,
      • 64.6,
      • 3,
      • 60.35,
      • 59.48,
      • 2.2,
      • 61.76,
      • 6,
      • 51.66,
      • 43.81,
      • 22.67,
      • 74.06,
      • 3,
      • 22.54,
      • 22.36,
      • 0.32,
      • 22.59,
      • 1,
      • 21.18,
      • 21.18,
      • 0.01,
      • 21.18,
      • 2,
      • 33.53,
      • 33.24,
      • 0.62,
      • 33.68,
      • 1,
      • 35.81,
      • 35.36,
      • 0.98,
      • 36.06,
      • 2,
      • "MistralForCausalLM",
      • "?",
      • 8,
      • 0,
      • true,
      • ""
      ],
    • [
      • 29,
      • "๐Ÿ”ต",
      • "5๏ธโƒฃ",
      • true,
      • "<a target="_blank" href="https://huggingface.co/CohereForAI/aya-expanse-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">aya-expanse-8b</a>",
      • 49.3,
      • 72.91,
      • 71.88,
      • 4.42,
      • 76.25,
      • 2,
      • 72.28,
      • 71.9,
      • 1.11,
      • 73.3,
      • 5,
      • 67.71,
      • 65.83,
      • 5.06,
      • 72.71,
      • 4,
      • 52.54,
      • 46.27,
      • 13.05,
      • 63.4,
      • 4,
      • 62.02,
      • 59.58,
      • 4.03,
      • 66.98,
      • 3,
      • 54.39,
      • 51.87,
      • 31.88,
      • 94.01,
      • 4,
      • 20.55,
      • 20.37,
      • 0.33,
      • 20.6,
      • 2,
      • 18.32,
      • 18.27,
      • 0.08,
      • 18.33,
      • 1,
      • 35.73,
      • 35.67,
      • 0.13,
      • 35.77,
      • 2,
      • 36.56,
      • 36.56,
      • 0,
      • 36.56,
      • 1,
      • "CohereForCausalLM",
      • "?",
      • 0,
      • 0,
      • true,
      • ""
      ],
    • [
      • 30,
      • "๐Ÿ”ต๐Ÿ”ต๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/meta-llama/Llama-3.3-70B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Llama-3.3-70B-Instruct</a>",
      • 47.69,
      • 68.2,
      • 65.29,
      • 10.42,
      • 79.5,
      • 4,
      • 63.44,
      • 59.3,
      • 18.97,
      • 80.31,
      • 4,
      • 70.17,
      • 67.84,
      • 8.28,
      • 78.84,
      • 3,
      • 61.73,
      • 58.93,
      • 25.42,
      • 91.4,
      • 3,
      • 57.84,
      • 52.8,
      • 19.32,
      • 69.15,
      • 2,
      • 54.3,
      • 53.74,
      • 34.58,
      • 98.75,
      • 3,
      • 28.41,
      • 23.67,
      • 9.62,
      • 30.48,
      • 2,
      • 24.44,
      • 24.12,
      • 0.6,
      • 24.54,
      • 2,
      • 20.81,
      • 18.96,
      • 3.32,
      • 21.31,
      • 1,
      • 27.56,
      • 27.26,
      • 0.59,
      • 27.67,
      • 2,
      • "LlamaForCausalLM",
      • "?",
      • 71,
      • 0,
      • true,
      • ""
      ],
    • [
      • 31,
      • "๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/google/gemma-2-9b-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gemma-2-9b-it</a> ๐Ÿ”ต๐ŸŽ–๏ธ",
      • 47.54,
      • 71.99,
      • 70.12,
      • 8.85,
      • 79,
      • 2,
      • 64.35,
      • 61.74,
      • 11.1,
      • 70.63,
      • 4,
      • 64.22,
      • 61.92,
      • 7.4,
      • 69.42,
      • 4,
      • 57.94,
      • 52.27,
      • 19.76,
      • 77.8,
      • 4,
      • 42.48,
      • 30.36,
      • 28.96,
      • 64.67,
      • 5,
      • 51.53,
      • 50.58,
      • 34.19,
      • 98,
      • 4,
      • 25.45,
      • 25.29,
      • 0.3,
      • 25.5,
      • 2,
      • 30.48,
      • 30.1,
      • 0.78,
      • 30.65,
      • 2,
      • 32.01,
      • 31.63,
      • 0.8,
      • 32.2,
      • 1,
      • 34.97,
      • 34.56,
      • 0.9,
      • 35.19,
      • 1,
      • "Gemma2ForCausalLM",
      • "?",
      • 10,
      • 0,
      • true,
      • ""
      ],
    • [
      • 32,
      • "๐Ÿ”ต๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/google/gemma-3-12b-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gemma-3-12b-it</a>",
      • 47.35,
      • 71.3,
      • 69.25,
      • 13.64,
      • 79,
      • 4,
      • 61.91,
      • 57.59,
      • 16.98,
      • 74.6,
      • 4,
      • 67.19,
      • 65.51,
      • 4.28,
      • 71.4,
      • 3,
      • 58.57,
      • 53.63,
      • 23.4,
      • 83.4,
      • 4,
      • 46.06,
      • 35.76,
      • 22.92,
      • 66.76,
      • 5,
      • 52.91,
      • 52.33,
      • 35.77,
      • 98.75,
      • 3,
      • 21.85,
      • 20.17,
      • 3.05,
      • 22.33,
      • 2,
      • 29.93,
      • 29.46,
      • 0.95,
      • 30.13,
      • 2,
      • 31.5,
      • 29.49,
      • 4.22,
      • 32.47,
      • 1,
      • 32.23,
      • 32.16,
      • 0.16,
      • 32.27,
      • 1,
      • "Gemma3ForConditionalGeneration",
      • "?",
      • 13,
      • 0,
      • true,
      • ""
      ],
    • [
      • 33,
      • "๐Ÿ”ต",
      • "5๏ธโƒฃ",
      • true,
      • "<a target="_blank" href="https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Mistral-7B-Instruct-v0.3</a>",
      • 47.31,
      • 74.4,
      • 73.5,
      • 3.09,
      • 77.5,
      • 2,
      • 69.31,
      • 67.96,
      • 5.46,
      • 72.97,
      • 6,
      • 64.44,
      • 63.62,
      • 1.73,
      • 66.03,
      • 4,
      • 49.33,
      • 43.33,
      • 10.84,
      • 57.4,
      • 3,
      • 53.87,
      • 47.58,
      • 15.97,
      • 66.16,
      • 4,
      • 52.92,
      • 51.46,
      • 34.9,
      • 96.76,
      • 3,
      • 18.21,
      • 18.16,
      • 0.08,
      • 18.22,
      • 1,
      • 20.59,
      • 20.3,
      • 0.51,
      • 20.66,
      • 2,
      • 33.18,
      • 30.65,
      • 5.48,
      • 34.52,
      • 2,
      • 36.89,
      • 36.85,
      • 0.1,
      • 36.92,
      • 2,
      • "MistralForCausalLM",
      • "?",
      • 8,
      • 0,
      • true,
      • ""
      ],
    • [
      • 34,
      • "๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/DeepMount00/Lexora-Medium-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Lexora-Medium-7B</a>",
      • 46.93,
      • 71.85,
      • 69.54,
      • 11.89,
      • 83.75,
      • 4,
      • 57.37,
      • 51.92,
      • 16.82,
      • 70.24,
      • 4,
      • 64.32,
      • 62.94,
      • 3.07,
      • 67.16,
      • 3,
      • 50.59,
      • 42.37,
      • 23.91,
      • 74,
      • 3,
      • 62.42,
      • 60.14,
      • 7.94,
      • 67.04,
      • 2,
      • 53.55,
      • 52.54,
      • 34.93,
      • 97.76,
      • 3,
      • 40.88,
      • 40.46,
      • 1,
      • 41.17,
      • 2,
      • 28.36,
      • 27.76,
      • 1.18,
      • 28.6,
      • 2,
      • 27.52,
      • 27.28,
      • 0.46,
      • 27.61,
      • 2,
      • 12.46,
      • 10.19,
      • 3.68,
      • 12.79,
      • 1,
      • "Qwen2ForCausalLM",
      • "?",
      • 8,
      • 0,
      • true,
      • ""
      ],
    • [
      • 35,
      • "๐Ÿ”ต",
      • "5๏ธโƒฃ",
      • true,
      • "<a target="_blank" href="https://huggingface.co/google/gemma-3-4b-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gemma-3-4b-it</a>",
      • 46.57,
      • 68.28,
      • 66.46,
      • 5.92,
      • 73.25,
      • 2,
      • 75.3,
      • 74.96,
      • 1.38,
      • 76.37,
      • 4,
      • 65.02,
      • 62.43,
      • 7.17,
      • 71.53,
      • 4,
      • 51.92,
      • 46.53,
      • 10.36,
      • 60,
      • 4,
      • 49.02,
      • 41.31,
      • 16.77,
      • 61.17,
      • 6,
      • 51.66,
      • 50.04,
      • 35.71,
      • 96.51,
      • 3,
      • 14.55,
      • 14.06,
      • 0.83,
      • 14.64,
      • 1,
      • 17.96,
      • 17.95,
      • 0.01,
      • 17.96,
      • 1,
      • 36,
      • 35.79,
      • 0.47,
      • 36.12,
      • 1,
      • 36.02,
      • 35.98,
      • 0.08,
      • 36.04,
      • 1,
      • "Gemma3ForConditionalGeneration",
      • "?",
      • 0,
      • 0,
      • true,
      • ""
      ],
    • [
      • 36,
      • "๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/Qwen/Qwen2.5-7B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen2.5-7B-Instruct</a>",
      • 45.5,
      • 75.21,
      • 73.67,
      • 13.22,
      • 82.5,
      • 4,
      • 57.18,
      • 51.62,
      • 13.4,
      • 70.43,
      • 4,
      • 64.56,
      • 61.54,
      • 7.85,
      • 72.54,
      • 4,
      • 49.73,
      • 41.03,
      • 22.65,
      • 70.4,
      • 3,
      • 45.89,
      • 35.48,
      • 24.69,
      • 67.11,
      • 5,
      • 52.83,
      • 51.58,
      • 33.77,
      • 97.26,
      • 4,
      • 32.96,
      • 32.42,
      • 1.12,
      • 33.22,
      • 2,
      • 25.39,
      • 24.9,
      • 0.93,
      • 25.56,
      • 2,
      • 14.97,
      • 10.24,
      • 7.94,
      • 15.86,
      • 2,
      • 36.26,
      • 36.11,
      • 0.35,
      • 36.35,
      • 1,
      • "Qwen2ForCausalLM",
      • "?",
      • 8,
      • 0,
      • true,
      • ""
      ],
    • [
      • 37,
      • "๐Ÿ”ต",
      • "5๏ธโƒฃ",
      • true,
      • "<a target="_blank" href="https://huggingface.co/FairMind/Llama-3-8B-4bit-UltraChat-Ita" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Llama-3-8B-4bit-UltraChat-Ita</a>",
      • 44.93,
      • 65.05,
      • 62.62,
      • 7.72,
      • 71,
      • 1,
      • 69.44,
      • 68.97,
      • 1.12,
      • 70.56,
      • 4,
      • 61.51,
      • 60.03,
      • 3.48,
      • 64.16,
      • 4,
      • 47.01,
      • 39.57,
      • 12.67,
      • 56.8,
      • 3,
      • 52.79,
      • 47.59,
      • 10.75,
      • 60.88,
      • 3,
      • 52.88,
      • 48.26,
      • 29.59,
      • 88.53,
      • 4,
      • 22.47,
      • 21.8,
      • 1.22,
      • 22.67,
      • 2,
      • 21.41,
      • 21.25,
      • 0.28,
      • 21.45,
      • 1,
      • 29.77,
      • 29.17,
      • 1.21,
      • 30.02,
      • 2,
      • 27,
      • 26.27,
      • 1.41,
      • 27.27,
      • 2,
      • "LlamaForCausalLM",
      • "?",
      • 9,
      • 0,
      • true,
      • ""
      ],
    • [
      • 38,
      • "๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/google/gemma-3-4b-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gemma-3-4b-it</a>",
      • 44.59,
      • 66.22,
      • 64.88,
      • 5.21,
      • 69.25,
      • 3,
      • 60.41,
      • 57.66,
      • 8.12,
      • 65.65,
      • 5,
      • 59.8,
      • 57.32,
      • 10.14,
      • 64.25,
      • 5,
      • 46.01,
      • 38.87,
      • 12.57,
      • 54.6,
      • 3,
      • 50.8,
      • 44.13,
      • 21.85,
      • 61.44,
      • 5,
      • 52.55,
      • 48.67,
      • 32.61,
      • 90.77,
      • 3,
      • 32.37,
      • 30.72,
      • 3.47,
      • 33.18,
      • 1,
      • 29.24,
      • 28.93,
      • 0.62,
      • 29.37,
      • 2,
      • 29.76,
      • 28.92,
      • 1.71,
      • 30.13,
      • 2,
      • 18.7,
      • 18.34,
      • 0.62,
      • 18.78,
      • 2,
      • "Gemma3ForConditionalGeneration",
      • "?",
      • 0,
      • 0,
      • true,
      • ""
      ],
    • [
      • 39,
      • "๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/microsoft/Phi-3.5-mini-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Phi-3.5-mini-instruct</a>",
      • 44.4,
      • 72.16,
      • 70.04,
      • 11.65,
      • 81.5,
      • 4,
      • 51.63,
      • 43.72,
      • 20.43,
      • 70.59,
      • 3,
      • 65.93,
      • 64.49,
      • 3.24,
      • 69.17,
      • 4,
      • 48.9,
      • 40.67,
      • 17.65,
      • 62.8,
      • 3,
      • 60.37,
      • 56.97,
      • 13.07,
      • 67.41,
      • 5,
      • 51.52,
      • 44.22,
      • 25.29,
      • 79.05,
      • 3,
      • 20.38,
      • 18.28,
      • 3.75,
      • 20.94,
      • 2,
      • 23.24,
      • 22.7,
      • 1,
      • 23.4,
      • 2,
      • 30.58,
      • 30.35,
      • 0.47,
      • 30.68,
      • 2,
      • 19.33,
      • 14.4,
      • 8.78,
      • 20.61,
      • 1,
      • "Phi3ForCausalLM",
      • "?",
      • 4,
      • 0,
      • true,
      • ""
      ],
    • [
      • 40,
      • "๐Ÿ”ต๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/Qwen/Qwen2.5-14B-Instruct-1M" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen2.5-14B-Instruct-1M</a>",
      • 44.36,
      • 69.51,
      • 66.88,
      • 14.54,
      • 86.75,
      • 4,
      • 54.07,
      • 47.09,
      • 22.28,
      • 74.98,
      • 3,
      • 59.97,
      • 54.95,
      • 20.19,
      • 75.11,
      • 4,
      • 53.29,
      • 47.3,
      • 27.9,
      • 83.4,
      • 3,
      • 48.97,
      • 40.07,
      • 28.87,
      • 67.2,
      • 6,
      • 54.41,
      • 53.74,
      • 34.68,
      • 98.5,
      • 3,
      • 36.76,
      • 35.98,
      • 1.77,
      • 37.23,
      • 2,
      • 25.17,
      • 25.1,
      • 0.13,
      • 25.19,
      • 2,
      • 8.15,
      • 6.63,
      • 2.34,
      • 8.28,
      • 2,
      • 33.32,
      • 33.06,
      • 0.54,
      • 33.45,
      • 1,
      • "Qwen2ForCausalLM",
      • "?",
      • 15,
      • 0,
      • true,
      • ""
      ],
    • [
      • 41,
      • "๐Ÿ”ต",
      • "5๏ธโƒฃ",
      • true,
      • "<a target="_blank" href="https://huggingface.co/HuggingFaceH4/zephyr-7b-beta" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zephyr-7b-beta</a>",
      • 43.84,
      • 70.69,
      • 69.46,
      • 3.73,
      • 74.25,
      • 2,
      • 69.9,
      • 69.11,
      • 1.52,
      • 71.92,
      • 6,
      • 60.37,
      • 57.29,
      • 7.15,
      • 66.49,
      • 4,
      • 49.31,
      • 43.3,
      • 10.2,
      • 57.4,
      • 3,
      • 40.25,
      • 30.87,
      • 11.92,
      • 49.42,
      • 4,
      • 42.79,
      • 34.37,
      • 9.46,
      • 51.87,
      • 4,
      • 11.45,
      • 11.42,
      • 0.04,
      • 11.45,
      • 2,
      • 24.26,
      • 23.94,
      • 0.59,
      • 24.36,
      • 2,
      • 32.82,
      • 32.72,
      • 0.2,
      • 32.87,
      • 1,
      • 36.53,
      • 36.47,
      • 0.13,
      • 36.56,
      • 1,
      • "MistralForCausalLM",
      • "?",
      • 8,
      • 0,
      • true,
      • ""
      ],
    • [
      • 42,
      • "๐Ÿ”ต",
      • "5๏ธโƒฃ",
      • true,
      • "<a target="_blank" href="https://huggingface.co/MoxoffSpA/Volare" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Volare</a>",
      • 43.73,
      • 66.86,
      • 65.17,
      • 5.77,
      • 71,
      • 2,
      • 68.39,
      • 66.88,
      • 4.66,
      • 72.33,
      • 4,
      • 49.59,
      • 45.34,
      • 9.24,
      • 54.73,
      • 4,
      • 48.92,
      • 41.83,
      • 13.46,
      • 59.2,
      • 3,
      • 44.42,
      • 36.76,
      • 15.28,
      • 53.1,
      • 1,
      • 49.76,
      • 41.36,
      • 21.25,
      • 75.06,
      • 4,
      • 28,
      • 27.25,
      • 1.47,
      • 28.29,
      • 1,
      • 17.48,
      • 17.37,
      • 0.19,
      • 17.5,
      • 2,
      • 31.55,
      • 31.49,
      • 0.12,
      • 31.57,
      • 2,
      • 32.34,
      • 31.3,
      • 2.18,
      • 32.85,
      • 1,
      • "GemmaForCausalLM",
      • "?",
      • 0,
      • 0,
      • true,
      • ""
      ],
    • [
      • 43,
      • "๐Ÿ”ต",
      • "5๏ธโƒฃ",
      • true,
      • "<a target="_blank" href="https://huggingface.co/occiglot/occiglot-7b-it-en-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">occiglot-7b-it-en-instruct</a>",
      • 43.69,
      • 64.34,
      • 61.83,
      • 6.5,
      • 70.25,
      • 2,
      • 69.04,
      • 68.43,
      • 1.37,
      • 70.48,
      • 4,
      • 61.77,
      • 58.3,
      • 11.61,
      • 69.78,
      • 4,
      • 54.16,
      • 49.43,
      • 9.4,
      • 61.8,
      • 4,
      • 55.72,
      • 51.2,
      • 11.26,
      • 63.65,
      • 3,
      • 51.6,
      • 43.68,
      • 22,
      • 73.07,
      • 4,
      • 5.6,
      • 5.4,
      • 0.3,
      • 5.61,
      • 2,
      • 21.25,
      • 20.55,
      • 1.26,
      • 21.44,
      • 2,
      • 25.15,
      • 24.93,
      • 0.41,
      • 25.22,
      • 2,
      • 28.28,
      • 27.2,
      • 2.14,
      • 28.71,
      • 1,
      • "MistralForCausalLM",
      • "?",
      • 0,
      • 0,
      • true,
      • ""
      ],
    • [
      • 44,
      • "๐Ÿ”ต๐Ÿ”ต",
      • "5๏ธโƒฃ",
      • true,
      • "<a target="_blank" href="https://huggingface.co/Almawave/Velvet-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Velvet-14B</a>",
      • 42.69,
      • 74.99,
      • 74.46,
      • 2.19,
      • 76.75,
      • 2,
      • 68.28,
      • 67.99,
      • 0.98,
      • 68.93,
      • 1,
      • 67.17,
      • 66.35,
      • 2.31,
      • 68.97,
      • 4,
      • 54.18,
      • 49.73,
      • 8.58,
      • 61.2,
      • 4,
      • 29.29,
      • 15.41,
      • 14.11,
      • 37.69,
      • 6,
      • 49.28,
      • 41.73,
      • 21.95,
      • 80.55,
      • 4,
      • 9.43,
      • 9.21,
      • 0.34,
      • 9.45,
      • 1,
      • 34.66,
      • 34.25,
      • 0.88,
      • 34.88,
      • 2,
      • 24.83,
      • 24.48,
      • 0.66,
      • 24.94,
      • 2,
      • 14.79,
      • 13.08,
      • 2.86,
      • 15.1,
      • 1,
      • "MistralForCausalLM",
      • "?",
      • 15,
      • 0,
      • true,
      • ""
      ],
    • [
      • 45,
      • "๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/arcee-ai/Llama-3.1-SuperNova-Lite" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Llama-3.1-SuperNova-Lite</a>",
      • 42.66,
      • 70.06,
      • 68.54,
      • 6.62,
      • 74.5,
      • 4,
      • 66.85,
      • 63.8,
      • 12.28,
      • 77.12,
      • 4,
      • 56.41,
      • 52.21,
      • 12.37,
      • 63.8,
      • 5,
      • 51.47,
      • 43.63,
      • 19.37,
      • 68.6,
      • 3,
      • 50.8,
      • 42.88,
      • 24.72,
      • 66.58,
      • 6,
      • 54.89,
      • 51.16,
      • 30.2,
      • 90.52,
      • 4,
      • 23.74,
      • 22.04,
      • 3.17,
      • 24.29,
      • 2,
      • 22.77,
      • 22.74,
      • 0.05,
      • 22.78,
      • 2,
      • 9.98,
      • 8.86,
      • 1.76,
      • 10.11,
      • 2,
      • 19.61,
      • 17.82,
      • 3.17,
      • 20.06,
      • 2,
      • "LlamaForCausalLM",
      • "?",
      • 9,
      • 0,
      • true,
      • ""
      ],
    • [
      • 46,
      • "๐Ÿ”ต",
      • "5๏ธโƒฃ",
      • true,
      • "<a target="_blank" href="https://huggingface.co/Fastweb/FastwebMIIA-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FastwebMIIA-7B</a>",
      • 42.56,
      • 59.24,
      • 56.79,
      • 4.22,
      • 63.5,
      • 2,
      • 58.62,
      • 56.18,
      • 5.14,
      • 62.72,
      • 3,
      • 54.45,
      • 49.7,
      • 10.75,
      • 62.31,
      • 3,
      • 49.48,
      • 44.47,
      • 8.35,
      • 55.8,
      • 3,
      • 56.57,
      • 52.09,
      • 11.01,
      • 64.81,
      • 3,
      • 48.87,
      • 40.07,
      • 19.11,
      • 65.84,
      • 3,
      • 11.89,
      • 11.49,
      • 0.64,
      • 11.94,
      • 1,
      • 28.1,
      • 27.83,
      • 0.52,
      • 28.2,
      • 2,
      • 29,
      • 28.43,
      • 1.14,
      • 29.24,
      • 1,
      • 29.4,
      • 28.16,
      • 2.51,
      • 29.93,
      • 1,
      • "MistralForCausalLM",
      • "?",
      • 8,
      • 0,
      • true,
      • ""
      ],
    • [
      • 47,
      • "๐Ÿ”ต๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/microsoft/Phi-3-medium-4k-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Phi-3-medium-4k-instruct</a>",
      • 42.09,
      • 66.01,
      • 62.58,
      • 11.76,
      • 78.5,
      • 3,
      • 55.31,
      • 48.78,
      • 19.52,
      • 72.58,
      • 3,
      • 65.19,
      • 62.35,
      • 8.99,
      • 72.75,
      • 4,
      • 54.11,
      • 47.67,
      • 25.17,
      • 80.2,
      • 3,
      • 53.5,
      • 46.92,
      • 30.89,
      • 66.67,
      • 1,
      • 53.24,
      • 48.67,
      • 30.21,
      • 88.53,
      • 4,
      • 0.34,
      • 0.31,
      • 0.04,
      • 0.34,
      • 2,
      • 27.64,
      • 26.52,
      • 2.21,
      • 28.08,
      • 2,
      • 18.6,
      • 17.48,
      • 1.97,
      • 18.87,
      • 1,
      • 27.01,
      • 26.99,
      • 0.02,
      • 27.01,
      • 2,
      • "Phi3ForCausalLM",
      • "?",
      • 14,
      • 0,
      • true,
      • ""
      ],
    • [
      • 48,
      • "๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/google/gemma-3n-E4B-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gemma-3n-E4B-it</a>",
      • 42.05,
      • 68.01,
      • 65.58,
      • 9.41,
      • 75.5,
      • 3,
      • 64.7,
      • 61.45,
      • 12.94,
      • 73.95,
      • 3,
      • 66.37,
      • 64.2,
      • 6.07,
      • 71.93,
      • 4,
      • 53.34,
      • 46.07,
      • 20.99,
      • 73.2,
      • 3,
      • 39.71,
      • 26.32,
      • 26.78,
      • 67.4,
      • 6,
      • 53.1,
      • 49.92,
      • 32.9,
      • 92.52,
      • 4,
      • 5.94,
      • 3.44,
      • 3.75,
      • 6.1,
      • 2,
      • 29.76,
      • 29.4,
      • 0.74,
      • 29.92,
      • 2,
      • 9.86,
      • 8.13,
      • 2.72,
      • 10.05,
      • 2,
      • 29.73,
      • 29.07,
      • 1.32,
      • 30.01,
      • 1,
      • "Gemma3nForConditionalGeneration",
      • "?",
      • 8,
      • 0,
      • true,
      • ""
      ],
    • [
      • 49,
      • "๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/swap-uniba/LLaMAntino-3-ANITA-8B-Inst-DPO-ITA" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">LLaMAntino-3-ANITA-8B-Inst-DPO-ITA</a>",
      • 41.74,
      • 62.14,
      • 58.92,
      • 8.59,
      • 69.5,
      • 2,
      • 64.05,
      • 60.96,
      • 12.61,
      • 72.04,
      • 4,
      • 48.59,
      • 39.59,
      • 30.9,
      • 66.32,
      • 6,
      • 48.85,
      • 40.73,
      • 16.57,
      • 62.2,
      • 4,
      • 57.27,
      • 52.6,
      • 20.2,
      • 66.57,
      • 6,
      • 51.01,
      • 42.85,
      • 18.14,
      • 71.82,
      • 4,
      • 19.37,
      • 19.35,
      • 0.03,
      • 19.37,
      • 1,
      • 22.63,
      • 22.37,
      • 0.48,
      • 22.71,
      • 2,
      • 22.65,
      • 19.02,
      • 6.75,
      • 23.79,
      • 1,
      • 20.79,
      • 17.81,
      • 5.37,
      • 21.61,
      • 2,
      • "LlamaForCausalLM",
      • "?",
      • 9,
      • 0,
      • true,
      • ""
      ],
    • [
      • 50,
      • "๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Mistral-7B-Instruct-v0.3</a>",
      • 41.56,
      • 63.83,
      • 60.83,
      • 6.86,
      • 71.25,
      • 1,
      • 61.23,
      • 58.71,
      • 7.98,
      • 66.14,
      • 6,
      • 59.16,
      • 55.86,
      • 14.5,
      • 65.39,
      • 5,
      • 46.03,
      • 39.23,
      • 11.54,
      • 54,
      • 3,
      • 62.89,
      • 61.08,
      • 6.85,
      • 66.49,
      • 2,
      • 53.28,
      • 47.63,
      • 27.27,
      • 84.79,
      • 4,
      • 9.87,
      • 6.47,
      • 5.36,
      • 10.26,
      • 2,
      • 28.1,
      • 27.28,
      • 1.63,
      • 28.43,
      • 2,
      • 9.72,
      • 9.64,
      • 0.13,
      • 9.73,
      • 1,
      • 21.49,
      • 20.29,
      • 2.18,
      • 21.83,
      • 2,
      • "MistralForCausalLM",
      • "?",
      • 8,
      • 0,
      • true,
      • ""
      ],
    • [
      • 51,
      • "๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/mii-llm/maestrale-chat-v0.4-beta" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">maestrale-chat-v0.4-beta</a>",
      • 41.04,
      • 60.19,
      • 55.42,
      • 10.3,
      • 73.25,
      • 4,
      • 63.23,
      • 61.22,
      • 8.35,
      • 67.4,
      • 4,
      • 61.53,
      • 60.62,
      • 3.45,
      • 63.07,
      • 6,
      • 51.91,
      • 45.17,
      • 15.17,
      • 63.8,
      • 3,
      • 46.35,
      • 36.19,
      • 27.34,
      • 66.84,
      • 6,
      • 52.36,
      • 46.09,
      • 24.45,
      • 83.04,
      • 4,
      • 20.38,
      • 19.22,
      • 2.07,
      • 20.68,
      • 2,
      • 25.69,
      • 25.14,
      • 1.07,
      • 25.89,
      • 2,
      • 5.02,
      • 4.62,
      • 0.6,
      • 5.05,
      • 1,
      • 23.76,
      • 21.98,
      • 3.32,
      • 24.33,
      • 2,
      • "MistralForCausalLM",
      • "?",
      • 8,
      • 0,
      • true,
      • ""
      ],
    • [
      • 52,
      • "๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/DeepMount00/Llama-3-8b-Ita" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Llama-3-8b-Ita</a>",
      • 41.02,
      • 58.59,
      • 56,
      • 6.41,
      • 63,
      • 2,
      • 62.59,
      • 60.33,
      • 8.3,
      • 67.21,
      • 5,
      • 49.12,
      • 40.78,
      • 31.64,
      • 63.86,
      • 6,
      • 49.04,
      • 40.83,
      • 17.06,
      • 63,
      • 4,
      • 55.44,
      • 49.72,
      • 25.38,
      • 67.12,
      • 6,
      • 51.22,
      • 46.22,
      • 25.61,
      • 88.03,
      • 4,
      • 22.59,
      • 21.46,
      • 2.07,
      • 22.93,
      • 2,
      • 22.76,
      • 22.51,
      • 0.47,
      • 22.84,
      • 1,
      • 25.08,
      • 20.84,
      • 8.18,
      • 26.62,
      • 1,
      • 13.78,
      • 12.08,
      • 2.79,
      • 14.06,
      • 2,
      • "LlamaForCausalLM",
      • "?",
      • 9,
      • 0,
      • true,
      • ""
      ],
    • [
      • 53,
      • "๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/meta-llama/Meta-Llama-3.1-8B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Meta-Llama-3.1-8B-Instruct</a>",
      • 40.23,
      • 62.73,
      • 60.62,
      • 4.93,
      • 67,
      • 1,
      • 63.07,
      • 59.8,
      • 11.31,
      • 71.12,
      • 5,
      • 51.39,
      • 44.49,
      • 18.77,
      • 63.26,
      • 5,
      • 50.55,
      • 42.27,
      • 20.77,
      • 69,
      • 4,
      • 55.86,
      • 50.4,
      • 22.1,
      • 66.93,
      • 6,
      • 53.12,
      • 48.29,
      • 27.52,
      • 87.78,
      • 4,
      • 19.94,
      • 15.75,
      • 7.51,
      • 21.06,
      • 2,
      • 22.33,
      • 22.3,
      • 0.06,
      • 22.34,
      • 1,
      • 7.93,
      • 7.57,
      • 0.55,
      • 7.96,
      • 1,
      • 15.42,
      • 15.08,
      • 0.57,
      • 15.48,
      • 2,
      • "LlamaForCausalLM",
      • "?",
      • 9,
      • 0,
      • true,
      • ""
      ],
    • [
      • 54,
      • "๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/CohereForAI/aya-expanse-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">aya-expanse-8b</a>",
      • 39.85,
      • 64,
      • 60.33,
      • 10.96,
      • 75,
      • 1,
      • 63.23,
      • 60.16,
      • 11.1,
      • 70.62,
      • 4,
      • 61.4,
      • 60.48,
      • 3.13,
      • 62.97,
      • 4,
      • 46.66,
      • 38.53,
      • 14.81,
      • 57.8,
      • 4,
      • 52.79,
      • 45.95,
      • 30.08,
      • 66.19,
      • 3,
      • 47.56,
      • 38.03,
      • 14.59,
      • 66.33,
      • 4,
      • 15.24,
      • 11.62,
      • 6.07,
      • 15.92,
      • 2,
      • 19.14,
      • 18.84,
      • 0.52,
      • 19.21,
      • 2,
      • 19.08,
      • 17.36,
      • 3.03,
      • 19.5,
      • 2,
      • 9.37,
      • 7.52,
      • 2.9,
      • 9.57,
      • 2,
      • "CohereForCausalLM",
      • "?",
      • 0,
      • 0,
      • true,
      • ""
      ],
    • [
      • 55,
      • "๐Ÿ”ต๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/Almawave/Velvet-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Velvet-14B</a>",
      • 39.48,
      • 69.96,
      • 67.62,
      • 7.72,
      • 78.5,
      • 1,
      • 62.53,
      • 60.49,
      • 6.86,
      • 66.59,
      • 5,
      • 59.27,
      • 55.93,
      • 12.94,
      • 65.64,
      • 3,
      • 48.79,
      • 42.4,
      • 11.14,
      • 57.4,
      • 4,
      • 47.33,
      • 37.91,
      • 30.77,
      • 64.31,
      • 5,
      • 50.08,
      • 45.47,
      • 26.63,
      • 89.53,
      • 4,
      • 0.13,
      • 0.06,
      • 0.09,
      • 0.13,
      • 1,
      • 31.1,
      • 31.09,
      • 0.03,
      • 31.11,
      • 2,
      • 15.89,
      • 15.75,
      • 0.24,
      • 15.91,
      • 2,
      • 9.68,
      • 6.96,
      • 4.28,
      • 9.98,
      • 1,
      • "MistralForCausalLM",
      • "?",
      • 15,
      • 0,
      • true,
      • ""
      ],
    • [
      • 56,
      • "๐Ÿ”ต๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/microsoft/phi-4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">phi-4</a>",
      • 38.37,
      • 63.85,
      • 60.42,
      • 7.82,
      • 73.25,
      • 4,
      • 50.54,
      • 42.31,
      • 23.16,
      • 74.26,
      • 4,
      • 61.42,
      • 60.41,
      • 3.23,
      • 63.15,
      • 4,
      • 55.69,
      • 50.13,
      • 25.25,
      • 83.2,
      • 3,
      • 54.67,
      • 48.23,
      • 27.02,
      • 69.01,
      • 3,
      • 53.53,
      • 51.45,
      • 31.22,
      • 95.26,
      • 4,
      • 0,
      • 0,
      • 0,
      • 0,
      • 1,
      • 22.5,
      • 13.28,
      • 17.55,
      • 25.69,
      • 1,
      • 0.36,
      • 0.18,
      • 0.26,
      • 0.36,
      • 1,
      • 21.17,
      • 19.38,
      • 3.22,
      • 21.66,
      • 1,
      • "Phi3ForCausalLM",
      • "?",
      • 15,
      • 0,
      • true,
      • ""
      ],
    • [
      • 57,
      • "๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/occiglot/occiglot-7b-it-en-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">occiglot-7b-it-en-instruct</a>",
      • 38,
      • 52.59,
      • 49.92,
      • 4.48,
      • 56,
      • 4,
      • 55.46,
      • 51.87,
      • 9.82,
      • 61.09,
      • 6,
      • 53.79,
      • 48.48,
      • 20.94,
      • 62.72,
      • 1,
      • 47.95,
      • 42.8,
      • 8.92,
      • 54,
      • 3,
      • 50.78,
      • 42.86,
      • 28.9,
      • 66.49,
      • 5,
      • 49.64,
      • 42.89,
      • 21.86,
      • 83.29,
      • 4,
      • 2.84,
      • 1.56,
      • 1.87,
      • 2.88,
      • 2,
      • 27.68,
      • 25.5,
      • 4.33,
      • 28.56,
      • 2,
      • 6.96,
      • 5.26,
      • 2.58,
      • 7.09,
      • 1,
      • 32.31,
      • 32.03,
      • 0.6,
      • 32.45,
      • 2,
      • "MistralForCausalLM",
      • "?",
      • 0,
      • 0,
      • true,
      • ""
      ],
    • [
      • 58,
      • "๐Ÿ”ต",
      • "5๏ธโƒฃ",
      • true,
      • "<a target="_blank" href="https://huggingface.co/swap-uniba/LLaMAntino-2-7b-hf-ITA" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">LLaMAntino-2-7b-hf-ITA</a>",
      • 37.86,
      • 54.01,
      • 53.21,
      • 1.55,
      • 55,
      • 3,
      • 56.57,
      • 52.85,
      • 8.85,
      • 62.87,
      • 5,
      • 52.28,
      • 46.57,
      • 12.64,
      • 61.35,
      • 3,
      • 31.71,
      • 29.47,
      • 2.29,
      • 32.8,
      • 4,
      • 51.54,
      • 45.13,
      • 13.92,
      • 62,
      • 3,
      • 33.97,
      • 30.51,
      • 2.74,
      • 35.91,
      • 4,
      • 19.74,
      • 19.32,
      • 0.74,
      • 19.84,
      • 2,
      • 20.68,
      • 20.62,
      • 0.12,
      • 20.7,
      • 2,
      • 29.28,
      • 29.08,
      • 0.4,
      • 29.36,
      • 2,
      • 28.86,
      • 28.76,
      • 0.19,
      • 28.9,
      • 1,
      • "LlamaForCausalLM",
      • "?",
      • 0,
      • 0,
      • true,
      • ""
      ],
    • [
      • 59,
      • "๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/ibm-granite/granite-3.1-8b-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">granite-3.1-8b-instruct</a>",
      • 37.26,
      • 56.34,
      • 51.08,
      • 8.65,
      • 67,
      • 2,
      • 54.43,
      • 47.82,
      • 20.82,
      • 69.45,
      • 4,
      • 48.15,
      • 39.51,
      • 30.59,
      • 62.64,
      • 1,
      • 49.04,
      • 41.97,
      • 12.63,
      • 59.4,
      • 3,
      • 50.24,
      • 42,
      • 32.2,
      • 66.85,
      • 5,
      • 54.7,
      • 51.29,
      • 31.07,
      • 91.52,
      • 4,
      • 0.17,
      • 0.08,
      • 0.12,
      • 0.17,
      • 2,
      • 30.28,
      • 30.04,
      • 0.49,
      • 30.39,
      • 1,
      • 18.09,
      • 16.03,
      • 3.59,
      • 18.57,
      • 2,
      • 11.21,
      • 9.17,
      • 3.25,
      • 11.47,
      • 2,
      • "GraniteForCausalLM",
      • "?",
      • 9,
      • 0,
      • true,
      • ""
      ],
    • [
      • 60,
      • "๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/HuggingFaceH4/zephyr-7b-beta" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zephyr-7b-beta</a>",
      • 37.26,
      • 60.71,
      • 58.62,
      • 5.3,
      • 64.5,
      • 1,
      • 51.72,
      • 45.08,
      • 16.59,
      • 63.08,
      • 6,
      • 55.33,
      • 50.36,
      • 24.41,
      • 64.26,
      • 5,
      • 43.64,
      • 36.7,
      • 11.19,
      • 50.8,
      • 3,
      • 57.88,
      • 53.45,
      • 16.85,
      • 66.76,
      • 5,
      • 44.43,
      • 34.5,
      • 11.74,
      • 58.35,
      • 4,
      • 2.84,
      • 2.62,
      • 0.33,
      • 2.85,
      • 2,
      • 27.16,
      • 27.06,
      • 0.2,
      • 27.2,
      • 2,
      • 18.43,
      • 18.15,
      • 0.48,
      • 18.49,
      • 2,
      • 10.42,
      • 8.61,
      • 2.87,
      • 10.64,
      • 2,
      • "MistralForCausalLM",
      • "?",
      • 8,
      • 0,
      • true,
      • ""
      ],
    • [
      • 61,
      • "๐Ÿ”ต",
      • "5๏ธโƒฃ",
      • true,
      • "<a target="_blank" href="https://huggingface.co/openai/gpt-oss-20b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gpt-oss-20b</a>",
      • 37.2,
      • 57.08,
      • 53.71,
      • 4.72,
      • 62.75,
      • 1,
      • 58.44,
      • 56.51,
      • 3.11,
      • 61.54,
      • 1,
      • 57.07,
      • 53.93,
      • 11.27,
      • 62.26,
      • 6,
      • 44.63,
      • 34.17,
      • 17.74,
      • 61,
      • 3,
      • 59.38,
      • 57.25,
      • 6.71,
      • 62.99,
      • 1,
      • 50.68,
      • 48.79,
      • 35.9,
      • 96.01,
      • 3,
      • 1.16,
      • 0.61,
      • 0.79,
      • 1.17,
      • 2,
      • 8.85,
      • 5.06,
      • 5.89,
      • 9.23,
      • 1,
      • 1.89,
      • 1.62,
      • 0.4,
      • 1.9,
      • 1,
      • 32.84,
      • 31.88,
      • 2.04,
      • 33.32,
      • 1,
      • "GptOssForCausalLM",
      • "?",
      • 2,
      • 0,
      • true,
      • ""
      ],
    • [
      • 62,
      • "๐Ÿ”ต๐Ÿ”ต๐Ÿ”ต",
      • "5๏ธโƒฃ",
      • true,
      • "<a target="_blank" href="https://huggingface.co/meta-llama/Llama-4-Scout-17B-16E-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Llama-4-Scout-17B-16E-Instruct</a>",
      • 36.52,
      • 56.31,
      • 54.75,
      • 2.78,
      • 58.5,
      • 5,
      • 52.56,
      • 48.54,
      • 7.57,
      • 58.15,
      • 4,
      • 45.82,
      • 42.59,
      • 7.88,
      • 48.91,
      • 4,
      • 23.58,
      • 19.33,
      • 3.17,
      • 25,
      • 4,
      • 56.46,
      • 55.02,
      • 2.22,
      • 58.49,
      • 3,
      • 40.12,
      • 31.67,
      • 8.69,
      • 47.88,
      • 4,
      • 5.7,
      • 5.32,
      • 0.57,
      • 5.72,
      • 2,
      • 21.46,
      • 20.9,
      • 1.03,
      • 21.62,
      • 2,
      • 26.95,
      • 26.7,
      • 0.48,
      • 27.05,
      • 2,
      • 36.21,
      • 35.75,
      • 1.03,
      • 36.47,
      • 2,
      • "Llama4ForConditionalGeneration",
      • "?",
      • 109,
      • 0,
      • true,
      • ""
      ],
    • [
      • 63,
      • "๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/FairMind/Llama-3-8B-4bit-UltraChat-Ita" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Llama-3-8B-4bit-UltraChat-Ita</a>",
      • 36.28,
      • 60.29,
      • 58.08,
      • 4.16,
      • 64.25,
      • 6,
      • 52.97,
      • 46.01,
      • 15.04,
      • 67.3,
      • 5,
      • 54.09,
      • 48.72,
      • 15.55,
      • 63.38,
      • 5,
      • 41.41,
      • 33.37,
      • 11.69,
      • 49.2,
      • 3,
      • 66.22,
      • 66.04,
      • 0.62,
      • 66.58,
      • 4,
      • 43.89,
      • 34.54,
      • 10.55,
      • 55.61,
      • 4,
      • 0,
      • 0,
      • 0,
      • 0,
      • 1,
      • 24.17,
      • 23.6,
      • 1.06,
      • 24.35,
      • 2,
      • 15.59,
      • 13.72,
      • 3.15,
      • 15.95,
      • 2,
      • 4.2,
      • 2.62,
      • 2.33,
      • 4.27,
      • 2,
      • "LlamaForCausalLM",
      • "?",
      • 9,
      • 0,
      • true,
      • ""
      ],
    • [
      • 64,
      • "๐Ÿ”ต",
      • "5๏ธโƒฃ",
      • true,
      • "<a target="_blank" href="https://huggingface.co/sapienzanlp/Minerva-7B-instruct-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Minerva-7B-instruct-v1.0</a>",
      • 35.7,
      • 67.25,
      • 65.33,
      • 8.14,
      • 72.25,
      • 3,
      • 57.68,
      • 54.91,
      • 9.45,
      • 62.25,
      • 6,
      • 41.59,
      • 30.89,
      • 20.05,
      • 54.29,
      • 5,
      • 37.84,
      • 37.27,
      • 1.12,
      • 38.2,
      • 2,
      • 45.48,
      • 36.89,
      • 19.24,
      • 56.73,
      • 5,
      • 29.88,
      • 28.64,
      • 1.69,
      • 30.42,
      • 4,
      • 5,
      • 4.88,
      • 0.18,
      • 5.01,
      • 2,
      • 15.36,
      • 15.12,
      • 0.4,
      • 15.4,
      • 2,
      • 27.24,
      • 27.04,
      • 0.39,
      • 27.32,
      • 2,
      • 29.69,
      • 29.36,
      • 0.67,
      • 29.83,
      • 1,
      • "MistralForCausalLM",
      • "?",
      • 8,
      • 0,
      • true,
      • ""
      ],
    • [
      • 65,
      • "๐Ÿ”ต",
      • "5๏ธโƒฃ",
      • true,
      • "<a target="_blank" href="https://huggingface.co/sapienzanlp/Minerva-7B-base-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Minerva-7B-base-v1.0</a>",
      • 35.06,
      • 68.71,
      • 66.79,
      • 8.95,
      • 74.25,
      • 4,
      • 53.85,
      • 53.14,
      • 1.48,
      • 54.7,
      • 5,
      • 39.49,
      • 34.45,
      • 10.03,
      • 43.34,
      • 5,
      • 38.04,
      • 36.83,
      • 1.62,
      • 38.8,
      • 6,
      • 43.99,
      • 34.55,
      • 19.47,
      • 56,
      • 5,
      • 28.45,
      • 27.27,
      • 2.3,
      • 28.93,
      • 1,
      • 9.53,
      • 9.51,
      • 0.03,
      • 9.53,
      • 2,
      • 15.94,
      • 15.92,
      • 0.05,
      • 15.95,
      • 1,
      • 24.37,
      • 24.29,
      • 0.15,
      • 24.4,
      • 1,
      • 28.2,
      • 26.64,
      • 3.1,
      • 28.83,
      • 1,
      • "MistralForCausalLM",
      • "?",
      • 0,
      • 0,
      • true,
      • ""
      ],
    • [
      • 66,
      • "๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/openai/gpt-oss-20b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gpt-oss-20b</a>",
      • 34.95,
      • 52.49,
      • 50.08,
      • 4.71,
      • 55.5,
      • 1,
      • 58.06,
      • 53.39,
      • 11.42,
      • 68.01,
      • 3,
      • 49.59,
      • 41.82,
      • 22.51,
      • 62.59,
      • 2,
      • 48.07,
      • 38.87,
      • 26.09,
      • 73.2,
      • 3,
      • 49.02,
      • 40.39,
      • 29.4,
      • 65.14,
      • 2,
      • 39.17,
      • 31.26,
      • 7.77,
      • 45.89,
      • 3,
      • 0,
      • 0,
      • 0,
      • 0,
      • 1,
      • 18.31,
      • 17.78,
      • 0.92,
      • 18.43,
      • 1,
      • 1.69,
      • 1.5,
      • 0.27,
      • 1.7,
      • 1,
      • 33.06,
      • 32.45,
      • 1.3,
      • 33.37,
      • 1,
      • "GptOssForCausalLM",
      • "?",
      • 2,
      • 0,
      • true,
      • ""
      ],
    • [
      • 67,
      • "๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/Fastweb/FastwebMIIA-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FastwebMIIA-7B</a>",
      • 34.71,
      • 61.08,
      • 57.75,
      • 5.47,
      • 68.25,
      • 4,
      • 53.02,
      • 47.2,
      • 11.99,
      • 62.9,
      • 3,
      • 52.07,
      • 46.36,
      • 22.64,
      • 60.99,
      • 1,
      • 39.8,
      • 33.07,
      • 8.59,
      • 45.4,
      • 4,
      • 58.66,
      • 54.69,
      • 14.35,
      • 66.58,
      • 2,
      • 39.79,
      • 32.38,
      • 7.26,
      • 46.13,
      • 4,
      • 0,
      • 0,
      • 0,
      • 0,
      • 1,
      • 29.18,
      • 26.83,
      • 4.77,
      • 30.2,
      • 2,
      • 9.34,
      • 9.32,
      • 0.03,
      • 9.34,
      • 1,
      • 4.12,
      • 2.41,
      • 2.53,
      • 4.2,
      • 1,
      • "MistralForCausalLM",
      • "?",
      • 8,
      • 0,
      • true,
      • ""
      ],
    • [
      • 68,
      • "๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/sapienzanlp/Minerva-7B-instruct-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Minerva-7B-instruct-v1.0</a>",
      • 32.5,
      • 55.16,
      • 54.12,
      • 3.76,
      • 56.5,
      • 4,
      • 50.57,
      • 44.51,
      • 11.58,
      • 59.46,
      • 6,
      • 47.62,
      • 39.22,
      • 22.8,
      • 60.48,
      • 2,
      • 32.25,
      • 28.87,
      • 4.14,
      • 34,
      • 6,
      • 57.7,
      • 53.42,
      • 14.91,
      • 66.04,
      • 5,
      • 35.17,
      • 31.05,
      • 3.36,
      • 37.66,
      • 4,
      • 0,
      • 0,
      • 0,
      • 0,
      • 1,
      • 16.34,
      • 16.22,
      • 0.2,
      • 16.36,
      • 2,
      • 9.61,
      • 9.16,
      • 0.71,
      • 9.66,
      • 1,
      • 20.6,
      • 17.75,
      • 5.13,
      • 21.38,
      • 2,
      • "MistralForCausalLM",
      • "?",
      • 8,
      • 0,
      • true,
      • ""
      ],
    • [
      • 69,
      • "๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/sapienzanlp/Minerva-7B-base-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Minerva-7B-base-v1.0</a>",
      • 32.36,
      • 56.93,
      • 54.46,
      • 4.94,
      • 60.75,
      • 1,
      • 56.56,
      • 50.62,
      • 13.19,
      • 71.33,
      • 6,
      • 53.3,
      • 48.04,
      • 13.54,
      • 61.82,
      • 2,
      • 31.37,
      • 27.7,
      • 4.01,
      • 33.2,
      • 5,
      • 61.49,
      • 59.12,
      • 9.19,
      • 66.13,
      • 5,
      • 29.12,
      • 27.81,
      • 1.8,
      • 29.68,
      • 5,
      • 0.01,
      • 0,
      • 0.01,
      • 0.01,
      • 2,
      • 16.27,
      • 16.04,
      • 0.38,
      • 16.31,
      • 1,
      • 9.64,
      • 9.62,
      • 0.04,
      • 9.64,
      • 2,
      • 8.91,
      • 8.09,
      • 1.27,
      • 8.99,
      • 2,
      • "MistralForCausalLM",
      • "?",
      • 0,
      • 0,
      • true,
      • ""
      ],
    • [
      • 70,
      • "๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/swap-uniba/LLaMAntino-2-7b-hf-ITA" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">LLaMAntino-2-7b-hf-ITA</a>",
      • 32.07,
      • 54.5,
      • 53.5,
      • 4.05,
      • 55.75,
      • 6,
      • 51.32,
      • 45.96,
      • 8.92,
      • 59.06,
      • 6,
      • 57,
      • 53.57,
      • 12.58,
      • 62.77,
      • 5,
      • 24.59,
      • 23.37,
      • 1.56,
      • 25,
      • 3,
      • 55.94,
      • 50.67,
      • 26.48,
      • 66.31,
      • 5,
      • 29.23,
      • 28.76,
      • 0.75,
      • 29.43,
      • 4,
      • 0.06,
      • 0.03,
      • 0.04,
      • 0.06,
      • 2,
      • 26.7,
      • 25.86,
      • 1.63,
      • 27.01,
      • 2,
      • 13.03,
      • 12.55,
      • 0.78,
      • 13.1,
      • 2,
      • 8.32,
      • 7.39,
      • 1.43,
      • 8.4,
      • 2,
      • "LlamaForCausalLM",
      • "?",
      • 0,
      • 0,
      • true,
      • ""
      ],
    • [
      • 71,
      • "๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/MoxoffSpA/Volare" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Volare</a>",
      • 30.23,
      • 53.14,
      • 51.25,
      • 4.56,
      • 55.5,
      • 6,
      • 50.1,
      • 44.15,
      • 10.09,
      • 58.49,
      • 6,
      • 50.7,
      • 43.57,
      • 19.46,
      • 62.7,
      • 2,
      • 26.13,
      • 22.77,
      • 3.36,
      • 27.4,
      • 3,
      • 40.77,
      • 27.8,
      • 29.8,
      • 66.4,
      • 5,
      • 27.95,
      • 27.35,
      • 0.86,
      • 28.18,
      • 1,
      • 0.04,
      • 0.02,
      • 0.03,
      • 0.04,
      • 1,
      • 23.28,
      • 23.15,
      • 0.24,
      • 23.32,
      • 1,
      • 10.77,
      • 10.63,
      • 0.22,
      • 10.78,
      • 2,
      • 19.47,
      • 15.95,
      • 6.24,
      • 20.37,
      • 1,
      • "GemmaForCausalLM",
      • "?",
      • 0,
      • 0,
      • true,
      • ""
      ],
    • [
      • 72,
      • "๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/google/gemma-3-270m" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gemma-3-270m</a>",
      • 30.14,
      • 52.16,
      • 49.83,
      • 4.72,
      • 55,
      • 4,
      • 43.71,
      • 38.96,
      • 8.72,
      • 48.12,
      • 3,
      • 59.62,
      • 57.79,
      • 5.08,
      • 62.7,
      • 5,
      • 23.2,
      • 19.5,
      • 2.6,
      • 24.4,
      • 4,
      • 54.29,
      • 48.11,
      • 28.37,
      • 66.67,
      • 4,
      • 27.26,
      • 26.81,
      • 0.56,
      • 27.43,
      • 5,
      • 0,
      • 0,
      • 0,
      • 0,
      • 1,
      • 18.35,
      • 17.93,
      • 0.73,
      • 18.45,
      • 2,
      • 9.68,
      • 9.58,
      • 0.16,
      • 9.69,
      • 2,
      • 13.08,
      • 7.5,
      • 9.18,
      • 13.99,
      • 1,
      • "Gemma3ForCausalLM",
      • "?",
      • 1,
      • 0,
      • true,
      • ""
      ]
    ],
  • "metadata": null
}
Theoretical performance of a model that scores the highest on every individual task: 65.17

This project has benefited from the following support:

  • ๐Ÿง  Codebase: Based on and extended from the Open Italian LLM Leaderboard, developed by Alessandro Ercolani and Samuele Colombo. We warmly thank them for their invaluable support and guidance in implementing this leaderboard.

  • ๐Ÿ’ถ Funding: Partially supported by the PNRR project FAIR - Future AI Research (PE00000013), under the NRRP MUR program funded by NextGenerationEU.

  • ๐Ÿ–ฅ๏ธ Computation: We gratefully acknowledge CINECA for granting access to the LEONARDO supercomputer.