EVALITA-LLM Leaderboard

Open Italian LLM Leaderboard

Evalita-LLM is a benchmark designed to evaluate Large Language Models (LLMs) on Italian tasks. The distinguishing features of Evalita-LLM are the following: (i) all tasks are native Italian, avoiding translation issues and potential cultural biases; (ii) the benchmark includes generative tasks, enabling more natural interaction with LLMs; (iii) all tasks are evaluated against multiple prompts, this way mitigating the model sensitivity to specific prompts and allowing a fairer evaluation.

Multiple-choice tasks: ๐Ÿ“ŠTE (Textual Entailment), ๐Ÿ˜ƒSA (Sentiment Analysis), โš ๏ธHS (Hate Speech Detection), ๐ŸฅAT (Admission Test), ๐Ÿ”คWIC (Word in Context), โ“FAQ (Frequently Asked Questions)
Generative tasks: ๐Ÿ”„LS (Lexical Substitution), ๐Ÿ“SU (Summarization), ๐Ÿท๏ธNER (Named Entity Recognition), ๐Ÿ”—REL (Relation Extraction)

{
  • "headers": [
    • "Rank",
    • "Size",
    • "FS",
    • "IS_FS",
    • "Model",
    • "Avg. Comb. Perf. โฌ†๏ธ",
    • "TE",
    • "TE Prompt Average",
    • "TE Prompt Std",
    • "TE Best Prompt",
    • "TE Best Prompt Id",
    • "SA",
    • "SA Prompt Average",
    • "SA STD Accuracy",
    • "SA Best Prompt",
    • "SA Best Prompt Id",
    • "HS",
    • "HS Prompt Average",
    • "HS Prompt Std",
    • "HS Best Prompt",
    • "HS Best Prompt Id",
    • "AT",
    • "AT Prompt Average",
    • "AT Prompt Std",
    • "AT Best Prompt",
    • "AT Best Prompt Id",
    • "WIC",
    • "WIC Prompt Average",
    • "WIC Prompt Std",
    • "WIC Best Prompt",
    • "WIC Best Prompt Id",
    • "FAQ",
    • "FAQ Prompt Average",
    • "FAQ Prompt Std",
    • "FAQ Best Prompt",
    • "FAQ Best Prompt Id",
    • "LS",
    • "LS Prompt Average",
    • "LS Prompt Std",
    • "LS Best Prompt",
    • "LS Best Prompt Id",
    • "SU",
    • "SU Prompt Average",
    • "SU Prompt Std",
    • "SU Best Prompt",
    • "SU Best Prompt Id",
    • "NER",
    • "NER Prompt Average",
    • "NER Prompt Std",
    • "NER Best Prompt",
    • "NER Best Prompt Id",
    • "REL",
    • "REL Prompt Average",
    • "REL Prompt Std",
    • "REL Best Prompt",
    • "REL Best Prompt Id",
    • "Architecture",
    • "Hub License",
    • "#Params (B)",
    • "Hub โค๏ธ",
    • "Available on the hub",
    • "Model sha"
    ],
  • "data": [
    • [
      • 1,
      • "๐Ÿ”ต๐Ÿ”ต๐Ÿ”ต",
      • "5๏ธโƒฃ",
      • true,
      • "<a target="_blank" href="https://huggingface.co/mistralai/Mistral-Large-Instruct-2411" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Mistral-Large-Instruct-2411</a> ๐Ÿ”ต๐Ÿ”ต๐Ÿ”ต๐Ÿ†",
      • 62.28,
      • "81.0",
      • 80.5,
      • 1.87,
      • 83.25,
      • 6,
      • "80.9๐Ÿ”บ",
      • 80.71,
      • 0.98,
      • 81.98,
      • 3,
      • "77.0",
      • 76.61,
      • 2.71,
      • 78.58,
      • 2,
      • "76.3๐Ÿ”บ",
      • 75.27,
      • 15.06,
      • 94.8,
      • 3,
      • "75.5๐Ÿ”บ",
      • 74.43,
      • 6.32,
      • 80.07,
      • 4,
      • "54.4",
      • 54.2,
      • 35.09,
      • 99.5,
      • 3,
      • "38.6",
      • 38.24,
      • 0.74,
      • 38.76,
      • 2,
      • "33.4",
      • 30.91,
      • 5.42,
      • 34.74,
      • 2,
      • "38.8",
      • 38.75,
      • 0.08,
      • 38.81,
      • 1,
      • "66.8๐Ÿ”บ",
      • 66.76,
      • 0.24,
      • 66.93,
      • 1,
      • "MistralForCausalLM",
      • "?",
      • 123,
      • 0,
      • true,
      • ""
      ],
    • [
      • 2,
      • "๐Ÿ”ต๐Ÿ”ต๐Ÿ”ต",
      • "5๏ธโƒฃ",
      • true,
      • "<a target="_blank" href="https://huggingface.co/allenai/Llama-3.1-Tulu-3-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Llama-3.1-Tulu-3-70B</a>",
      • 60.16,
      • "82.4",
      • 82.12,
      • 1.71,
      • 84,
      • 2,
      • "80.4",
      • 80.07,
      • 1.66,
      • 81.94,
      • 3,
      • "75.5",
      • 74.85,
      • 2.89,
      • 77.85,
      • 2,
      • "72.4",
      • 70.8,
      • 16.68,
      • 92.4,
      • 3,
      • "65.4",
      • 62.35,
      • 16.2,
      • 74.46,
      • 2,
      • "54.5",
      • 54.41,
      • 35.12,
      • 99.75,
      • 3,
      • "40.9",
      • 39.89,
      • 2.43,
      • 41.6,
      • 1,
      • "35.5๐Ÿ”บ",
      • 34.9,
      • 1.23,
      • 35.77,
      • 2,
      • "39.1",
      • 38.44,
      • 1.61,
      • 39.57,
      • 1,
      • "55.4",
      • 52.58,
      • 9.77,
      • 59.49,
      • 2,
      • "LlamaForCausalLM",
      • "?",
      • 71,
      • 0,
      • true,
      • ""
      ],
    • [
      • 3,
      • "๐Ÿ”ต๐Ÿ”ต๐Ÿ”ต",
      • "5๏ธโƒฃ",
      • true,
      • "<a target="_blank" href="https://huggingface.co/meta-llama/Llama-3.3-70B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Llama-3.3-70B-Instruct</a>",
      • 57.93,
      • "80.8",
      • 80.25,
      • 2.38,
      • 83.5,
      • 3,
      • "79.5",
      • 78.98,
      • 2.57,
      • 82.19,
      • 4,
      • "77.8๐Ÿ”บ",
      • 77.4,
      • 1.44,
      • 79.45,
      • 2,
      • "70.7",
      • 68.83,
      • 17.56,
      • 91.8,
      • 3,
      • "67.2",
      • 65.01,
      • 9.36,
      • 72.89,
      • 2,
      • "54.1",
      • 53.66,
      • 35.03,
      • 99,
      • 3,
      • "46.8",
      • 45.77,
      • 2.79,
      • 47.74,
      • 1,
      • "21.2",
      • 20.94,
      • 0.54,
      • 21.32,
      • 2,
      • "44.6๐Ÿ”บ",
      • 44.53,
      • 0.13,
      • 44.62,
      • 1,
      • "36.5",
      • 36.51,
      • 0.03,
      • 36.53,
      • 1,
      • "LlamaForCausalLM",
      • "?",
      • 71,
      • 0,
      • true,
      • ""
      ],
    • [
      • 4,
      • "๐Ÿ”ต๐Ÿ”ต",
      • "5๏ธโƒฃ",
      • true,
      • "<a target="_blank" href="https://huggingface.co/mistralai/Mistral-Small-24B-Instruct-2501" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Mistral-Small-24B-Instruct-2501</a> ๐Ÿ”ต๐Ÿ”ต๐Ÿ†",
      • 57.67,
      • "82.0",
      • 81.71,
      • 1.61,
      • 83.5,
      • 6,
      • "76.0",
      • 75.1,
      • 3.31,
      • 79.21,
      • 3,
      • "72.9",
      • 72.1,
      • 3.35,
      • 75.36,
      • 3,
      • "69.8",
      • 67.73,
      • 18.33,
      • 91.4,
      • 3,
      • "71.7",
      • 70.86,
      • 3.21,
      • 73.89,
      • 4,
      • "53.8",
      • 53.45,
      • 35.38,
      • 99.25,
      • 3,
      • "41.9",
      • 41.15,
      • 1.81,
      • 42.43,
      • 1,
      • "29.7",
      • 28.42,
      • 2.58,
      • 30.25,
      • 2,
      • "38.2",
      • 38.09,
      • 0.17,
      • 38.21,
      • 1,
      • "40.9",
      • 40.18,
      • 1.73,
      • 41.41,
      • 2,
      • "MistralForCausalLM",
      • "?",
      • 24,
      • 0,
      • true,
      • ""
      ],
    • [
      • 5,
      • "๐Ÿ”ต๐Ÿ”ต",
      • "5๏ธโƒฃ",
      • true,
      • "<a target="_blank" href="https://huggingface.co/google/gemma-3-27b-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gemma-3-27b-it</a>",
      • 57.42,
      • "81.1",
      • 80.75,
      • 1.96,
      • 83,
      • 4,
      • "80.3",
      • 80.18,
      • 0.47,
      • 80.8,
      • 2,
      • "75.9",
      • 75.33,
      • 2.32,
      • 77.77,
      • 4,
      • "73.9",
      • 72.1,
      • 13.63,
      • 89.8,
      • 3,
      • "66.5",
      • 64.51,
      • 8.29,
      • 71.46,
      • 4,
      • "53.1",
      • 52.33,
      • 35.57,
      • 98.25,
      • 3,
      • "36.1",
      • 35.64,
      • 0.98,
      • 36.33,
      • 2,
      • "20.0",
      • 19.72,
      • 0.52,
      • 20.09,
      • 1,
      • "38.8",
      • 38.72,
      • 0.11,
      • 38.79,
      • 2,
      • "48.5",
      • 46.34,
      • 6.14,
      • 50.68,
      • 2,
      • "Gemma3ForConditionalGeneration",
      • "?",
      • 28,
      • 0,
      • true,
      • ""
      ],
    • [
      • 6,
      • "๐Ÿ”ต๐Ÿ”ต๐Ÿ”ต",
      • "5๏ธโƒฃ",
      • true,
      • "<a target="_blank" href="https://huggingface.co/Qwen/Qwen2.5-72B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen2.5-72B-Instruct</a>",
      • 57.36,
      • "85.1๐Ÿ”บ",
      • 85,
      • 0.45,
      • 85.5,
      • 6,
      • "77.7",
      • 77.6,
      • 0.42,
      • 77.97,
      • 6,
      • "76.4",
      • 75.89,
      • 1.67,
      • 78.11,
      • 3,
      • "69.3",
      • 67.9,
      • 20.69,
      • 94.8,
      • 3,
      • "72.1",
      • 71.27,
      • 2.46,
      • 74.34,
      • 1,
      • "54.0",
      • 53.66,
      • 35.32,
      • 99.25,
      • 3,
      • "38.0",
      • 37.94,
      • 0.2,
      • 38.08,
      • 2,
      • "24.5",
      • 24.15,
      • 0.64,
      • 24.6,
      • 1,
      • "38.9",
      • 38.76,
      • 0.34,
      • 39,
      • 2,
      • "37.6",
      • 37.51,
      • 0.31,
      • 37.73,
      • 1,
      • "Qwen2ForCausalLM",
      • "?",
      • 73,
      • 0,
      • true,
      • ""
      ],
    • [
      • 7,
      • "๐Ÿ”ต๐Ÿ”ต",
      • "5๏ธโƒฃ",
      • true,
      • "<a target="_blank" href="https://huggingface.co/Qwen/Qwen2.5-14B-Instruct-1M" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen2.5-14B-Instruct-1M</a>",
      • 54.72,
      • "85.0",
      • 84.79,
      • 1.8,
      • 86.5,
      • 4,
      • "73.2",
      • 71.75,
      • 5.41,
      • 78.26,
      • 3,
      • "72.8",
      • 71.8,
      • 5.46,
      • 75.96,
      • 4,
      • "59.8",
      • 55.4,
      • 23.32,
      • 85.6,
      • 3,
      • "63.6",
      • 61.7,
      • 4.86,
      • 67.52,
      • 6,
      • "52.6",
      • 52.37,
      • 36.41,
      • 99.5,
      • 4,
      • "35.0",
      • 34.65,
      • 0.84,
      • 35.24,
      • 1,
      • "25.9",
      • 25.51,
      • 0.75,
      • 26.04,
      • 2,
      • "35.1",
      • 35.03,
      • 0.17,
      • 35.14,
      • 1,
      • "44.2",
      • 42.63,
      • 4.14,
      • 45.56,
      • 1,
      • "Qwen2ForCausalLM",
      • "?",
      • 15,
      • 0,
      • true,
      • ""
      ],
    • [
      • 8,
      • "๐Ÿ”ต๐Ÿ”ต",
      • "5๏ธโƒฃ",
      • true,
      • "<a target="_blank" href="https://huggingface.co/google/gemma-3-12b-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gemma-3-12b-it</a>",
      • 53.92,
      • "79.5",
      • 79.12,
      • 1.4,
      • 81.25,
      • 1,
      • "78.3",
      • 78.17,
      • 0.64,
      • 78.98,
      • 6,
      • "71.4",
      • 70.14,
      • 4.61,
      • 75.24,
      • 4,
      • "65.9",
      • 62.5,
      • 16.97,
      • 84.6,
      • 4,
      • "66.5",
      • 64.94,
      • 6.14,
      • 70.16,
      • 3,
      • "52.4",
      • 52,
      • 36.41,
      • 99.25,
      • 3,
      • "26.8",
      • 26.64,
      • 0.23,
      • 26.81,
      • 2,
      • "18.9",
      • 18.74,
      • 0.34,
      • 18.98,
      • 2,
      • "37.9",
      • 37.82,
      • 0.1,
      • 37.89,
      • 1,
      • "41.6",
      • 41.46,
      • 0.33,
      • 41.69,
      • 1,
      • "Gemma3ForConditionalGeneration",
      • "?",
      • 13,
      • 0,
      • true,
      • ""
      ],
    • [
      • 9,
      • "๐Ÿ”ต๐Ÿ”ต",
      • "5๏ธโƒฃ",
      • true,
      • "<a target="_blank" href="https://huggingface.co/google/gemma-2-27b-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gemma-2-27b-it</a>",
      • 53.86,
      • "78.7",
      • 78.46,
      • 0.98,
      • 79.75,
      • 5,
      • "74.3",
      • 73.84,
      • 1.05,
      • 75.56,
      • 1,
      • "74.1",
      • 73.42,
      • 3.33,
      • 76.47,
      • 4,
      • "68.5",
      • 65.63,
      • 15.47,
      • 86.2,
      • 3,
      • "68.6",
      • 66.82,
      • 6.31,
      • 73.31,
      • 4,
      • "52.4",
      • 51.33,
      • 35.96,
      • 97.76,
      • 3,
      • "28.5",
      • 28.5,
      • 0.04,
      • 28.53,
      • 2,
      • "18.7",
      • 18.51,
      • 0.26,
      • 18.69,
      • 2,
      • "38.3",
      • 38.23,
      • 0.24,
      • 38.4,
      • 2,
      • "36.6",
      • 36.56,
      • 0,
      • 36.56,
      • 1,
      • "Gemma2ForCausalLM",
      • "?",
      • 28,
      • 0,
      • true,
      • ""
      ],
    • [
      • 10,
      • "๐Ÿ”ต",
      • "5๏ธโƒฃ",
      • true,
      • "<a target="_blank" href="https://huggingface.co/google/gemma-2-9b-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gemma-2-9b-it</a> ๐Ÿ”ต๐Ÿ†",
      • 53.64,
      • "80.0",
      • 79.62,
      • 2.35,
      • 81.5,
      • 2,
      • "72.8",
      • 72.49,
      • 1.06,
      • 73.89,
      • 1,
      • "72.0",
      • 70.71,
      • 4.79,
      • 76.03,
      • 3,
      • "60.0",
      • 54.9,
      • 16.19,
      • 75.8,
      • 3,
      • "57.8",
      • 55.64,
      • 5.34,
      • 61.17,
      • 3,
      • "51.8",
      • 51.16,
      • 36.67,
      • 98.75,
      • 3,
      • "22.4",
      • 21.97,
      • 0.71,
      • 22.47,
      • 2,
      • "30.4",
      • 29.12,
      • 2.68,
      • 31.02,
      • 1,
      • "38.0",
      • 37.92,
      • 0.24,
      • 38.09,
      • 1,
      • "51.3",
      • 50.98,
      • 0.81,
      • 51.56,
      • 1,
      • "Gemma2ForCausalLM",
      • "?",
      • 10,
      • 0,
      • true,
      • ""
      ],
    • [
      • 11,
      • "๐Ÿ”ต",
      • "5๏ธโƒฃ",
      • true,
      • "<a target="_blank" href="https://huggingface.co/Qwen/Qwen2.5-7B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen2.5-7B-Instruct</a>",
      • 52.62,
      • "83.7",
      • 83.42,
      • 0.82,
      • 85,
      • 2,
      • "74.2",
      • 73.77,
      • 1.12,
      • 75.35,
      • 4,
      • "68.5",
      • 67.14,
      • 5.8,
      • 72.04,
      • 4,
      • "53.9",
      • 46.93,
      • 21.82,
      • 75.8,
      • 4,
      • "62.7",
      • 60.38,
      • 5.38,
      • 67.5,
      • 6,
      • "52.5",
      • 51.87,
      • 36.22,
      • 98.75,
      • 4,
      • "28.2",
      • 27.85,
      • 0.62,
      • 28.29,
      • 2,
      • "30.8",
      • 30.36,
      • 0.88,
      • 30.98,
      • 2,
      • "35.2",
      • 34.45,
      • 1.6,
      • 35.58,
      • 1,
      • "36.7",
      • 36.64,
      • 0.11,
      • 36.72,
      • 2,
      • "Qwen2ForCausalLM",
      • "?",
      • 8,
      • 0,
      • true,
      • ""
      ],
    • [
      • 12,
      • "๐Ÿ”ต",
      • "5๏ธโƒฃ",
      • true,
      • "<a target="_blank" href="https://huggingface.co/google/gemma-3n-E4B-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gemma-3n-E4B-it</a>",
      • 52.4,
      • "73.4",
      • 72.08,
      • 4.01,
      • 78,
      • 2,
      • "76.3",
      • 76.02,
      • 1.37,
      • 77.47,
      • 6,
      • "68.3",
      • 66.55,
      • 6.05,
      • 72.92,
      • 4,
      • "58.2",
      • 53.1,
      • 13.18,
      • 70.6,
      • 4,
      • "62.6",
      • 60.55,
      • 9.3,
      • 66.81,
      • 4,
      • "52.1",
      • 47.55,
      • 31.75,
      • 89.03,
      • 3,
      • "25.3",
      • 25.18,
      • 0.16,
      • 25.3,
      • 1,
      • "25.8",
      • 25.35,
      • 0.84,
      • 25.95,
      • 1,
      • "32.4",
      • 31.9,
      • 0.93,
      • 32.56,
      • 2,
      • "49.6",
      • 49.29,
      • 1.03,
      • 50.02,
      • 1,
      • "Gemma3nForConditionalGeneration",
      • "?",
      • 8,
      • 0,
      • true,
      • ""
      ],
    • [
      • 13,
      • "๐Ÿ”ต๐Ÿ”ต๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/Qwen/Qwen2.5-72B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen2.5-72B-Instruct</a> ๐Ÿ”ต๐Ÿ”ต๐Ÿ”ต๐ŸŽ–๏ธ",
      • 52.38,
      • "75.2",
      • 73.46,
      • 13.98,
      • 84.25,
      • 3,
      • "57.4",
      • 51.56,
      • 20.98,
      • 77.2,
      • 4,
      • "71.6",
      • 69.86,
      • 6.22,
      • 77.42,
      • 4,
      • "59.4",
      • 56.77,
      • 28.65,
      • 92.8,
      • 3,
      • "56.3",
      • 51.14,
      • 17.02,
      • 66.67,
      • 5,
      • "53.4",
      • 53.08,
      • 35.39,
      • 99.25,
      • 4,
      • "50.2",
      • 50.1,
      • 0.37,
      • 50.37,
      • 1,
      • "25.8",
      • 25.12,
      • 1.39,
      • 26.1,
      • 1,
      • "32.9",
      • 31.83,
      • 2.21,
      • 33.4,
      • 2,
      • "41.6",
      • 39.77,
      • 4.54,
      • 42.98,
      • 1,
      • "Qwen2ForCausalLM",
      • "?",
      • 73,
      • 0,
      • true,
      • ""
      ],
    • [
      • 14,
      • "๐Ÿ”ต๐Ÿ”ต",
      • "5๏ธโƒฃ",
      • true,
      • "<a target="_blank" href="https://huggingface.co/microsoft/phi-4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">phi-4</a>",
      • 51.84,
      • "76.8",
      • 76.21,
      • 3.56,
      • 79.25,
      • 3,
      • "75.2",
      • 74.44,
      • 2.76,
      • 77.95,
      • 4,
      • "69.2",
      • 67.08,
      • 6.48,
      • 75.89,
      • 4,
      • "61.9",
      • 57.87,
      • 21.1,
      • 85.2,
      • 4,
      • "59.4",
      • 57.17,
      • 4.21,
      • 63.16,
      • 4,
      • "54.3",
      • 53.41,
      • 34.45,
      • 98,
      • 4,
      • "35.1",
      • 35.1,
      • 0.02,
      • 35.11,
      • 2,
      • "21.2",
      • 21.06,
      • 0.33,
      • 21.29,
      • 1,
      • "28.1",
      • 27.93,
      • 0.25,
      • 28.1,
      • 2,
      • "37.1",
      • 36.83,
      • 0.72,
      • 37.34,
      • 1,
      • "Phi3ForCausalLM",
      • "?",
      • 15,
      • 0,
      • true,
      • ""
      ],
    • [
      • 15,
      • "๐Ÿ”ต",
      • "5๏ธโƒฃ",
      • true,
      • "<a target="_blank" href="https://huggingface.co/arcee-ai/Llama-3.1-SuperNova-Lite" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Llama-3.1-SuperNova-Lite</a>",
      • 51.71,
      • "74.2",
      • 73.62,
      • 2.64,
      • 76,
      • 3,
      • "75.2",
      • 74.23,
      • 3.64,
      • 79.03,
      • 4,
      • "69.2",
      • 67.58,
      • 5.12,
      • 73.93,
      • 4,
      • "54.8",
      • 48.43,
      • 15.87,
      • 69.2,
      • 4,
      • "62.6",
      • 61.35,
      • 2.04,
      • 64.86,
      • 6,
      • "54.4",
      • 51,
      • 31.2,
      • 91.52,
      • 3,
      • "29.5",
      • 29.28,
      • 0.43,
      • 29.59,
      • 2,
      • "21.6",
      • 21.38,
      • 0.39,
      • 21.66,
      • 2,
      • "39.5",
      • 38.24,
      • 2.96,
      • 40.34,
      • 2,
      • "36.0",
      • 35.67,
      • 0.76,
      • 36.21,
      • 1,
      • "LlamaForCausalLM",
      • "?",
      • 9,
      • 0,
      • true,
      • ""
      ],
    • [
      • 16,
      • "๐Ÿ”ต",
      • "5๏ธโƒฃ",
      • true,
      • "<a target="_blank" href="https://huggingface.co/ibm-granite/granite-3.1-8b-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">granite-3.1-8b-instruct</a>",
      • 51.7,
      • "72.2",
      • 70.92,
      • 5.97,
      • 76.5,
      • 4,
      • "74.8",
      • 74.47,
      • 0.84,
      • 75.67,
      • 1,
      • "66.9",
      • 64.05,
      • 8.78,
      • 75.68,
      • 4,
      • "54.2",
      • 48.17,
      • 13.29,
      • 66,
      • 4,
      • "65.0",
      • 63.71,
      • 5.26,
      • 67.83,
      • 4,
      • "53.5",
      • 52.58,
      • 35.09,
      • 98,
      • 3,
      • "25.7",
      • 25.6,
      • 0.26,
      • 25.79,
      • 1,
      • "32.9",
      • 32.36,
      • 1.1,
      • 33.14,
      • 1,
      • "31.8",
      • 30.59,
      • 2.53,
      • 32.38,
      • 2,
      • "39.9",
      • 39.7,
      • 0.52,
      • 40.06,
      • 1,
      • "GraniteForCausalLM",
      • "?",
      • 9,
      • 0,
      • true,
      • ""
      ],
    • [
      • 17,
      • "๐Ÿ”ต๐Ÿ”ต๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/meta-llama/Llama-4-Scout-17B-16E-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Llama-4-Scout-17B-16E-Instruct</a>",
      • 51.65,
      • "71.2",
      • 68.92,
      • 12.62,
      • 81.5,
      • 4,
      • "69.3",
      • 67,
      • 9.58,
      • 77.35,
      • 4,
      • "68.2",
      • 65.82,
      • 6.9,
      • 75.27,
      • 4,
      • "68.2",
      • 66.1,
      • 20.14,
      • 92,
      • 3,
      • "62.8",
      • 60.82,
      • 7.79,
      • 66.85,
      • 5,
      • "54.0",
      • 53.41,
      • 34.93,
      • 98.75,
      • 3,
      • "39.3",
      • 38.46,
      • 1.95,
      • 39.84,
      • 1,
      • "28.2",
      • 28.16,
      • 0.15,
      • 28.27,
      • 2,
      • "21.6",
      • 16.62,
      • 9.15,
      • 23.08,
      • 1,
      • "33.7",
      • 31.78,
      • 4.12,
      • 34.7,
      • 1,
      • "Llama4ForConditionalGeneration",
      • "?",
      • 109,
      • 0,
      • true,
      • ""
      ],
    • [
      • 18,
      • "๐Ÿ”ต๐Ÿ”ต๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/allenai/Llama-3.1-Tulu-3-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Llama-3.1-Tulu-3-70B</a>",
      • 51.6,
      • "71.7",
      • 69.67,
      • 11.26,
      • 79.75,
      • 2,
      • "59.9",
      • 54.84,
      • 16.28,
      • 75.25,
      • 4,
      • "64.9",
      • 61.08,
      • 27.3,
      • 81.43,
      • 3,
      • "65.2",
      • 62.6,
      • 21.94,
      • 91,
      • 3,
      • "58.6",
      • 54.24,
      • 21.7,
      • 67.96,
      • 4,
      • "54.5",
      • 53.57,
      • 34.22,
      • 98,
      • 4,
      • "46.5",
      • 46.33,
      • 0.33,
      • 46.56,
      • 2,
      • "29.2",
      • 29.14,
      • 0.23,
      • 29.3,
      • 1,
      • "30.1",
      • 28.84,
      • 2.64,
      • 30.71,
      • 2,
      • "35.5",
      • 35.05,
      • 0.93,
      • 35.71,
      • 2,
      • "LlamaForCausalLM",
      • "?",
      • 71,
      • 0,
      • true,
      • ""
      ],
    • [
      • 19,
      • "๐Ÿ”ต๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/mistralai/Mistral-Small-24B-Instruct-2501" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Mistral-Small-24B-Instruct-2501</a> ๐Ÿ”ต๐Ÿ”ต๐ŸŽ–๏ธ",
      • 51.3,
      • "69.1",
      • 66.21,
      • 13.7,
      • 82.25,
      • 4,
      • "62.9",
      • 58.64,
      • 20.07,
      • 78.32,
      • 3,
      • "67.3",
      • 64.95,
      • 5.31,
      • 74.19,
      • 4,
      • "61.6",
      • 58.53,
      • 24.89,
      • 90.4,
      • 3,
      • "60.3",
      • 56.56,
      • 16.96,
      • 68.43,
      • 2,
      • "54.5",
      • 53.45,
      • 33.74,
      • 97.51,
      • 4,
      • "52.7๐Ÿ”บ",
      • 51.95,
      • 2.18,
      • 53.49,
      • 2,
      • "31.9",
      • 31.83,
      • 0.06,
      • 31.87,
      • 1,
      • "18.6",
      • 17.66,
      • 1.65,
      • 18.83,
      • 2,
      • "34.1",
      • 33.6,
      • 1.17,
      • 34.43,
      • 2,
      • "MistralForCausalLM",
      • "?",
      • 24,
      • 0,
      • true,
      • ""
      ],
    • [
      • 20,
      • "๐Ÿ”ต๐Ÿ”ต",
      • "5๏ธโƒฃ",
      • true,
      • "<a target="_blank" href="https://huggingface.co/microsoft/Phi-3-medium-4k-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Phi-3-medium-4k-instruct</a>",
      • 51.22,
      • "76.3",
      • 75.25,
      • 6.22,
      • 80.5,
      • 4,
      • "70.5",
      • 69.85,
      • 2.03,
      • 72.1,
      • 3,
      • "62.8",
      • 58.75,
      • 22.01,
      • 75.2,
      • 3,
      • "61.5",
      • 56.9,
      • 18.75,
      • 81.2,
      • 4,
      • "68.9",
      • 67.56,
      • 5.96,
      • 72.47,
      • 2,
      • "52.2",
      • 45.47,
      • 25.69,
      • 81.3,
      • 4,
      • "35.9",
      • 35.46,
      • 0.93,
      • 36.11,
      • 1,
      • "21.6",
      • 21.35,
      • 0.42,
      • 21.65,
      • 2,
      • "25.7",
      • 25.55,
      • 0.23,
      • 25.71,
      • 2,
      • "37.0",
      • 36.78,
      • 0.36,
      • 37.04,
      • 2,
      • "Phi3ForCausalLM",
      • "?",
      • 14,
      • 0,
      • true,
      • ""
      ],
    • [
      • 21,
      • "๐Ÿ”ต๐Ÿ”ต๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/mistralai/Mistral-Large-Instruct-2411" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Mistral-Large-Instruct-2411</a>",
      • 50.86,
      • "68.5",
      • 65.58,
      • 13.21,
      • 81.75,
      • 4,
      • "65.9",
      • 62.37,
      • 17.69,
      • 80.46,
      • 4,
      • "69.1",
      • 66.95,
      • 6.29,
      • 75.58,
      • 4,
      • "67.6",
      • 65.73,
      • 21.69,
      • 93,
      • 3,
      • "69.6",
      • 67.68,
      • 5.89,
      • 75.73,
      • 2,
      • "54.7",
      • 54.24,
      • 34.39,
      • 99,
      • 3,
      • "21.8",
      • 18.56,
      • 5.93,
      • 22.75,
      • 1,
      • "30.8",
      • 30.58,
      • 0.38,
      • 30.85,
      • 2,
      • "16.1",
      • 14.17,
      • 3.21,
      • 16.44,
      • 1,
      • "44.5",
      • 44.07,
      • 1.23,
      • 44.94,
      • 2,
      • "MistralForCausalLM",
      • "?",
      • 123,
      • 0,
      • true,
      • ""
      ],
    • [
      • 22,
      • "๐Ÿ”ต๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/google/gemma-2-27b-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gemma-2-27b-it</a>",
      • 50.6,
      • "68.5",
      • 66.25,
      • 8.97,
      • 75.5,
      • 4,
      • "67.0",
      • 65.61,
      • 6.64,
      • 70.48,
      • 3,
      • "61.8",
      • 59.68,
      • 5.88,
      • 65.82,
      • 4,
      • "62.0",
      • 57.67,
      • 19.69,
      • 82.6,
      • 4,
      • "55.5",
      • 49.62,
      • 24,
      • 68.14,
      • 6,
      • "52.5",
      • 51.71,
      • 35.19,
      • 98.25,
      • 4,
      • "36.0",
      • 34.61,
      • 3.22,
      • 36.88,
      • 1,
      • "30.0",
      • 29.43,
      • 1.12,
      • 30.22,
      • 2,
      • "36.6",
      • 36.47,
      • 0.32,
      • 36.69,
      • 1,
      • "36.0",
      • 35.78,
      • 0.52,
      • 36.15,
      • 1,
      • "Gemma2ForCausalLM",
      • "?",
      • 28,
      • 0,
      • true,
      • ""
      ],
    • [
      • 23,
      • "๐Ÿ”ต",
      • "5๏ธโƒฃ",
      • true,
      • "<a target="_blank" href="https://huggingface.co/meta-llama/Meta-Llama-3.1-8B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Meta-Llama-3.1-8B-Instruct</a>",
      • 50.37,
      • "70.5",
      • 69.04,
      • 7.9,
      • 75,
      • 4,
      • "73.8",
      • 72.72,
      • 3.67,
      • 77.35,
      • 4,
      • "65.7",
      • 63.22,
      • 8.24,
      • 72.22,
      • 4,
      • "53.9",
      • 46.97,
      • 17.77,
      • 70.2,
      • 4,
      • "59.0",
      • 57.65,
      • 3.2,
      • 61.2,
      • 2,
      • "53.7",
      • 51.16,
      • 33,
      • 94.01,
      • 3,
      • "30.3",
      • 29.76,
      • 1.15,
      • 30.57,
      • 2,
      • "20.4",
      • 20.2,
      • 0.43,
      • 20.5,
      • 2,
      • "40.3",
      • 39.7,
      • 1.44,
      • 40.72,
      • 2,
      • "36.0",
      • 35.89,
      • 0.22,
      • 36.05,
      • 2,
      • "LlamaForCausalLM",
      • "?",
      • 9,
      • 0,
      • true,
      • ""
      ],
    • [
      • 24,
      • "๐Ÿ”ต",
      • "5๏ธโƒฃ",
      • true,
      • "<a target="_blank" href="https://huggingface.co/microsoft/Phi-3.5-mini-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Phi-3.5-mini-instruct</a>",
      • 50.06,
      • "77.3",
      • 76.92,
      • 2.32,
      • 79,
      • 2,
      • "72.8",
      • 72.21,
      • 2.05,
      • 74.57,
      • 5,
      • "67.0",
      • 65.31,
      • 6.51,
      • 71.39,
      • 4,
      • "53.2",
      • 46.5,
      • 14.96,
      • 66.4,
      • 3,
      • "65.3",
      • 63.97,
      • 3.54,
      • 68.25,
      • 6,
      • "52.7",
      • 47.92,
      • 29.83,
      • 88.03,
      • 4,
      • "24.1",
      • 23.86,
      • 0.4,
      • 24.14,
      • 2,
      • "19.9",
      • 19.71,
      • 0.25,
      • 19.89,
      • 1,
      • "32.0",
      • 31.5,
      • 1.01,
      • 32.21,
      • 1,
      • "36.2",
      • 36.15,
      • 0.21,
      • 36.3,
      • 2,
      • "Phi3ForCausalLM",
      • "?",
      • 4,
      • 0,
      • true,
      • ""
      ],
    • [
      • 25,
      • "๐Ÿ”ต๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/google/gemma-3-27b-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gemma-3-27b-it</a>",
      • 49.89,
      • "68.8",
      • 65.96,
      • 15.57,
      • 80,
      • 4,
      • "67.8",
      • 65.76,
      • 9.22,
      • 73.63,
      • 4,
      • "66.7",
      • 65.32,
      • 3.51,
      • 69.74,
      • 4,
      • "64.8",
      • 62.13,
      • 23.01,
      • 90.4,
      • 3,
      • "40.6",
      • 27.58,
      • 30.52,
      • 66.58,
      • 5,
      • "53.3",
      • 52.58,
      • 35.48,
      • 98.5,
      • 3,
      • "42.4",
      • 38.66,
      • 9.74,
      • 45.55,
      • 2,
      • "29.8",
      • 29.56,
      • 0.43,
      • 29.86,
      • 2,
      • "32.5",
      • 32.32,
      • 0.39,
      • 32.6,
      • 2,
      • "32.2",
      • 30.52,
      • 3.64,
      • 33.1,
      • 1,
      • "Gemma3ForConditionalGeneration",
      • "?",
      • 28,
      • 0,
      • true,
      • ""
      ],
    • [
      • 26,
      • "๐Ÿ”ต",
      • "5๏ธโƒฃ",
      • true,
      • "<a target="_blank" href="https://huggingface.co/DeepMount00/Llama-3-8b-Ita" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Llama-3-8b-Ita</a>",
      • 49.41,
      • "72.6",
      • 71.54,
      • 3.77,
      • 76,
      • 1,
      • "71.9",
      • 70.46,
      • 4.33,
      • 76.59,
      • 4,
      • "69.0",
      • 67.49,
      • 3.78,
      • 72.86,
      • 4,
      • "53.0",
      • 46.33,
      • 14.93,
      • 65.6,
      • 3,
      • "57.0",
      • 54.37,
      • 5.33,
      • 61.26,
      • 1,
      • "53.7",
      • 51.41,
      • 33.29,
      • 94.76,
      • 4,
      • "25.1",
      • 24.86,
      • 0.42,
      • 25.16,
      • 1,
      • "24.3",
      • 24.26,
      • 0.07,
      • 24.31,
      • 2,
      • "31.0",
      • 30.27,
      • 1.47,
      • 31.31,
      • 2,
      • "36.6",
      • 36.56,
      • 0,
      • 36.56,
      • 1,
      • "LlamaForCausalLM",
      • "?",
      • 9,
      • 0,
      • true,
      • ""
      ],
    • [
      • 27,
      • "๐Ÿ”ต",
      • "5๏ธโƒฃ",
      • true,
      • "<a target="_blank" href="https://huggingface.co/swap-uniba/LLaMAntino-3-ANITA-8B-Inst-DPO-ITA" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">LLaMAntino-3-ANITA-8B-Inst-DPO-ITA</a>",
      • 49.39,
      • "72.9",
      • 71.75,
      • 3.34,
      • 76.75,
      • 1,
      • "72.1",
      • 70.82,
      • 3.81,
      • 76.05,
      • 4,
      • "71.0",
      • 69.88,
      • 2.93,
      • 74.19,
      • 4,
      • "52.8",
      • 46.4,
      • 13.88,
      • 64.4,
      • 4,
      • "56.8",
      • 52.88,
      • 10.47,
      • 63.74,
      • 1,
      • "56.5๐Ÿ”บ",
      • 52.41,
      • 27.9,
      • 88.53,
      • 3,
      • "23.3",
      • 22.92,
      • 0.64,
      • 23.37,
      • 1,
      • "29.1",
      • 28.92,
      • 0.37,
      • 29.18,
      • 2,
      • "17.2",
      • 15.77,
      • 2.46,
      • 17.51,
      • 2,
      • "42.1",
      • 41.31,
      • 2.05,
      • 42.76,
      • 1,
      • "LlamaForCausalLM",
      • "?",
      • 9,
      • 0,
      • true,
      • ""
      ],
    • [
      • 28,
      • "๐Ÿ”ต",
      • "5๏ธโƒฃ",
      • true,
      • "<a target="_blank" href="https://huggingface.co/mii-llm/maestrale-chat-v0.4-beta" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">maestrale-chat-v0.4-beta</a>",
      • 49.37,
      • "73.9",
      • 72.62,
      • 6.4,
      • 78.5,
      • 4,
      • "72.2",
      • 71.75,
      • 1.77,
      • 73.57,
      • 2,
      • "66.9",
      • 64.99,
      • 8.41,
      • 71.56,
      • 3,
      • "55.6",
      • 50.7,
      • 10.77,
      • 64.6,
      • 3,
      • "60.4",
      • 59.48,
      • 2.2,
      • 61.76,
      • 6,
      • "51.7",
      • 43.81,
      • 22.67,
      • 74.06,
      • 3,
      • "22.5",
      • 22.36,
      • 0.32,
      • 22.59,
      • 1,
      • "21.2",
      • 21.18,
      • 0.01,
      • 21.18,
      • 2,
      • "33.5",
      • 33.24,
      • 0.62,
      • 33.68,
      • 1,
      • "35.8",
      • 35.36,
      • 0.98,
      • 36.06,
      • 2,
      • "MistralForCausalLM",
      • "?",
      • 8,
      • 0,
      • true,
      • ""
      ],
    • [
      • 29,
      • "๐Ÿ”ต",
      • "5๏ธโƒฃ",
      • true,
      • "<a target="_blank" href="https://huggingface.co/CohereForAI/aya-expanse-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">aya-expanse-8b</a>",
      • 49.3,
      • "72.9",
      • 71.88,
      • 4.42,
      • 76.25,
      • 2,
      • "72.3",
      • 71.9,
      • 1.11,
      • 73.3,
      • 5,
      • "67.7",
      • 65.83,
      • 5.06,
      • 72.71,
      • 4,
      • "52.5",
      • 46.27,
      • 13.05,
      • 63.4,
      • 4,
      • "62.0",
      • 59.58,
      • 4.03,
      • 66.98,
      • 3,
      • "54.4",
      • 51.87,
      • 31.88,
      • 94.01,
      • 4,
      • "20.6",
      • 20.37,
      • 0.33,
      • 20.6,
      • 2,
      • "18.3",
      • 18.27,
      • 0.08,
      • 18.33,
      • 1,
      • "35.7",
      • 35.67,
      • 0.13,
      • 35.77,
      • 2,
      • "36.6",
      • 36.56,
      • 0,
      • 36.56,
      • 1,
      • "CohereForCausalLM",
      • "?",
      • 0,
      • 0,
      • true,
      • ""
      ],
    • [
      • 30,
      • "๐Ÿ”ต๐Ÿ”ต๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/meta-llama/Llama-3.3-70B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Llama-3.3-70B-Instruct</a>",
      • 47.69,
      • "68.2",
      • 65.29,
      • 10.42,
      • 79.5,
      • 4,
      • "63.4",
      • 59.3,
      • 18.97,
      • 80.31,
      • 4,
      • "70.2",
      • 67.84,
      • 8.28,
      • 78.84,
      • 3,
      • "61.7",
      • 58.93,
      • 25.42,
      • 91.4,
      • 3,
      • "57.8",
      • 52.8,
      • 19.32,
      • 69.15,
      • 2,
      • "54.3",
      • 53.74,
      • 34.58,
      • 98.75,
      • 3,
      • "28.4",
      • 23.67,
      • 9.62,
      • 30.48,
      • 2,
      • "24.4",
      • 24.12,
      • 0.6,
      • 24.54,
      • 2,
      • "20.8",
      • 18.96,
      • 3.32,
      • 21.31,
      • 1,
      • "27.6",
      • 27.26,
      • 0.59,
      • 27.67,
      • 2,
      • "LlamaForCausalLM",
      • "?",
      • 71,
      • 0,
      • true,
      • ""
      ],
    • [
      • 31,
      • "๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/google/gemma-2-9b-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gemma-2-9b-it</a> ๐Ÿ”ต๐ŸŽ–๏ธ",
      • 47.54,
      • "72.0",
      • 70.12,
      • 8.85,
      • 79,
      • 2,
      • "64.3",
      • 61.74,
      • 11.1,
      • 70.63,
      • 4,
      • "64.2",
      • 61.92,
      • 7.4,
      • 69.42,
      • 4,
      • "57.9",
      • 52.27,
      • 19.76,
      • 77.8,
      • 4,
      • "42.5",
      • 30.36,
      • 28.96,
      • 64.67,
      • 5,
      • "51.5",
      • 50.58,
      • 34.19,
      • 98,
      • 4,
      • "25.4",
      • 25.29,
      • 0.3,
      • 25.5,
      • 2,
      • "30.5",
      • 30.1,
      • 0.78,
      • 30.65,
      • 2,
      • "32.0",
      • 31.63,
      • 0.8,
      • 32.2,
      • 1,
      • "35.0",
      • 34.56,
      • 0.9,
      • 35.19,
      • 1,
      • "Gemma2ForCausalLM",
      • "?",
      • 10,
      • 0,
      • true,
      • ""
      ],
    • [
      • 32,
      • "๐Ÿ”ต๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/google/gemma-3-12b-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gemma-3-12b-it</a>",
      • 47.35,
      • "71.3",
      • 69.25,
      • 13.64,
      • 79,
      • 4,
      • "61.9",
      • 57.59,
      • 16.98,
      • 74.6,
      • 4,
      • "67.2",
      • 65.51,
      • 4.28,
      • 71.4,
      • 3,
      • "58.6",
      • 53.63,
      • 23.4,
      • 83.4,
      • 4,
      • "46.1",
      • 35.76,
      • 22.92,
      • 66.76,
      • 5,
      • "52.9",
      • 52.33,
      • 35.77,
      • 98.75,
      • 3,
      • "21.9",
      • 20.17,
      • 3.05,
      • 22.33,
      • 2,
      • "29.9",
      • 29.46,
      • 0.95,
      • 30.13,
      • 2,
      • "31.5",
      • 29.49,
      • 4.22,
      • 32.47,
      • 1,
      • "32.2",
      • 32.16,
      • 0.16,
      • 32.27,
      • 1,
      • "Gemma3ForConditionalGeneration",
      • "?",
      • 13,
      • 0,
      • true,
      • ""
      ],
    • [
      • 33,
      • "๐Ÿ”ต",
      • "5๏ธโƒฃ",
      • true,
      • "<a target="_blank" href="https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Mistral-7B-Instruct-v0.3</a>",
      • 47.31,
      • "74.4",
      • 73.5,
      • 3.09,
      • 77.5,
      • 2,
      • "69.3",
      • 67.96,
      • 5.46,
      • 72.97,
      • 6,
      • "64.4",
      • 63.62,
      • 1.73,
      • 66.03,
      • 4,
      • "49.3",
      • 43.33,
      • 10.84,
      • 57.4,
      • 3,
      • "53.9",
      • 47.58,
      • 15.97,
      • 66.16,
      • 4,
      • "52.9",
      • 51.46,
      • 34.9,
      • 96.76,
      • 3,
      • "18.2",
      • 18.16,
      • 0.08,
      • 18.22,
      • 1,
      • "20.6",
      • 20.3,
      • 0.51,
      • 20.66,
      • 2,
      • "33.2",
      • 30.65,
      • 5.48,
      • 34.52,
      • 2,
      • "36.9",
      • 36.85,
      • 0.1,
      • 36.92,
      • 2,
      • "MistralForCausalLM",
      • "?",
      • 8,
      • 0,
      • true,
      • ""
      ],
    • [
      • 34,
      • "๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/DeepMount00/Lexora-Medium-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Lexora-Medium-7B</a>",
      • 46.93,
      • "71.8",
      • 69.54,
      • 11.89,
      • 83.75,
      • 4,
      • "57.4",
      • 51.92,
      • 16.82,
      • 70.24,
      • 4,
      • "64.3",
      • 62.94,
      • 3.07,
      • 67.16,
      • 3,
      • "50.6",
      • 42.37,
      • 23.91,
      • 74,
      • 3,
      • "62.4",
      • 60.14,
      • 7.94,
      • 67.04,
      • 2,
      • "53.5",
      • 52.54,
      • 34.93,
      • 97.76,
      • 3,
      • "40.9",
      • 40.46,
      • 1,
      • 41.17,
      • 2,
      • "28.4",
      • 27.76,
      • 1.18,
      • 28.6,
      • 2,
      • "27.5",
      • 27.28,
      • 0.46,
      • 27.61,
      • 2,
      • "12.5",
      • 10.19,
      • 3.68,
      • 12.79,
      • 1,
      • "Qwen2ForCausalLM",
      • "?",
      • 8,
      • 0,
      • true,
      • ""
      ],
    • [
      • 35,
      • "๐Ÿ”ต",
      • "5๏ธโƒฃ",
      • true,
      • "<a target="_blank" href="https://huggingface.co/google/gemma-3-4b-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gemma-3-4b-it</a>",
      • 46.57,
      • "68.3",
      • 66.46,
      • 5.92,
      • 73.25,
      • 2,
      • "75.3",
      • 74.96,
      • 1.38,
      • 76.37,
      • 4,
      • "65.0",
      • 62.43,
      • 7.17,
      • 71.53,
      • 4,
      • "51.9",
      • 46.53,
      • 10.36,
      • 60,
      • 4,
      • "49.0",
      • 41.31,
      • 16.77,
      • 61.17,
      • 6,
      • "51.7",
      • 50.04,
      • 35.71,
      • 96.51,
      • 3,
      • "14.6",
      • 14.06,
      • 0.83,
      • 14.64,
      • 1,
      • "18.0",
      • 17.95,
      • 0.01,
      • 17.96,
      • 1,
      • "36.0",
      • 35.79,
      • 0.47,
      • 36.12,
      • 1,
      • "36.0",
      • 35.98,
      • 0.08,
      • 36.04,
      • 1,
      • "Gemma3ForConditionalGeneration",
      • "?",
      • 0,
      • 0,
      • true,
      • ""
      ],
    • [
      • 36,
      • "๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/Qwen/Qwen2.5-7B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen2.5-7B-Instruct</a>",
      • 45.5,
      • "75.2",
      • 73.67,
      • 13.22,
      • 82.5,
      • 4,
      • "57.2",
      • 51.62,
      • 13.4,
      • 70.43,
      • 4,
      • "64.6",
      • 61.54,
      • 7.85,
      • 72.54,
      • 4,
      • "49.7",
      • 41.03,
      • 22.65,
      • 70.4,
      • 3,
      • "45.9",
      • 35.48,
      • 24.69,
      • 67.11,
      • 5,
      • "52.8",
      • 51.58,
      • 33.77,
      • 97.26,
      • 4,
      • "33.0",
      • 32.42,
      • 1.12,
      • 33.22,
      • 2,
      • "25.4",
      • 24.9,
      • 0.93,
      • 25.56,
      • 2,
      • "15.0",
      • 10.24,
      • 7.94,
      • 15.86,
      • 2,
      • "36.3",
      • 36.11,
      • 0.35,
      • 36.35,
      • 1,
      • "Qwen2ForCausalLM",
      • "?",
      • 8,
      • 0,
      • true,
      • ""
      ],
    • [
      • 37,
      • "๐Ÿ”ต",
      • "5๏ธโƒฃ",
      • true,
      • "<a target="_blank" href="https://huggingface.co/FairMind/Llama-3-8B-4bit-UltraChat-Ita" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Llama-3-8B-4bit-UltraChat-Ita</a>",
      • 44.93,
      • "65.0",
      • 62.62,
      • 7.72,
      • 71,
      • 1,
      • "69.4",
      • 68.97,
      • 1.12,
      • 70.56,
      • 4,
      • "61.5",
      • 60.03,
      • 3.48,
      • 64.16,
      • 4,
      • "47.0",
      • 39.57,
      • 12.67,
      • 56.8,
      • 3,
      • "52.8",
      • 47.59,
      • 10.75,
      • 60.88,
      • 3,
      • "52.9",
      • 48.26,
      • 29.59,
      • 88.53,
      • 4,
      • "22.5",
      • 21.8,
      • 1.22,
      • 22.67,
      • 2,
      • "21.4",
      • 21.25,
      • 0.28,
      • 21.45,
      • 1,
      • "29.8",
      • 29.17,
      • 1.21,
      • 30.02,
      • 2,
      • "27.0",
      • 26.27,
      • 1.41,
      • 27.27,
      • 2,
      • "LlamaForCausalLM",
      • "?",
      • 9,
      • 0,
      • true,
      • ""
      ],
    • [
      • 38,
      • "๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/google/gemma-3-4b-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gemma-3-4b-it</a>",
      • 44.59,
      • "66.2",
      • 64.88,
      • 5.21,
      • 69.25,
      • 3,
      • "60.4",
      • 57.66,
      • 8.12,
      • 65.65,
      • 5,
      • "59.8",
      • 57.32,
      • 10.14,
      • 64.25,
      • 5,
      • "46.0",
      • 38.87,
      • 12.57,
      • 54.6,
      • 3,
      • "50.8",
      • 44.13,
      • 21.85,
      • 61.44,
      • 5,
      • "52.5",
      • 48.67,
      • 32.61,
      • 90.77,
      • 3,
      • "32.4",
      • 30.72,
      • 3.47,
      • 33.18,
      • 1,
      • "29.2",
      • 28.93,
      • 0.62,
      • 29.37,
      • 2,
      • "29.8",
      • 28.92,
      • 1.71,
      • 30.13,
      • 2,
      • "18.7",
      • 18.34,
      • 0.62,
      • 18.78,
      • 2,
      • "Gemma3ForConditionalGeneration",
      • "?",
      • 0,
      • 0,
      • true,
      • ""
      ],
    • [
      • 39,
      • "๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/microsoft/Phi-3.5-mini-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Phi-3.5-mini-instruct</a>",
      • 44.4,
      • "72.2",
      • 70.04,
      • 11.65,
      • 81.5,
      • 4,
      • "51.6",
      • 43.72,
      • 20.43,
      • 70.59,
      • 3,
      • "65.9",
      • 64.49,
      • 3.24,
      • 69.17,
      • 4,
      • "48.9",
      • 40.67,
      • 17.65,
      • 62.8,
      • 3,
      • "60.4",
      • 56.97,
      • 13.07,
      • 67.41,
      • 5,
      • "51.5",
      • 44.22,
      • 25.29,
      • 79.05,
      • 3,
      • "20.4",
      • 18.28,
      • 3.75,
      • 20.94,
      • 2,
      • "23.2",
      • 22.7,
      • 1,
      • 23.4,
      • 2,
      • "30.6",
      • 30.35,
      • 0.47,
      • 30.68,
      • 2,
      • "19.3",
      • 14.4,
      • 8.78,
      • 20.61,
      • 1,
      • "Phi3ForCausalLM",
      • "?",
      • 4,
      • 0,
      • true,
      • ""
      ],
    • [
      • 40,
      • "๐Ÿ”ต๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/Qwen/Qwen2.5-14B-Instruct-1M" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen2.5-14B-Instruct-1M</a>",
      • 44.36,
      • "69.5",
      • 66.88,
      • 14.54,
      • 86.75,
      • 4,
      • "54.1",
      • 47.09,
      • 22.28,
      • 74.98,
      • 3,
      • "60.0",
      • 54.95,
      • 20.19,
      • 75.11,
      • 4,
      • "53.3",
      • 47.3,
      • 27.9,
      • 83.4,
      • 3,
      • "49.0",
      • 40.07,
      • 28.87,
      • 67.2,
      • 6,
      • "54.4",
      • 53.74,
      • 34.68,
      • 98.5,
      • 3,
      • "36.8",
      • 35.98,
      • 1.77,
      • 37.23,
      • 2,
      • "25.2",
      • 25.1,
      • 0.13,
      • 25.19,
      • 2,
      • "8.2",
      • 6.63,
      • 2.34,
      • 8.28,
      • 2,
      • "33.3",
      • 33.06,
      • 0.54,
      • 33.45,
      • 1,
      • "Qwen2ForCausalLM",
      • "?",
      • 15,
      • 0,
      • true,
      • ""
      ],
    • [
      • 41,
      • "๐Ÿ”ต",
      • "5๏ธโƒฃ",
      • true,
      • "<a target="_blank" href="https://huggingface.co/HuggingFaceH4/zephyr-7b-beta" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zephyr-7b-beta</a>",
      • 43.84,
      • "70.7",
      • 69.46,
      • 3.73,
      • 74.25,
      • 2,
      • "69.9",
      • 69.11,
      • 1.52,
      • 71.92,
      • 6,
      • "60.4",
      • 57.29,
      • 7.15,
      • 66.49,
      • 4,
      • "49.3",
      • 43.3,
      • 10.2,
      • 57.4,
      • 3,
      • "40.2",
      • 30.87,
      • 11.92,
      • 49.42,
      • 4,
      • "42.8",
      • 34.37,
      • 9.46,
      • 51.87,
      • 4,
      • "11.4",
      • 11.42,
      • 0.04,
      • 11.45,
      • 2,
      • "24.3",
      • 23.94,
      • 0.59,
      • 24.36,
      • 2,
      • "32.8",
      • 32.72,
      • 0.2,
      • 32.87,
      • 1,
      • "36.5",
      • 36.47,
      • 0.13,
      • 36.56,
      • 1,
      • "MistralForCausalLM",
      • "?",
      • 8,
      • 0,
      • true,
      • ""
      ],
    • [
      • 42,
      • "๐Ÿ”ต",
      • "5๏ธโƒฃ",
      • true,
      • "<a target="_blank" href="https://huggingface.co/MoxoffSpA/Volare" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Volare</a>",
      • 43.73,
      • "66.9",
      • 65.17,
      • 5.77,
      • 71,
      • 2,
      • "68.4",
      • 66.88,
      • 4.66,
      • 72.33,
      • 4,
      • "49.6",
      • 45.34,
      • 9.24,
      • 54.73,
      • 4,
      • "48.9",
      • 41.83,
      • 13.46,
      • 59.2,
      • 3,
      • "44.4",
      • 36.76,
      • 15.28,
      • 53.1,
      • 1,
      • "49.8",
      • 41.36,
      • 21.25,
      • 75.06,
      • 4,
      • "28.0",
      • 27.25,
      • 1.47,
      • 28.29,
      • 1,
      • "17.5",
      • 17.37,
      • 0.19,
      • 17.5,
      • 2,
      • "31.6",
      • 31.49,
      • 0.12,
      • 31.57,
      • 2,
      • "32.3",
      • 31.3,
      • 2.18,
      • 32.85,
      • 1,
      • "GemmaForCausalLM",
      • "?",
      • 0,
      • 0,
      • true,
      • ""
      ],
    • [
      • 43,
      • "๐Ÿ”ต",
      • "5๏ธโƒฃ",
      • true,
      • "<a target="_blank" href="https://huggingface.co/occiglot/occiglot-7b-it-en-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">occiglot-7b-it-en-instruct</a>",
      • 43.69,
      • "64.3",
      • 61.83,
      • 6.5,
      • 70.25,
      • 2,
      • "69.0",
      • 68.43,
      • 1.37,
      • 70.48,
      • 4,
      • "61.8",
      • 58.3,
      • 11.61,
      • 69.78,
      • 4,
      • "54.2",
      • 49.43,
      • 9.4,
      • 61.8,
      • 4,
      • "55.7",
      • 51.2,
      • 11.26,
      • 63.65,
      • 3,
      • "51.6",
      • 43.68,
      • 22,
      • 73.07,
      • 4,
      • "5.6",
      • 5.4,
      • 0.3,
      • 5.61,
      • 2,
      • "21.2",
      • 20.55,
      • 1.26,
      • 21.44,
      • 2,
      • "25.1",
      • 24.93,
      • 0.41,
      • 25.22,
      • 2,
      • "28.3",
      • 27.2,
      • 2.14,
      • 28.71,
      • 1,
      • "MistralForCausalLM",
      • "?",
      • 0,
      • 0,
      • true,
      • ""
      ],
    • [
      • 44,
      • "๐Ÿ”ต๐Ÿ”ต",
      • "5๏ธโƒฃ",
      • true,
      • "<a target="_blank" href="https://huggingface.co/Almawave/Velvet-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Velvet-14B</a>",
      • 42.69,
      • "75.0",
      • 74.46,
      • 2.19,
      • 76.75,
      • 2,
      • "68.3",
      • 67.99,
      • 0.98,
      • 68.93,
      • 1,
      • "67.2",
      • 66.35,
      • 2.31,
      • 68.97,
      • 4,
      • "54.2",
      • 49.73,
      • 8.58,
      • 61.2,
      • 4,
      • "29.3",
      • 15.41,
      • 14.11,
      • 37.69,
      • 6,
      • "49.3",
      • 41.73,
      • 21.95,
      • 80.55,
      • 4,
      • "9.4",
      • 9.21,
      • 0.34,
      • 9.45,
      • 1,
      • "34.7",
      • 34.25,
      • 0.88,
      • 34.88,
      • 2,
      • "24.8",
      • 24.48,
      • 0.66,
      • 24.94,
      • 2,
      • "14.8",
      • 13.08,
      • 2.86,
      • 15.1,
      • 1,
      • "MistralForCausalLM",
      • "?",
      • 15,
      • 0,
      • true,
      • ""
      ],
    • [
      • 45,
      • "๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/arcee-ai/Llama-3.1-SuperNova-Lite" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Llama-3.1-SuperNova-Lite</a>",
      • 42.66,
      • "70.1",
      • 68.54,
      • 6.62,
      • 74.5,
      • 4,
      • "66.8",
      • 63.8,
      • 12.28,
      • 77.12,
      • 4,
      • "56.4",
      • 52.21,
      • 12.37,
      • 63.8,
      • 5,
      • "51.5",
      • 43.63,
      • 19.37,
      • 68.6,
      • 3,
      • "50.8",
      • 42.88,
      • 24.72,
      • 66.58,
      • 6,
      • "54.9",
      • 51.16,
      • 30.2,
      • 90.52,
      • 4,
      • "23.7",
      • 22.04,
      • 3.17,
      • 24.29,
      • 2,
      • "22.8",
      • 22.74,
      • 0.05,
      • 22.78,
      • 2,
      • "10.0",
      • 8.86,
      • 1.76,
      • 10.11,
      • 2,
      • "19.6",
      • 17.82,
      • 3.17,
      • 20.06,
      • 2,
      • "LlamaForCausalLM",
      • "?",
      • 9,
      • 0,
      • true,
      • ""
      ],
    • [
      • 46,
      • "๐Ÿ”ต",
      • "5๏ธโƒฃ",
      • true,
      • "<a target="_blank" href="https://huggingface.co/Fastweb/FastwebMIIA-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FastwebMIIA-7B</a>",
      • 42.56,
      • "59.2",
      • 56.79,
      • 4.22,
      • 63.5,
      • 2,
      • "58.6",
      • 56.18,
      • 5.14,
      • 62.72,
      • 3,
      • "54.5",
      • 49.7,
      • 10.75,
      • 62.31,
      • 3,
      • "49.5",
      • 44.47,
      • 8.35,
      • 55.8,
      • 3,
      • "56.6",
      • 52.09,
      • 11.01,
      • 64.81,
      • 3,
      • "48.9",
      • 40.07,
      • 19.11,
      • 65.84,
      • 3,
      • "11.9",
      • 11.49,
      • 0.64,
      • 11.94,
      • 1,
      • "28.1",
      • 27.83,
      • 0.52,
      • 28.2,
      • 2,
      • "29.0",
      • 28.43,
      • 1.14,
      • 29.24,
      • 1,
      • "29.4",
      • 28.16,
      • 2.51,
      • 29.93,
      • 1,
      • "MistralForCausalLM",
      • "?",
      • 8,
      • 0,
      • true,
      • ""
      ],
    • [
      • 47,
      • "๐Ÿ”ต๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/microsoft/Phi-3-medium-4k-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Phi-3-medium-4k-instruct</a>",
      • 42.09,
      • "66.0",
      • 62.58,
      • 11.76,
      • 78.5,
      • 3,
      • "55.3",
      • 48.78,
      • 19.52,
      • 72.58,
      • 3,
      • "65.2",
      • 62.35,
      • 8.99,
      • 72.75,
      • 4,
      • "54.1",
      • 47.67,
      • 25.17,
      • 80.2,
      • 3,
      • "53.5",
      • 46.92,
      • 30.89,
      • 66.67,
      • 1,
      • "53.2",
      • 48.67,
      • 30.21,
      • 88.53,
      • 4,
      • "0.3",
      • 0.31,
      • 0.04,
      • 0.34,
      • 2,
      • "27.6",
      • 26.52,
      • 2.21,
      • 28.08,
      • 2,
      • "18.6",
      • 17.48,
      • 1.97,
      • 18.87,
      • 1,
      • "27.0",
      • 26.99,
      • 0.02,
      • 27.01,
      • 2,
      • "Phi3ForCausalLM",
      • "?",
      • 14,
      • 0,
      • true,
      • ""
      ],
    • [
      • 48,
      • "๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/google/gemma-3n-E4B-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gemma-3n-E4B-it</a>",
      • 42.05,
      • "68.0",
      • 65.58,
      • 9.41,
      • 75.5,
      • 3,
      • "64.7",
      • 61.45,
      • 12.94,
      • 73.95,
      • 3,
      • "66.4",
      • 64.2,
      • 6.07,
      • 71.93,
      • 4,
      • "53.3",
      • 46.07,
      • 20.99,
      • 73.2,
      • 3,
      • "39.7",
      • 26.32,
      • 26.78,
      • 67.4,
      • 6,
      • "53.1",
      • 49.92,
      • 32.9,
      • 92.52,
      • 4,
      • "5.9",
      • 3.44,
      • 3.75,
      • 6.1,
      • 2,
      • "29.8",
      • 29.4,
      • 0.74,
      • 29.92,
      • 2,
      • "9.9",
      • 8.13,
      • 2.72,
      • 10.05,
      • 2,
      • "29.7",
      • 29.07,
      • 1.32,
      • 30.01,
      • 1,
      • "Gemma3nForConditionalGeneration",
      • "?",
      • 8,
      • 0,
      • true,
      • ""
      ],
    • [
      • 49,
      • "๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/swap-uniba/LLaMAntino-3-ANITA-8B-Inst-DPO-ITA" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">LLaMAntino-3-ANITA-8B-Inst-DPO-ITA</a>",
      • 41.74,
      • "62.1",
      • 58.92,
      • 8.59,
      • 69.5,
      • 2,
      • "64.0",
      • 60.96,
      • 12.61,
      • 72.04,
      • 4,
      • "48.6",
      • 39.59,
      • 30.9,
      • 66.32,
      • 6,
      • "48.9",
      • 40.73,
      • 16.57,
      • 62.2,
      • 4,
      • "57.3",
      • 52.6,
      • 20.2,
      • 66.57,
      • 6,
      • "51.0",
      • 42.85,
      • 18.14,
      • 71.82,
      • 4,
      • "19.4",
      • 19.35,
      • 0.03,
      • 19.37,
      • 1,
      • "22.6",
      • 22.37,
      • 0.48,
      • 22.71,
      • 2,
      • "22.6",
      • 19.02,
      • 6.75,
      • 23.79,
      • 1,
      • "20.8",
      • 17.81,
      • 5.37,
      • 21.61,
      • 2,
      • "LlamaForCausalLM",
      • "?",
      • 9,
      • 0,
      • true,
      • ""
      ],
    • [
      • 50,
      • "๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Mistral-7B-Instruct-v0.3</a>",
      • 41.56,
      • "63.8",
      • 60.83,
      • 6.86,
      • 71.25,
      • 1,
      • "61.2",
      • 58.71,
      • 7.98,
      • 66.14,
      • 6,
      • "59.2",
      • 55.86,
      • 14.5,
      • 65.39,
      • 5,
      • "46.0",
      • 39.23,
      • 11.54,
      • 54,
      • 3,
      • "62.9",
      • 61.08,
      • 6.85,
      • 66.49,
      • 2,
      • "53.3",
      • 47.63,
      • 27.27,
      • 84.79,
      • 4,
      • "9.9",
      • 6.47,
      • 5.36,
      • 10.26,
      • 2,
      • "28.1",
      • 27.28,
      • 1.63,
      • 28.43,
      • 2,
      • "9.7",
      • 9.64,
      • 0.13,
      • 9.73,
      • 1,
      • "21.5",
      • 20.29,
      • 2.18,
      • 21.83,
      • 2,
      • "MistralForCausalLM",
      • "?",
      • 8,
      • 0,
      • true,
      • ""
      ],
    • [
      • 51,
      • "๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/mii-llm/maestrale-chat-v0.4-beta" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">maestrale-chat-v0.4-beta</a>",
      • 41.04,
      • "60.2",
      • 55.42,
      • 10.3,
      • 73.25,
      • 4,
      • "63.2",
      • 61.22,
      • 8.35,
      • 67.4,
      • 4,
      • "61.5",
      • 60.62,
      • 3.45,
      • 63.07,
      • 6,
      • "51.9",
      • 45.17,
      • 15.17,
      • 63.8,
      • 3,
      • "46.4",
      • 36.19,
      • 27.34,
      • 66.84,
      • 6,
      • "52.4",
      • 46.09,
      • 24.45,
      • 83.04,
      • 4,
      • "20.4",
      • 19.22,
      • 2.07,
      • 20.68,
      • 2,
      • "25.7",
      • 25.14,
      • 1.07,
      • 25.89,
      • 2,
      • "5.0",
      • 4.62,
      • 0.6,
      • 5.05,
      • 1,
      • "23.8",
      • 21.98,
      • 3.32,
      • 24.33,
      • 2,
      • "MistralForCausalLM",
      • "?",
      • 8,
      • 0,
      • true,
      • ""
      ],
    • [
      • 52,
      • "๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/DeepMount00/Llama-3-8b-Ita" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Llama-3-8b-Ita</a>",
      • 41.02,
      • "58.6",
      • 56,
      • 6.41,
      • 63,
      • 2,
      • "62.6",
      • 60.33,
      • 8.3,
      • 67.21,
      • 5,
      • "49.1",
      • 40.78,
      • 31.64,
      • 63.86,
      • 6,
      • "49.0",
      • 40.83,
      • 17.06,
      • 63,
      • 4,
      • "55.4",
      • 49.72,
      • 25.38,
      • 67.12,
      • 6,
      • "51.2",
      • 46.22,
      • 25.61,
      • 88.03,
      • 4,
      • "22.6",
      • 21.46,
      • 2.07,
      • 22.93,
      • 2,
      • "22.8",
      • 22.51,
      • 0.47,
      • 22.84,
      • 1,
      • "25.1",
      • 20.84,
      • 8.18,
      • 26.62,
      • 1,
      • "13.8",
      • 12.08,
      • 2.79,
      • 14.06,
      • 2,
      • "LlamaForCausalLM",
      • "?",
      • 9,
      • 0,
      • true,
      • ""
      ],
    • [
      • 53,
      • "๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/meta-llama/Meta-Llama-3.1-8B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Meta-Llama-3.1-8B-Instruct</a>",
      • 40.23,
      • "62.7",
      • 60.62,
      • 4.93,
      • 67,
      • 1,
      • "63.1",
      • 59.8,
      • 11.31,
      • 71.12,
      • 5,
      • "51.4",
      • 44.49,
      • 18.77,
      • 63.26,
      • 5,
      • "50.5",
      • 42.27,
      • 20.77,
      • 69,
      • 4,
      • "55.9",
      • 50.4,
      • 22.1,
      • 66.93,
      • 6,
      • "53.1",
      • 48.29,
      • 27.52,
      • 87.78,
      • 4,
      • "19.9",
      • 15.75,
      • 7.51,
      • 21.06,
      • 2,
      • "22.3",
      • 22.3,
      • 0.06,
      • 22.34,
      • 1,
      • "7.9",
      • 7.57,
      • 0.55,
      • 7.96,
      • 1,
      • "15.4",
      • 15.08,
      • 0.57,
      • 15.48,
      • 2,
      • "LlamaForCausalLM",
      • "?",
      • 9,
      • 0,
      • true,
      • ""
      ],
    • [
      • 54,
      • "๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/CohereForAI/aya-expanse-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">aya-expanse-8b</a>",
      • 39.85,
      • "64.0",
      • 60.33,
      • 10.96,
      • 75,
      • 1,
      • "63.2",
      • 60.16,
      • 11.1,
      • 70.62,
      • 4,
      • "61.4",
      • 60.48,
      • 3.13,
      • 62.97,
      • 4,
      • "46.7",
      • 38.53,
      • 14.81,
      • 57.8,
      • 4,
      • "52.8",
      • 45.95,
      • 30.08,
      • 66.19,
      • 3,
      • "47.6",
      • 38.03,
      • 14.59,
      • 66.33,
      • 4,
      • "15.2",
      • 11.62,
      • 6.07,
      • 15.92,
      • 2,
      • "19.1",
      • 18.84,
      • 0.52,
      • 19.21,
      • 2,
      • "19.1",
      • 17.36,
      • 3.03,
      • 19.5,
      • 2,
      • "9.4",
      • 7.52,
      • 2.9,
      • 9.57,
      • 2,
      • "CohereForCausalLM",
      • "?",
      • 0,
      • 0,
      • true,
      • ""
      ],
    • [
      • 55,
      • "๐Ÿ”ต๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/Almawave/Velvet-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Velvet-14B</a>",
      • 39.48,
      • "70.0",
      • 67.62,
      • 7.72,
      • 78.5,
      • 1,
      • "62.5",
      • 60.49,
      • 6.86,
      • 66.59,
      • 5,
      • "59.3",
      • 55.93,
      • 12.94,
      • 65.64,
      • 3,
      • "48.8",
      • 42.4,
      • 11.14,
      • 57.4,
      • 4,
      • "47.3",
      • 37.91,
      • 30.77,
      • 64.31,
      • 5,
      • "50.1",
      • 45.47,
      • 26.63,
      • 89.53,
      • 4,
      • "0.1",
      • 0.06,
      • 0.09,
      • 0.13,
      • 1,
      • "31.1",
      • 31.09,
      • 0.03,
      • 31.11,
      • 2,
      • "15.9",
      • 15.75,
      • 0.24,
      • 15.91,
      • 2,
      • "9.7",
      • 6.96,
      • 4.28,
      • 9.98,
      • 1,
      • "MistralForCausalLM",
      • "?",
      • 15,
      • 0,
      • true,
      • ""
      ],
    • [
      • 56,
      • "๐Ÿ”ต๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/microsoft/phi-4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">phi-4</a>",
      • 38.37,
      • "63.9",
      • 60.42,
      • 7.82,
      • 73.25,
      • 4,
      • "50.5",
      • 42.31,
      • 23.16,
      • 74.26,
      • 4,
      • "61.4",
      • 60.41,
      • 3.23,
      • 63.15,
      • 4,
      • "55.7",
      • 50.13,
      • 25.25,
      • 83.2,
      • 3,
      • "54.7",
      • 48.23,
      • 27.02,
      • 69.01,
      • 3,
      • "53.5",
      • 51.45,
      • 31.22,
      • 95.26,
      • 4,
      • "0.0",
      • 0,
      • 0,
      • 0,
      • 1,
      • "22.5",
      • 13.28,
      • 17.55,
      • 25.69,
      • 1,
      • "0.4",
      • 0.18,
      • 0.26,
      • 0.36,
      • 1,
      • "21.2",
      • 19.38,
      • 3.22,
      • 21.66,
      • 1,
      • "Phi3ForCausalLM",
      • "?",
      • 15,
      • 0,
      • true,
      • ""
      ],
    • [
      • 57,
      • "๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/occiglot/occiglot-7b-it-en-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">occiglot-7b-it-en-instruct</a>",
      • 38,
      • "52.6",
      • 49.92,
      • 4.48,
      • 56,
      • 4,
      • "55.5",
      • 51.87,
      • 9.82,
      • 61.09,
      • 6,
      • "53.8",
      • 48.48,
      • 20.94,
      • 62.72,
      • 1,
      • "48.0",
      • 42.8,
      • 8.92,
      • 54,
      • 3,
      • "50.8",
      • 42.86,
      • 28.9,
      • 66.49,
      • 5,
      • "49.6",
      • 42.89,
      • 21.86,
      • 83.29,
      • 4,
      • "2.8",
      • 1.56,
      • 1.87,
      • 2.88,
      • 2,
      • "27.7",
      • 25.5,
      • 4.33,
      • 28.56,
      • 2,
      • "7.0",
      • 5.26,
      • 2.58,
      • 7.09,
      • 1,
      • "32.3",
      • 32.03,
      • 0.6,
      • 32.45,
      • 2,
      • "MistralForCausalLM",
      • "?",
      • 0,
      • 0,
      • true,
      • ""
      ],
    • [
      • 58,
      • "๐Ÿ”ต",
      • "5๏ธโƒฃ",
      • true,
      • "<a target="_blank" href="https://huggingface.co/swap-uniba/LLaMAntino-2-7b-hf-ITA" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">LLaMAntino-2-7b-hf-ITA</a>",
      • 37.86,
      • "54.0",
      • 53.21,
      • 1.55,
      • 55,
      • 3,
      • "56.6",
      • 52.85,
      • 8.85,
      • 62.87,
      • 5,
      • "52.3",
      • 46.57,
      • 12.64,
      • 61.35,
      • 3,
      • "31.7",
      • 29.47,
      • 2.29,
      • 32.8,
      • 4,
      • "51.5",
      • 45.13,
      • 13.92,
      • 62,
      • 3,
      • "34.0",
      • 30.51,
      • 2.74,
      • 35.91,
      • 4,
      • "19.7",
      • 19.32,
      • 0.74,
      • 19.84,
      • 2,
      • "20.7",
      • 20.62,
      • 0.12,
      • 20.7,
      • 2,
      • "29.3",
      • 29.08,
      • 0.4,
      • 29.36,
      • 2,
      • "28.9",
      • 28.76,
      • 0.19,
      • 28.9,
      • 1,
      • "LlamaForCausalLM",
      • "?",
      • 0,
      • 0,
      • true,
      • ""
      ],
    • [
      • 59,
      • "๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/ibm-granite/granite-3.1-8b-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">granite-3.1-8b-instruct</a>",
      • 37.26,
      • "56.3",
      • 51.08,
      • 8.65,
      • 67,
      • 2,
      • "54.4",
      • 47.82,
      • 20.82,
      • 69.45,
      • 4,
      • "48.1",
      • 39.51,
      • 30.59,
      • 62.64,
      • 1,
      • "49.0",
      • 41.97,
      • 12.63,
      • 59.4,
      • 3,
      • "50.2",
      • 42,
      • 32.2,
      • 66.85,
      • 5,
      • "54.7",
      • 51.29,
      • 31.07,
      • 91.52,
      • 4,
      • "0.2",
      • 0.08,
      • 0.12,
      • 0.17,
      • 2,
      • "30.3",
      • 30.04,
      • 0.49,
      • 30.39,
      • 1,
      • "18.1",
      • 16.03,
      • 3.59,
      • 18.57,
      • 2,
      • "11.2",
      • 9.17,
      • 3.25,
      • 11.47,
      • 2,
      • "GraniteForCausalLM",
      • "?",
      • 9,
      • 0,
      • true,
      • ""
      ],
    • [
      • 60,
      • "๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/HuggingFaceH4/zephyr-7b-beta" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zephyr-7b-beta</a>",
      • 37.26,
      • "60.7",
      • 58.62,
      • 5.3,
      • 64.5,
      • 1,
      • "51.7",
      • 45.08,
      • 16.59,
      • 63.08,
      • 6,
      • "55.3",
      • 50.36,
      • 24.41,
      • 64.26,
      • 5,
      • "43.6",
      • 36.7,
      • 11.19,
      • 50.8,
      • 3,
      • "57.9",
      • 53.45,
      • 16.85,
      • 66.76,
      • 5,
      • "44.4",
      • 34.5,
      • 11.74,
      • 58.35,
      • 4,
      • "2.8",
      • 2.62,
      • 0.33,
      • 2.85,
      • 2,
      • "27.2",
      • 27.06,
      • 0.2,
      • 27.2,
      • 2,
      • "18.4",
      • 18.15,
      • 0.48,
      • 18.49,
      • 2,
      • "10.4",
      • 8.61,
      • 2.87,
      • 10.64,
      • 2,
      • "MistralForCausalLM",
      • "?",
      • 8,
      • 0,
      • true,
      • ""
      ],
    • [
      • 61,
      • "๐Ÿ”ต",
      • "5๏ธโƒฃ",
      • true,
      • "<a target="_blank" href="https://huggingface.co/openai/gpt-oss-20b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gpt-oss-20b</a>",
      • 37.2,
      • "57.1",
      • 53.71,
      • 4.72,
      • 62.75,
      • 1,
      • "58.4",
      • 56.51,
      • 3.11,
      • 61.54,
      • 1,
      • "57.1",
      • 53.93,
      • 11.27,
      • 62.26,
      • 6,
      • "44.6",
      • 34.17,
      • 17.74,
      • 61,
      • 3,
      • "59.4",
      • 57.25,
      • 6.71,
      • 62.99,
      • 1,
      • "50.7",
      • 48.79,
      • 35.9,
      • 96.01,
      • 3,
      • "1.2",
      • 0.61,
      • 0.79,
      • 1.17,
      • 2,
      • "8.8",
      • 5.06,
      • 5.89,
      • 9.23,
      • 1,
      • "1.9",
      • 1.62,
      • 0.4,
      • 1.9,
      • 1,
      • "32.8",
      • 31.88,
      • 2.04,
      • 33.32,
      • 1,
      • "GptOssForCausalLM",
      • "?",
      • 2,
      • 0,
      • true,
      • ""
      ],
    • [
      • 62,
      • "๐Ÿ”ต๐Ÿ”ต๐Ÿ”ต",
      • "5๏ธโƒฃ",
      • true,
      • "<a target="_blank" href="https://huggingface.co/meta-llama/Llama-4-Scout-17B-16E-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Llama-4-Scout-17B-16E-Instruct</a>",
      • 36.52,
      • "56.3",
      • 54.75,
      • 2.78,
      • 58.5,
      • 5,
      • "52.6",
      • 48.54,
      • 7.57,
      • 58.15,
      • 4,
      • "45.8",
      • 42.59,
      • 7.88,
      • 48.91,
      • 4,
      • "23.6",
      • 19.33,
      • 3.17,
      • 25,
      • 4,
      • "56.5",
      • 55.02,
      • 2.22,
      • 58.49,
      • 3,
      • "40.1",
      • 31.67,
      • 8.69,
      • 47.88,
      • 4,
      • "5.7",
      • 5.32,
      • 0.57,
      • 5.72,
      • 2,
      • "21.5",
      • 20.9,
      • 1.03,
      • 21.62,
      • 2,
      • "26.9",
      • 26.7,
      • 0.48,
      • 27.05,
      • 2,
      • "36.2",
      • 35.75,
      • 1.03,
      • 36.47,
      • 2,
      • "Llama4ForConditionalGeneration",
      • "?",
      • 109,
      • 0,
      • true,
      • ""
      ],
    • [
      • 63,
      • "๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/FairMind/Llama-3-8B-4bit-UltraChat-Ita" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Llama-3-8B-4bit-UltraChat-Ita</a>",
      • 36.28,
      • "60.3",
      • 58.08,
      • 4.16,
      • 64.25,
      • 6,
      • "53.0",
      • 46.01,
      • 15.04,
      • 67.3,
      • 5,
      • "54.1",
      • 48.72,
      • 15.55,
      • 63.38,
      • 5,
      • "41.4",
      • 33.37,
      • 11.69,
      • 49.2,
      • 3,
      • "66.2",
      • 66.04,
      • 0.62,
      • 66.58,
      • 4,
      • "43.9",
      • 34.54,
      • 10.55,
      • 55.61,
      • 4,
      • "0.0",
      • 0,
      • 0,
      • 0,
      • 1,
      • "24.2",
      • 23.6,
      • 1.06,
      • 24.35,
      • 2,
      • "15.6",
      • 13.72,
      • 3.15,
      • 15.95,
      • 2,
      • "4.2",
      • 2.62,
      • 2.33,
      • 4.27,
      • 2,
      • "LlamaForCausalLM",
      • "?",
      • 9,
      • 0,
      • true,
      • ""
      ],
    • [
      • 64,
      • "๐Ÿ”ต",
      • "5๏ธโƒฃ",
      • true,
      • "<a target="_blank" href="https://huggingface.co/sapienzanlp/Minerva-7B-instruct-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Minerva-7B-instruct-v1.0</a>",
      • 35.7,
      • "67.2",
      • 65.33,
      • 8.14,
      • 72.25,
      • 3,
      • "57.7",
      • 54.91,
      • 9.45,
      • 62.25,
      • 6,
      • "41.6",
      • 30.89,
      • 20.05,
      • 54.29,
      • 5,
      • "37.8",
      • 37.27,
      • 1.12,
      • 38.2,
      • 2,
      • "45.5",
      • 36.89,
      • 19.24,
      • 56.73,
      • 5,
      • "29.9",
      • 28.64,
      • 1.69,
      • 30.42,
      • 4,
      • "5.0",
      • 4.88,
      • 0.18,
      • 5.01,
      • 2,
      • "15.4",
      • 15.12,
      • 0.4,
      • 15.4,
      • 2,
      • "27.2",
      • 27.04,
      • 0.39,
      • 27.32,
      • 2,
      • "29.7",
      • 29.36,
      • 0.67,
      • 29.83,
      • 1,
      • "MistralForCausalLM",
      • "?",
      • 8,
      • 0,
      • true,
      • ""
      ],
    • [
      • 65,
      • "๐Ÿ”ต",
      • "5๏ธโƒฃ",
      • true,
      • "<a target="_blank" href="https://huggingface.co/sapienzanlp/Minerva-7B-base-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Minerva-7B-base-v1.0</a>",
      • 35.06,
      • "68.7",
      • 66.79,
      • 8.95,
      • 74.25,
      • 4,
      • "53.9",
      • 53.14,
      • 1.48,
      • 54.7,
      • 5,
      • "39.5",
      • 34.45,
      • 10.03,
      • 43.34,
      • 5,
      • "38.0",
      • 36.83,
      • 1.62,
      • 38.8,
      • 6,
      • "44.0",
      • 34.55,
      • 19.47,
      • 56,
      • 5,
      • "28.4",
      • 27.27,
      • 2.3,
      • 28.93,
      • 1,
      • "9.5",
      • 9.51,
      • 0.03,
      • 9.53,
      • 2,
      • "15.9",
      • 15.92,
      • 0.05,
      • 15.95,
      • 1,
      • "24.4",
      • 24.29,
      • 0.15,
      • 24.4,
      • 1,
      • "28.2",
      • 26.64,
      • 3.1,
      • 28.83,
      • 1,
      • "MistralForCausalLM",
      • "?",
      • 0,
      • 0,
      • true,
      • ""
      ],
    • [
      • 66,
      • "๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/openai/gpt-oss-20b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gpt-oss-20b</a>",
      • 34.95,
      • "52.5",
      • 50.08,
      • 4.71,
      • 55.5,
      • 1,
      • "58.1",
      • 53.39,
      • 11.42,
      • 68.01,
      • 3,
      • "49.6",
      • 41.82,
      • 22.51,
      • 62.59,
      • 2,
      • "48.1",
      • 38.87,
      • 26.09,
      • 73.2,
      • 3,
      • "49.0",
      • 40.39,
      • 29.4,
      • 65.14,
      • 2,
      • "39.2",
      • 31.26,
      • 7.77,
      • 45.89,
      • 3,
      • "0.0",
      • 0,
      • 0,
      • 0,
      • 1,
      • "18.3",
      • 17.78,
      • 0.92,
      • 18.43,
      • 1,
      • "1.7",
      • 1.5,
      • 0.27,
      • 1.7,
      • 1,
      • "33.1",
      • 32.45,
      • 1.3,
      • 33.37,
      • 1,
      • "GptOssForCausalLM",
      • "?",
      • 2,
      • 0,
      • true,
      • ""
      ],
    • [
      • 67,
      • "๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/Fastweb/FastwebMIIA-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FastwebMIIA-7B</a>",
      • 34.71,
      • "61.1",
      • 57.75,
      • 5.47,
      • 68.25,
      • 4,
      • "53.0",
      • 47.2,
      • 11.99,
      • 62.9,
      • 3,
      • "52.1",
      • 46.36,
      • 22.64,
      • 60.99,
      • 1,
      • "39.8",
      • 33.07,
      • 8.59,
      • 45.4,
      • 4,
      • "58.7",
      • 54.69,
      • 14.35,
      • 66.58,
      • 2,
      • "39.8",
      • 32.38,
      • 7.26,
      • 46.13,
      • 4,
      • "0.0",
      • 0,
      • 0,
      • 0,
      • 1,
      • "29.2",
      • 26.83,
      • 4.77,
      • 30.2,
      • 2,
      • "9.3",
      • 9.32,
      • 0.03,
      • 9.34,
      • 1,
      • "4.1",
      • 2.41,
      • 2.53,
      • 4.2,
      • 1,
      • "MistralForCausalLM",
      • "?",
      • 8,
      • 0,
      • true,
      • ""
      ],
    • [
      • 68,
      • "๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/sapienzanlp/Minerva-7B-instruct-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Minerva-7B-instruct-v1.0</a>",
      • 32.5,
      • "55.2",
      • 54.12,
      • 3.76,
      • 56.5,
      • 4,
      • "50.6",
      • 44.51,
      • 11.58,
      • 59.46,
      • 6,
      • "47.6",
      • 39.22,
      • 22.8,
      • 60.48,
      • 2,
      • "32.2",
      • 28.87,
      • 4.14,
      • 34,
      • 6,
      • "57.7",
      • 53.42,
      • 14.91,
      • 66.04,
      • 5,
      • "35.2",
      • 31.05,
      • 3.36,
      • 37.66,
      • 4,
      • "0.0",
      • 0,
      • 0,
      • 0,
      • 1,
      • "16.3",
      • 16.22,
      • 0.2,
      • 16.36,
      • 2,
      • "9.6",
      • 9.16,
      • 0.71,
      • 9.66,
      • 1,
      • "20.6",
      • 17.75,
      • 5.13,
      • 21.38,
      • 2,
      • "MistralForCausalLM",
      • "?",
      • 8,
      • 0,
      • true,
      • ""
      ],
    • [
      • 69,
      • "๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/sapienzanlp/Minerva-7B-base-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Minerva-7B-base-v1.0</a>",
      • 32.36,
      • "56.9",
      • 54.46,
      • 4.94,
      • 60.75,
      • 1,
      • "56.6",
      • 50.62,
      • 13.19,
      • 71.33,
      • 6,
      • "53.3",
      • 48.04,
      • 13.54,
      • 61.82,
      • 2,
      • "31.4",
      • 27.7,
      • 4.01,
      • 33.2,
      • 5,
      • "61.5",
      • 59.12,
      • 9.19,
      • 66.13,
      • 5,
      • "29.1",
      • 27.81,
      • 1.8,
      • 29.68,
      • 5,
      • "0.0",
      • 0,
      • 0.01,
      • 0.01,
      • 2,
      • "16.3",
      • 16.04,
      • 0.38,
      • 16.31,
      • 1,
      • "9.6",
      • 9.62,
      • 0.04,
      • 9.64,
      • 2,
      • "8.9",
      • 8.09,
      • 1.27,
      • 8.99,
      • 2,
      • "MistralForCausalLM",
      • "?",
      • 0,
      • 0,
      • true,
      • ""
      ],
    • [
      • 70,
      • "๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/swap-uniba/LLaMAntino-2-7b-hf-ITA" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">LLaMAntino-2-7b-hf-ITA</a>",
      • 32.07,
      • "54.5",
      • 53.5,
      • 4.05,
      • 55.75,
      • 6,
      • "51.3",
      • 45.96,
      • 8.92,
      • 59.06,
      • 6,
      • "57.0",
      • 53.57,
      • 12.58,
      • 62.77,
      • 5,
      • "24.6",
      • 23.37,
      • 1.56,
      • 25,
      • 3,
      • "55.9",
      • 50.67,
      • 26.48,
      • 66.31,
      • 5,
      • "29.2",
      • 28.76,
      • 0.75,
      • 29.43,
      • 4,
      • "0.1",
      • 0.03,
      • 0.04,
      • 0.06,
      • 2,
      • "26.7",
      • 25.86,
      • 1.63,
      • 27.01,
      • 2,
      • "13.0",
      • 12.55,
      • 0.78,
      • 13.1,
      • 2,
      • "8.3",
      • 7.39,
      • 1.43,
      • 8.4,
      • 2,
      • "LlamaForCausalLM",
      • "?",
      • 0,
      • 0,
      • true,
      • ""
      ],
    • [
      • 71,
      • "๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/MoxoffSpA/Volare" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Volare</a>",
      • 30.23,
      • "53.1",
      • 51.25,
      • 4.56,
      • 55.5,
      • 6,
      • "50.1",
      • 44.15,
      • 10.09,
      • 58.49,
      • 6,
      • "50.7",
      • 43.57,
      • 19.46,
      • 62.7,
      • 2,
      • "26.1",
      • 22.77,
      • 3.36,
      • 27.4,
      • 3,
      • "40.8",
      • 27.8,
      • 29.8,
      • 66.4,
      • 5,
      • "27.9",
      • 27.35,
      • 0.86,
      • 28.18,
      • 1,
      • "0.0",
      • 0.02,
      • 0.03,
      • 0.04,
      • 1,
      • "23.3",
      • 23.15,
      • 0.24,
      • 23.32,
      • 1,
      • "10.8",
      • 10.63,
      • 0.22,
      • 10.78,
      • 2,
      • "19.5",
      • 15.95,
      • 6.24,
      • 20.37,
      • 1,
      • "GemmaForCausalLM",
      • "?",
      • 0,
      • 0,
      • true,
      • ""
      ],
    • [
      • 72,
      • "๐Ÿ”ต",
      • "๐Ÿ…พ๏ธ",
      • false,
      • "<a target="_blank" href="https://huggingface.co/google/gemma-3-270m" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gemma-3-270m</a>",
      • 30.14,
      • "52.2",
      • 49.83,
      • 4.72,
      • 55,
      • 4,
      • "43.7",
      • 38.96,
      • 8.72,
      • 48.12,
      • 3,
      • "59.6",
      • 57.79,
      • 5.08,
      • 62.7,
      • 5,
      • "23.2",
      • 19.5,
      • 2.6,
      • 24.4,
      • 4,
      • "54.3",
      • 48.11,
      • 28.37,
      • 66.67,
      • 4,
      • "27.3",
      • 26.81,
      • 0.56,
      • 27.43,
      • 5,
      • "0.0",
      • 0,
      • 0,
      • 0,
      • 1,
      • "18.4",
      • 17.93,
      • 0.73,
      • 18.45,
      • 2,
      • "9.7",
      • 9.58,
      • 0.16,
      • 9.69,
      • 2,
      • "13.1",
      • 7.5,
      • 9.18,
      • 13.99,
      • 1,
      • "Gemma3ForCausalLM",
      • "?",
      • 1,
      • 0,
      • true,
      • ""
      ]
    ],
  • "metadata": null
}
Theoretical performance of a model that scores the highest on every individual task: 75.00

This project has benefited from the following support:

  • ๐Ÿง  Codebase: Based on and extended from the Open Italian LLM Leaderboard, developed by Alessandro Ercolani and Samuele Colombo. We warmly thank them for their invaluable support and guidance in implementing this leaderboard.

  • ๐Ÿ’ถ Funding: Partially supported by the PNRR project FAIR - Future AI Research (PE00000013), under the NRRP MUR program funded by NextGenerationEU.

  • ๐Ÿ–ฅ๏ธ Computation: We gratefully acknowledge CINECA for granting access to the LEONARDO supercomputer.