sha
stringlengths 40
40
| text
stringlengths 1
13.4M
| id
stringlengths 2
117
| tags
sequencelengths 1
7.91k
| created_at
stringlengths 25
25
| metadata
stringlengths 2
875k
| last_modified
stringlengths 25
25
| arxiv
sequencelengths 0
25
| languages
sequencelengths 0
7.91k
| tags_str
stringlengths 17
159k
| text_str
stringlengths 1
447k
| text_lists
sequencelengths 0
352
| processed_texts
sequencelengths 1
353
| tokens_length
sequencelengths 1
353
| input_texts
sequencelengths 1
40
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
93b675daa2bd4cc110a3c18ecf722bc2e7868c6a | # Task Name
- **FLAN-2021 -> 70**
```json
{
"ag_news_subset": null,
"ai2_arc_ARC-Challenge": null,
"ai2_arc_ARC-Easy": null,
"aeslc": null,
"anli_r1": null,
"anli_r2": null,
"anli_r3": null,
"bool_q": null,
"cnn_dailymail": null,
"coqa": null,
"cosmos_qa": null,
"definite_pronoun_resolution": null,
"drop": null,
"fix_punct": null,
"gem_common_gen": null,
"gem_dart": null,
"gem_e2e_nlg": null,
"gem_web_nlg_en": null,
"gem_wiki_lingua_english_en": null,
"gigaword": null,
"glue_cola": null,
"glue_mnli": null,
"glue_mrpc": null,
"glue_qnli": null,
"glue_qqp": null,
"glue_sst2": null,
"glue_stsb": null,
"glue_wnli": null,
"hellaswag": null,
"huggingface_xsum": null,
"imdb_reviews_plain_text": null,
"lambada": null,
"math_dataset_algebra__linear_1d": null,
"multi_news": null,
"natural_questions_open": null,
"newsroom": null,
"openbookqa": null,
"opinion_abstracts_idebate": null,
"opinion_abstracts_rotten_tomatoes": null,
"para_crawl_enes": null,
"paws_wiki": null,
"piqa": null,
"quac": null,
"samsum": null,
"sentiment140": null,
"snli": null,
"squad_v1_1": null,
"squad_v2_0": null,
"story_cloze_2016": null,
"super_glue_cb": null,
"super_glue_copa": null,
"super_glue_multirc": null,
"super_glue_record": null,
"super_glue_rte": null,
"super_glue_wic": null,
"super_glue_wsc_fixed": null,
"trec": null,
"trivia_qa_rc": null,
"true_case": null,
"unified_qa_science_inst": null,
"winogrande": null,
"word_segment": null,
"wmt14_translate_fr-en": null,
"wmt16_translate_cs-en": null,
"wmt16_translate_de-en": null,
"wmt16_translate_fi-en": null,
"wmt16_translate_ro-en": null,
"wmt16_translate_ru-en": null,
"wmt16_translate_tr-en": null,
"yelp_polarity_reviews": null
}
``` | aslawliet/flan_zeroshot | [
"task_categories:text-generation",
"task_categories:text-classification",
"task_categories:token-classification",
"task_categories:question-answering",
"task_categories:zero-shot-classification",
"task_categories:translation",
"task_categories:summarization",
"size_categories:10M<n<100M",
"language:en",
"license:cc-by-4.0",
"region:us"
] | 2024-02-16T14:17:08+00:00 | {"language": ["en"], "license": "cc-by-4.0", "size_categories": ["10M<n<100M"], "task_categories": ["text-generation", "text-classification", "token-classification", "question-answering", "zero-shot-classification", "translation", "summarization"]} | 2024-02-16T15:39:32+00:00 | [] | [
"en"
] | TAGS
#task_categories-text-generation #task_categories-text-classification #task_categories-token-classification #task_categories-question-answering #task_categories-zero-shot-classification #task_categories-translation #task_categories-summarization #size_categories-10M<n<100M #language-English #license-cc-by-4.0 #region-us
| # Task Name
- FLAN-2021 -> 70
| [
"# Task Name\n- FLAN-2021 -> 70"
] | [
"TAGS\n#task_categories-text-generation #task_categories-text-classification #task_categories-token-classification #task_categories-question-answering #task_categories-zero-shot-classification #task_categories-translation #task_categories-summarization #size_categories-10M<n<100M #language-English #license-cc-by-4.0 #region-us \n",
"# Task Name\n- FLAN-2021 -> 70"
] | [
109,
11
] | [
"passage: TAGS\n#task_categories-text-generation #task_categories-text-classification #task_categories-token-classification #task_categories-question-answering #task_categories-zero-shot-classification #task_categories-translation #task_categories-summarization #size_categories-10M<n<100M #language-English #license-cc-by-4.0 #region-us \n# Task Name\n- FLAN-2021 -> 70"
] |
208390616700865bf6cfc22b0af2f515dc496ccc |
# Dataset Card for OpenHermes-2.5-1k-longest
<!-- Provide a quick summary of the dataset. -->
OpenHermes-2.5-1k-longest is a dataset of 1,000 samples derived from [teknium/OpenHermes-2.5](https://huggingface.co/datasets/teknium/OpenHermes-2.5) using the [Long is More for Alignment](https://huggingface.co/papers/2402.04833) protocol. This protocol consists of selecting the 1,000 longest responses and provides a strong baseline to measure performance against. For example, fine-tuning [mistralai/Mistral-7B-v0.1](https://huggingface.co/mistralai/Mistral-7B-v0.1) on this dataset using similar hyperparameters to those given in the paper produces a chat model that achieves a score of ~7.0 on MT-Bench, which is comparable in performance to training over the full ~1 million examples.
We found that stratified sampling across the subsets of [teknium/OpenHermes-2.5](https://huggingface.co/datasets/teknium/OpenHermes-2.5) produced slightly better models than uniform sampling. The result is a dataset with the following proportions across each `source`:
| source | proportion |
|:----------------------|-------------:|
| glaive-code-assist | 0.362 |
| CamelAI | 0.155 |
| metamath | 0.112 |
| EvolInstruct_70k | 0.103 |
| cot_alpaca_gpt4 | 0.083 |
| airoboros2.2 | 0.07 |
| platypus | 0.044 |
| GPT-4 Comparison Data | 0.03 |
| UnnaturalInstructions | 0.017 |
| CogStackMed | 0.009 |
| LMSys Chatbot Arena | 0.006 |
| caseus_custom | 0.005 |
| lmsys1m | 0.003 |
| Econ_domain_expert | 0.001 |
See the [`create_dataset.py`](https://huggingface.co/datasets/HuggingFaceH4/OpenHermes-2.5-1k-longest/blob/main/create_dataset.py) script for details on how the dataset was constructed.
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
OpenHermes-2.5-1k-longest is suitable for training chat models via supervised fine-tuning (SFT). To load the dataset run:
```python
from datasets import load_dataset
from transformers import AutoTokenizer
ds = load_dataset("HuggingFaceH4/OpenHermes-2.5-1k-longest")
# Load a tokenizer and apply chat template
tokenizer = AutoTokenizer.from_pretrained("teknium/OpenHermes-2.5-Mistral-7B")
example = ds["train_sft"][0]
formatted_example = tokenizer.apply_chat_template(example["messages"], tokenize=False)
print(formatted_example)
```
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
Each example has the following structure:
```json
{
"source": "glaive-code-assist",
"category": None,
"messages": [
{
"content": "How can I copy code to StationA and StationB and set the hostnames for each station in the code?",
"role": "user",
},
{
"content": 'To copy code to StationA and StationB and set the hostnames, you can use the following code:\n\n```python\nfrom __future__ import with_statement, print_function\nfrom fabric.api import local, settings, abort, run, sudo, cd, hosts, env, execute\nfrom fabric.contrib.console import confirm\nfrom fabric.operations import put, get\nfrom fabric.contrib.project import rsync_project\n\nimport re\nimport subprocess as sp \nimport os.path \nfrom StringIO import StringIO\n\n\ncurrent_dir = os.path.dirname(os.path.realpath(__file__))\n\n\nVNC_LICENSE = [\n "xxxxx-xxxxx-xxxxx-xxxxx-xxxxx"\n]\n\n\n# TODO: Put a proper deployment mechanism here.\nenv.key_filename = \'/home/alcides/.ssh/zunzun_ec2_keypair_0.pem\'\n\nStationA_H = \'[email protected]\'\nStationB_H = \'[email protected]\'\nBeefy_H = \'[email protected]\'\n# TODO: Make the IP number below deployment-specific...\nBeefy_InternalIP = \'192.168.112.131\'\nStationA_InternalIP = \'192.168.112.129\'\nStationB_InternalIP = \'192.168.112.130\'\nHomeDir_Name = "ubuntu"\n\n\n@hosts(StationA_H)\ndef StationA():\n """\n Copies code to StationA \n """\n rsync_project(\n local_dir = "scripts/StationA",\n remote_dir = ("/home/{HomeDir_Name}/".format(HomeDir_Name=HomeDir_Name))\n )\n run("ln -sf /home/{HomeDir_Name}/StationA/onstartup.py /home/{HomeDir_Name}/onstartup.py".format(HomeDir_Name=HomeDir_Name))\n\n\n@hosts(StationB_H)\ndef setup_dns_masq():\n sudo("apt-get install -y dnsmasq")\n put(\n local_path = StringIO("addn-hosts=/home/{HomeDir_Name}/dnsmasq_more.conf\\n".format(HomeDir_Name=HomeDir_Name)),\n remote_path = "/etc/dnsmasq.conf",\n use_sudo=True)\n\n\n@hosts(StationB_H)\ndef StationB():\n """\n Copies both the chrome plugin and the DNSMasq watcher \n """\n rsync_project(\n local_dir = "scripts/StationB",\n remote_dir = ("/home/{HomeDir_Name}/".format(HomeDir_Name=HomeDir_Name))\n )\n rsync_project(\n local_dir = "scripts/StationA/chrome_captures_hars",\n remote_dir = (("/home/{HomeDir_Name}/StationB/".format(HomeDir_Name=HomeDir_Name)).format(HomeDir_Name=HomeDir_Name))\n )\n run("ln -sf /home/{HomeDir_Name}/StationB/onstartup.py /home/{HomeDir_Name}/onstartup.py".format(HomeDir_Name=HomeDir_Name))\n\n\n@hosts(StationB_H)\ndef install_updatednsmasq_service():\n with settings(warn_only=True):\n sudo("service updatednsmasq stop")\n put(\n local_path = "scripts/StationB/configure_dnsmasq.py",\n remote_path = "/home/{HomeDir_Name}/StationB/configure_dnsmasq.py".format(HomeDir_Name=HomeDir_Name) ,\n use_sudo = True\n )\n put(\n local_path = StringIO("""\ndescription "Update dnsmasq"\n\nstart on runlevel [2345]\nstop on runlevel [!2345]\n\numask 022\n\nconsole log\n\nenv PATH=/opt/openssl-1.0.2/bin/:/usr/bin:/usr/local/bin:/usr/sbin:/bin \nexport PATH\nenv LD_LIBRARY_PATH=/opt/openssl-1.0.2/lib\nexport LD_LIBRARY_PATH\nenv USER={HomeDir_Name}\nexport USER\n\nscript \n exec /usr/bin/python /home/{HomeDir_Name}/StationB/configure_dnsmasq.py\nend script\n\n""".format(HomeDir_Name=HomeDir_Name)),\n remote_path = "/etc/init/updatednsmasq.conf",\n use_sudo=True )\n sudo("service updatednsmasq start")\n\n\n@hosts(Beefy_H)\ndef Beefy():\n sudo("apt-get update")\n sudo("apt-get -y install libgmp-dev")\n\n\n@hosts(Beefy_H)\ndef BeefyRehMimic():\n with settings(warn_only=True):\n sudo("service mimic stop")\n put(\n local_path = "dist/build/reh-mimic/reh-mimic",\n remote_path = "/home/{HomeDir_Name}/reh-mimic".format(HomeDir_Name=HomeDir_Name)\n )\n run("chmod ugo+x /home/{HomeDir_Name}/reh-mimic".format(HomeDir_Name=HomeDir_Name))\n sudo("rm /home/{HomeDir_Name}/mimic -rf".format(HomeDir_Name=HomeDir_Name) )\n rsync_project(\n local_dir = "mimic",\n remote_dir = "/home/{HomeDir_Name}/".format(HomeDir_Name=HomeDir_Name),\n )\n put(\n local_path = "scripts/mimic.conf",\n remote_path = "/etc/init/mimic.conf",\n use_sudo = True\n )\n sudo("touch /root/.rnd")\n sudo("service mimic start")\n\n\n@hosts(Beefy_H, StationA_H, StationB_H )\ndef configure_logging():\n if env.host_string == StationA_H:\n put(\n local_path = StringIO("""$template Logentries,"199fb2e1-8227-4f73-9150-70a34a5d5e0c %HOSTNAME% %syslogtag%%msg%\\\\n"\n*.* @@api.logentries.com:10000;Logentries"""),\n remote_path = "/etc/rsyslog.d/70-logentries.conf",\n use_sudo = True )\n elif env.host_string == StationB_H:\n put(\n local_path = StringIO("""$template Logentries,"3d2fd756-407a-4764-b130-1dd6f22a1b62 %HOSTNAME% %syslogtag%%msg%\\\\n"\n*.* @@api.logentries.com:10000;Logentries"""),\n remote_path = "/etc/rsyslog.d/70-logentries.conf",\n use_sudo = True )\n else:\n put(\n local_path = StringIO("""$template Logentries,"7551d4e0-fa76-466f-8547-8c9a347a9363 %HOSTNAME% %syslogtag%%msg%\\\\n"\n*.* @@api.logentries.com:10000;Logentries"""),\n remote_path = "/etc/rsyslog.d/70-logentries.conf",\n use_sudo = True )\n \n sudo("service rsyslog restart")\n # Check logging works...\n sudo("logger -t test Hello there Logentries")\n\n\n@hosts (StationA_H, StationB_H)\ndef deploy_specific():\n if env.host_string == StationA_H:\n print("StationA deploy")\n StationA()\n elif env.host_string == StationB_H:\n print("StationB deploy")\n StationB()\n else: \n print("Beefy station deploy")\n Beefy()\n\n\n@hosts(StationA_H, StationB_H)\ndef apt_stations():\n sudo("apt-get update")\n sudo("apt-get install -y xutils xbase-clients xfonts-base xfonts-75dpi xfonts-100dpi")\n sudo("apt-get install -y python-pip")\n sudo("apt-get install -y xdotool")\n sudo("apt-get install -y xfwm4") \n\n\n@hosts(StationA_H, StationB_H)\ndef pythonlibs():\n sudo("pip install python-daemon>=2.0")\n sudo("pip install raven")\n\n\n@hosts(Beefy_H, StationA_H, StationB_H)\ndef ssl():\n """\n Copies Openssl and curl to the target hosts...\n """\n sudo("mkdir -p /opt/openssl-1.0.2/")\n sudo(("chown {HomeDir_Name} /opt/openssl-1.0.2/".format(HomeDir_Name=HomeDir_Name)))\n rsync_project(\n local_dir = "/opt/openssl-1.0.2",\n remote_dir = "/opt/",\n extra_opts="-avz"\n )\n put(\n local_path = "scripts/ca-certificates.crt",\n remote_path = "/etc/ssl/certs/ca-certificates.crt",\n use_sudo = True\n )\n\n\n@hosts(Beefy_H, StationA_H, StationB_H)\ndef ca():\n """\n Copies the ca certificate to the home...\n """\n put(\n local_path = "mimic-here/config/ca/cacert.pem",\n remote_path = ("/home/{HomeDir_Name}/cacert.pem".format(HomeDir_Name=HomeDir_Name)),\n use_sudo = True\n )\n\n\n@hosts(StationA_H, StationB_H)\ndef install_vnc():\n """\n \n """\n # run("curl -L -o VNC.tar.gz https://www.realvnc.com/download/binary/1720/")\n # run("tar xvf VNC-5.2.3-Linux-x64-ANY.tar.gz")\n\n\n put(\n local_path = "scripts/VNC-5.2.3-Linux-x64-ANY.tar.gz",\n remote_path = ("/home/{HomeDir_Name}/VNC-5.2.3-Linux-x64-ANY.tar.gz".format(HomeDir_Name=HomeDir_Name)),\n use_sudo = True\n )\n run(("tar -xzf /home/{HomeDir_Name}/VNC-5.2.3-Linux-x64-ANY.tar.gz".format(HomeDir_Name=HomeDir_Name)))\n # Get a handier name.... \n run("rm -rf vnc")\n run(("mv /home/{HomeDir_Name}/VNC-5.2.3-Linux-x64 /home/{HomeDir_Name}/vnc".format(HomeDir_Name=HomeDir_Name)))\n sudo(("/home/{HomeDir_Name}/vnc/vnclicense -add {VncLicense}".format(\n HomeDir_Name= HomeDir_Name,\n VncLicense = VNC_LICENSE[0]\n )))\n # Will demand some for of interactive input...\n run(("mkdir -p /home/{HomeDir_Name}/.vnc/".format(HomeDir_Name=HomeDir_Name)))\n run(("mkdir -p /home/{HomeDir_Name}/.vnc/config.d/".format(HomeDir_Name=HomeDir_Name)))\n sudo(("/home/{HomeDir_Name}/vnc/vncpasswd /home/{HomeDir_Name}/.vnc/config.d/Xvnc".format(HomeDir_Name=HomeDir_Name)))\n vnc_fix_permissions()\n\n@hosts(StationA_H, StationB_H)\ndef vnc_fix_permissions():\n sudo(("chown {HomeDir_Name} /home/{HomeDir_Name}/.vnc -R").format(HomeDir_Name=HomeDir_Name))\n\n@hosts(StationA_H, StationB_H)\ndef install_vnc_xstartup():\n run(("mkdir -p /home/{HomeDir_Name}/.vnc/".format(HomeDir_Name=HomeDir_Name)))\n run(("mkdir -p /home/{HomeDir_Name}/.vnc/config.d/".format(HomeDir_Name=HomeDir_Name)))\n put(\n local_path = "scripts/vnc-xstartup",\n remote_path = ("/home/{HomeDir_Name}/.vnc/xstartup".format(HomeDir_Name=HomeDir_Name))\n )\n run("chmod ugo+x /home/{HomeDir_Name}/.vnc/xstartup".format(HomeDir_Name=HomeDir_Name))\n put(\n local_path = "scripts/xvncfontpath",\n remote_path = ("/home/{HomeDir_Name}/.vnc/config.d/xvncfontpath".format(HomeDir_Name=HomeDir_Name))\n )\n\n\n@hosts(StationA_H, StationB_H)\ndef setup_google_chrome():\n put(\n local_path = "scripts/google-chrome-stable_current_amd64.deb",\n remote_path = ("/home/{HomeDir_Name}/google-chrome-stable_current_amd64.deb".format(HomeDir_Name=HomeDir_Name)),\n use_sudo = True\n )\n really_setup_google_chrome()\n\n\n@hosts(Beefy_H, StationA_H, StationB_H)\ndef ensure_local_hosts():\n # Get the contents of /etc/hosts\n local_file = StringIO()\n get(\n local_path = local_file,\n remote_path = "/etc/hosts",\n use_sudo = True \n )\n hosts_file = local_file.getvalue()\n snippet = """# DO NOT EDIT BELOW BY HAND\n{Beefy_InternalIP} instr.httpdos.com\n192.168.112.129 ip-192-168-112-129\n192.168.112.130 ip-192-168-112-130\n192.168.112.131 ip-192-168-112-131\n# END DO NOT EDIT BELOW""".format(\n StationA_InternalIP = StationA_InternalIP,\n Beefy_InternalIP = Beefy_InternalIP\n )\n mo = re.search(r"# DO NOT EDIT BELOW BY HAND\\n(.*?)\\n# END DO NOT EDIT BELOW", hosts_file, re.MULTILINE)\n if mo:\n part_before = hosts_file[:mo.start(0)]\n part_after = hosts_file[mo.end(0):]\n hosts_file = part_before + snippet + part_after\n else:\n hosts_file += "\\n" + snippet\n\n put(\n local_path = StringIO(hosts_file),\n remote_path = "/etc/hosts",\n use_sudo = True\n )\n\n\n@hosts(StationA_H, StationB_H)\ndef really_setup_google_chrome():\n sudo("apt-get update")\n sudo(("apt-get -f install -y".format(HomeDir_Name=HomeDir_Name)))\n sudo("apt-get install -y --fix-missing xdg-utils")\n sudo(("dpkg -i --force-depends /home/{HomeDir_Name}/google-chrome-stable_current_amd64.deb".format(HomeDir_Name=HomeDir_Name)))\n sudo(("apt-get -f install -y".format(HomeDir_Name=HomeDir_Name)))\n\n\n@hosts(StationA_H, StationB_H)\ndef setup_vnc_service():\n put(\n local_path = "scripts/vncserv-{HomeDir_Name}.conf".format(HomeDir_Name=HomeDir_Name),\n remote_path = "/etc/init/vncserv.conf",\n use_sudo = True\n )\n put(\n local_path = "scripts/undaemon.py",\n remote_path = "/home/{HomeDir_Name}/undaemon.py".format(HomeDir_Name=HomeDir_Name)\n )\n run("chmod ugo+x /home/{HomeDir_Name}/undaemon.py".format(HomeDir_Name=HomeDir_Name))\n with settings(warn_only=True):\n sudo(\n "service vncserv start"\n )\n\n\n@hosts(StationA_H, StationB_H)\ndef disable_lightdm():\n contents = StringIO("manual")\n put(\n local_path = contents, \n remote_path = "/etc/init/lightdm.override",\n use_sudo=True\n )\n\n\n@hosts(StationA_H, StationB_H)\ndef touch_xauthority():\n run("touch $HOME/.Xauthority")\n\n\n@hosts(StationA_H, StationB_H)\ndef deploy():\n execute(apt_stations)\n execute(setup_dns_masq)\n execute(setup_google_chrome)\n execute(deploy_specific)\n execute(touch_xauthority)\n execute(disable_lightdm)\n execute(StationA)\n execute(StationB)\n execute(Beefy)\n execute(ca)\n execute(ssl)\n execute(install_vnc)\n execute(install_vnc_xstartup)\n execute(ensure_local_hosts)\n execute(setup_vnc_service)\n execute(pythonlibs)\n execute(BeefyRehMimic)\n execute(install_updatednsmasq_service) \n```\n\nThis code uses Fabric to copy the code to StationA and StationB. The `StationA` function copies the code to StationA, while the `StationB` function copies the code to StationB. The `setup_dns_masq` function installs and sets up dnsmasq on StationB. The `deploy_specific` function deploys the code to the appropriate station based on the host string. The `apt_stations` function installs necessary packages on both StationA and StationB. The `pythonlibs` function installs required python libraries. The `ssl` function copies the OpenSSL and curl files to the target hosts. The `ca` function copies the ca certificate to the home directory. The `install_vnc` function installs VNC on both StationA and StationB. The `install_vnc_xstartup` function sets up the VNC xstartup file. The `setup_google_chrome` function installs Google Chrome on both StationA and StationB. The `ensure_local_hosts` function updates the /etc/hosts file with the appropriate IP addresses. The `really_setup_google_chrome` function completes the setup of Google Chrome. The `setup_vnc_service` function sets up the VNC service. The `disable_lightdm` function disables lightdm. The `touch_xauthority` function creates the .Xauthority file. The `deploy` function executes all the necessary steps to deploy the code to both StationA and StationB.',
"role": "assistant",
},
],
"average_response_length": 13327.0,
}
```
Here, `source` and `category` refer to metadata present in the original OpenHermes-2.5 dataset, while `messages` consists of multi-turn conversations that can be wrapped in a chat template like ChatML for training. The `average_response_length` denotes the average length of the assistant conversations, measured in characters.
## Dataset Creation
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
See the [`create_dataset.py`](https://huggingface.co/datasets/HuggingFaceH4/OpenHermes-2.5-1k-longest/blob/main/create_dataset.py) script for details on how the dataset was constructed.
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
This dataset was derived from Teknium's high-quality [OpenHermes-2.5](https://huggingface.co/datasets/teknium/OpenHermes-2.5) dataset that mostly comprises of GPT-4 instructions and demonstrations. | HuggingFaceH4/OpenHermes-2.5-1k-longest | [
"task_categories:text-generation",
"license:other",
"sft",
"synthetic",
"arxiv:2402.04833",
"region:us"
] | 2024-02-16T14:18:39+00:00 | {"license": "other", "task_categories": ["text-generation"], "pretty_name": "OpenHermes-2.5-1k-longest", "dataset_info": {"features": [{"name": "source", "dtype": "string"}, {"name": "category", "dtype": "string"}, {"name": "messages", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "average_response_length", "dtype": "float64"}], "splits": [{"name": "train_sft", "num_bytes": 6190190, "num_examples": 1000}, {"name": "test_sft", "num_bytes": 8730167, "num_examples": 1000}], "download_size": 5949801, "dataset_size": 14920357}, "configs": [{"config_name": "default", "data_files": [{"split": "train_sft", "path": "data/train_sft-*"}, {"split": "test_sft", "path": "data/test_sft-*"}]}], "tags": ["sft", "synthetic"]} | 2024-02-16T16:32:00+00:00 | [
"2402.04833"
] | [] | TAGS
#task_categories-text-generation #license-other #sft #synthetic #arxiv-2402.04833 #region-us
| Dataset Card for OpenHermes-2.5-1k-longest
==========================================
OpenHermes-2.5-1k-longest is a dataset of 1,000 samples derived from teknium/OpenHermes-2.5 using the Long is More for Alignment protocol. This protocol consists of selecting the 1,000 longest responses and provides a strong baseline to measure performance against. For example, fine-tuning mistralai/Mistral-7B-v0.1 on this dataset using similar hyperparameters to those given in the paper produces a chat model that achieves a score of ~7.0 on MT-Bench, which is comparable in performance to training over the full ~1 million examples.
We found that stratified sampling across the subsets of teknium/OpenHermes-2.5 produced slightly better models than uniform sampling. The result is a dataset with the following proportions across each 'source':
See the 'create\_dataset.py' script for details on how the dataset was constructed.
Uses
----
### Direct Use
OpenHermes-2.5-1k-longest is suitable for training chat models via supervised fine-tuning (SFT). To load the dataset run:
Dataset Structure
-----------------
Each example has the following structure:
python\nfrom **future** import with\_statement, print\_function\nfrom URL import local, settings, abort, run, sudo, cd, hosts, env, execute\nfrom fabric.contrib.console import confirm\nfrom fabric.operations import put, get\nfrom fabric.contrib.project import rsync\_project\n\nimport re\nimport subprocess as sp \nimport URL \nfrom StringIO import StringIO\n\n\ncurrent\_dir = URL.dirname(URL.realpath(**file**))\n\n\nVNC\_LICENSE = [\n "xxxxx-xxxxx-xxxxx-xxxxx-xxxxx"\n]\n\n\n# TODO: Put a proper deployment mechanism here.\nenv.key\_filename = '/home/alcides/.ssh/zunzun\_ec2\_keypair\_0.pem'\n\nStationA\_H = '[email protected]'\nStationB\_H = '[email protected]'\nBeefy\_H = '[email protected]'\n# TODO: Make the IP number below deployment-specific...\nBeefy\_InternalIP = '192.168.112.131'\nStationA\_InternalIP = '192.168.112.129'\nStationB\_InternalIP = '192.168.112.130'\nHomeDir\_Name = "ubuntu"\n\n\n@hosts(StationA\_H)\ndef StationA():\n """\n Copies code to StationA \n """\n rsync\_project(\n local\_dir = "scripts/StationA",\n remote\_dir = ("/home/{HomeDir\_Name}/".format(HomeDir\_Name=HomeDir\_Name))\n )\n run("ln -sf /home/{HomeDir\_Name}/StationA/URL /home/{HomeDir\_Name}/URL".format(HomeDir\_Name=HomeDir\_Name))\n\n\n@hosts(StationB\_H)\ndef setup\_dns\_masq():\n sudo("apt-get install -y dnsmasq")\n put(\n local\_path = StringIO("addn-hosts=/home/{HomeDir\_Name}/dnsmasq\_more.conf\n".format(HomeDir\_Name=HomeDir\_Name)),\n remote\_path = "/etc/URL",\n use\_sudo=True)\n\n\n@hosts(StationB\_H)\ndef StationB():\n """\n Copies both the chrome plugin and the DNSMasq watcher \n """\n rsync\_project(\n local\_dir = "scripts/StationB",\n remote\_dir = ("/home/{HomeDir\_Name}/".format(HomeDir\_Name=HomeDir\_Name))\n )\n rsync\_project(\n local\_dir = "scripts/StationA/chrome\_captures\_hars",\n remote\_dir = (("/home/{HomeDir\_Name}/StationB/".format(HomeDir\_Name=HomeDir\_Name)).format(HomeDir\_Name=HomeDir\_Name))\n )\n run("ln -sf /home/{HomeDir\_Name}/StationB/URL /home/{HomeDir\_Name}/URL".format(HomeDir\_Name=HomeDir\_Name))\n\n\n@hosts(StationB\_H)\ndef install\_updatednsmasq\_service():\n with settings(warn\_only=True):\n sudo("service updatednsmasq stop")\n put(\n local\_path = "scripts/StationB/configure\_dnsmasq.py",\n remote\_path = "/home/{HomeDir\_Name}/StationB/configure\_dnsmasq.py".format(HomeDir\_Name=HomeDir\_Name) ,\n use\_sudo = True\n )\n put(\n local\_path = StringIO("""\ndescription "Update dnsmasq"\n\nstart on runlevel [2345]\nstop on runlevel [!2345]\n\numask 022\n\nconsole log\n\nenv PATH=/opt/openssl-1.0.2/bin/:/usr/bin:/usr/local/bin:/usr/sbin:/bin \nexport PATH\nenv LD\_LIBRARY\_PATH=/opt/openssl-1.0.2/lib\nexport LD\_LIBRARY\_PATH\nenv USER={HomeDir\_Name}\nexport USER\n\nscript \n exec /usr/bin/python /home/{HomeDir\_Name}/StationB/configure\_dnsmasq.py\nend script\n\n""".format(HomeDir\_Name=HomeDir\_Name)),\n remote\_path = "/etc/init/URL",\n use\_sudo=True )\n sudo("service updatednsmasq start")\n\n\n@hosts(Beefy\_H)\ndef Beefy():\n sudo("apt-get update")\n sudo("apt-get -y install libgmp-dev")\n\n\n@hosts(Beefy\_H)\ndef BeefyRehMimic():\n with settings(warn\_only=True):\n sudo("service mimic stop")\n put(\n local\_path = "dist/build/reh-mimic/reh-mimic",\n remote\_path = "/home/{HomeDir\_Name}/reh-mimic".format(HomeDir\_Name=HomeDir\_Name)\n )\n run("chmod ugo+x /home/{HomeDir\_Name}/reh-mimic".format(HomeDir\_Name=HomeDir\_Name))\n sudo("rm /home/{HomeDir\_Name}/mimic -rf".format(HomeDir\_Name=HomeDir\_Name) )\n rsync\_project(\n local\_dir = "mimic",\n remote\_dir = "/home/{HomeDir\_Name}/".format(HomeDir\_Name=HomeDir\_Name),\n )\n put(\n local\_path = "scripts/URL",\n remote\_path = "/etc/init/URL",\n use\_sudo = True\n )\n sudo("touch /root/.rnd")\n sudo("service mimic start")\n\n\n@hosts(Beefy\_H, StationA\_H, StationB\_H )\ndef configure\_logging():\n if env.host\_string == StationA\_H:\n put(\n local\_path = StringIO("""$template Logentries,"199fb2e1-8227-4f73-9150-70a34a5d5e0c %HOSTNAME% %syslogtag%%msg%\\n"\n\*.\* @@URL:10000;Logentries"""),\n remote\_path = "/etc/rsyslog.d/URL",\n use\_sudo = True )\n elif env.host\_string == StationB\_H:\n put(\n local\_path = StringIO("""$template Logentries,"3d2fd756-407a-4764-b130-1dd6f22a1b62 %HOSTNAME% %syslogtag%%msg%\\n"\n\*.\* @@URL:10000;Logentries"""),\n remote\_path = "/etc/rsyslog.d/URL",\n use\_sudo = True )\n else:\n put(\n local\_path = StringIO("""$template Logentries,"7551d4e0-fa76-466f-8547-8c9a347a9363 %HOSTNAME% %syslogtag%%msg%\\n"\n\*.\* @@URL:10000;Logentries"""),\n remote\_path = "/etc/rsyslog.d/URL",\n use\_sudo = True )\n \n sudo("service rsyslog restart")\n # Check logging works...\n sudo("logger -t test Hello there Logentries")\n\n\n@hosts (StationA\_H, StationB\_H)\ndef deploy\_specific():\n if env.host\_string == StationA\_H:\n print("StationA deploy")\n StationA()\n elif env.host\_string == StationB\_H:\n print("StationB deploy")\n StationB()\n else: \n print("Beefy station deploy")\n Beefy()\n\n\n@hosts(StationA\_H, StationB\_H)\ndef apt\_stations():\n sudo("apt-get update")\n sudo("apt-get install -y xutils xbase-clients xfonts-base xfonts-75dpi xfonts-100dpi")\n sudo("apt-get install -y python-pip")\n sudo("apt-get install -y xdotool")\n sudo("apt-get install -y xfwm4") \n\n\n@hosts(StationA\_H, StationB\_H)\ndef pythonlibs():\n sudo("pip install python-daemon>=2.0")\n sudo("pip install raven")\n\n\n@hosts(Beefy\_H, StationA\_H, StationB\_H)\ndef ssl():\n """\n Copies Openssl and curl to the target hosts...\n """\n sudo("mkdir -p /opt/openssl-1.0.2/")\n sudo(("chown {HomeDir\_Name} /opt/openssl-1.0.2/".format(HomeDir\_Name=HomeDir\_Name)))\n rsync\_project(\n local\_dir = "/opt/openssl-1.0.2",\n remote\_dir = "/opt/",\n extra\_opts="-avz"\n )\n put(\n local\_path = "scripts/URL",\n remote\_path = "/etc/ssl/certs/URL",\n use\_sudo = True\n )\n\n\n@hosts(Beefy\_H, StationA\_H, StationB\_H)\ndef ca():\n """\n Copies the ca certificate to the home...\n """\n put(\n local\_path = "mimic-here/config/ca/URL",\n remote\_path = ("/home/{HomeDir\_Name}/URL".format(HomeDir\_Name=HomeDir\_Name)),\n use\_sudo = True\n )\n\n\n@hosts(StationA\_H, StationB\_H)\ndef install\_vnc():\n """\n \n """\n # run("curl -L -o URL URL # run("tar xvf VNC-5.2.URL")\n\n\n put(\n local\_path = "scripts/VNC-5.2.URL",\n remote\_path = ("/home/{HomeDir\_Name}/VNC-5.2.URL".format(HomeDir\_Name=HomeDir\_Name)),\n use\_sudo = True\n )\n run(("tar -xzf /home/{HomeDir\_Name}/VNC-5.2.URL".format(HomeDir\_Name=HomeDir\_Name)))\n # Get a handier name.... \n run("rm -rf vnc")\n run(("mv /home/{HomeDir\_Name}/VNC-5.2.3-Linux-x64 /home/{HomeDir\_Name}/vnc".format(HomeDir\_Name=HomeDir\_Name)))\n sudo(("/home/{HomeDir\_Name}/vnc/vnclicense -add {VncLicense}".format(\n HomeDir\_Name= HomeDir\_Name,\n VncLicense = VNC\_LICENSE[0]\n )))\n # Will demand some for of interactive input...\n run(("mkdir -p /home/{HomeDir\_Name}/.vnc/".format(HomeDir\_Name=HomeDir\_Name)))\n run(("mkdir -p /home/{HomeDir\_Name}/.vnc/config.d/".format(HomeDir\_Name=HomeDir\_Name)))\n sudo(("/home/{HomeDir\_Name}/vnc/vncpasswd /home/{HomeDir\_Name}/.vnc/config.d/Xvnc".format(HomeDir\_Name=HomeDir\_Name)))\n vnc\_fix\_permissions()\n\n@hosts(StationA\_H, StationB\_H)\ndef vnc\_fix\_permissions():\n sudo(("chown {HomeDir\_Name} /home/{HomeDir\_Name}/.vnc -R").format(HomeDir\_Name=HomeDir\_Name))\n\n@hosts(StationA\_H, StationB\_H)\ndef install\_vnc\_xstartup():\n run(("mkdir -p /home/{HomeDir\_Name}/.vnc/".format(HomeDir\_Name=HomeDir\_Name)))\n run(("mkdir -p /home/{HomeDir\_Name}/.vnc/config.d/".format(HomeDir\_Name=HomeDir\_Name)))\n put(\n local\_path = "scripts/vnc-xstartup",\n remote\_path = ("/home/{HomeDir\_Name}/.vnc/xstartup".format(HomeDir\_Name=HomeDir\_Name))\n )\n run("chmod ugo+x /home/{HomeDir\_Name}/.vnc/xstartup".format(HomeDir\_Name=HomeDir\_Name))\n put(\n local\_path = "scripts/xvncfontpath",\n remote\_path = ("/home/{HomeDir\_Name}/.vnc/config.d/xvncfontpath".format(HomeDir\_Name=HomeDir\_Name))\n )\n\n\n@hosts(StationA\_H, StationB\_H)\ndef setup\_google\_chrome():\n put(\n local\_path = "scripts/google-chrome-stable\_current\_amd64.deb",\n remote\_path = ("/home/{HomeDir\_Name}/google-chrome-stable\_current\_amd64.deb".format(HomeDir\_Name=HomeDir\_Name)),\n use\_sudo = True\n )\n really\_setup\_google\_chrome()\n\n\n@hosts(Beefy\_H, StationA\_H, StationB\_H)\ndef ensure\_local\_hosts():\n # Get the contents of /etc/hosts\n local\_file = StringIO()\n get(\n local\_path = local\_file,\n remote\_path = "/etc/hosts",\n use\_sudo = True \n )\n hosts\_file = local\_file.getvalue()\n snippet = """# DO NOT EDIT BELOW BY HAND\n{Beefy\_InternalIP} URL\n192.168.112.129 ip-192-168-112-129\n192.168.112.130 ip-192-168-112-130\n192.168.112.131 ip-192-168-112-131\n# END DO NOT EDIT BELOW""".format(\n StationA\_InternalIP = StationA\_InternalIP,\n Beefy\_InternalIP = Beefy\_InternalIP\n )\n mo = URL(r"# DO NOT EDIT BELOW BY HAND\n(.\*?)\n# END DO NOT EDIT BELOW", hosts\_file, re.MULTILINE)\n if mo:\n part\_before = hosts\_file[:URL(0)]\n part\_after = hosts\_file[URL(0):]\n hosts\_file = part\_before + snippet + part\_after\n else:\n hosts\_file += "\n" + snippet\n\n put(\n local\_path = StringIO(hosts\_file),\n remote\_path = "/etc/hosts",\n use\_sudo = True\n )\n\n\n@hosts(StationA\_H, StationB\_H)\ndef really\_setup\_google\_chrome():\n sudo("apt-get update")\n sudo(("apt-get -f install -y".format(HomeDir\_Name=HomeDir\_Name)))\n sudo("apt-get install -y --fix-missing xdg-utils")\n sudo(("dpkg -i --force-depends /home/{HomeDir\_Name}/google-chrome-stable\_current\_amd64.deb".format(HomeDir\_Name=HomeDir\_Name)))\n sudo(("apt-get -f install -y".format(HomeDir\_Name=HomeDir\_Name)))\n\n\n@hosts(StationA\_H, StationB\_H)\ndef setup\_vnc\_service():\n put(\n local\_path = "scripts/vncserv-{HomeDir\_Name}.conf".format(HomeDir\_Name=HomeDir\_Name),\n remote\_path = "/etc/init/URL",\n use\_sudo = True\n )\n put(\n local\_path = "scripts/URL",\n remote\_path = "/home/{HomeDir\_Name}/URL".format(HomeDir\_Name=HomeDir\_Name)\n )\n run("chmod ugo+x /home/{HomeDir\_Name}/URL".format(HomeDir\_Name=HomeDir\_Name))\n with settings(warn\_only=True):\n sudo(\n "service vncserv start"\n )\n\n\n@hosts(StationA\_H, StationB\_H)\ndef disable\_lightdm():\n contents = StringIO("manual")\n put(\n local\_path = contents, \n remote\_path = "/etc/init/lightdm.override",\n use\_sudo=True\n )\n\n\n@hosts(StationA\_H, StationB\_H)\ndef touch\_xauthority():\n run("touch $HOME/.Xauthority")\n\n\n@hosts(StationA\_H, StationB\_H)\ndef deploy():\n execute(apt\_stations)\n execute(setup\_dns\_masq)\n execute(setup\_google\_chrome)\n execute(deploy\_specific)\n execute(touch\_xauthority)\n execute(disable\_lightdm)\n execute(StationA)\n execute(StationB)\n execute(Beefy)\n execute(ca)\n execute(ssl)\n execute(install\_vnc)\n execute(install\_vnc\_xstartup)\n execute(ensure\_local\_hosts)\n execute(setup\_vnc\_service)\n execute(pythonlibs)\n execute(BeefyRehMimic)\n execute(install\_updatednsmasq\_service) \n
Here, 'source' and 'category' refer to metadata present in the original OpenHermes-2.5 dataset, while 'messages' consists of multi-turn conversations that can be wrapped in a chat template like ChatML for training. The 'average\_response\_length' denotes the average length of the assistant conversations, measured in characters.
Dataset Creation
----------------
### Source Data
#### Data Collection and Processing
See the 'create\_dataset.py' script for details on how the dataset was constructed.
#### Who are the source data producers?
This dataset was derived from Teknium's high-quality OpenHermes-2.5 dataset that mostly comprises of GPT-4 instructions and demonstrations.
| [
"### Direct Use\n\n\nOpenHermes-2.5-1k-longest is suitable for training chat models via supervised fine-tuning (SFT). To load the dataset run:\n\n\nDataset Structure\n-----------------\n\n\nEach example has the following structure:\n\n\npython\\nfrom **future** import with\\_statement, print\\_function\\nfrom URL import local, settings, abort, run, sudo, cd, hosts, env, execute\\nfrom fabric.contrib.console import confirm\\nfrom fabric.operations import put, get\\nfrom fabric.contrib.project import rsync\\_project\\n\\nimport re\\nimport subprocess as sp \\nimport URL \\nfrom StringIO import StringIO\\n\\n\\ncurrent\\_dir = URL.dirname(URL.realpath(**file**))\\n\\n\\nVNC\\_LICENSE = [\\n \"xxxxx-xxxxx-xxxxx-xxxxx-xxxxx\"\\n]\\n\\n\\n# TODO: Put a proper deployment mechanism here.\\nenv.key\\_filename = '/home/alcides/.ssh/zunzun\\_ec2\\_keypair\\_0.pem'\\n\\nStationA\\_H = '[email protected]'\\nStationB\\_H = '[email protected]'\\nBeefy\\_H = '[email protected]'\\n# TODO: Make the IP number below deployment-specific...\\nBeefy\\_InternalIP = '192.168.112.131'\\nStationA\\_InternalIP = '192.168.112.129'\\nStationB\\_InternalIP = '192.168.112.130'\\nHomeDir\\_Name = \"ubuntu\"\\n\\n\\n@hosts(StationA\\_H)\\ndef StationA():\\n \"\"\"\\n Copies code to StationA \\n \"\"\"\\n rsync\\_project(\\n local\\_dir = \"scripts/StationA\",\\n remote\\_dir = (\"/home/{HomeDir\\_Name}/\".format(HomeDir\\_Name=HomeDir\\_Name))\\n )\\n run(\"ln -sf /home/{HomeDir\\_Name}/StationA/URL /home/{HomeDir\\_Name}/URL\".format(HomeDir\\_Name=HomeDir\\_Name))\\n\\n\\n@hosts(StationB\\_H)\\ndef setup\\_dns\\_masq():\\n sudo(\"apt-get install -y dnsmasq\")\\n put(\\n local\\_path = StringIO(\"addn-hosts=/home/{HomeDir\\_Name}/dnsmasq\\_more.conf\\n\".format(HomeDir\\_Name=HomeDir\\_Name)),\\n remote\\_path = \"/etc/URL\",\\n use\\_sudo=True)\\n\\n\\n@hosts(StationB\\_H)\\ndef StationB():\\n \"\"\"\\n Copies both the chrome plugin and the DNSMasq watcher \\n \"\"\"\\n rsync\\_project(\\n local\\_dir = \"scripts/StationB\",\\n remote\\_dir = (\"/home/{HomeDir\\_Name}/\".format(HomeDir\\_Name=HomeDir\\_Name))\\n )\\n rsync\\_project(\\n local\\_dir = \"scripts/StationA/chrome\\_captures\\_hars\",\\n remote\\_dir = ((\"/home/{HomeDir\\_Name}/StationB/\".format(HomeDir\\_Name=HomeDir\\_Name)).format(HomeDir\\_Name=HomeDir\\_Name))\\n )\\n run(\"ln -sf /home/{HomeDir\\_Name}/StationB/URL /home/{HomeDir\\_Name}/URL\".format(HomeDir\\_Name=HomeDir\\_Name))\\n\\n\\n@hosts(StationB\\_H)\\ndef install\\_updatednsmasq\\_service():\\n with settings(warn\\_only=True):\\n sudo(\"service updatednsmasq stop\")\\n put(\\n local\\_path = \"scripts/StationB/configure\\_dnsmasq.py\",\\n remote\\_path = \"/home/{HomeDir\\_Name}/StationB/configure\\_dnsmasq.py\".format(HomeDir\\_Name=HomeDir\\_Name) ,\\n use\\_sudo = True\\n )\\n put(\\n local\\_path = StringIO(\"\"\"\\ndescription \"Update dnsmasq\"\\n\\nstart on runlevel [2345]\\nstop on runlevel [!2345]\\n\\numask 022\\n\\nconsole log\\n\\nenv PATH=/opt/openssl-1.0.2/bin/:/usr/bin:/usr/local/bin:/usr/sbin:/bin \\nexport PATH\\nenv LD\\_LIBRARY\\_PATH=/opt/openssl-1.0.2/lib\\nexport LD\\_LIBRARY\\_PATH\\nenv USER={HomeDir\\_Name}\\nexport USER\\n\\nscript \\n exec /usr/bin/python /home/{HomeDir\\_Name}/StationB/configure\\_dnsmasq.py\\nend script\\n\\n\"\"\".format(HomeDir\\_Name=HomeDir\\_Name)),\\n remote\\_path = \"/etc/init/URL\",\\n use\\_sudo=True )\\n sudo(\"service updatednsmasq start\")\\n\\n\\n@hosts(Beefy\\_H)\\ndef Beefy():\\n sudo(\"apt-get update\")\\n sudo(\"apt-get -y install libgmp-dev\")\\n\\n\\n@hosts(Beefy\\_H)\\ndef BeefyRehMimic():\\n with settings(warn\\_only=True):\\n sudo(\"service mimic stop\")\\n put(\\n local\\_path = \"dist/build/reh-mimic/reh-mimic\",\\n remote\\_path = \"/home/{HomeDir\\_Name}/reh-mimic\".format(HomeDir\\_Name=HomeDir\\_Name)\\n )\\n run(\"chmod ugo+x /home/{HomeDir\\_Name}/reh-mimic\".format(HomeDir\\_Name=HomeDir\\_Name))\\n sudo(\"rm /home/{HomeDir\\_Name}/mimic -rf\".format(HomeDir\\_Name=HomeDir\\_Name) )\\n rsync\\_project(\\n local\\_dir = \"mimic\",\\n remote\\_dir = \"/home/{HomeDir\\_Name}/\".format(HomeDir\\_Name=HomeDir\\_Name),\\n )\\n put(\\n local\\_path = \"scripts/URL\",\\n remote\\_path = \"/etc/init/URL\",\\n use\\_sudo = True\\n )\\n sudo(\"touch /root/.rnd\")\\n sudo(\"service mimic start\")\\n\\n\\n@hosts(Beefy\\_H, StationA\\_H, StationB\\_H )\\ndef configure\\_logging():\\n if env.host\\_string == StationA\\_H:\\n put(\\n local\\_path = StringIO(\"\"\"$template Logentries,\"199fb2e1-8227-4f73-9150-70a34a5d5e0c %HOSTNAME% %syslogtag%%msg%\\\\n\"\\n\\*.\\* @@URL:10000;Logentries\"\"\"),\\n remote\\_path = \"/etc/rsyslog.d/URL\",\\n use\\_sudo = True )\\n elif env.host\\_string == StationB\\_H:\\n put(\\n local\\_path = StringIO(\"\"\"$template Logentries,\"3d2fd756-407a-4764-b130-1dd6f22a1b62 %HOSTNAME% %syslogtag%%msg%\\\\n\"\\n\\*.\\* @@URL:10000;Logentries\"\"\"),\\n remote\\_path = \"/etc/rsyslog.d/URL\",\\n use\\_sudo = True )\\n else:\\n put(\\n local\\_path = StringIO(\"\"\"$template Logentries,\"7551d4e0-fa76-466f-8547-8c9a347a9363 %HOSTNAME% %syslogtag%%msg%\\\\n\"\\n\\*.\\* @@URL:10000;Logentries\"\"\"),\\n remote\\_path = \"/etc/rsyslog.d/URL\",\\n use\\_sudo = True )\\n \\n sudo(\"service rsyslog restart\")\\n # Check logging works...\\n sudo(\"logger -t test Hello there Logentries\")\\n\\n\\n@hosts (StationA\\_H, StationB\\_H)\\ndef deploy\\_specific():\\n if env.host\\_string == StationA\\_H:\\n print(\"StationA deploy\")\\n StationA()\\n elif env.host\\_string == StationB\\_H:\\n print(\"StationB deploy\")\\n StationB()\\n else: \\n print(\"Beefy station deploy\")\\n Beefy()\\n\\n\\n@hosts(StationA\\_H, StationB\\_H)\\ndef apt\\_stations():\\n sudo(\"apt-get update\")\\n sudo(\"apt-get install -y xutils xbase-clients xfonts-base xfonts-75dpi xfonts-100dpi\")\\n sudo(\"apt-get install -y python-pip\")\\n sudo(\"apt-get install -y xdotool\")\\n sudo(\"apt-get install -y xfwm4\") \\n\\n\\n@hosts(StationA\\_H, StationB\\_H)\\ndef pythonlibs():\\n sudo(\"pip install python-daemon>=2.0\")\\n sudo(\"pip install raven\")\\n\\n\\n@hosts(Beefy\\_H, StationA\\_H, StationB\\_H)\\ndef ssl():\\n \"\"\"\\n Copies Openssl and curl to the target hosts...\\n \"\"\"\\n sudo(\"mkdir -p /opt/openssl-1.0.2/\")\\n sudo((\"chown {HomeDir\\_Name} /opt/openssl-1.0.2/\".format(HomeDir\\_Name=HomeDir\\_Name)))\\n rsync\\_project(\\n local\\_dir = \"/opt/openssl-1.0.2\",\\n remote\\_dir = \"/opt/\",\\n extra\\_opts=\"-avz\"\\n )\\n put(\\n local\\_path = \"scripts/URL\",\\n remote\\_path = \"/etc/ssl/certs/URL\",\\n use\\_sudo = True\\n )\\n\\n\\n@hosts(Beefy\\_H, StationA\\_H, StationB\\_H)\\ndef ca():\\n \"\"\"\\n Copies the ca certificate to the home...\\n \"\"\"\\n put(\\n local\\_path = \"mimic-here/config/ca/URL\",\\n remote\\_path = (\"/home/{HomeDir\\_Name}/URL\".format(HomeDir\\_Name=HomeDir\\_Name)),\\n use\\_sudo = True\\n )\\n\\n\\n@hosts(StationA\\_H, StationB\\_H)\\ndef install\\_vnc():\\n \"\"\"\\n \\n \"\"\"\\n # run(\"curl -L -o URL URL # run(\"tar xvf VNC-5.2.URL\")\\n\\n\\n put(\\n local\\_path = \"scripts/VNC-5.2.URL\",\\n remote\\_path = (\"/home/{HomeDir\\_Name}/VNC-5.2.URL\".format(HomeDir\\_Name=HomeDir\\_Name)),\\n use\\_sudo = True\\n )\\n run((\"tar -xzf /home/{HomeDir\\_Name}/VNC-5.2.URL\".format(HomeDir\\_Name=HomeDir\\_Name)))\\n # Get a handier name.... \\n run(\"rm -rf vnc\")\\n run((\"mv /home/{HomeDir\\_Name}/VNC-5.2.3-Linux-x64 /home/{HomeDir\\_Name}/vnc\".format(HomeDir\\_Name=HomeDir\\_Name)))\\n sudo((\"/home/{HomeDir\\_Name}/vnc/vnclicense -add {VncLicense}\".format(\\n HomeDir\\_Name= HomeDir\\_Name,\\n VncLicense = VNC\\_LICENSE[0]\\n )))\\n # Will demand some for of interactive input...\\n run((\"mkdir -p /home/{HomeDir\\_Name}/.vnc/\".format(HomeDir\\_Name=HomeDir\\_Name)))\\n run((\"mkdir -p /home/{HomeDir\\_Name}/.vnc/config.d/\".format(HomeDir\\_Name=HomeDir\\_Name)))\\n sudo((\"/home/{HomeDir\\_Name}/vnc/vncpasswd /home/{HomeDir\\_Name}/.vnc/config.d/Xvnc\".format(HomeDir\\_Name=HomeDir\\_Name)))\\n vnc\\_fix\\_permissions()\\n\\n@hosts(StationA\\_H, StationB\\_H)\\ndef vnc\\_fix\\_permissions():\\n sudo((\"chown {HomeDir\\_Name} /home/{HomeDir\\_Name}/.vnc -R\").format(HomeDir\\_Name=HomeDir\\_Name))\\n\\n@hosts(StationA\\_H, StationB\\_H)\\ndef install\\_vnc\\_xstartup():\\n run((\"mkdir -p /home/{HomeDir\\_Name}/.vnc/\".format(HomeDir\\_Name=HomeDir\\_Name)))\\n run((\"mkdir -p /home/{HomeDir\\_Name}/.vnc/config.d/\".format(HomeDir\\_Name=HomeDir\\_Name)))\\n put(\\n local\\_path = \"scripts/vnc-xstartup\",\\n remote\\_path = (\"/home/{HomeDir\\_Name}/.vnc/xstartup\".format(HomeDir\\_Name=HomeDir\\_Name))\\n )\\n run(\"chmod ugo+x /home/{HomeDir\\_Name}/.vnc/xstartup\".format(HomeDir\\_Name=HomeDir\\_Name))\\n put(\\n local\\_path = \"scripts/xvncfontpath\",\\n remote\\_path = (\"/home/{HomeDir\\_Name}/.vnc/config.d/xvncfontpath\".format(HomeDir\\_Name=HomeDir\\_Name))\\n )\\n\\n\\n@hosts(StationA\\_H, StationB\\_H)\\ndef setup\\_google\\_chrome():\\n put(\\n local\\_path = \"scripts/google-chrome-stable\\_current\\_amd64.deb\",\\n remote\\_path = (\"/home/{HomeDir\\_Name}/google-chrome-stable\\_current\\_amd64.deb\".format(HomeDir\\_Name=HomeDir\\_Name)),\\n use\\_sudo = True\\n )\\n really\\_setup\\_google\\_chrome()\\n\\n\\n@hosts(Beefy\\_H, StationA\\_H, StationB\\_H)\\ndef ensure\\_local\\_hosts():\\n # Get the contents of /etc/hosts\\n local\\_file = StringIO()\\n get(\\n local\\_path = local\\_file,\\n remote\\_path = \"/etc/hosts\",\\n use\\_sudo = True \\n )\\n hosts\\_file = local\\_file.getvalue()\\n snippet = \"\"\"# DO NOT EDIT BELOW BY HAND\\n{Beefy\\_InternalIP} URL\\n192.168.112.129 ip-192-168-112-129\\n192.168.112.130 ip-192-168-112-130\\n192.168.112.131 ip-192-168-112-131\\n# END DO NOT EDIT BELOW\"\"\".format(\\n StationA\\_InternalIP = StationA\\_InternalIP,\\n Beefy\\_InternalIP = Beefy\\_InternalIP\\n )\\n mo = URL(r\"# DO NOT EDIT BELOW BY HAND\\n(.\\*?)\\n# END DO NOT EDIT BELOW\", hosts\\_file, re.MULTILINE)\\n if mo:\\n part\\_before = hosts\\_file[:URL(0)]\\n part\\_after = hosts\\_file[URL(0):]\\n hosts\\_file = part\\_before + snippet + part\\_after\\n else:\\n hosts\\_file += \"\\n\" + snippet\\n\\n put(\\n local\\_path = StringIO(hosts\\_file),\\n remote\\_path = \"/etc/hosts\",\\n use\\_sudo = True\\n )\\n\\n\\n@hosts(StationA\\_H, StationB\\_H)\\ndef really\\_setup\\_google\\_chrome():\\n sudo(\"apt-get update\")\\n sudo((\"apt-get -f install -y\".format(HomeDir\\_Name=HomeDir\\_Name)))\\n sudo(\"apt-get install -y --fix-missing xdg-utils\")\\n sudo((\"dpkg -i --force-depends /home/{HomeDir\\_Name}/google-chrome-stable\\_current\\_amd64.deb\".format(HomeDir\\_Name=HomeDir\\_Name)))\\n sudo((\"apt-get -f install -y\".format(HomeDir\\_Name=HomeDir\\_Name)))\\n\\n\\n@hosts(StationA\\_H, StationB\\_H)\\ndef setup\\_vnc\\_service():\\n put(\\n local\\_path = \"scripts/vncserv-{HomeDir\\_Name}.conf\".format(HomeDir\\_Name=HomeDir\\_Name),\\n remote\\_path = \"/etc/init/URL\",\\n use\\_sudo = True\\n )\\n put(\\n local\\_path = \"scripts/URL\",\\n remote\\_path = \"/home/{HomeDir\\_Name}/URL\".format(HomeDir\\_Name=HomeDir\\_Name)\\n )\\n run(\"chmod ugo+x /home/{HomeDir\\_Name}/URL\".format(HomeDir\\_Name=HomeDir\\_Name))\\n with settings(warn\\_only=True):\\n sudo(\\n \"service vncserv start\"\\n )\\n\\n\\n@hosts(StationA\\_H, StationB\\_H)\\ndef disable\\_lightdm():\\n contents = StringIO(\"manual\")\\n put(\\n local\\_path = contents, \\n remote\\_path = \"/etc/init/lightdm.override\",\\n use\\_sudo=True\\n )\\n\\n\\n@hosts(StationA\\_H, StationB\\_H)\\ndef touch\\_xauthority():\\n run(\"touch $HOME/.Xauthority\")\\n\\n\\n@hosts(StationA\\_H, StationB\\_H)\\ndef deploy():\\n execute(apt\\_stations)\\n execute(setup\\_dns\\_masq)\\n execute(setup\\_google\\_chrome)\\n execute(deploy\\_specific)\\n execute(touch\\_xauthority)\\n execute(disable\\_lightdm)\\n execute(StationA)\\n execute(StationB)\\n execute(Beefy)\\n execute(ca)\\n execute(ssl)\\n execute(install\\_vnc)\\n execute(install\\_vnc\\_xstartup)\\n execute(ensure\\_local\\_hosts)\\n execute(setup\\_vnc\\_service)\\n execute(pythonlibs)\\n execute(BeefyRehMimic)\\n execute(install\\_updatednsmasq\\_service) \\n\n\n\nHere, 'source' and 'category' refer to metadata present in the original OpenHermes-2.5 dataset, while 'messages' consists of multi-turn conversations that can be wrapped in a chat template like ChatML for training. The 'average\\_response\\_length' denotes the average length of the assistant conversations, measured in characters.\n\n\nDataset Creation\n----------------",
"### Source Data",
"#### Data Collection and Processing\n\n\nSee the 'create\\_dataset.py' script for details on how the dataset was constructed.",
"#### Who are the source data producers?\n\n\nThis dataset was derived from Teknium's high-quality OpenHermes-2.5 dataset that mostly comprises of GPT-4 instructions and demonstrations."
] | [
"TAGS\n#task_categories-text-generation #license-other #sft #synthetic #arxiv-2402.04833 #region-us \n",
"### Direct Use\n\n\nOpenHermes-2.5-1k-longest is suitable for training chat models via supervised fine-tuning (SFT). To load the dataset run:\n\n\nDataset Structure\n-----------------\n\n\nEach example has the following structure:\n\n\npython\\nfrom **future** import with\\_statement, print\\_function\\nfrom URL import local, settings, abort, run, sudo, cd, hosts, env, execute\\nfrom fabric.contrib.console import confirm\\nfrom fabric.operations import put, get\\nfrom fabric.contrib.project import rsync\\_project\\n\\nimport re\\nimport subprocess as sp \\nimport URL \\nfrom StringIO import StringIO\\n\\n\\ncurrent\\_dir = URL.dirname(URL.realpath(**file**))\\n\\n\\nVNC\\_LICENSE = [\\n \"xxxxx-xxxxx-xxxxx-xxxxx-xxxxx\"\\n]\\n\\n\\n# TODO: Put a proper deployment mechanism here.\\nenv.key\\_filename = '/home/alcides/.ssh/zunzun\\_ec2\\_keypair\\_0.pem'\\n\\nStationA\\_H = '[email protected]'\\nStationB\\_H = '[email protected]'\\nBeefy\\_H = '[email protected]'\\n# TODO: Make the IP number below deployment-specific...\\nBeefy\\_InternalIP = '192.168.112.131'\\nStationA\\_InternalIP = '192.168.112.129'\\nStationB\\_InternalIP = '192.168.112.130'\\nHomeDir\\_Name = \"ubuntu\"\\n\\n\\n@hosts(StationA\\_H)\\ndef StationA():\\n \"\"\"\\n Copies code to StationA \\n \"\"\"\\n rsync\\_project(\\n local\\_dir = \"scripts/StationA\",\\n remote\\_dir = (\"/home/{HomeDir\\_Name}/\".format(HomeDir\\_Name=HomeDir\\_Name))\\n )\\n run(\"ln -sf /home/{HomeDir\\_Name}/StationA/URL /home/{HomeDir\\_Name}/URL\".format(HomeDir\\_Name=HomeDir\\_Name))\\n\\n\\n@hosts(StationB\\_H)\\ndef setup\\_dns\\_masq():\\n sudo(\"apt-get install -y dnsmasq\")\\n put(\\n local\\_path = StringIO(\"addn-hosts=/home/{HomeDir\\_Name}/dnsmasq\\_more.conf\\n\".format(HomeDir\\_Name=HomeDir\\_Name)),\\n remote\\_path = \"/etc/URL\",\\n use\\_sudo=True)\\n\\n\\n@hosts(StationB\\_H)\\ndef StationB():\\n \"\"\"\\n Copies both the chrome plugin and the DNSMasq watcher \\n \"\"\"\\n rsync\\_project(\\n local\\_dir = \"scripts/StationB\",\\n remote\\_dir = (\"/home/{HomeDir\\_Name}/\".format(HomeDir\\_Name=HomeDir\\_Name))\\n )\\n rsync\\_project(\\n local\\_dir = \"scripts/StationA/chrome\\_captures\\_hars\",\\n remote\\_dir = ((\"/home/{HomeDir\\_Name}/StationB/\".format(HomeDir\\_Name=HomeDir\\_Name)).format(HomeDir\\_Name=HomeDir\\_Name))\\n )\\n run(\"ln -sf /home/{HomeDir\\_Name}/StationB/URL /home/{HomeDir\\_Name}/URL\".format(HomeDir\\_Name=HomeDir\\_Name))\\n\\n\\n@hosts(StationB\\_H)\\ndef install\\_updatednsmasq\\_service():\\n with settings(warn\\_only=True):\\n sudo(\"service updatednsmasq stop\")\\n put(\\n local\\_path = \"scripts/StationB/configure\\_dnsmasq.py\",\\n remote\\_path = \"/home/{HomeDir\\_Name}/StationB/configure\\_dnsmasq.py\".format(HomeDir\\_Name=HomeDir\\_Name) ,\\n use\\_sudo = True\\n )\\n put(\\n local\\_path = StringIO(\"\"\"\\ndescription \"Update dnsmasq\"\\n\\nstart on runlevel [2345]\\nstop on runlevel [!2345]\\n\\numask 022\\n\\nconsole log\\n\\nenv PATH=/opt/openssl-1.0.2/bin/:/usr/bin:/usr/local/bin:/usr/sbin:/bin \\nexport PATH\\nenv LD\\_LIBRARY\\_PATH=/opt/openssl-1.0.2/lib\\nexport LD\\_LIBRARY\\_PATH\\nenv USER={HomeDir\\_Name}\\nexport USER\\n\\nscript \\n exec /usr/bin/python /home/{HomeDir\\_Name}/StationB/configure\\_dnsmasq.py\\nend script\\n\\n\"\"\".format(HomeDir\\_Name=HomeDir\\_Name)),\\n remote\\_path = \"/etc/init/URL\",\\n use\\_sudo=True )\\n sudo(\"service updatednsmasq start\")\\n\\n\\n@hosts(Beefy\\_H)\\ndef Beefy():\\n sudo(\"apt-get update\")\\n sudo(\"apt-get -y install libgmp-dev\")\\n\\n\\n@hosts(Beefy\\_H)\\ndef BeefyRehMimic():\\n with settings(warn\\_only=True):\\n sudo(\"service mimic stop\")\\n put(\\n local\\_path = \"dist/build/reh-mimic/reh-mimic\",\\n remote\\_path = \"/home/{HomeDir\\_Name}/reh-mimic\".format(HomeDir\\_Name=HomeDir\\_Name)\\n )\\n run(\"chmod ugo+x /home/{HomeDir\\_Name}/reh-mimic\".format(HomeDir\\_Name=HomeDir\\_Name))\\n sudo(\"rm /home/{HomeDir\\_Name}/mimic -rf\".format(HomeDir\\_Name=HomeDir\\_Name) )\\n rsync\\_project(\\n local\\_dir = \"mimic\",\\n remote\\_dir = \"/home/{HomeDir\\_Name}/\".format(HomeDir\\_Name=HomeDir\\_Name),\\n )\\n put(\\n local\\_path = \"scripts/URL\",\\n remote\\_path = \"/etc/init/URL\",\\n use\\_sudo = True\\n )\\n sudo(\"touch /root/.rnd\")\\n sudo(\"service mimic start\")\\n\\n\\n@hosts(Beefy\\_H, StationA\\_H, StationB\\_H )\\ndef configure\\_logging():\\n if env.host\\_string == StationA\\_H:\\n put(\\n local\\_path = StringIO(\"\"\"$template Logentries,\"199fb2e1-8227-4f73-9150-70a34a5d5e0c %HOSTNAME% %syslogtag%%msg%\\\\n\"\\n\\*.\\* @@URL:10000;Logentries\"\"\"),\\n remote\\_path = \"/etc/rsyslog.d/URL\",\\n use\\_sudo = True )\\n elif env.host\\_string == StationB\\_H:\\n put(\\n local\\_path = StringIO(\"\"\"$template Logentries,\"3d2fd756-407a-4764-b130-1dd6f22a1b62 %HOSTNAME% %syslogtag%%msg%\\\\n\"\\n\\*.\\* @@URL:10000;Logentries\"\"\"),\\n remote\\_path = \"/etc/rsyslog.d/URL\",\\n use\\_sudo = True )\\n else:\\n put(\\n local\\_path = StringIO(\"\"\"$template Logentries,\"7551d4e0-fa76-466f-8547-8c9a347a9363 %HOSTNAME% %syslogtag%%msg%\\\\n\"\\n\\*.\\* @@URL:10000;Logentries\"\"\"),\\n remote\\_path = \"/etc/rsyslog.d/URL\",\\n use\\_sudo = True )\\n \\n sudo(\"service rsyslog restart\")\\n # Check logging works...\\n sudo(\"logger -t test Hello there Logentries\")\\n\\n\\n@hosts (StationA\\_H, StationB\\_H)\\ndef deploy\\_specific():\\n if env.host\\_string == StationA\\_H:\\n print(\"StationA deploy\")\\n StationA()\\n elif env.host\\_string == StationB\\_H:\\n print(\"StationB deploy\")\\n StationB()\\n else: \\n print(\"Beefy station deploy\")\\n Beefy()\\n\\n\\n@hosts(StationA\\_H, StationB\\_H)\\ndef apt\\_stations():\\n sudo(\"apt-get update\")\\n sudo(\"apt-get install -y xutils xbase-clients xfonts-base xfonts-75dpi xfonts-100dpi\")\\n sudo(\"apt-get install -y python-pip\")\\n sudo(\"apt-get install -y xdotool\")\\n sudo(\"apt-get install -y xfwm4\") \\n\\n\\n@hosts(StationA\\_H, StationB\\_H)\\ndef pythonlibs():\\n sudo(\"pip install python-daemon>=2.0\")\\n sudo(\"pip install raven\")\\n\\n\\n@hosts(Beefy\\_H, StationA\\_H, StationB\\_H)\\ndef ssl():\\n \"\"\"\\n Copies Openssl and curl to the target hosts...\\n \"\"\"\\n sudo(\"mkdir -p /opt/openssl-1.0.2/\")\\n sudo((\"chown {HomeDir\\_Name} /opt/openssl-1.0.2/\".format(HomeDir\\_Name=HomeDir\\_Name)))\\n rsync\\_project(\\n local\\_dir = \"/opt/openssl-1.0.2\",\\n remote\\_dir = \"/opt/\",\\n extra\\_opts=\"-avz\"\\n )\\n put(\\n local\\_path = \"scripts/URL\",\\n remote\\_path = \"/etc/ssl/certs/URL\",\\n use\\_sudo = True\\n )\\n\\n\\n@hosts(Beefy\\_H, StationA\\_H, StationB\\_H)\\ndef ca():\\n \"\"\"\\n Copies the ca certificate to the home...\\n \"\"\"\\n put(\\n local\\_path = \"mimic-here/config/ca/URL\",\\n remote\\_path = (\"/home/{HomeDir\\_Name}/URL\".format(HomeDir\\_Name=HomeDir\\_Name)),\\n use\\_sudo = True\\n )\\n\\n\\n@hosts(StationA\\_H, StationB\\_H)\\ndef install\\_vnc():\\n \"\"\"\\n \\n \"\"\"\\n # run(\"curl -L -o URL URL # run(\"tar xvf VNC-5.2.URL\")\\n\\n\\n put(\\n local\\_path = \"scripts/VNC-5.2.URL\",\\n remote\\_path = (\"/home/{HomeDir\\_Name}/VNC-5.2.URL\".format(HomeDir\\_Name=HomeDir\\_Name)),\\n use\\_sudo = True\\n )\\n run((\"tar -xzf /home/{HomeDir\\_Name}/VNC-5.2.URL\".format(HomeDir\\_Name=HomeDir\\_Name)))\\n # Get a handier name.... \\n run(\"rm -rf vnc\")\\n run((\"mv /home/{HomeDir\\_Name}/VNC-5.2.3-Linux-x64 /home/{HomeDir\\_Name}/vnc\".format(HomeDir\\_Name=HomeDir\\_Name)))\\n sudo((\"/home/{HomeDir\\_Name}/vnc/vnclicense -add {VncLicense}\".format(\\n HomeDir\\_Name= HomeDir\\_Name,\\n VncLicense = VNC\\_LICENSE[0]\\n )))\\n # Will demand some for of interactive input...\\n run((\"mkdir -p /home/{HomeDir\\_Name}/.vnc/\".format(HomeDir\\_Name=HomeDir\\_Name)))\\n run((\"mkdir -p /home/{HomeDir\\_Name}/.vnc/config.d/\".format(HomeDir\\_Name=HomeDir\\_Name)))\\n sudo((\"/home/{HomeDir\\_Name}/vnc/vncpasswd /home/{HomeDir\\_Name}/.vnc/config.d/Xvnc\".format(HomeDir\\_Name=HomeDir\\_Name)))\\n vnc\\_fix\\_permissions()\\n\\n@hosts(StationA\\_H, StationB\\_H)\\ndef vnc\\_fix\\_permissions():\\n sudo((\"chown {HomeDir\\_Name} /home/{HomeDir\\_Name}/.vnc -R\").format(HomeDir\\_Name=HomeDir\\_Name))\\n\\n@hosts(StationA\\_H, StationB\\_H)\\ndef install\\_vnc\\_xstartup():\\n run((\"mkdir -p /home/{HomeDir\\_Name}/.vnc/\".format(HomeDir\\_Name=HomeDir\\_Name)))\\n run((\"mkdir -p /home/{HomeDir\\_Name}/.vnc/config.d/\".format(HomeDir\\_Name=HomeDir\\_Name)))\\n put(\\n local\\_path = \"scripts/vnc-xstartup\",\\n remote\\_path = (\"/home/{HomeDir\\_Name}/.vnc/xstartup\".format(HomeDir\\_Name=HomeDir\\_Name))\\n )\\n run(\"chmod ugo+x /home/{HomeDir\\_Name}/.vnc/xstartup\".format(HomeDir\\_Name=HomeDir\\_Name))\\n put(\\n local\\_path = \"scripts/xvncfontpath\",\\n remote\\_path = (\"/home/{HomeDir\\_Name}/.vnc/config.d/xvncfontpath\".format(HomeDir\\_Name=HomeDir\\_Name))\\n )\\n\\n\\n@hosts(StationA\\_H, StationB\\_H)\\ndef setup\\_google\\_chrome():\\n put(\\n local\\_path = \"scripts/google-chrome-stable\\_current\\_amd64.deb\",\\n remote\\_path = (\"/home/{HomeDir\\_Name}/google-chrome-stable\\_current\\_amd64.deb\".format(HomeDir\\_Name=HomeDir\\_Name)),\\n use\\_sudo = True\\n )\\n really\\_setup\\_google\\_chrome()\\n\\n\\n@hosts(Beefy\\_H, StationA\\_H, StationB\\_H)\\ndef ensure\\_local\\_hosts():\\n # Get the contents of /etc/hosts\\n local\\_file = StringIO()\\n get(\\n local\\_path = local\\_file,\\n remote\\_path = \"/etc/hosts\",\\n use\\_sudo = True \\n )\\n hosts\\_file = local\\_file.getvalue()\\n snippet = \"\"\"# DO NOT EDIT BELOW BY HAND\\n{Beefy\\_InternalIP} URL\\n192.168.112.129 ip-192-168-112-129\\n192.168.112.130 ip-192-168-112-130\\n192.168.112.131 ip-192-168-112-131\\n# END DO NOT EDIT BELOW\"\"\".format(\\n StationA\\_InternalIP = StationA\\_InternalIP,\\n Beefy\\_InternalIP = Beefy\\_InternalIP\\n )\\n mo = URL(r\"# DO NOT EDIT BELOW BY HAND\\n(.\\*?)\\n# END DO NOT EDIT BELOW\", hosts\\_file, re.MULTILINE)\\n if mo:\\n part\\_before = hosts\\_file[:URL(0)]\\n part\\_after = hosts\\_file[URL(0):]\\n hosts\\_file = part\\_before + snippet + part\\_after\\n else:\\n hosts\\_file += \"\\n\" + snippet\\n\\n put(\\n local\\_path = StringIO(hosts\\_file),\\n remote\\_path = \"/etc/hosts\",\\n use\\_sudo = True\\n )\\n\\n\\n@hosts(StationA\\_H, StationB\\_H)\\ndef really\\_setup\\_google\\_chrome():\\n sudo(\"apt-get update\")\\n sudo((\"apt-get -f install -y\".format(HomeDir\\_Name=HomeDir\\_Name)))\\n sudo(\"apt-get install -y --fix-missing xdg-utils\")\\n sudo((\"dpkg -i --force-depends /home/{HomeDir\\_Name}/google-chrome-stable\\_current\\_amd64.deb\".format(HomeDir\\_Name=HomeDir\\_Name)))\\n sudo((\"apt-get -f install -y\".format(HomeDir\\_Name=HomeDir\\_Name)))\\n\\n\\n@hosts(StationA\\_H, StationB\\_H)\\ndef setup\\_vnc\\_service():\\n put(\\n local\\_path = \"scripts/vncserv-{HomeDir\\_Name}.conf\".format(HomeDir\\_Name=HomeDir\\_Name),\\n remote\\_path = \"/etc/init/URL\",\\n use\\_sudo = True\\n )\\n put(\\n local\\_path = \"scripts/URL\",\\n remote\\_path = \"/home/{HomeDir\\_Name}/URL\".format(HomeDir\\_Name=HomeDir\\_Name)\\n )\\n run(\"chmod ugo+x /home/{HomeDir\\_Name}/URL\".format(HomeDir\\_Name=HomeDir\\_Name))\\n with settings(warn\\_only=True):\\n sudo(\\n \"service vncserv start\"\\n )\\n\\n\\n@hosts(StationA\\_H, StationB\\_H)\\ndef disable\\_lightdm():\\n contents = StringIO(\"manual\")\\n put(\\n local\\_path = contents, \\n remote\\_path = \"/etc/init/lightdm.override\",\\n use\\_sudo=True\\n )\\n\\n\\n@hosts(StationA\\_H, StationB\\_H)\\ndef touch\\_xauthority():\\n run(\"touch $HOME/.Xauthority\")\\n\\n\\n@hosts(StationA\\_H, StationB\\_H)\\ndef deploy():\\n execute(apt\\_stations)\\n execute(setup\\_dns\\_masq)\\n execute(setup\\_google\\_chrome)\\n execute(deploy\\_specific)\\n execute(touch\\_xauthority)\\n execute(disable\\_lightdm)\\n execute(StationA)\\n execute(StationB)\\n execute(Beefy)\\n execute(ca)\\n execute(ssl)\\n execute(install\\_vnc)\\n execute(install\\_vnc\\_xstartup)\\n execute(ensure\\_local\\_hosts)\\n execute(setup\\_vnc\\_service)\\n execute(pythonlibs)\\n execute(BeefyRehMimic)\\n execute(install\\_updatednsmasq\\_service) \\n\n\n\nHere, 'source' and 'category' refer to metadata present in the original OpenHermes-2.5 dataset, while 'messages' consists of multi-turn conversations that can be wrapped in a chat template like ChatML for training. The 'average\\_response\\_length' denotes the average length of the assistant conversations, measured in characters.\n\n\nDataset Creation\n----------------",
"### Source Data",
"#### Data Collection and Processing\n\n\nSee the 'create\\_dataset.py' script for details on how the dataset was constructed.",
"#### Who are the source data producers?\n\n\nThis dataset was derived from Teknium's high-quality OpenHermes-2.5 dataset that mostly comprises of GPT-4 instructions and demonstrations."
] | [
38,
5524,
4,
31,
44
] | [
"passage: TAGS\n#task_categories-text-generation #license-other #sft #synthetic #arxiv-2402.04833 #region-us \n"
] |
dc9f8d00ec3fffaa8af578225e2ffdca23d37a2a |
# Dataset Card for Evaluation run of RaduGabriel/SirUkrainian2.0
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [RaduGabriel/SirUkrainian2.0](https://huggingface.co/RaduGabriel/SirUkrainian2.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_RaduGabriel__SirUkrainian2.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-16T14:47:08.297350](https://huggingface.co/datasets/open-llm-leaderboard/details_RaduGabriel__SirUkrainian2.0/blob/main/results_2024-02-16T14-47-08.297350.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6122899617068881,
"acc_stderr": 0.03314256377542638,
"acc_norm": 0.6163125160011517,
"acc_norm_stderr": 0.033822587397925895,
"mc1": 0.4749082007343941,
"mc1_stderr": 0.017481446804104,
"mc2": 0.6423733209082649,
"mc2_stderr": 0.01507454376325255
},
"harness|arc:challenge|25": {
"acc": 0.5989761092150171,
"acc_stderr": 0.014322255790719865,
"acc_norm": 0.636518771331058,
"acc_norm_stderr": 0.014056207319068283
},
"harness|hellaswag|10": {
"acc": 0.650866361282613,
"acc_stderr": 0.004757220449283699,
"acc_norm": 0.832603067118104,
"acc_norm_stderr": 0.0037256689970413094
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6381578947368421,
"acc_stderr": 0.03910525752849724,
"acc_norm": 0.6381578947368421,
"acc_norm_stderr": 0.03910525752849724
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6641509433962264,
"acc_stderr": 0.029067220146644826,
"acc_norm": 0.6641509433962264,
"acc_norm_stderr": 0.029067220146644826
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7013888888888888,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.7013888888888888,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107224,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107224
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5276595744680851,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.5276595744680851,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.04692008381368909,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.04692008381368909
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3941798941798942,
"acc_stderr": 0.025167982333894143,
"acc_norm": 0.3941798941798942,
"acc_norm_stderr": 0.025167982333894143
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7193548387096774,
"acc_stderr": 0.02556060472102289,
"acc_norm": 0.7193548387096774,
"acc_norm_stderr": 0.02556060472102289
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5320197044334976,
"acc_stderr": 0.035107665979592154,
"acc_norm": 0.5320197044334976,
"acc_norm_stderr": 0.035107665979592154
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7151515151515152,
"acc_stderr": 0.03524390844511781,
"acc_norm": 0.7151515151515152,
"acc_norm_stderr": 0.03524390844511781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.029620227874790482,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.029620227874790482
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8497409326424871,
"acc_stderr": 0.02578772318072387,
"acc_norm": 0.8497409326424871,
"acc_norm_stderr": 0.02578772318072387
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5974358974358974,
"acc_stderr": 0.02486499515976775,
"acc_norm": 0.5974358974358974,
"acc_norm_stderr": 0.02486499515976775
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.029443169323031537,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.029443169323031537
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.634453781512605,
"acc_stderr": 0.031282177063684614,
"acc_norm": 0.634453781512605,
"acc_norm_stderr": 0.031282177063684614
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7926605504587156,
"acc_stderr": 0.01738141556360868,
"acc_norm": 0.7926605504587156,
"acc_norm_stderr": 0.01738141556360868
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977748,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977748
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7401960784313726,
"acc_stderr": 0.030778554678693257,
"acc_norm": 0.7401960784313726,
"acc_norm_stderr": 0.030778554678693257
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.729957805907173,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.729957805907173,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.031493846709941306,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.031493846709941306
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6641221374045801,
"acc_stderr": 0.041423137719966634,
"acc_norm": 0.6641221374045801,
"acc_norm_stderr": 0.041423137719966634
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7024793388429752,
"acc_stderr": 0.04173349148083499,
"acc_norm": 0.7024793388429752,
"acc_norm_stderr": 0.04173349148083499
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.044531975073749834,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.044531975073749834
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7177914110429447,
"acc_stderr": 0.03536117886664742,
"acc_norm": 0.7177914110429447,
"acc_norm_stderr": 0.03536117886664742
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165616,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165616
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7828863346104725,
"acc_stderr": 0.014743125394823297,
"acc_norm": 0.7828863346104725,
"acc_norm_stderr": 0.014743125394823297
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6734104046242775,
"acc_stderr": 0.02524826477424284,
"acc_norm": 0.6734104046242775,
"acc_norm_stderr": 0.02524826477424284
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4782122905027933,
"acc_stderr": 0.016706617522176132,
"acc_norm": 0.4782122905027933,
"acc_norm_stderr": 0.016706617522176132
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6895424836601307,
"acc_stderr": 0.026493033225145898,
"acc_norm": 0.6895424836601307,
"acc_norm_stderr": 0.026493033225145898
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.662379421221865,
"acc_stderr": 0.02685882587948855,
"acc_norm": 0.662379421221865,
"acc_norm_stderr": 0.02685882587948855
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.654320987654321,
"acc_stderr": 0.026462487777001855,
"acc_norm": 0.654320987654321,
"acc_norm_stderr": 0.026462487777001855
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4397163120567376,
"acc_stderr": 0.02960991207559411,
"acc_norm": 0.4397163120567376,
"acc_norm_stderr": 0.02960991207559411
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44132985658409385,
"acc_stderr": 0.01268201633564667,
"acc_norm": 0.44132985658409385,
"acc_norm_stderr": 0.01268201633564667
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6029411764705882,
"acc_stderr": 0.029722152099280065,
"acc_norm": 0.6029411764705882,
"acc_norm_stderr": 0.029722152099280065
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6143790849673203,
"acc_stderr": 0.01969145905235403,
"acc_norm": 0.6143790849673203,
"acc_norm_stderr": 0.01969145905235403
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6653061224489796,
"acc_stderr": 0.030209235226242307,
"acc_norm": 0.6653061224489796,
"acc_norm_stderr": 0.030209235226242307
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.02519692987482707,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.02519692987482707
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4819277108433735,
"acc_stderr": 0.038899512528272166,
"acc_norm": 0.4819277108433735,
"acc_norm_stderr": 0.038899512528272166
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4749082007343941,
"mc1_stderr": 0.017481446804104,
"mc2": 0.6423733209082649,
"mc2_stderr": 0.01507454376325255
},
"harness|winogrande|5": {
"acc": 0.7963693764798737,
"acc_stderr": 0.011317798781626918
},
"harness|gsm8k|5": {
"acc": 0.41015921152388174,
"acc_stderr": 0.013548335117860338
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_RaduGabriel__SirUkrainian2.0 | [
"region:us"
] | 2024-02-16T14:49:26+00:00 | {"pretty_name": "Evaluation run of RaduGabriel/SirUkrainian2.0", "dataset_summary": "Dataset automatically created during the evaluation run of model [RaduGabriel/SirUkrainian2.0](https://huggingface.co/RaduGabriel/SirUkrainian2.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_RaduGabriel__SirUkrainian2.0\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-16T14:47:08.297350](https://huggingface.co/datasets/open-llm-leaderboard/details_RaduGabriel__SirUkrainian2.0/blob/main/results_2024-02-16T14-47-08.297350.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6122899617068881,\n \"acc_stderr\": 0.03314256377542638,\n \"acc_norm\": 0.6163125160011517,\n \"acc_norm_stderr\": 0.033822587397925895,\n \"mc1\": 0.4749082007343941,\n \"mc1_stderr\": 0.017481446804104,\n \"mc2\": 0.6423733209082649,\n \"mc2_stderr\": 0.01507454376325255\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5989761092150171,\n \"acc_stderr\": 0.014322255790719865,\n \"acc_norm\": 0.636518771331058,\n \"acc_norm_stderr\": 0.014056207319068283\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.650866361282613,\n \"acc_stderr\": 0.004757220449283699,\n \"acc_norm\": 0.832603067118104,\n \"acc_norm_stderr\": 0.0037256689970413094\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6381578947368421,\n \"acc_stderr\": 0.03910525752849724,\n \"acc_norm\": 0.6381578947368421,\n \"acc_norm_stderr\": 0.03910525752849724\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6641509433962264,\n \"acc_stderr\": 0.029067220146644826,\n \"acc_norm\": 0.6641509433962264,\n \"acc_norm_stderr\": 0.029067220146644826\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107224,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107224\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5276595744680851,\n \"acc_stderr\": 0.03263597118409769,\n \"acc_norm\": 0.5276595744680851,\n \"acc_norm_stderr\": 0.03263597118409769\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.04692008381368909,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.04692008381368909\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3941798941798942,\n \"acc_stderr\": 0.025167982333894143,\n \"acc_norm\": 0.3941798941798942,\n \"acc_norm_stderr\": 0.025167982333894143\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7193548387096774,\n \"acc_stderr\": 0.02556060472102289,\n \"acc_norm\": 0.7193548387096774,\n \"acc_norm_stderr\": 0.02556060472102289\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5320197044334976,\n \"acc_stderr\": 0.035107665979592154,\n \"acc_norm\": 0.5320197044334976,\n \"acc_norm_stderr\": 0.035107665979592154\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7151515151515152,\n \"acc_stderr\": 0.03524390844511781,\n \"acc_norm\": 0.7151515151515152,\n \"acc_norm_stderr\": 0.03524390844511781\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.029620227874790482,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.029620227874790482\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8497409326424871,\n \"acc_stderr\": 0.02578772318072387,\n \"acc_norm\": 0.8497409326424871,\n \"acc_norm_stderr\": 0.02578772318072387\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5974358974358974,\n \"acc_stderr\": 0.02486499515976775,\n \"acc_norm\": 0.5974358974358974,\n \"acc_norm_stderr\": 0.02486499515976775\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.37037037037037035,\n \"acc_stderr\": 0.029443169323031537,\n \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.029443169323031537\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.634453781512605,\n \"acc_stderr\": 0.031282177063684614,\n \"acc_norm\": 0.634453781512605,\n \"acc_norm_stderr\": 0.031282177063684614\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7926605504587156,\n \"acc_stderr\": 0.01738141556360868,\n \"acc_norm\": 0.7926605504587156,\n \"acc_norm_stderr\": 0.01738141556360868\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977748,\n \"acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977748\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7401960784313726,\n \"acc_stderr\": 0.030778554678693257,\n \"acc_norm\": 0.7401960784313726,\n \"acc_norm_stderr\": 0.030778554678693257\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.729957805907173,\n \"acc_stderr\": 0.028900721906293426,\n \"acc_norm\": 0.729957805907173,\n \"acc_norm_stderr\": 0.028900721906293426\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n \"acc_stderr\": 0.031493846709941306,\n \"acc_norm\": 0.672645739910314,\n \"acc_norm_stderr\": 0.031493846709941306\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6641221374045801,\n \"acc_stderr\": 0.041423137719966634,\n \"acc_norm\": 0.6641221374045801,\n \"acc_norm_stderr\": 0.041423137719966634\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7024793388429752,\n \"acc_stderr\": 0.04173349148083499,\n \"acc_norm\": 0.7024793388429752,\n \"acc_norm_stderr\": 0.04173349148083499\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6944444444444444,\n \"acc_stderr\": 0.044531975073749834,\n \"acc_norm\": 0.6944444444444444,\n \"acc_norm_stderr\": 0.044531975073749834\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7177914110429447,\n \"acc_stderr\": 0.03536117886664742,\n \"acc_norm\": 0.7177914110429447,\n \"acc_norm_stderr\": 0.03536117886664742\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n \"acc_stderr\": 0.022209309073165616,\n \"acc_norm\": 0.8675213675213675,\n \"acc_norm_stderr\": 0.022209309073165616\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7828863346104725,\n \"acc_stderr\": 0.014743125394823297,\n \"acc_norm\": 0.7828863346104725,\n \"acc_norm_stderr\": 0.014743125394823297\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6734104046242775,\n \"acc_stderr\": 0.02524826477424284,\n \"acc_norm\": 0.6734104046242775,\n \"acc_norm_stderr\": 0.02524826477424284\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4782122905027933,\n \"acc_stderr\": 0.016706617522176132,\n \"acc_norm\": 0.4782122905027933,\n \"acc_norm_stderr\": 0.016706617522176132\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6895424836601307,\n \"acc_stderr\": 0.026493033225145898,\n \"acc_norm\": 0.6895424836601307,\n \"acc_norm_stderr\": 0.026493033225145898\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.662379421221865,\n \"acc_stderr\": 0.02685882587948855,\n \"acc_norm\": 0.662379421221865,\n \"acc_norm_stderr\": 0.02685882587948855\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.654320987654321,\n \"acc_stderr\": 0.026462487777001855,\n \"acc_norm\": 0.654320987654321,\n \"acc_norm_stderr\": 0.026462487777001855\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4397163120567376,\n \"acc_stderr\": 0.02960991207559411,\n \"acc_norm\": 0.4397163120567376,\n \"acc_norm_stderr\": 0.02960991207559411\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44132985658409385,\n \"acc_stderr\": 0.01268201633564667,\n \"acc_norm\": 0.44132985658409385,\n \"acc_norm_stderr\": 0.01268201633564667\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6029411764705882,\n \"acc_stderr\": 0.029722152099280065,\n \"acc_norm\": 0.6029411764705882,\n \"acc_norm_stderr\": 0.029722152099280065\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6143790849673203,\n \"acc_stderr\": 0.01969145905235403,\n \"acc_norm\": 0.6143790849673203,\n \"acc_norm_stderr\": 0.01969145905235403\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6653061224489796,\n \"acc_stderr\": 0.030209235226242307,\n \"acc_norm\": 0.6653061224489796,\n \"acc_norm_stderr\": 0.030209235226242307\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n \"acc_stderr\": 0.02519692987482707,\n \"acc_norm\": 0.8507462686567164,\n \"acc_norm_stderr\": 0.02519692987482707\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n \"acc_stderr\": 0.038899512528272166,\n \"acc_norm\": 0.4819277108433735,\n \"acc_norm_stderr\": 0.038899512528272166\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4749082007343941,\n \"mc1_stderr\": 0.017481446804104,\n \"mc2\": 0.6423733209082649,\n \"mc2_stderr\": 0.01507454376325255\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7963693764798737,\n \"acc_stderr\": 0.011317798781626918\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.41015921152388174,\n \"acc_stderr\": 0.013548335117860338\n }\n}\n```", "repo_url": "https://huggingface.co/RaduGabriel/SirUkrainian2.0", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|arc:challenge|25_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|gsm8k|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hellaswag|10_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|winogrande|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["results_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["results_2024-02-16T14-47-08.297350.parquet"]}]}]} | 2024-02-16T14:49:48+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of RaduGabriel/SirUkrainian2.0
Dataset automatically created during the evaluation run of model RaduGabriel/SirUkrainian2.0 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-16T14:47:08.297350(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of RaduGabriel/SirUkrainian2.0\n\n\n\nDataset automatically created during the evaluation run of model RaduGabriel/SirUkrainian2.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-16T14:47:08.297350(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of RaduGabriel/SirUkrainian2.0\n\n\n\nDataset automatically created during the evaluation run of model RaduGabriel/SirUkrainian2.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-16T14:47:08.297350(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
185,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of RaduGabriel/SirUkrainian2.0\n\n\n\nDataset automatically created during the evaluation run of model RaduGabriel/SirUkrainian2.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-16T14:47:08.297350(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
ea23d11aa86606eadae7f8b72fa8eae3863e2545 |
# Dataset Card for Evaluation run of CorticalStack/mistral-7b-dolphin-sft
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [CorticalStack/mistral-7b-dolphin-sft](https://huggingface.co/CorticalStack/mistral-7b-dolphin-sft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CorticalStack__mistral-7b-dolphin-sft",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-16T14:55:12.739347](https://huggingface.co/datasets/open-llm-leaderboard/details_CorticalStack__mistral-7b-dolphin-sft/blob/main/results_2024-02-16T14-55-12.739347.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6226391506011404,
"acc_stderr": 0.032752871244970075,
"acc_norm": 0.6284357429187831,
"acc_norm_stderr": 0.033420600664784014,
"mc1": 0.32802937576499386,
"mc1_stderr": 0.01643563293281503,
"mc2": 0.4891471279958395,
"mc2_stderr": 0.014787543186222349
},
"harness|arc:challenge|25": {
"acc": 0.5486348122866894,
"acc_stderr": 0.014542104569955269,
"acc_norm": 0.5725255972696246,
"acc_norm_stderr": 0.014456862944650647
},
"harness|hellaswag|10": {
"acc": 0.6243776140211114,
"acc_stderr": 0.0048329345291207955,
"acc_norm": 0.8301135232025493,
"acc_norm_stderr": 0.0037476555337545158
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.042320736951515885,
"acc_norm": 0.6,
"acc_norm_stderr": 0.042320736951515885
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6513157894736842,
"acc_stderr": 0.0387813988879761,
"acc_norm": 0.6513157894736842,
"acc_norm_stderr": 0.0387813988879761
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7013888888888888,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.7013888888888888,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.03656343653353159,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.03656343653353159
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082636,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.047028804320496165,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.047028804320496165
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370332,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370332
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.025355741263055266,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.025355741263055266
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7419354838709677,
"acc_stderr": 0.024892469172462836,
"acc_norm": 0.7419354838709677,
"acc_norm_stderr": 0.024892469172462836
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.035145285621750094,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.035145285621750094
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.03453131801885417,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.03453131801885417
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.02985751567338641,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.02985751567338641
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8497409326424871,
"acc_stderr": 0.02578772318072387,
"acc_norm": 0.8497409326424871,
"acc_norm_stderr": 0.02578772318072387
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6205128205128205,
"acc_stderr": 0.024603626924097417,
"acc_norm": 0.6205128205128205,
"acc_norm_stderr": 0.024603626924097417
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.028317533496066475,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.028317533496066475
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6260504201680672,
"acc_stderr": 0.03142946637883708,
"acc_norm": 0.6260504201680672,
"acc_norm_stderr": 0.03142946637883708
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8073394495412844,
"acc_stderr": 0.01690927688493609,
"acc_norm": 0.8073394495412844,
"acc_norm_stderr": 0.01690927688493609
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.02812597226565437,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.02812597226565437
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.02675082699467617,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.02675082699467617
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7404580152671756,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.7404580152671756,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7300613496932515,
"acc_stderr": 0.03487825168497892,
"acc_norm": 0.7300613496932515,
"acc_norm_stderr": 0.03487825168497892
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092375,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092375
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7994891443167306,
"acc_stderr": 0.014317653708594202,
"acc_norm": 0.7994891443167306,
"acc_norm_stderr": 0.014317653708594202
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6994219653179191,
"acc_stderr": 0.0246853168672578,
"acc_norm": 0.6994219653179191,
"acc_norm_stderr": 0.0246853168672578
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3139664804469274,
"acc_stderr": 0.01552192393352364,
"acc_norm": 0.3139664804469274,
"acc_norm_stderr": 0.01552192393352364
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7026143790849673,
"acc_stderr": 0.02617390850671858,
"acc_norm": 0.7026143790849673,
"acc_norm_stderr": 0.02617390850671858
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.026082700695399665,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.026082700695399665
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7006172839506173,
"acc_stderr": 0.025483115601195455,
"acc_norm": 0.7006172839506173,
"acc_norm_stderr": 0.025483115601195455
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.029779450957303062,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.029779450957303062
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.43741851368970014,
"acc_stderr": 0.012669813464935722,
"acc_norm": 0.43741851368970014,
"acc_norm_stderr": 0.012669813464935722
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.02873932851398357,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.02873932851398357
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6290849673202614,
"acc_stderr": 0.019542101564854128,
"acc_norm": 0.6290849673202614,
"acc_norm_stderr": 0.019542101564854128
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.028920583220675606,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.028920583220675606
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454132,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454132
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.32802937576499386,
"mc1_stderr": 0.01643563293281503,
"mc2": 0.4891471279958395,
"mc2_stderr": 0.014787543186222349
},
"harness|winogrande|5": {
"acc": 0.7750591949486977,
"acc_stderr": 0.011735043564126735
},
"harness|gsm8k|5": {
"acc": 0.35784685367702807,
"acc_stderr": 0.013204142536119939
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_CorticalStack__mistral-7b-dolphin-sft | [
"region:us"
] | 2024-02-16T14:57:34+00:00 | {"pretty_name": "Evaluation run of CorticalStack/mistral-7b-dolphin-sft", "dataset_summary": "Dataset automatically created during the evaluation run of model [CorticalStack/mistral-7b-dolphin-sft](https://huggingface.co/CorticalStack/mistral-7b-dolphin-sft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CorticalStack__mistral-7b-dolphin-sft\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-16T14:55:12.739347](https://huggingface.co/datasets/open-llm-leaderboard/details_CorticalStack__mistral-7b-dolphin-sft/blob/main/results_2024-02-16T14-55-12.739347.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6226391506011404,\n \"acc_stderr\": 0.032752871244970075,\n \"acc_norm\": 0.6284357429187831,\n \"acc_norm_stderr\": 0.033420600664784014,\n \"mc1\": 0.32802937576499386,\n \"mc1_stderr\": 0.01643563293281503,\n \"mc2\": 0.4891471279958395,\n \"mc2_stderr\": 0.014787543186222349\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5486348122866894,\n \"acc_stderr\": 0.014542104569955269,\n \"acc_norm\": 0.5725255972696246,\n \"acc_norm_stderr\": 0.014456862944650647\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6243776140211114,\n \"acc_stderr\": 0.0048329345291207955,\n \"acc_norm\": 0.8301135232025493,\n \"acc_norm_stderr\": 0.0037476555337545158\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.042320736951515885,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.042320736951515885\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6513157894736842,\n \"acc_stderr\": 0.0387813988879761,\n \"acc_norm\": 0.6513157894736842,\n \"acc_norm_stderr\": 0.0387813988879761\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n \"acc_stderr\": 0.03656343653353159,\n \"acc_norm\": 0.6416184971098265,\n \"acc_norm_stderr\": 0.03656343653353159\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082636,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082636\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.047028804320496165,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.047028804320496165\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370332,\n \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370332\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.025355741263055266,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.025355741263055266\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7419354838709677,\n \"acc_stderr\": 0.024892469172462836,\n \"acc_norm\": 0.7419354838709677,\n \"acc_norm_stderr\": 0.024892469172462836\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.035145285621750094,\n \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.035145285621750094\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885417,\n \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885417\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7727272727272727,\n \"acc_stderr\": 0.02985751567338641,\n \"acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.02985751567338641\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8497409326424871,\n \"acc_stderr\": 0.02578772318072387,\n \"acc_norm\": 0.8497409326424871,\n \"acc_norm_stderr\": 0.02578772318072387\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6205128205128205,\n \"acc_stderr\": 0.024603626924097417,\n \"acc_norm\": 0.6205128205128205,\n \"acc_norm_stderr\": 0.024603626924097417\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.028317533496066475,\n \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.028317533496066475\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6260504201680672,\n \"acc_stderr\": 0.03142946637883708,\n \"acc_norm\": 0.6260504201680672,\n \"acc_norm_stderr\": 0.03142946637883708\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8073394495412844,\n \"acc_stderr\": 0.01690927688493609,\n \"acc_norm\": 0.8073394495412844,\n \"acc_norm_stderr\": 0.01690927688493609\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7990196078431373,\n \"acc_stderr\": 0.02812597226565437,\n \"acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.02812597226565437\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7848101265822784,\n \"acc_stderr\": 0.02675082699467617,\n \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.02675082699467617\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.03487825168497892,\n \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.03487825168497892\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.020588491316092375,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.020588491316092375\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7994891443167306,\n \"acc_stderr\": 0.014317653708594202,\n \"acc_norm\": 0.7994891443167306,\n \"acc_norm_stderr\": 0.014317653708594202\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6994219653179191,\n \"acc_stderr\": 0.0246853168672578,\n \"acc_norm\": 0.6994219653179191,\n \"acc_norm_stderr\": 0.0246853168672578\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3139664804469274,\n \"acc_stderr\": 0.01552192393352364,\n \"acc_norm\": 0.3139664804469274,\n \"acc_norm_stderr\": 0.01552192393352364\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7026143790849673,\n \"acc_stderr\": 0.02617390850671858,\n \"acc_norm\": 0.7026143790849673,\n \"acc_norm_stderr\": 0.02617390850671858\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n \"acc_stderr\": 0.026082700695399665,\n \"acc_norm\": 0.6977491961414791,\n \"acc_norm_stderr\": 0.026082700695399665\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7006172839506173,\n \"acc_stderr\": 0.025483115601195455,\n \"acc_norm\": 0.7006172839506173,\n \"acc_norm_stderr\": 0.025483115601195455\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303062,\n \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303062\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.43741851368970014,\n \"acc_stderr\": 0.012669813464935722,\n \"acc_norm\": 0.43741851368970014,\n \"acc_norm_stderr\": 0.012669813464935722\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.02873932851398357,\n \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.02873932851398357\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6290849673202614,\n \"acc_stderr\": 0.019542101564854128,\n \"acc_norm\": 0.6290849673202614,\n \"acc_norm_stderr\": 0.019542101564854128\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.028920583220675606,\n \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.028920583220675606\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454132,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454132\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.32802937576499386,\n \"mc1_stderr\": 0.01643563293281503,\n \"mc2\": 0.4891471279958395,\n \"mc2_stderr\": 0.014787543186222349\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7750591949486977,\n \"acc_stderr\": 0.011735043564126735\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.35784685367702807,\n \"acc_stderr\": 0.013204142536119939\n }\n}\n```", "repo_url": "https://huggingface.co/CorticalStack/mistral-7b-dolphin-sft", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|arc:challenge|25_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|gsm8k|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hellaswag|10_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|winogrande|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["results_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["results_2024-02-16T14-55-12.739347.parquet"]}]}]} | 2024-02-16T14:57:57+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of CorticalStack/mistral-7b-dolphin-sft
Dataset automatically created during the evaluation run of model CorticalStack/mistral-7b-dolphin-sft on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-16T14:55:12.739347(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of CorticalStack/mistral-7b-dolphin-sft\n\n\n\nDataset automatically created during the evaluation run of model CorticalStack/mistral-7b-dolphin-sft on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-16T14:55:12.739347(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of CorticalStack/mistral-7b-dolphin-sft\n\n\n\nDataset automatically created during the evaluation run of model CorticalStack/mistral-7b-dolphin-sft on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-16T14:55:12.739347(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
193,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of CorticalStack/mistral-7b-dolphin-sft\n\n\n\nDataset automatically created during the evaluation run of model CorticalStack/mistral-7b-dolphin-sft on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-16T14:55:12.739347(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
79cce594c3b54d6014543854a3c5c8f0d6f6488b |
# Dataset Card for Evaluation run of RaduGabriel/SirUkrainian2.0DPO
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [RaduGabriel/SirUkrainian2.0DPO](https://huggingface.co/RaduGabriel/SirUkrainian2.0DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_RaduGabriel__SirUkrainian2.0DPO",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-16T15:05:17.320355](https://huggingface.co/datasets/open-llm-leaderboard/details_RaduGabriel__SirUkrainian2.0DPO/blob/main/results_2024-02-16T15-05-17.320355.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6121147220670339,
"acc_stderr": 0.03309664409492137,
"acc_norm": 0.61597978296096,
"acc_norm_stderr": 0.03377396962261278,
"mc1": 0.48225214198286415,
"mc1_stderr": 0.017492470843075363,
"mc2": 0.6508219588046045,
"mc2_stderr": 0.015061453784907832
},
"harness|arc:challenge|25": {
"acc": 0.6006825938566553,
"acc_stderr": 0.014312094557946716,
"acc_norm": 0.6390784982935154,
"acc_norm_stderr": 0.014034761386175452
},
"harness|hellaswag|10": {
"acc": 0.6541525592511452,
"acc_stderr": 0.004746716805735753,
"acc_norm": 0.8351921927902808,
"acc_norm_stderr": 0.003702487662126949
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5703703703703704,
"acc_stderr": 0.04276349494376599,
"acc_norm": 0.5703703703703704,
"acc_norm_stderr": 0.04276349494376599
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6381578947368421,
"acc_stderr": 0.03910525752849724,
"acc_norm": 0.6381578947368421,
"acc_norm_stderr": 0.03910525752849724
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6716981132075471,
"acc_stderr": 0.02890159361241178,
"acc_norm": 0.6716981132075471,
"acc_norm_stderr": 0.02890159361241178
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6875,
"acc_stderr": 0.038760854559127644,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.038760854559127644
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105654,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5404255319148936,
"acc_stderr": 0.03257901482099835,
"acc_norm": 0.5404255319148936,
"acc_norm_stderr": 0.03257901482099835
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3994708994708995,
"acc_stderr": 0.02522545028406788,
"acc_norm": 0.3994708994708995,
"acc_norm_stderr": 0.02522545028406788
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7193548387096774,
"acc_stderr": 0.02556060472102289,
"acc_norm": 0.7193548387096774,
"acc_norm_stderr": 0.02556060472102289
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5320197044334976,
"acc_stderr": 0.035107665979592154,
"acc_norm": 0.5320197044334976,
"acc_norm_stderr": 0.035107665979592154
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7212121212121212,
"acc_stderr": 0.03501438706296781,
"acc_norm": 0.7212121212121212,
"acc_norm_stderr": 0.03501438706296781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.029620227874790482,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.029620227874790482
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8497409326424871,
"acc_stderr": 0.02578772318072386,
"acc_norm": 0.8497409326424871,
"acc_norm_stderr": 0.02578772318072386
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6051282051282051,
"acc_stderr": 0.0247843169421564,
"acc_norm": 0.6051282051282051,
"acc_norm_stderr": 0.0247843169421564
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.02918571494985741,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.02918571494985741
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6302521008403361,
"acc_stderr": 0.03135709599613591,
"acc_norm": 0.6302521008403361,
"acc_norm_stderr": 0.03135709599613591
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7944954128440367,
"acc_stderr": 0.01732435232501602,
"acc_norm": 0.7944954128440367,
"acc_norm_stderr": 0.01732435232501602
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7401960784313726,
"acc_stderr": 0.030778554678693257,
"acc_norm": 0.7401960784313726,
"acc_norm_stderr": 0.030778554678693257
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7257383966244726,
"acc_stderr": 0.029041333510598035,
"acc_norm": 0.7257383966244726,
"acc_norm_stderr": 0.029041333510598035
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6591928251121076,
"acc_stderr": 0.03181149747055359,
"acc_norm": 0.6591928251121076,
"acc_norm_stderr": 0.03181149747055359
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6641221374045801,
"acc_stderr": 0.041423137719966634,
"acc_norm": 0.6641221374045801,
"acc_norm_stderr": 0.041423137719966634
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7107438016528925,
"acc_stderr": 0.041391127276354626,
"acc_norm": 0.7107438016528925,
"acc_norm_stderr": 0.041391127276354626
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.044531975073749834,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.044531975073749834
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7177914110429447,
"acc_stderr": 0.03536117886664742,
"acc_norm": 0.7177914110429447,
"acc_norm_stderr": 0.03536117886664742
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077805,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077805
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7841634738186463,
"acc_stderr": 0.01471168438613996,
"acc_norm": 0.7841634738186463,
"acc_norm_stderr": 0.01471168438613996
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.025305258131879695,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.025305258131879695
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.48044692737430167,
"acc_stderr": 0.016709709877662,
"acc_norm": 0.48044692737430167,
"acc_norm_stderr": 0.016709709877662
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6895424836601307,
"acc_stderr": 0.026493033225145898,
"acc_norm": 0.6895424836601307,
"acc_norm_stderr": 0.026493033225145898
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6752411575562701,
"acc_stderr": 0.026596782287697046,
"acc_norm": 0.6752411575562701,
"acc_norm_stderr": 0.026596782287697046
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6512345679012346,
"acc_stderr": 0.02651759772446501,
"acc_norm": 0.6512345679012346,
"acc_norm_stderr": 0.02651759772446501
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.44680851063829785,
"acc_stderr": 0.029658235097666907,
"acc_norm": 0.44680851063829785,
"acc_norm_stderr": 0.029658235097666907
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4452411994784876,
"acc_stderr": 0.012693421303973294,
"acc_norm": 0.4452411994784876,
"acc_norm_stderr": 0.012693421303973294
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6102941176470589,
"acc_stderr": 0.029624663581159696,
"acc_norm": 0.6102941176470589,
"acc_norm_stderr": 0.029624663581159696
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6160130718954249,
"acc_stderr": 0.019675808135281515,
"acc_norm": 0.6160130718954249,
"acc_norm_stderr": 0.019675808135281515
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6775510204081633,
"acc_stderr": 0.029923100563683906,
"acc_norm": 0.6775510204081633,
"acc_norm_stderr": 0.029923100563683906
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.024845753212306053,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.024845753212306053
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4759036144578313,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.4759036144578313,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.02991312723236804,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.02991312723236804
},
"harness|truthfulqa:mc|0": {
"mc1": 0.48225214198286415,
"mc1_stderr": 0.017492470843075363,
"mc2": 0.6508219588046045,
"mc2_stderr": 0.015061453784907832
},
"harness|winogrande|5": {
"acc": 0.7963693764798737,
"acc_stderr": 0.011317798781626918
},
"harness|gsm8k|5": {
"acc": 0.41925701288855194,
"acc_stderr": 0.01359172095904211
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_RaduGabriel__SirUkrainian2.0DPO | [
"region:us"
] | 2024-02-16T15:07:36+00:00 | {"pretty_name": "Evaluation run of RaduGabriel/SirUkrainian2.0DPO", "dataset_summary": "Dataset automatically created during the evaluation run of model [RaduGabriel/SirUkrainian2.0DPO](https://huggingface.co/RaduGabriel/SirUkrainian2.0DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_RaduGabriel__SirUkrainian2.0DPO\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-16T15:05:17.320355](https://huggingface.co/datasets/open-llm-leaderboard/details_RaduGabriel__SirUkrainian2.0DPO/blob/main/results_2024-02-16T15-05-17.320355.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6121147220670339,\n \"acc_stderr\": 0.03309664409492137,\n \"acc_norm\": 0.61597978296096,\n \"acc_norm_stderr\": 0.03377396962261278,\n \"mc1\": 0.48225214198286415,\n \"mc1_stderr\": 0.017492470843075363,\n \"mc2\": 0.6508219588046045,\n \"mc2_stderr\": 0.015061453784907832\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6006825938566553,\n \"acc_stderr\": 0.014312094557946716,\n \"acc_norm\": 0.6390784982935154,\n \"acc_norm_stderr\": 0.014034761386175452\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6541525592511452,\n \"acc_stderr\": 0.004746716805735753,\n \"acc_norm\": 0.8351921927902808,\n \"acc_norm_stderr\": 0.003702487662126949\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5703703703703704,\n \"acc_stderr\": 0.04276349494376599,\n \"acc_norm\": 0.5703703703703704,\n \"acc_norm_stderr\": 0.04276349494376599\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6381578947368421,\n \"acc_stderr\": 0.03910525752849724,\n \"acc_norm\": 0.6381578947368421,\n \"acc_norm_stderr\": 0.03910525752849724\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.04960449637488583,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.04960449637488583\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6716981132075471,\n \"acc_stderr\": 0.02890159361241178,\n \"acc_norm\": 0.6716981132075471,\n \"acc_norm_stderr\": 0.02890159361241178\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.038760854559127644,\n \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.038760854559127644\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720683,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720683\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105654,\n \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105654\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099835,\n \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099835\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3994708994708995,\n \"acc_stderr\": 0.02522545028406788,\n \"acc_norm\": 0.3994708994708995,\n \"acc_norm_stderr\": 0.02522545028406788\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7193548387096774,\n \"acc_stderr\": 0.02556060472102289,\n \"acc_norm\": 0.7193548387096774,\n \"acc_norm_stderr\": 0.02556060472102289\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5320197044334976,\n \"acc_stderr\": 0.035107665979592154,\n \"acc_norm\": 0.5320197044334976,\n \"acc_norm_stderr\": 0.035107665979592154\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7212121212121212,\n \"acc_stderr\": 0.03501438706296781,\n \"acc_norm\": 0.7212121212121212,\n \"acc_norm_stderr\": 0.03501438706296781\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.029620227874790482,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.029620227874790482\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8497409326424871,\n \"acc_stderr\": 0.02578772318072386,\n \"acc_norm\": 0.8497409326424871,\n \"acc_norm_stderr\": 0.02578772318072386\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6051282051282051,\n \"acc_stderr\": 0.0247843169421564,\n \"acc_norm\": 0.6051282051282051,\n \"acc_norm_stderr\": 0.0247843169421564\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.35555555555555557,\n \"acc_stderr\": 0.02918571494985741,\n \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.02918571494985741\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6302521008403361,\n \"acc_stderr\": 0.03135709599613591,\n \"acc_norm\": 0.6302521008403361,\n \"acc_norm_stderr\": 0.03135709599613591\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7944954128440367,\n \"acc_stderr\": 0.01732435232501602,\n \"acc_norm\": 0.7944954128440367,\n \"acc_norm_stderr\": 0.01732435232501602\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7401960784313726,\n \"acc_stderr\": 0.030778554678693257,\n \"acc_norm\": 0.7401960784313726,\n \"acc_norm_stderr\": 0.030778554678693257\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7257383966244726,\n \"acc_stderr\": 0.029041333510598035,\n \"acc_norm\": 0.7257383966244726,\n \"acc_norm_stderr\": 0.029041333510598035\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6591928251121076,\n \"acc_stderr\": 0.03181149747055359,\n \"acc_norm\": 0.6591928251121076,\n \"acc_norm_stderr\": 0.03181149747055359\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6641221374045801,\n \"acc_stderr\": 0.041423137719966634,\n \"acc_norm\": 0.6641221374045801,\n \"acc_norm_stderr\": 0.041423137719966634\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7107438016528925,\n \"acc_stderr\": 0.041391127276354626,\n \"acc_norm\": 0.7107438016528925,\n \"acc_norm_stderr\": 0.041391127276354626\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6944444444444444,\n \"acc_stderr\": 0.044531975073749834,\n \"acc_norm\": 0.6944444444444444,\n \"acc_norm_stderr\": 0.044531975073749834\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7177914110429447,\n \"acc_stderr\": 0.03536117886664742,\n \"acc_norm\": 0.7177914110429447,\n \"acc_norm_stderr\": 0.03536117886664742\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n \"acc_stderr\": 0.022509033937077805,\n \"acc_norm\": 0.8632478632478633,\n \"acc_norm_stderr\": 0.022509033937077805\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7841634738186463,\n \"acc_stderr\": 0.01471168438613996,\n \"acc_norm\": 0.7841634738186463,\n \"acc_norm_stderr\": 0.01471168438613996\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.025305258131879695,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.025305258131879695\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.48044692737430167,\n \"acc_stderr\": 0.016709709877662,\n \"acc_norm\": 0.48044692737430167,\n \"acc_norm_stderr\": 0.016709709877662\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6895424836601307,\n \"acc_stderr\": 0.026493033225145898,\n \"acc_norm\": 0.6895424836601307,\n \"acc_norm_stderr\": 0.026493033225145898\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6752411575562701,\n \"acc_stderr\": 0.026596782287697046,\n \"acc_norm\": 0.6752411575562701,\n \"acc_norm_stderr\": 0.026596782287697046\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6512345679012346,\n \"acc_stderr\": 0.02651759772446501,\n \"acc_norm\": 0.6512345679012346,\n \"acc_norm_stderr\": 0.02651759772446501\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.44680851063829785,\n \"acc_stderr\": 0.029658235097666907,\n \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.029658235097666907\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4452411994784876,\n \"acc_stderr\": 0.012693421303973294,\n \"acc_norm\": 0.4452411994784876,\n \"acc_norm_stderr\": 0.012693421303973294\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6102941176470589,\n \"acc_stderr\": 0.029624663581159696,\n \"acc_norm\": 0.6102941176470589,\n \"acc_norm_stderr\": 0.029624663581159696\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6160130718954249,\n \"acc_stderr\": 0.019675808135281515,\n \"acc_norm\": 0.6160130718954249,\n \"acc_norm_stderr\": 0.019675808135281515\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6775510204081633,\n \"acc_stderr\": 0.029923100563683906,\n \"acc_norm\": 0.6775510204081633,\n \"acc_norm_stderr\": 0.029923100563683906\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n \"acc_stderr\": 0.024845753212306053,\n \"acc_norm\": 0.8557213930348259,\n \"acc_norm_stderr\": 0.024845753212306053\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.4759036144578313,\n \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.02991312723236804,\n \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.02991312723236804\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.48225214198286415,\n \"mc1_stderr\": 0.017492470843075363,\n \"mc2\": 0.6508219588046045,\n \"mc2_stderr\": 0.015061453784907832\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7963693764798737,\n \"acc_stderr\": 0.011317798781626918\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.41925701288855194,\n \"acc_stderr\": 0.01359172095904211\n }\n}\n```", "repo_url": "https://huggingface.co/RaduGabriel/SirUkrainian2.0DPO", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|arc:challenge|25_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|gsm8k|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hellaswag|10_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|winogrande|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["results_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["results_2024-02-16T15-05-17.320355.parquet"]}]}]} | 2024-02-16T15:07:59+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of RaduGabriel/SirUkrainian2.0DPO
Dataset automatically created during the evaluation run of model RaduGabriel/SirUkrainian2.0DPO on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-16T15:05:17.320355(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of RaduGabriel/SirUkrainian2.0DPO\n\n\n\nDataset automatically created during the evaluation run of model RaduGabriel/SirUkrainian2.0DPO on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-16T15:05:17.320355(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of RaduGabriel/SirUkrainian2.0DPO\n\n\n\nDataset automatically created during the evaluation run of model RaduGabriel/SirUkrainian2.0DPO on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-16T15:05:17.320355(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
189,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of RaduGabriel/SirUkrainian2.0DPO\n\n\n\nDataset automatically created during the evaluation run of model RaduGabriel/SirUkrainian2.0DPO on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-16T15:05:17.320355(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
85bd8f0cc8326de958f445c4f059fafe66d23044 | # Dataset Card for "finance_dataset_subset_binirized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | jan-hq/finance_dataset_subset_binarized | [
"region:us"
] | 2024-02-16T15:10:16+00:00 | {"dataset_info": {"features": [{"name": "messages", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 56318865.41867357, "num_examples": 55352}, {"name": "test", "num_bytes": 6258443.076858308, "num_examples": 6151}], "download_size": 46540189, "dataset_size": 62577308.49553187}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-02-16T15:26:09+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "finance_dataset_subset_binirized"
More Information needed | [
"# Dataset Card for \"finance_dataset_subset_binirized\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"finance_dataset_subset_binirized\"\n\nMore Information needed"
] | [
6,
22
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"finance_dataset_subset_binirized\"\n\nMore Information needed"
] |
ece087aca234ab8de35883f9648ac5f04176082c |
# Dataset Card for Evaluation run of CorticalStack/mistral-7b-slimorca-sft
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [CorticalStack/mistral-7b-slimorca-sft](https://huggingface.co/CorticalStack/mistral-7b-slimorca-sft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CorticalStack__mistral-7b-slimorca-sft",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-16T15:13:14.245418](https://huggingface.co/datasets/open-llm-leaderboard/details_CorticalStack__mistral-7b-slimorca-sft/blob/main/results_2024-02-16T15-13-14.245418.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6048142926851883,
"acc_stderr": 0.032906396487846115,
"acc_norm": 0.6105167673880798,
"acc_norm_stderr": 0.03358845031709559,
"mc1": 0.3402692778457772,
"mc1_stderr": 0.016586304901762557,
"mc2": 0.5018293862426123,
"mc2_stderr": 0.014695173813842227
},
"harness|arc:challenge|25": {
"acc": 0.552901023890785,
"acc_stderr": 0.014529380160526843,
"acc_norm": 0.5853242320819113,
"acc_norm_stderr": 0.014397070564409174
},
"harness|hellaswag|10": {
"acc": 0.6249751045608445,
"acc_stderr": 0.004831399218500236,
"acc_norm": 0.8316072495518821,
"acc_norm_stderr": 0.003734498979207306
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526066,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526066
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5986842105263158,
"acc_stderr": 0.03988903703336284,
"acc_norm": 0.5986842105263158,
"acc_norm_stderr": 0.03988903703336284
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.02881561571343211,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.02881561571343211
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6527777777777778,
"acc_stderr": 0.039812405437178615,
"acc_norm": 0.6527777777777778,
"acc_norm_stderr": 0.039812405437178615
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6069364161849711,
"acc_stderr": 0.0372424959581773,
"acc_norm": 0.6069364161849711,
"acc_norm_stderr": 0.0372424959581773
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.04755129616062946,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.04755129616062946
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5446808510638298,
"acc_stderr": 0.03255525359340355,
"acc_norm": 0.5446808510638298,
"acc_norm_stderr": 0.03255525359340355
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.04657047260594964,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.04657047260594964
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.02530590624159063,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.02530590624159063
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768176,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768176
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7193548387096774,
"acc_stderr": 0.02556060472102288,
"acc_norm": 0.7193548387096774,
"acc_norm_stderr": 0.02556060472102288
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7626262626262627,
"acc_stderr": 0.030313710538198892,
"acc_norm": 0.7626262626262627,
"acc_norm_stderr": 0.030313710538198892
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8393782383419689,
"acc_stderr": 0.02649905770139746,
"acc_norm": 0.8393782383419689,
"acc_norm_stderr": 0.02649905770139746
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6,
"acc_stderr": 0.024838811988033165,
"acc_norm": 0.6,
"acc_norm_stderr": 0.024838811988033165
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.02730914058823018,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.02730914058823018
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6092436974789915,
"acc_stderr": 0.03169380235712997,
"acc_norm": 0.6092436974789915,
"acc_norm_stderr": 0.03169380235712997
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.0395802723112157,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.0395802723112157
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7871559633027523,
"acc_stderr": 0.017549376389313694,
"acc_norm": 0.7871559633027523,
"acc_norm_stderr": 0.017549376389313694
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.03388857118502326,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.03388857118502326
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.029771775228145624,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.029771775228145624
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.759493670886076,
"acc_stderr": 0.027820781981149685,
"acc_norm": 0.759493670886076,
"acc_norm_stderr": 0.027820781981149685
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.732824427480916,
"acc_stderr": 0.038808483010823944,
"acc_norm": 0.732824427480916,
"acc_norm_stderr": 0.038808483010823944
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.04284467968052194,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.04284467968052194
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7177914110429447,
"acc_stderr": 0.03536117886664743,
"acc_norm": 0.7177914110429447,
"acc_norm_stderr": 0.03536117886664743
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406943,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406943
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8084291187739464,
"acc_stderr": 0.014072859310451949,
"acc_norm": 0.8084291187739464,
"acc_norm_stderr": 0.014072859310451949
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.024818350129436593,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.024818350129436593
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2670391061452514,
"acc_stderr": 0.01479650262256256,
"acc_norm": 0.2670391061452514,
"acc_norm_stderr": 0.01479650262256256
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6928104575163399,
"acc_stderr": 0.026415601914388992,
"acc_norm": 0.6928104575163399,
"acc_norm_stderr": 0.026415601914388992
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818767,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818767
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6882716049382716,
"acc_stderr": 0.025773111169630453,
"acc_norm": 0.6882716049382716,
"acc_norm_stderr": 0.025773111169630453
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.43617021276595747,
"acc_stderr": 0.02958345203628407,
"acc_norm": 0.43617021276595747,
"acc_norm_stderr": 0.02958345203628407
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4380704041720991,
"acc_stderr": 0.012671902782567655,
"acc_norm": 0.4380704041720991,
"acc_norm_stderr": 0.012671902782567655
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6029411764705882,
"acc_stderr": 0.02972215209928006,
"acc_norm": 0.6029411764705882,
"acc_norm_stderr": 0.02972215209928006
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.630718954248366,
"acc_stderr": 0.019524316744866356,
"acc_norm": 0.630718954248366,
"acc_norm_stderr": 0.019524316744866356
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7020408163265306,
"acc_stderr": 0.029279567411065674,
"acc_norm": 0.7020408163265306,
"acc_norm_stderr": 0.029279567411065674
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454132,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454132
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896308,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896308
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5,
"acc_stderr": 0.03892494720807614,
"acc_norm": 0.5,
"acc_norm_stderr": 0.03892494720807614
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.03094445977853321,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.03094445977853321
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3402692778457772,
"mc1_stderr": 0.016586304901762557,
"mc2": 0.5018293862426123,
"mc2_stderr": 0.014695173813842227
},
"harness|winogrande|5": {
"acc": 0.7892659826361483,
"acc_stderr": 0.011462046419710683
},
"harness|gsm8k|5": {
"acc": 0.32297194844579225,
"acc_stderr": 0.012880360794851825
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_CorticalStack__mistral-7b-slimorca-sft | [
"region:us"
] | 2024-02-16T15:15:31+00:00 | {"pretty_name": "Evaluation run of CorticalStack/mistral-7b-slimorca-sft", "dataset_summary": "Dataset automatically created during the evaluation run of model [CorticalStack/mistral-7b-slimorca-sft](https://huggingface.co/CorticalStack/mistral-7b-slimorca-sft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CorticalStack__mistral-7b-slimorca-sft\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-16T15:13:14.245418](https://huggingface.co/datasets/open-llm-leaderboard/details_CorticalStack__mistral-7b-slimorca-sft/blob/main/results_2024-02-16T15-13-14.245418.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6048142926851883,\n \"acc_stderr\": 0.032906396487846115,\n \"acc_norm\": 0.6105167673880798,\n \"acc_norm_stderr\": 0.03358845031709559,\n \"mc1\": 0.3402692778457772,\n \"mc1_stderr\": 0.016586304901762557,\n \"mc2\": 0.5018293862426123,\n \"mc2_stderr\": 0.014695173813842227\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.552901023890785,\n \"acc_stderr\": 0.014529380160526843,\n \"acc_norm\": 0.5853242320819113,\n \"acc_norm_stderr\": 0.014397070564409174\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6249751045608445,\n \"acc_stderr\": 0.004831399218500236,\n \"acc_norm\": 0.8316072495518821,\n \"acc_norm_stderr\": 0.003734498979207306\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526066,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526066\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5986842105263158,\n \"acc_stderr\": 0.03988903703336284,\n \"acc_norm\": 0.5986842105263158,\n \"acc_norm_stderr\": 0.03988903703336284\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.02881561571343211,\n \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.02881561571343211\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6527777777777778,\n \"acc_stderr\": 0.039812405437178615,\n \"acc_norm\": 0.6527777777777778,\n \"acc_norm_stderr\": 0.039812405437178615\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6069364161849711,\n \"acc_stderr\": 0.0372424959581773,\n \"acc_norm\": 0.6069364161849711,\n \"acc_norm_stderr\": 0.0372424959581773\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.04755129616062946,\n \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.04755129616062946\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5446808510638298,\n \"acc_stderr\": 0.03255525359340355,\n \"acc_norm\": 0.5446808510638298,\n \"acc_norm_stderr\": 0.03255525359340355\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n \"acc_stderr\": 0.04657047260594964,\n \"acc_norm\": 0.4298245614035088,\n \"acc_norm_stderr\": 0.04657047260594964\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4074074074074074,\n \"acc_stderr\": 0.02530590624159063,\n \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.02530590624159063\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.04403438954768176,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.04403438954768176\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7193548387096774,\n \"acc_stderr\": 0.02556060472102288,\n \"acc_norm\": 0.7193548387096774,\n \"acc_norm_stderr\": 0.02556060472102288\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7626262626262627,\n \"acc_stderr\": 0.030313710538198892,\n \"acc_norm\": 0.7626262626262627,\n \"acc_norm_stderr\": 0.030313710538198892\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8393782383419689,\n \"acc_stderr\": 0.02649905770139746,\n \"acc_norm\": 0.8393782383419689,\n \"acc_norm_stderr\": 0.02649905770139746\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.024838811988033165,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.024838811988033165\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.02730914058823018,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.02730914058823018\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6092436974789915,\n \"acc_stderr\": 0.03169380235712997,\n \"acc_norm\": 0.6092436974789915,\n \"acc_norm_stderr\": 0.03169380235712997\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7871559633027523,\n \"acc_stderr\": 0.017549376389313694,\n \"acc_norm\": 0.7871559633027523,\n \"acc_norm_stderr\": 0.017549376389313694\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.03388857118502326,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.03388857118502326\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.029771775228145624,\n \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.029771775228145624\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.759493670886076,\n \"acc_stderr\": 0.027820781981149685,\n \"acc_norm\": 0.759493670886076,\n \"acc_norm_stderr\": 0.027820781981149685\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.732824427480916,\n \"acc_stderr\": 0.038808483010823944,\n \"acc_norm\": 0.732824427480916,\n \"acc_norm_stderr\": 0.038808483010823944\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.04284467968052194,\n \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.04284467968052194\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7177914110429447,\n \"acc_stderr\": 0.03536117886664743,\n \"acc_norm\": 0.7177914110429447,\n \"acc_norm_stderr\": 0.03536117886664743\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406943,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406943\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8084291187739464,\n \"acc_stderr\": 0.014072859310451949,\n \"acc_norm\": 0.8084291187739464,\n \"acc_norm_stderr\": 0.014072859310451949\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6936416184971098,\n \"acc_stderr\": 0.024818350129436593,\n \"acc_norm\": 0.6936416184971098,\n \"acc_norm_stderr\": 0.024818350129436593\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2670391061452514,\n \"acc_stderr\": 0.01479650262256256,\n \"acc_norm\": 0.2670391061452514,\n \"acc_norm_stderr\": 0.01479650262256256\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6928104575163399,\n \"acc_stderr\": 0.026415601914388992,\n \"acc_norm\": 0.6928104575163399,\n \"acc_norm_stderr\": 0.026415601914388992\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n \"acc_stderr\": 0.025922371788818767,\n \"acc_norm\": 0.7041800643086816,\n \"acc_norm_stderr\": 0.025922371788818767\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6882716049382716,\n \"acc_stderr\": 0.025773111169630453,\n \"acc_norm\": 0.6882716049382716,\n \"acc_norm_stderr\": 0.025773111169630453\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.43617021276595747,\n \"acc_stderr\": 0.02958345203628407,\n \"acc_norm\": 0.43617021276595747,\n \"acc_norm_stderr\": 0.02958345203628407\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4380704041720991,\n \"acc_stderr\": 0.012671902782567655,\n \"acc_norm\": 0.4380704041720991,\n \"acc_norm_stderr\": 0.012671902782567655\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6029411764705882,\n \"acc_stderr\": 0.02972215209928006,\n \"acc_norm\": 0.6029411764705882,\n \"acc_norm_stderr\": 0.02972215209928006\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.630718954248366,\n \"acc_stderr\": 0.019524316744866356,\n \"acc_norm\": 0.630718954248366,\n \"acc_norm_stderr\": 0.019524316744866356\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.029279567411065674,\n \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.029279567411065674\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454132,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454132\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896308,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896308\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.03094445977853321,\n \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.03094445977853321\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3402692778457772,\n \"mc1_stderr\": 0.016586304901762557,\n \"mc2\": 0.5018293862426123,\n \"mc2_stderr\": 0.014695173813842227\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7892659826361483,\n \"acc_stderr\": 0.011462046419710683\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.32297194844579225,\n \"acc_stderr\": 0.012880360794851825\n }\n}\n```", "repo_url": "https://huggingface.co/CorticalStack/mistral-7b-slimorca-sft", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|arc:challenge|25_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|gsm8k|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hellaswag|10_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|winogrande|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["results_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["results_2024-02-16T15-13-14.245418.parquet"]}]}]} | 2024-02-16T15:15:55+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of CorticalStack/mistral-7b-slimorca-sft
Dataset automatically created during the evaluation run of model CorticalStack/mistral-7b-slimorca-sft on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-16T15:13:14.245418(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of CorticalStack/mistral-7b-slimorca-sft\n\n\n\nDataset automatically created during the evaluation run of model CorticalStack/mistral-7b-slimorca-sft on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-16T15:13:14.245418(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of CorticalStack/mistral-7b-slimorca-sft\n\n\n\nDataset automatically created during the evaluation run of model CorticalStack/mistral-7b-slimorca-sft on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-16T15:13:14.245418(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
193,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of CorticalStack/mistral-7b-slimorca-sft\n\n\n\nDataset automatically created during the evaluation run of model CorticalStack/mistral-7b-slimorca-sft on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-16T15:13:14.245418(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
5e176fb0827b9762f20da5bf395486878b79ca6f | ## INTERS: Unlocking the Power of Large Language Models in Search with Instruction Tuning
**Authors**: Yutao Zhu, Peitian Zhang, Chenghao Zhang, Yifei Chen, Binyu Xie, Zhicheng Dou, Zheng Liu, and Ji-Rong Wen
⭐ Other data files and the fine-tuned models are uploading. Due to the network latency, it will take several days!
<p>
📃 <a href="https://arxiv.org/abs/2401.06532">ArXiv Paper</a>
•
💡 <a href="https://github.com/DaoD/INTERS">GitHub</a>
</p>
## Introduction
Large language models (LLMs) have demonstrated impressive capabilities in various natural language processing tasks. Despite this, their application to information retrieval (IR) tasks is still challenging due to the infrequent occurrence of many IR-specific concepts in natural language. While prompt-based methods can provide task descriptions to LLMs, they often fall short in facilitating a comprehensive understanding and execution of IR tasks, thereby limiting LLMs' applicability. To address this gap, in this work, we explore the potential of instruction tuning to enhance LLMs' proficiency in IR tasks. We introduce a novel instruction tuning dataset, INTERS, encompassing 20 tasks across three fundamental IR categories: query understanding, document understanding, and query-document relationship understanding. The data are derived from 43 distinct datasets with manually written templates. Our empirical results reveal that INTERS significantly boosts the performance of various publicly available LLMs, such as LLaMA, Mistral, and Phi, in IR tasks. Furthermore, we conduct extensive experiments to analyze the effects of instruction design, template diversity, few-shot demonstrations, and the volume of instructions on performance.
## File List
- train.jsonl: The training set for the in-domain evaluation scenario.
- dev-qu-du-zero-shot/*: The dev set (query understanding tasks and document understanding tasks) for the zero-shot evaluation scenario.
- test-qu-du-zero-shot/*: The test set (query understanding tasks and document understanding tasks) for the zero-shot evaluation scenario.
## File Format
Each line of the file is a json dict with the following structure:
```
{
"prompt": the input for LLMs,
"completion": the output for LLMs,
"source": the data source,
}
``` | yutaozhu94/INTERS | [
"task_categories:text-generation",
"size_categories:100K<n<1M",
"language:en",
"license:cc-by-sa-4.0",
"arxiv:2401.06532",
"region:us"
] | 2024-02-16T15:16:51+00:00 | {"language": ["en"], "license": "cc-by-sa-4.0", "size_categories": ["100K<n<1M"], "task_categories": ["text-generation"]} | 2024-02-17T04:08:10+00:00 | [
"2401.06532"
] | [
"en"
] | TAGS
#task_categories-text-generation #size_categories-100K<n<1M #language-English #license-cc-by-sa-4.0 #arxiv-2401.06532 #region-us
| ## INTERS: Unlocking the Power of Large Language Models in Search with Instruction Tuning
Authors: Yutao Zhu, Peitian Zhang, Chenghao Zhang, Yifei Chen, Binyu Xie, Zhicheng Dou, Zheng Liu, and Ji-Rong Wen
⭐ Other data files and the fine-tuned models are uploading. Due to the network latency, it will take several days!
<p>
<a href="URL Paper</a>
•
<a href="URL
</p>
## Introduction
Large language models (LLMs) have demonstrated impressive capabilities in various natural language processing tasks. Despite this, their application to information retrieval (IR) tasks is still challenging due to the infrequent occurrence of many IR-specific concepts in natural language. While prompt-based methods can provide task descriptions to LLMs, they often fall short in facilitating a comprehensive understanding and execution of IR tasks, thereby limiting LLMs' applicability. To address this gap, in this work, we explore the potential of instruction tuning to enhance LLMs' proficiency in IR tasks. We introduce a novel instruction tuning dataset, INTERS, encompassing 20 tasks across three fundamental IR categories: query understanding, document understanding, and query-document relationship understanding. The data are derived from 43 distinct datasets with manually written templates. Our empirical results reveal that INTERS significantly boosts the performance of various publicly available LLMs, such as LLaMA, Mistral, and Phi, in IR tasks. Furthermore, we conduct extensive experiments to analyze the effects of instruction design, template diversity, few-shot demonstrations, and the volume of instructions on performance.
## File List
- URL: The training set for the in-domain evaluation scenario.
- dev-qu-du-zero-shot/*: The dev set (query understanding tasks and document understanding tasks) for the zero-shot evaluation scenario.
- test-qu-du-zero-shot/*: The test set (query understanding tasks and document understanding tasks) for the zero-shot evaluation scenario.
## File Format
Each line of the file is a json dict with the following structure:
| [
"## INTERS: Unlocking the Power of Large Language Models in Search with Instruction Tuning\nAuthors: Yutao Zhu, Peitian Zhang, Chenghao Zhang, Yifei Chen, Binyu Xie, Zhicheng Dou, Zheng Liu, and Ji-Rong Wen\n\n⭐ Other data files and the fine-tuned models are uploading. Due to the network latency, it will take several days!\n\n<p>\n <a href=\"URL Paper</a>\n •\n <a href=\"URL\n</p>",
"## Introduction\nLarge language models (LLMs) have demonstrated impressive capabilities in various natural language processing tasks. Despite this, their application to information retrieval (IR) tasks is still challenging due to the infrequent occurrence of many IR-specific concepts in natural language. While prompt-based methods can provide task descriptions to LLMs, they often fall short in facilitating a comprehensive understanding and execution of IR tasks, thereby limiting LLMs' applicability. To address this gap, in this work, we explore the potential of instruction tuning to enhance LLMs' proficiency in IR tasks. We introduce a novel instruction tuning dataset, INTERS, encompassing 20 tasks across three fundamental IR categories: query understanding, document understanding, and query-document relationship understanding. The data are derived from 43 distinct datasets with manually written templates. Our empirical results reveal that INTERS significantly boosts the performance of various publicly available LLMs, such as LLaMA, Mistral, and Phi, in IR tasks. Furthermore, we conduct extensive experiments to analyze the effects of instruction design, template diversity, few-shot demonstrations, and the volume of instructions on performance.",
"## File List\n- URL: The training set for the in-domain evaluation scenario.\n- dev-qu-du-zero-shot/*: The dev set (query understanding tasks and document understanding tasks) for the zero-shot evaluation scenario.\n- test-qu-du-zero-shot/*: The test set (query understanding tasks and document understanding tasks) for the zero-shot evaluation scenario.",
"## File Format\nEach line of the file is a json dict with the following structure:"
] | [
"TAGS\n#task_categories-text-generation #size_categories-100K<n<1M #language-English #license-cc-by-sa-4.0 #arxiv-2401.06532 #region-us \n",
"## INTERS: Unlocking the Power of Large Language Models in Search with Instruction Tuning\nAuthors: Yutao Zhu, Peitian Zhang, Chenghao Zhang, Yifei Chen, Binyu Xie, Zhicheng Dou, Zheng Liu, and Ji-Rong Wen\n\n⭐ Other data files and the fine-tuned models are uploading. Due to the network latency, it will take several days!\n\n<p>\n <a href=\"URL Paper</a>\n •\n <a href=\"URL\n</p>",
"## Introduction\nLarge language models (LLMs) have demonstrated impressive capabilities in various natural language processing tasks. Despite this, their application to information retrieval (IR) tasks is still challenging due to the infrequent occurrence of many IR-specific concepts in natural language. While prompt-based methods can provide task descriptions to LLMs, they often fall short in facilitating a comprehensive understanding and execution of IR tasks, thereby limiting LLMs' applicability. To address this gap, in this work, we explore the potential of instruction tuning to enhance LLMs' proficiency in IR tasks. We introduce a novel instruction tuning dataset, INTERS, encompassing 20 tasks across three fundamental IR categories: query understanding, document understanding, and query-document relationship understanding. The data are derived from 43 distinct datasets with manually written templates. Our empirical results reveal that INTERS significantly boosts the performance of various publicly available LLMs, such as LLaMA, Mistral, and Phi, in IR tasks. Furthermore, we conduct extensive experiments to analyze the effects of instruction design, template diversity, few-shot demonstrations, and the volume of instructions on performance.",
"## File List\n- URL: The training set for the in-domain evaluation scenario.\n- dev-qu-du-zero-shot/*: The dev set (query understanding tasks and document understanding tasks) for the zero-shot evaluation scenario.\n- test-qu-du-zero-shot/*: The test set (query understanding tasks and document understanding tasks) for the zero-shot evaluation scenario.",
"## File Format\nEach line of the file is a json dict with the following structure:"
] | [
53,
121,
282,
90,
19
] | [
"passage: TAGS\n#task_categories-text-generation #size_categories-100K<n<1M #language-English #license-cc-by-sa-4.0 #arxiv-2401.06532 #region-us \n## INTERS: Unlocking the Power of Large Language Models in Search with Instruction Tuning\nAuthors: Yutao Zhu, Peitian Zhang, Chenghao Zhang, Yifei Chen, Binyu Xie, Zhicheng Dou, Zheng Liu, and Ji-Rong Wen\n\n⭐ Other data files and the fine-tuned models are uploading. Due to the network latency, it will take several days!\n\n<p>\n <a href=\"URL Paper</a>\n •\n <a href=\"URL\n</p>## Introduction\nLarge language models (LLMs) have demonstrated impressive capabilities in various natural language processing tasks. Despite this, their application to information retrieval (IR) tasks is still challenging due to the infrequent occurrence of many IR-specific concepts in natural language. While prompt-based methods can provide task descriptions to LLMs, they often fall short in facilitating a comprehensive understanding and execution of IR tasks, thereby limiting LLMs' applicability. To address this gap, in this work, we explore the potential of instruction tuning to enhance LLMs' proficiency in IR tasks. We introduce a novel instruction tuning dataset, INTERS, encompassing 20 tasks across three fundamental IR categories: query understanding, document understanding, and query-document relationship understanding. The data are derived from 43 distinct datasets with manually written templates. Our empirical results reveal that INTERS significantly boosts the performance of various publicly available LLMs, such as LLaMA, Mistral, and Phi, in IR tasks. Furthermore, we conduct extensive experiments to analyze the effects of instruction design, template diversity, few-shot demonstrations, and the volume of instructions on performance."
] |
5e362c1416a7a7186bc0cff6758876b0ff90e2cd |
🔧 This is still under construction 🔧
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | LucasWeber/tinyMMLU | [
"region:us"
] | 2024-02-16T15:28:43+00:00 | {"dataset_info": [{"config_name": "all", "features": [{"name": "question", "dtype": "string"}, {"name": "subject", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "auxiliary_train", "num_bytes": 161000625, "num_examples": 99842}, {"name": "test", "num_bytes": 49618.6654322746, "num_examples": 100}, {"name": "validation", "num_bytes": 763484, "num_examples": 1531}, {"name": "dev", "num_bytes": 125353, "num_examples": 285}], "download_size": 48030609, "dataset_size": 161939080.66543227}, {"config_name": "default", "features": [{"name": "question", "dtype": "string"}, {"name": "subject", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "auxiliary_train", "num_bytes": 161000625, "num_examples": 99842}, {"name": "test", "num_bytes": 49618.6654322746, "num_examples": 100}, {"name": "validation", "num_bytes": 763484, "num_examples": 1531}, {"name": "dev", "num_bytes": 125353, "num_examples": 285}], "download_size": 48030609, "dataset_size": 161939080.66543227}], "configs": [{"config_name": "all", "data_files": [{"split": "auxiliary_train", "path": "all/auxiliary_train-*"}, {"split": "test", "path": "all/test-*"}, {"split": "validation", "path": "all/validation-*"}, {"split": "dev", "path": "all/dev-*"}]}, {"config_name": "default", "data_files": [{"split": "auxiliary_train", "path": "data/auxiliary_train-*"}, {"split": "test", "path": "data/test-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "dev", "path": "data/dev-*"}]}]} | 2024-02-16T18:47:35+00:00 | [] | [] | TAGS
#region-us
|
This is still under construction
# Dataset Card for Dataset Name
This dataset card aims to be a base template for new datasets. It has been generated using this raw template.
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
34,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
bff8ba4d9a86891cf648d15cbff6255322eccef2 |
# Dataset Card for Evaluation run of giraffe176/Open_Maid_Samantha_Hermes_Orca
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [giraffe176/Open_Maid_Samantha_Hermes_Orca](https://huggingface.co/giraffe176/Open_Maid_Samantha_Hermes_Orca) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_giraffe176__Open_Maid_Samantha_Hermes_Orca",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-16T15:26:57.957451](https://huggingface.co/datasets/open-llm-leaderboard/details_giraffe176__Open_Maid_Samantha_Hermes_Orca/blob/main/results_2024-02-16T15-26-57.957451.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6479153990110535,
"acc_stderr": 0.03205375637884698,
"acc_norm": 0.6497903447396889,
"acc_norm_stderr": 0.03269663569413109,
"mc1": 0.3733170134638923,
"mc1_stderr": 0.016932370557570634,
"mc2": 0.5391206181068439,
"mc2_stderr": 0.015128579519405813
},
"harness|arc:challenge|25": {
"acc": 0.6254266211604096,
"acc_stderr": 0.014144193471893449,
"acc_norm": 0.6680887372013652,
"acc_norm_stderr": 0.013760988200880533
},
"harness|hellaswag|10": {
"acc": 0.6685919139613623,
"acc_stderr": 0.004697573962169426,
"acc_norm": 0.8582951603266281,
"acc_norm_stderr": 0.003480344142139517
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595852,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595852
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.038234289699266046,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.038234289699266046
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.028152837942493857,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.028152837942493857
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105654,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5872340425531914,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.5872340425531914,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.041307408795554966,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.041307408795554966
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.02546714904546955,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.02546714904546955
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.02366421667164251,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.02366421667164251
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.029620227874790486,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.029620227874790486
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.02150024957603346,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.02150024957603346
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.36666666666666664,
"acc_stderr": 0.02938162072646507,
"acc_norm": 0.36666666666666664,
"acc_norm_stderr": 0.02938162072646507
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.030066761582977934,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.030066761582977934
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.01563002297009244,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.01563002297009244
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.028125972265654373,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.028125972265654373
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.025955020841621115,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.025955020841621115
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.03826076324884866,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.03826076324884866
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.03989139859531771,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.03989139859531771
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.021901905115073325,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.021901905115073325
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8186462324393359,
"acc_stderr": 0.013778693778464076,
"acc_norm": 0.8186462324393359,
"acc_norm_stderr": 0.013778693778464076
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7312138728323699,
"acc_stderr": 0.023868003262500104,
"acc_norm": 0.7312138728323699,
"acc_norm_stderr": 0.023868003262500104
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3094972067039106,
"acc_stderr": 0.015461169002371544,
"acc_norm": 0.3094972067039106,
"acc_norm_stderr": 0.015461169002371544
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7418300653594772,
"acc_stderr": 0.025058503316958147,
"acc_norm": 0.7418300653594772,
"acc_norm_stderr": 0.025058503316958147
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.02608270069539966,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.02608270069539966
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.023788583551658537,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.023788583551658537
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.470013037809648,
"acc_stderr": 0.012747248967079064,
"acc_norm": 0.470013037809648,
"acc_norm_stderr": 0.012747248967079064
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6568627450980392,
"acc_stderr": 0.019206606848825365,
"acc_norm": 0.6568627450980392,
"acc_norm_stderr": 0.019206606848825365
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142783,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142783
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.024845753212306046,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.024845753212306046
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.02917088550072767,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.02917088550072767
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3733170134638923,
"mc1_stderr": 0.016932370557570634,
"mc2": 0.5391206181068439,
"mc2_stderr": 0.015128579519405813
},
"harness|winogrande|5": {
"acc": 0.8034727703235991,
"acc_stderr": 0.011168120593569572
},
"harness|gsm8k|5": {
"acc": 0.6141015921152388,
"acc_stderr": 0.013409077471319175
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_giraffe176__Open_Maid_Samantha_Hermes_Orca | [
"region:us"
] | 2024-02-16T15:29:16+00:00 | {"pretty_name": "Evaluation run of giraffe176/Open_Maid_Samantha_Hermes_Orca", "dataset_summary": "Dataset automatically created during the evaluation run of model [giraffe176/Open_Maid_Samantha_Hermes_Orca](https://huggingface.co/giraffe176/Open_Maid_Samantha_Hermes_Orca) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_giraffe176__Open_Maid_Samantha_Hermes_Orca\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-16T15:26:57.957451](https://huggingface.co/datasets/open-llm-leaderboard/details_giraffe176__Open_Maid_Samantha_Hermes_Orca/blob/main/results_2024-02-16T15-26-57.957451.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6479153990110535,\n \"acc_stderr\": 0.03205375637884698,\n \"acc_norm\": 0.6497903447396889,\n \"acc_norm_stderr\": 0.03269663569413109,\n \"mc1\": 0.3733170134638923,\n \"mc1_stderr\": 0.016932370557570634,\n \"mc2\": 0.5391206181068439,\n \"mc2_stderr\": 0.015128579519405813\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6254266211604096,\n \"acc_stderr\": 0.014144193471893449,\n \"acc_norm\": 0.6680887372013652,\n \"acc_norm_stderr\": 0.013760988200880533\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6685919139613623,\n \"acc_stderr\": 0.004697573962169426,\n \"acc_norm\": 0.8582951603266281,\n \"acc_norm_stderr\": 0.003480344142139517\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n \"acc_stderr\": 0.04188307537595852,\n \"acc_norm\": 0.6222222222222222,\n \"acc_norm_stderr\": 0.04188307537595852\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.038234289699266046,\n \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.038234289699266046\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.028152837942493857,\n \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.028152837942493857\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105654,\n \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105654\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5872340425531914,\n \"acc_stderr\": 0.03218471141400351,\n \"acc_norm\": 0.5872340425531914,\n \"acc_norm_stderr\": 0.03218471141400351\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.041307408795554966,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.041307408795554966\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42592592592592593,\n \"acc_stderr\": 0.02546714904546955,\n \"acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.02546714904546955\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n \"acc_stderr\": 0.02366421667164251,\n \"acc_norm\": 0.7774193548387097,\n \"acc_norm_stderr\": 0.02366421667164251\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.029620227874790486,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.029620227874790486\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.02150024957603346,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.02150024957603346\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.36666666666666664,\n \"acc_stderr\": 0.02938162072646507,\n \"acc_norm\": 0.36666666666666664,\n \"acc_norm_stderr\": 0.02938162072646507\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.030066761582977934,\n \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.030066761582977934\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8422018348623853,\n \"acc_stderr\": 0.01563002297009244,\n \"acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.01563002297009244\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7990196078431373,\n \"acc_stderr\": 0.028125972265654373,\n \"acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.028125972265654373\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621115,\n \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621115\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.03826076324884866,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.03826076324884866\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.03989139859531771,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.03989139859531771\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.021901905115073325,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.021901905115073325\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8186462324393359,\n \"acc_stderr\": 0.013778693778464076,\n \"acc_norm\": 0.8186462324393359,\n \"acc_norm_stderr\": 0.013778693778464076\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.023868003262500104,\n \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.023868003262500104\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3094972067039106,\n \"acc_stderr\": 0.015461169002371544,\n \"acc_norm\": 0.3094972067039106,\n \"acc_norm_stderr\": 0.015461169002371544\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7418300653594772,\n \"acc_stderr\": 0.025058503316958147,\n \"acc_norm\": 0.7418300653594772,\n \"acc_norm_stderr\": 0.025058503316958147\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.6977491961414791,\n \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.023788583551658537,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.023788583551658537\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.470013037809648,\n \"acc_stderr\": 0.012747248967079064,\n \"acc_norm\": 0.470013037809648,\n \"acc_norm_stderr\": 0.012747248967079064\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6568627450980392,\n \"acc_stderr\": 0.019206606848825365,\n \"acc_norm\": 0.6568627450980392,\n \"acc_norm_stderr\": 0.019206606848825365\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142783,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142783\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n \"acc_stderr\": 0.024845753212306046,\n \"acc_norm\": 0.8557213930348259,\n \"acc_norm_stderr\": 0.024845753212306046\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.02917088550072767,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.02917088550072767\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3733170134638923,\n \"mc1_stderr\": 0.016932370557570634,\n \"mc2\": 0.5391206181068439,\n \"mc2_stderr\": 0.015128579519405813\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8034727703235991,\n \"acc_stderr\": 0.011168120593569572\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6141015921152388,\n \"acc_stderr\": 0.013409077471319175\n }\n}\n```", "repo_url": "https://huggingface.co/giraffe176/Open_Maid_Samantha_Hermes_Orca", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|arc:challenge|25_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|gsm8k|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hellaswag|10_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|winogrande|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["results_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["results_2024-02-16T15-26-57.957451.parquet"]}]}]} | 2024-02-16T15:29:40+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of giraffe176/Open_Maid_Samantha_Hermes_Orca
Dataset automatically created during the evaluation run of model giraffe176/Open_Maid_Samantha_Hermes_Orca on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-16T15:26:57.957451(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of giraffe176/Open_Maid_Samantha_Hermes_Orca\n\n\n\nDataset automatically created during the evaluation run of model giraffe176/Open_Maid_Samantha_Hermes_Orca on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-16T15:26:57.957451(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of giraffe176/Open_Maid_Samantha_Hermes_Orca\n\n\n\nDataset automatically created during the evaluation run of model giraffe176/Open_Maid_Samantha_Hermes_Orca on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-16T15:26:57.957451(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
195,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of giraffe176/Open_Maid_Samantha_Hermes_Orca\n\n\n\nDataset automatically created during the evaluation run of model giraffe176/Open_Maid_Samantha_Hermes_Orca on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-16T15:26:57.957451(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]"
] |
419db2e512cb2bd4c88afa18ec4a45ec1e6fa834 | # Dataset Card for "human_ref_dna"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | samchain/human_ref_dna | [
"region:us"
] | 2024-02-16T15:42:39+00:00 | {"dataset_info": {"features": [{"name": "sequence", "dtype": "string"}, {"name": "start_pos", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 4086227516, "num_examples": 249221}, {"name": "test", "num_bytes": 69420664, "num_examples": 4234}], "download_size": 1534074499, "dataset_size": 4155648180}} | 2024-02-16T15:44:47+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "human_ref_dna"
More Information needed | [
"# Dataset Card for \"human_ref_dna\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"human_ref_dna\"\n\nMore Information needed"
] | [
6,
15
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"human_ref_dna\"\n\nMore Information needed"
] |
b544e809263882f38e9aaf4c6ae875678985a35f |
# Dataset Card for Evaluation run of CorticalStack/mistral-7b-metamathqa-sft
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [CorticalStack/mistral-7b-metamathqa-sft](https://huggingface.co/CorticalStack/mistral-7b-metamathqa-sft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CorticalStack__mistral-7b-metamathqa-sft",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-16T16:13:27.633644](https://huggingface.co/datasets/open-llm-leaderboard/details_CorticalStack__mistral-7b-metamathqa-sft/blob/main/results_2024-02-16T16-13-27.633644.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6109833546650115,
"acc_stderr": 0.033243551615710396,
"acc_norm": 0.6155806742206059,
"acc_norm_stderr": 0.03392554504142213,
"mc1": 0.3047735618115055,
"mc1_stderr": 0.016114124156882452,
"mc2": 0.44733823807616424,
"mc2_stderr": 0.014684832855657028
},
"harness|arc:challenge|25": {
"acc": 0.5546075085324232,
"acc_stderr": 0.014523987638344083,
"acc_norm": 0.5844709897610921,
"acc_norm_stderr": 0.01440136664121639
},
"harness|hellaswag|10": {
"acc": 0.6065524795857399,
"acc_stderr": 0.0048751626991216535,
"acc_norm": 0.8044214299940251,
"acc_norm_stderr": 0.0039583479345203345
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6578947368421053,
"acc_stderr": 0.03860731599316091,
"acc_norm": 0.6578947368421053,
"acc_norm_stderr": 0.03860731599316091
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7169811320754716,
"acc_stderr": 0.027724236492700918,
"acc_norm": 0.7169811320754716,
"acc_norm_stderr": 0.027724236492700918
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6875,
"acc_stderr": 0.038760854559127644,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.038760854559127644
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247078,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247078
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621503,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621503
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.04685473041907789,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.04685473041907789
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4021164021164021,
"acc_stderr": 0.025253032554997692,
"acc_norm": 0.4021164021164021,
"acc_norm_stderr": 0.025253032554997692
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7193548387096774,
"acc_stderr": 0.025560604721022884,
"acc_norm": 0.7193548387096774,
"acc_norm_stderr": 0.025560604721022884
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.03546563019624336,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.03546563019624336
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03053289223393202,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03053289223393202
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8549222797927462,
"acc_stderr": 0.025416343096306433,
"acc_norm": 0.8549222797927462,
"acc_norm_stderr": 0.025416343096306433
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6461538461538462,
"acc_stderr": 0.024243783994062157,
"acc_norm": 0.6461538461538462,
"acc_norm_stderr": 0.024243783994062157
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.02911661760608302,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.02911661760608302
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.031041941304059285,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.031041941304059285
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8165137614678899,
"acc_stderr": 0.016595259710399324,
"acc_norm": 0.8165137614678899,
"acc_norm_stderr": 0.016595259710399324
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5324074074074074,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.5324074074074074,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.03019028245350195,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.03019028245350195
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.027479744550808503,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.027479744550808503
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6547085201793722,
"acc_stderr": 0.03191100192835794,
"acc_norm": 0.6547085201793722,
"acc_norm_stderr": 0.03191100192835794
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7099236641221374,
"acc_stderr": 0.03980066246467765,
"acc_norm": 0.7099236641221374,
"acc_norm_stderr": 0.03980066246467765
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6776859504132231,
"acc_stderr": 0.04266416363352167,
"acc_norm": 0.6776859504132231,
"acc_norm_stderr": 0.04266416363352167
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7177914110429447,
"acc_stderr": 0.03536117886664742,
"acc_norm": 0.7177914110429447,
"acc_norm_stderr": 0.03536117886664742
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.02280138253459752,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.02280138253459752
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7803320561941252,
"acc_stderr": 0.014805384478371165,
"acc_norm": 0.7803320561941252,
"acc_norm_stderr": 0.014805384478371165
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.024946792225272314,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.024946792225272314
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2916201117318436,
"acc_stderr": 0.015201032512520425,
"acc_norm": 0.2916201117318436,
"acc_norm_stderr": 0.015201032512520425
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.026090162504279053,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.026090162504279053
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.026003301117885142,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.026003301117885142
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6759259259259259,
"acc_stderr": 0.026041766202717156,
"acc_norm": 0.6759259259259259,
"acc_norm_stderr": 0.026041766202717156
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.42907801418439717,
"acc_stderr": 0.02952591430255856,
"acc_norm": 0.42907801418439717,
"acc_norm_stderr": 0.02952591430255856
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4230769230769231,
"acc_stderr": 0.01261820406658839,
"acc_norm": 0.4230769230769231,
"acc_norm_stderr": 0.01261820406658839
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6507352941176471,
"acc_stderr": 0.028959755196824873,
"acc_norm": 0.6507352941176471,
"acc_norm_stderr": 0.028959755196824873
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6143790849673203,
"acc_stderr": 0.019691459052354022,
"acc_norm": 0.6143790849673203,
"acc_norm_stderr": 0.019691459052354022
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6693877551020408,
"acc_stderr": 0.030116426296540603,
"acc_norm": 0.6693877551020408,
"acc_norm_stderr": 0.030116426296540603
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8159203980099502,
"acc_stderr": 0.027403859410786848,
"acc_norm": 0.8159203980099502,
"acc_norm_stderr": 0.027403859410786848
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.783625730994152,
"acc_stderr": 0.031581495393387324,
"acc_norm": 0.783625730994152,
"acc_norm_stderr": 0.031581495393387324
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3047735618115055,
"mc1_stderr": 0.016114124156882452,
"mc2": 0.44733823807616424,
"mc2_stderr": 0.014684832855657028
},
"harness|winogrande|5": {
"acc": 0.77663772691397,
"acc_stderr": 0.011705697565205191
},
"harness|gsm8k|5": {
"acc": 0.40181956027293403,
"acc_stderr": 0.013504357787494035
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_CorticalStack__mistral-7b-metamathqa-sft | [
"region:us"
] | 2024-02-16T16:15:47+00:00 | {"pretty_name": "Evaluation run of CorticalStack/mistral-7b-metamathqa-sft", "dataset_summary": "Dataset automatically created during the evaluation run of model [CorticalStack/mistral-7b-metamathqa-sft](https://huggingface.co/CorticalStack/mistral-7b-metamathqa-sft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CorticalStack__mistral-7b-metamathqa-sft\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-16T16:13:27.633644](https://huggingface.co/datasets/open-llm-leaderboard/details_CorticalStack__mistral-7b-metamathqa-sft/blob/main/results_2024-02-16T16-13-27.633644.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6109833546650115,\n \"acc_stderr\": 0.033243551615710396,\n \"acc_norm\": 0.6155806742206059,\n \"acc_norm_stderr\": 0.03392554504142213,\n \"mc1\": 0.3047735618115055,\n \"mc1_stderr\": 0.016114124156882452,\n \"mc2\": 0.44733823807616424,\n \"mc2_stderr\": 0.014684832855657028\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5546075085324232,\n \"acc_stderr\": 0.014523987638344083,\n \"acc_norm\": 0.5844709897610921,\n \"acc_norm_stderr\": 0.01440136664121639\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6065524795857399,\n \"acc_stderr\": 0.0048751626991216535,\n \"acc_norm\": 0.8044214299940251,\n \"acc_norm_stderr\": 0.0039583479345203345\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6578947368421053,\n \"acc_stderr\": 0.03860731599316091,\n \"acc_norm\": 0.6578947368421053,\n \"acc_norm_stderr\": 0.03860731599316091\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700918,\n \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700918\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.038760854559127644,\n \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.038760854559127644\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247078,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247078\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621503,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621503\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.0325005368436584,\n \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.0325005368436584\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n \"acc_stderr\": 0.04685473041907789,\n \"acc_norm\": 0.45614035087719296,\n \"acc_norm_stderr\": 0.04685473041907789\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4021164021164021,\n \"acc_stderr\": 0.025253032554997692,\n \"acc_norm\": 0.4021164021164021,\n \"acc_norm_stderr\": 0.025253032554997692\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7193548387096774,\n \"acc_stderr\": 0.025560604721022884,\n \"acc_norm\": 0.7193548387096774,\n \"acc_norm_stderr\": 0.025560604721022884\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.035179450386910616,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.035179450386910616\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.03546563019624336,\n \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.03546563019624336\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03053289223393202,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03053289223393202\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.025416343096306433,\n \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.025416343096306433\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6461538461538462,\n \"acc_stderr\": 0.024243783994062157,\n \"acc_norm\": 0.6461538461538462,\n \"acc_norm_stderr\": 0.024243783994062157\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.35185185185185186,\n \"acc_stderr\": 0.02911661760608302,\n \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.02911661760608302\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.031041941304059285,\n \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.031041941304059285\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8165137614678899,\n \"acc_stderr\": 0.016595259710399324,\n \"acc_norm\": 0.8165137614678899,\n \"acc_norm_stderr\": 0.016595259710399324\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5324074074074074,\n \"acc_stderr\": 0.03402801581358966,\n \"acc_norm\": 0.5324074074074074,\n \"acc_norm_stderr\": 0.03402801581358966\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.03019028245350195,\n \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.03019028245350195\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7679324894514767,\n \"acc_stderr\": 0.027479744550808503,\n \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.027479744550808503\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.6547085201793722,\n \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7099236641221374,\n \"acc_stderr\": 0.03980066246467765,\n \"acc_norm\": 0.7099236641221374,\n \"acc_norm_stderr\": 0.03980066246467765\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6776859504132231,\n \"acc_stderr\": 0.04266416363352167,\n \"acc_norm\": 0.6776859504132231,\n \"acc_norm_stderr\": 0.04266416363352167\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.7129629629629629,\n \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7177914110429447,\n \"acc_stderr\": 0.03536117886664742,\n \"acc_norm\": 0.7177914110429447,\n \"acc_norm_stderr\": 0.03536117886664742\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.02280138253459752,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.02280138253459752\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7803320561941252,\n \"acc_stderr\": 0.014805384478371165,\n \"acc_norm\": 0.7803320561941252,\n \"acc_norm_stderr\": 0.014805384478371165\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.024946792225272314,\n \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.024946792225272314\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2916201117318436,\n \"acc_stderr\": 0.015201032512520425,\n \"acc_norm\": 0.2916201117318436,\n \"acc_norm_stderr\": 0.015201032512520425\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.026090162504279053,\n \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.026090162504279053\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n \"acc_stderr\": 0.026003301117885142,\n \"acc_norm\": 0.7009646302250804,\n \"acc_norm_stderr\": 0.026003301117885142\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6759259259259259,\n \"acc_stderr\": 0.026041766202717156,\n \"acc_norm\": 0.6759259259259259,\n \"acc_norm_stderr\": 0.026041766202717156\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.42907801418439717,\n \"acc_stderr\": 0.02952591430255856,\n \"acc_norm\": 0.42907801418439717,\n \"acc_norm_stderr\": 0.02952591430255856\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4230769230769231,\n \"acc_stderr\": 0.01261820406658839,\n \"acc_norm\": 0.4230769230769231,\n \"acc_norm_stderr\": 0.01261820406658839\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6507352941176471,\n \"acc_stderr\": 0.028959755196824873,\n \"acc_norm\": 0.6507352941176471,\n \"acc_norm_stderr\": 0.028959755196824873\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6143790849673203,\n \"acc_stderr\": 0.019691459052354022,\n \"acc_norm\": 0.6143790849673203,\n \"acc_norm_stderr\": 0.019691459052354022\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6693877551020408,\n \"acc_stderr\": 0.030116426296540603,\n \"acc_norm\": 0.6693877551020408,\n \"acc_norm_stderr\": 0.030116426296540603\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n \"acc_stderr\": 0.027403859410786848,\n \"acc_norm\": 0.8159203980099502,\n \"acc_norm_stderr\": 0.027403859410786848\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.031581495393387324,\n \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.031581495393387324\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3047735618115055,\n \"mc1_stderr\": 0.016114124156882452,\n \"mc2\": 0.44733823807616424,\n \"mc2_stderr\": 0.014684832855657028\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.77663772691397,\n \"acc_stderr\": 0.011705697565205191\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.40181956027293403,\n \"acc_stderr\": 0.013504357787494035\n }\n}\n```", "repo_url": "https://huggingface.co/CorticalStack/mistral-7b-metamathqa-sft", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|arc:challenge|25_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|gsm8k|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hellaswag|10_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|winogrande|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["results_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["results_2024-02-16T16-13-27.633644.parquet"]}]}]} | 2024-02-16T16:16:11+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of CorticalStack/mistral-7b-metamathqa-sft
Dataset automatically created during the evaluation run of model CorticalStack/mistral-7b-metamathqa-sft on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-16T16:13:27.633644(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of CorticalStack/mistral-7b-metamathqa-sft\n\n\n\nDataset automatically created during the evaluation run of model CorticalStack/mistral-7b-metamathqa-sft on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-16T16:13:27.633644(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of CorticalStack/mistral-7b-metamathqa-sft\n\n\n\nDataset automatically created during the evaluation run of model CorticalStack/mistral-7b-metamathqa-sft on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-16T16:13:27.633644(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
193,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of CorticalStack/mistral-7b-metamathqa-sft\n\n\n\nDataset automatically created during the evaluation run of model CorticalStack/mistral-7b-metamathqa-sft on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-16T16:13:27.633644(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
97beb2fa4a07c13458f014771a813fb3fdbd4d3a |
# PIPPA - Personal Interaction Pairs between People and AI in ShareGpt format
**⚠️ CAUTION: PIPPA contains conversations, themes and scenarios which can be considered "not safe for work" (NSFW) and/or heavily disturbing in nature. Models trained purely with PIPPA may have the tendency to generate X-rated output. You have been warned.**
## Dataset Summary
- `system`: Contains whatever was typed in the **Description** field in the character creator on the website. It usually consists of a few sentences which gives a brief overview of the character and any important details about them.
- `conversation`: The conversation between the user and the model. This is represented as a list of dictionaries, each dictionary representing a single utterance and containing two key-value pairs: `message`, referring to the utterance itself and `is_human`, which designates whether the dialogue was generated by the user or the LLM.
For further information about PIPPA, please refer to our [published paper](https://arxiv.org/abs/2308.05884) or contact us at the emails listed above.
## Files
I have used the original:
- **pippa_deduped.jsonl**: The 'cleaned' version of PIPPA, with duplicate conversations as well as any conversation with less than three turns removed from the dataset. **We recommend using this file.**
## Citation
Thanks to the https://huggingface.co/PygmalionAI Community, all the merits to them
___
Any relationship between the name of this dataset and any public personas is entirely and totally coincidential. | WasamiKirua/PygmalionPIPPA-ShareGpt | [
"task_categories:conversational",
"size_categories:10K<n<100K",
"language:en",
"license:apache-2.0",
"not-for-all-audiences",
"conversational",
"roleplay",
"custom-format",
"a.",
"arxiv:2308.05884",
"region:us"
] | 2024-02-16T16:32:19+00:00 | {"language": ["en"], "license": "apache-2.0", "size_categories": ["10K<n<100K"], "task_categories": ["conversational"], "pretty_name": "PIPPA - Personal Interaction Pairs Between People and AI", "tags": ["not-for-all-audiences", "conversational", "roleplay", "custom-format", "a."], "viewer": true} | 2024-02-16T16:50:13+00:00 | [
"2308.05884"
] | [
"en"
] | TAGS
#task_categories-conversational #size_categories-10K<n<100K #language-English #license-apache-2.0 #not-for-all-audiences #conversational #roleplay #custom-format #a. #arxiv-2308.05884 #region-us
|
# PIPPA - Personal Interaction Pairs between People and AI in ShareGpt format
️ CAUTION: PIPPA contains conversations, themes and scenarios which can be considered "not safe for work" (NSFW) and/or heavily disturbing in nature. Models trained purely with PIPPA may have the tendency to generate X-rated output. You have been warned.
## Dataset Summary
- 'system': Contains whatever was typed in the Description field in the character creator on the website. It usually consists of a few sentences which gives a brief overview of the character and any important details about them.
- 'conversation': The conversation between the user and the model. This is represented as a list of dictionaries, each dictionary representing a single utterance and containing two key-value pairs: 'message', referring to the utterance itself and 'is_human', which designates whether the dialogue was generated by the user or the LLM.
For further information about PIPPA, please refer to our published paper or contact us at the emails listed above.
## Files
I have used the original:
- pippa_deduped.jsonl: The 'cleaned' version of PIPPA, with duplicate conversations as well as any conversation with less than three turns removed from the dataset. We recommend using this file.
Thanks to the URL Community, all the merits to them
___
Any relationship between the name of this dataset and any public personas is entirely and totally coincidential. | [
"# PIPPA - Personal Interaction Pairs between People and AI in ShareGpt format\n\n\n️ CAUTION: PIPPA contains conversations, themes and scenarios which can be considered \"not safe for work\" (NSFW) and/or heavily disturbing in nature. Models trained purely with PIPPA may have the tendency to generate X-rated output. You have been warned.",
"## Dataset Summary\n\n- 'system': Contains whatever was typed in the Description field in the character creator on the website. It usually consists of a few sentences which gives a brief overview of the character and any important details about them.\n- 'conversation': The conversation between the user and the model. This is represented as a list of dictionaries, each dictionary representing a single utterance and containing two key-value pairs: 'message', referring to the utterance itself and 'is_human', which designates whether the dialogue was generated by the user or the LLM.\n\nFor further information about PIPPA, please refer to our published paper or contact us at the emails listed above.",
"## Files\n\nI have used the original: \n\n- pippa_deduped.jsonl: The 'cleaned' version of PIPPA, with duplicate conversations as well as any conversation with less than three turns removed from the dataset. We recommend using this file.\n\n\nThanks to the URL Community, all the merits to them\n\n___\nAny relationship between the name of this dataset and any public personas is entirely and totally coincidential."
] | [
"TAGS\n#task_categories-conversational #size_categories-10K<n<100K #language-English #license-apache-2.0 #not-for-all-audiences #conversational #roleplay #custom-format #a. #arxiv-2308.05884 #region-us \n",
"# PIPPA - Personal Interaction Pairs between People and AI in ShareGpt format\n\n\n️ CAUTION: PIPPA contains conversations, themes and scenarios which can be considered \"not safe for work\" (NSFW) and/or heavily disturbing in nature. Models trained purely with PIPPA may have the tendency to generate X-rated output. You have been warned.",
"## Dataset Summary\n\n- 'system': Contains whatever was typed in the Description field in the character creator on the website. It usually consists of a few sentences which gives a brief overview of the character and any important details about them.\n- 'conversation': The conversation between the user and the model. This is represented as a list of dictionaries, each dictionary representing a single utterance and containing two key-value pairs: 'message', referring to the utterance itself and 'is_human', which designates whether the dialogue was generated by the user or the LLM.\n\nFor further information about PIPPA, please refer to our published paper or contact us at the emails listed above.",
"## Files\n\nI have used the original: \n\n- pippa_deduped.jsonl: The 'cleaned' version of PIPPA, with duplicate conversations as well as any conversation with less than three turns removed from the dataset. We recommend using this file.\n\n\nThanks to the URL Community, all the merits to them\n\n___\nAny relationship between the name of this dataset and any public personas is entirely and totally coincidential."
] | [
73,
89,
164,
94
] | [
"passage: TAGS\n#task_categories-conversational #size_categories-10K<n<100K #language-English #license-apache-2.0 #not-for-all-audiences #conversational #roleplay #custom-format #a. #arxiv-2308.05884 #region-us \n# PIPPA - Personal Interaction Pairs between People and AI in ShareGpt format\n\n\n️ CAUTION: PIPPA contains conversations, themes and scenarios which can be considered \"not safe for work\" (NSFW) and/or heavily disturbing in nature. Models trained purely with PIPPA may have the tendency to generate X-rated output. You have been warned.## Dataset Summary\n\n- 'system': Contains whatever was typed in the Description field in the character creator on the website. It usually consists of a few sentences which gives a brief overview of the character and any important details about them.\n- 'conversation': The conversation between the user and the model. This is represented as a list of dictionaries, each dictionary representing a single utterance and containing two key-value pairs: 'message', referring to the utterance itself and 'is_human', which designates whether the dialogue was generated by the user or the LLM.\n\nFor further information about PIPPA, please refer to our published paper or contact us at the emails listed above.## Files\n\nI have used the original: \n\n- pippa_deduped.jsonl: The 'cleaned' version of PIPPA, with duplicate conversations as well as any conversation with less than three turns removed from the dataset. We recommend using this file.\n\n\nThanks to the URL Community, all the merits to them\n\n___\nAny relationship between the name of this dataset and any public personas is entirely and totally coincidential."
] |
a1bc634145c6c2fbc6b167266e4181bacd991156 |
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | vedastro-org/all-planet-data-london | [
"license:mit",
"region:us"
] | 2024-02-16T16:54:07+00:00 | {"license": "mit", "dataset_info": {"features": [{"name": "Name", "dtype": "string"}, {"name": "Time", "dtype": "string"}, {"name": "Location", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 99320, "num_examples": 1826}], "download_size": 23439, "dataset_size": 99320}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-16T17:28:41+00:00 | [] | [] | TAGS
#license-mit #region-us
|
# Dataset Card for Dataset Name
This dataset card aims to be a base template for new datasets. It has been generated using this raw template.
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#license-mit #region-us \n",
"# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
11,
34,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#license-mit #region-us \n# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
935306ba8c08e4cc46c14b6546d5271f18749e92 |
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | suguroglu/crowdsourced-calculator-demo | [
"region:us"
] | 2024-02-16T17:35:42+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data.csv"}]}]} | 2024-02-16T17:38:15+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Dataset Name
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Dataset Name",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Dataset Name",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
8,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Dataset Name## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
e87f2fc32886c1de2bb48c665961916a5ed9c5df |
# Dataset Card for Evaluation run of sethuiyer/Aika-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [sethuiyer/Aika-7B](https://huggingface.co/sethuiyer/Aika-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_sethuiyer__Aika-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-16T17:42:59.851366](https://huggingface.co/datasets/open-llm-leaderboard/details_sethuiyer__Aika-7B/blob/main/results_2024-02-16T17-42-59.851366.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5403534855873607,
"acc_stderr": 0.03412368335032635,
"acc_norm": 0.5457531977609761,
"acc_norm_stderr": 0.034853423231234165,
"mc1": 0.36964504283965727,
"mc1_stderr": 0.016898180706973888,
"mc2": 0.512198702575838,
"mc2_stderr": 0.015478157824850066
},
"harness|arc:challenge|25": {
"acc": 0.5947098976109215,
"acc_stderr": 0.01434686906022932,
"acc_norm": 0.6535836177474402,
"acc_norm_stderr": 0.013905011180063235
},
"harness|hellaswag|10": {
"acc": 0.6006771559450309,
"acc_stderr": 0.004887583074180845,
"acc_norm": 0.8148775144393547,
"acc_norm_stderr": 0.0038760312505449843
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4740740740740741,
"acc_stderr": 0.04313531696750574,
"acc_norm": 0.4740740740740741,
"acc_norm_stderr": 0.04313531696750574
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6118421052631579,
"acc_stderr": 0.03965842097512744,
"acc_norm": 0.6118421052631579,
"acc_norm_stderr": 0.03965842097512744
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6226415094339622,
"acc_stderr": 0.029832808114796005,
"acc_norm": 0.6226415094339622,
"acc_norm_stderr": 0.029832808114796005
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5694444444444444,
"acc_stderr": 0.04140685639111503,
"acc_norm": 0.5694444444444444,
"acc_norm_stderr": 0.04140685639111503
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5895953757225434,
"acc_stderr": 0.03750757044895537,
"acc_norm": 0.5895953757225434,
"acc_norm_stderr": 0.03750757044895537
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201942,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201942
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4808510638297872,
"acc_stderr": 0.032662042990646796,
"acc_norm": 0.4808510638297872,
"acc_norm_stderr": 0.032662042990646796
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.39473684210526316,
"acc_stderr": 0.045981880578165414,
"acc_norm": 0.39473684210526316,
"acc_norm_stderr": 0.045981880578165414
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.025197101074246483,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.025197101074246483
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5870967741935483,
"acc_stderr": 0.028009138125400387,
"acc_norm": 0.5870967741935483,
"acc_norm_stderr": 0.028009138125400387
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.35467980295566504,
"acc_stderr": 0.03366124489051449,
"acc_norm": 0.35467980295566504,
"acc_norm_stderr": 0.03366124489051449
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.49696969696969695,
"acc_stderr": 0.03904272341431857,
"acc_norm": 0.49696969696969695,
"acc_norm_stderr": 0.03904272341431857
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7121212121212122,
"acc_stderr": 0.03225883512300992,
"acc_norm": 0.7121212121212122,
"acc_norm_stderr": 0.03225883512300992
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8031088082901554,
"acc_stderr": 0.02869787397186067,
"acc_norm": 0.8031088082901554,
"acc_norm_stderr": 0.02869787397186067
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.541025641025641,
"acc_stderr": 0.025265525491284295,
"acc_norm": 0.541025641025641,
"acc_norm_stderr": 0.025265525491284295
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.026962424325073835,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.026962424325073835
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5168067226890757,
"acc_stderr": 0.03246013680375308,
"acc_norm": 0.5168067226890757,
"acc_norm_stderr": 0.03246013680375308
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389024,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7522935779816514,
"acc_stderr": 0.018508143602547825,
"acc_norm": 0.7522935779816514,
"acc_norm_stderr": 0.018508143602547825
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.36574074074074076,
"acc_stderr": 0.03284738857647206,
"acc_norm": 0.36574074074074076,
"acc_norm_stderr": 0.03284738857647206
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.553921568627451,
"acc_stderr": 0.034888454513049734,
"acc_norm": 0.553921568627451,
"acc_norm_stderr": 0.034888454513049734
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6497890295358649,
"acc_stderr": 0.031052391937584346,
"acc_norm": 0.6497890295358649,
"acc_norm_stderr": 0.031052391937584346
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6188340807174888,
"acc_stderr": 0.03259625118416827,
"acc_norm": 0.6188340807174888,
"acc_norm_stderr": 0.03259625118416827
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6259541984732825,
"acc_stderr": 0.042438692422305246,
"acc_norm": 0.6259541984732825,
"acc_norm_stderr": 0.042438692422305246
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6759259259259259,
"acc_stderr": 0.04524596007030048,
"acc_norm": 0.6759259259259259,
"acc_norm_stderr": 0.04524596007030048
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6380368098159509,
"acc_stderr": 0.037757007291414416,
"acc_norm": 0.6380368098159509,
"acc_norm_stderr": 0.037757007291414416
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7087378640776699,
"acc_stderr": 0.044986763205729224,
"acc_norm": 0.7087378640776699,
"acc_norm_stderr": 0.044986763205729224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8376068376068376,
"acc_stderr": 0.02416161812798774,
"acc_norm": 0.8376068376068376,
"acc_norm_stderr": 0.02416161812798774
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562429,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562429
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7381864623243933,
"acc_stderr": 0.015720838678445266,
"acc_norm": 0.7381864623243933,
"acc_norm_stderr": 0.015720838678445266
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5867052023121387,
"acc_stderr": 0.026511261369409244,
"acc_norm": 0.5867052023121387,
"acc_norm_stderr": 0.026511261369409244
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3541899441340782,
"acc_stderr": 0.015995644947299235,
"acc_norm": 0.3541899441340782,
"acc_norm_stderr": 0.015995644947299235
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5849673202614379,
"acc_stderr": 0.028213504177824096,
"acc_norm": 0.5849673202614379,
"acc_norm_stderr": 0.028213504177824096
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5852090032154341,
"acc_stderr": 0.02798268045975957,
"acc_norm": 0.5852090032154341,
"acc_norm_stderr": 0.02798268045975957
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5709876543209876,
"acc_stderr": 0.027538925613470863,
"acc_norm": 0.5709876543209876,
"acc_norm_stderr": 0.027538925613470863
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.41134751773049644,
"acc_stderr": 0.029354911159940978,
"acc_norm": 0.41134751773049644,
"acc_norm_stderr": 0.029354911159940978
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3767926988265971,
"acc_stderr": 0.012376459593894402,
"acc_norm": 0.3767926988265971,
"acc_norm_stderr": 0.012376459593894402
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.45955882352941174,
"acc_stderr": 0.030273325077345762,
"acc_norm": 0.45955882352941174,
"acc_norm_stderr": 0.030273325077345762
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5637254901960784,
"acc_stderr": 0.020062874243539128,
"acc_norm": 0.5637254901960784,
"acc_norm_stderr": 0.020062874243539128
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.04494290866252089,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.04494290866252089
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6122448979591837,
"acc_stderr": 0.03119223072679566,
"acc_norm": 0.6122448979591837,
"acc_norm_stderr": 0.03119223072679566
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5671641791044776,
"acc_stderr": 0.03503490923673282,
"acc_norm": 0.5671641791044776,
"acc_norm_stderr": 0.03503490923673282
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-virology|5": {
"acc": 0.45180722891566266,
"acc_stderr": 0.038743715565879536,
"acc_norm": 0.45180722891566266,
"acc_norm_stderr": 0.038743715565879536
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7426900584795322,
"acc_stderr": 0.03352799844161865,
"acc_norm": 0.7426900584795322,
"acc_norm_stderr": 0.03352799844161865
},
"harness|truthfulqa:mc|0": {
"mc1": 0.36964504283965727,
"mc1_stderr": 0.016898180706973888,
"mc2": 0.512198702575838,
"mc2_stderr": 0.015478157824850066
},
"harness|winogrande|5": {
"acc": 0.7774269928966061,
"acc_stderr": 0.01169093380971267
},
"harness|gsm8k|5": {
"acc": 0.2577710386656558,
"acc_stderr": 0.012048370213576598
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_sethuiyer__Aika-7B | [
"region:us"
] | 2024-02-16T17:45:18+00:00 | {"pretty_name": "Evaluation run of sethuiyer/Aika-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [sethuiyer/Aika-7B](https://huggingface.co/sethuiyer/Aika-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_sethuiyer__Aika-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-16T17:42:59.851366](https://huggingface.co/datasets/open-llm-leaderboard/details_sethuiyer__Aika-7B/blob/main/results_2024-02-16T17-42-59.851366.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5403534855873607,\n \"acc_stderr\": 0.03412368335032635,\n \"acc_norm\": 0.5457531977609761,\n \"acc_norm_stderr\": 0.034853423231234165,\n \"mc1\": 0.36964504283965727,\n \"mc1_stderr\": 0.016898180706973888,\n \"mc2\": 0.512198702575838,\n \"mc2_stderr\": 0.015478157824850066\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5947098976109215,\n \"acc_stderr\": 0.01434686906022932,\n \"acc_norm\": 0.6535836177474402,\n \"acc_norm_stderr\": 0.013905011180063235\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6006771559450309,\n \"acc_stderr\": 0.004887583074180845,\n \"acc_norm\": 0.8148775144393547,\n \"acc_norm_stderr\": 0.0038760312505449843\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n \"acc_stderr\": 0.04313531696750574,\n \"acc_norm\": 0.4740740740740741,\n \"acc_norm_stderr\": 0.04313531696750574\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6118421052631579,\n \"acc_stderr\": 0.03965842097512744,\n \"acc_norm\": 0.6118421052631579,\n \"acc_norm_stderr\": 0.03965842097512744\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6226415094339622,\n \"acc_stderr\": 0.029832808114796005,\n \"acc_norm\": 0.6226415094339622,\n \"acc_norm_stderr\": 0.029832808114796005\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5694444444444444,\n \"acc_stderr\": 0.04140685639111503,\n \"acc_norm\": 0.5694444444444444,\n \"acc_norm_stderr\": 0.04140685639111503\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5895953757225434,\n \"acc_stderr\": 0.03750757044895537,\n \"acc_norm\": 0.5895953757225434,\n \"acc_norm_stderr\": 0.03750757044895537\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201942,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201942\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4808510638297872,\n \"acc_stderr\": 0.032662042990646796,\n \"acc_norm\": 0.4808510638297872,\n \"acc_norm_stderr\": 0.032662042990646796\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.39473684210526316,\n \"acc_stderr\": 0.045981880578165414,\n \"acc_norm\": 0.39473684210526316,\n \"acc_norm_stderr\": 0.045981880578165414\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3968253968253968,\n \"acc_stderr\": 0.025197101074246483,\n \"acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.025197101074246483\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5870967741935483,\n \"acc_stderr\": 0.028009138125400387,\n \"acc_norm\": 0.5870967741935483,\n \"acc_norm_stderr\": 0.028009138125400387\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.35467980295566504,\n \"acc_stderr\": 0.03366124489051449,\n \"acc_norm\": 0.35467980295566504,\n \"acc_norm_stderr\": 0.03366124489051449\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.49696969696969695,\n \"acc_stderr\": 0.03904272341431857,\n \"acc_norm\": 0.49696969696969695,\n \"acc_norm_stderr\": 0.03904272341431857\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7121212121212122,\n \"acc_stderr\": 0.03225883512300992,\n \"acc_norm\": 0.7121212121212122,\n \"acc_norm_stderr\": 0.03225883512300992\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8031088082901554,\n \"acc_stderr\": 0.02869787397186067,\n \"acc_norm\": 0.8031088082901554,\n \"acc_norm_stderr\": 0.02869787397186067\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.541025641025641,\n \"acc_stderr\": 0.025265525491284295,\n \"acc_norm\": 0.541025641025641,\n \"acc_norm_stderr\": 0.025265525491284295\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.026962424325073835,\n \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.026962424325073835\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5168067226890757,\n \"acc_stderr\": 0.03246013680375308,\n \"acc_norm\": 0.5168067226890757,\n \"acc_norm_stderr\": 0.03246013680375308\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389024,\n \"acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389024\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7522935779816514,\n \"acc_stderr\": 0.018508143602547825,\n \"acc_norm\": 0.7522935779816514,\n \"acc_norm_stderr\": 0.018508143602547825\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.36574074074074076,\n \"acc_stderr\": 0.03284738857647206,\n \"acc_norm\": 0.36574074074074076,\n \"acc_norm_stderr\": 0.03284738857647206\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.553921568627451,\n \"acc_stderr\": 0.034888454513049734,\n \"acc_norm\": 0.553921568627451,\n \"acc_norm_stderr\": 0.034888454513049734\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.6497890295358649,\n \"acc_stderr\": 0.031052391937584346,\n \"acc_norm\": 0.6497890295358649,\n \"acc_norm_stderr\": 0.031052391937584346\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6188340807174888,\n \"acc_stderr\": 0.03259625118416827,\n \"acc_norm\": 0.6188340807174888,\n \"acc_norm_stderr\": 0.03259625118416827\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6259541984732825,\n \"acc_stderr\": 0.042438692422305246,\n \"acc_norm\": 0.6259541984732825,\n \"acc_norm_stderr\": 0.042438692422305246\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6759259259259259,\n \"acc_stderr\": 0.04524596007030048,\n \"acc_norm\": 0.6759259259259259,\n \"acc_norm_stderr\": 0.04524596007030048\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6380368098159509,\n \"acc_stderr\": 0.037757007291414416,\n \"acc_norm\": 0.6380368098159509,\n \"acc_norm_stderr\": 0.037757007291414416\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7087378640776699,\n \"acc_stderr\": 0.044986763205729224,\n \"acc_norm\": 0.7087378640776699,\n \"acc_norm_stderr\": 0.044986763205729224\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8376068376068376,\n \"acc_stderr\": 0.02416161812798774,\n \"acc_norm\": 0.8376068376068376,\n \"acc_norm_stderr\": 0.02416161812798774\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562429,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562429\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7381864623243933,\n \"acc_stderr\": 0.015720838678445266,\n \"acc_norm\": 0.7381864623243933,\n \"acc_norm_stderr\": 0.015720838678445266\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5867052023121387,\n \"acc_stderr\": 0.026511261369409244,\n \"acc_norm\": 0.5867052023121387,\n \"acc_norm_stderr\": 0.026511261369409244\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3541899441340782,\n \"acc_stderr\": 0.015995644947299235,\n \"acc_norm\": 0.3541899441340782,\n \"acc_norm_stderr\": 0.015995644947299235\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5849673202614379,\n \"acc_stderr\": 0.028213504177824096,\n \"acc_norm\": 0.5849673202614379,\n \"acc_norm_stderr\": 0.028213504177824096\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5852090032154341,\n \"acc_stderr\": 0.02798268045975957,\n \"acc_norm\": 0.5852090032154341,\n \"acc_norm_stderr\": 0.02798268045975957\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5709876543209876,\n \"acc_stderr\": 0.027538925613470863,\n \"acc_norm\": 0.5709876543209876,\n \"acc_norm_stderr\": 0.027538925613470863\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.41134751773049644,\n \"acc_stderr\": 0.029354911159940978,\n \"acc_norm\": 0.41134751773049644,\n \"acc_norm_stderr\": 0.029354911159940978\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3767926988265971,\n \"acc_stderr\": 0.012376459593894402,\n \"acc_norm\": 0.3767926988265971,\n \"acc_norm_stderr\": 0.012376459593894402\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.45955882352941174,\n \"acc_stderr\": 0.030273325077345762,\n \"acc_norm\": 0.45955882352941174,\n \"acc_norm_stderr\": 0.030273325077345762\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5637254901960784,\n \"acc_stderr\": 0.020062874243539128,\n \"acc_norm\": 0.5637254901960784,\n \"acc_norm_stderr\": 0.020062874243539128\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.04494290866252089,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.04494290866252089\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6122448979591837,\n \"acc_stderr\": 0.03119223072679566,\n \"acc_norm\": 0.6122448979591837,\n \"acc_norm_stderr\": 0.03119223072679566\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5671641791044776,\n \"acc_stderr\": 0.03503490923673282,\n \"acc_norm\": 0.5671641791044776,\n \"acc_norm_stderr\": 0.03503490923673282\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.45180722891566266,\n \"acc_stderr\": 0.038743715565879536,\n \"acc_norm\": 0.45180722891566266,\n \"acc_norm_stderr\": 0.038743715565879536\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7426900584795322,\n \"acc_stderr\": 0.03352799844161865,\n \"acc_norm\": 0.7426900584795322,\n \"acc_norm_stderr\": 0.03352799844161865\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.36964504283965727,\n \"mc1_stderr\": 0.016898180706973888,\n \"mc2\": 0.512198702575838,\n \"mc2_stderr\": 0.015478157824850066\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7774269928966061,\n \"acc_stderr\": 0.01169093380971267\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2577710386656558,\n \"acc_stderr\": 0.012048370213576598\n }\n}\n```", "repo_url": "https://huggingface.co/sethuiyer/Aika-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_16T17_42_59.851366", "path": ["**/details_harness|arc:challenge|25_2024-02-16T17-42-59.851366.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-16T17-42-59.851366.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_16T17_42_59.851366", "path": ["**/details_harness|gsm8k|5_2024-02-16T17-42-59.851366.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-16T17-42-59.851366.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_16T17_42_59.851366", "path": ["**/details_harness|hellaswag|10_2024-02-16T17-42-59.851366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-16T17-42-59.851366.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_16T17_42_59.851366", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T17-42-59.851366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T17-42-59.851366.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T17-42-59.851366.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_16T17_42_59.851366", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T17-42-59.851366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T17-42-59.851366.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_16T17_42_59.851366", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T17-42-59.851366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T17-42-59.851366.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_16T17_42_59.851366", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T17-42-59.851366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T17-42-59.851366.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_16T17_42_59.851366", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T17-42-59.851366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T17-42-59.851366.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_16T17_42_59.851366", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T17-42-59.851366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T17-42-59.851366.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_16T17_42_59.851366", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T17-42-59.851366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T17-42-59.851366.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_16T17_42_59.851366", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T17-42-59.851366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T17-42-59.851366.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_16T17_42_59.851366", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T17-42-59.851366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T17-42-59.851366.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_16T17_42_59.851366", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T17-42-59.851366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T17-42-59.851366.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_16T17_42_59.851366", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T17-42-59.851366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T17-42-59.851366.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_16T17_42_59.851366", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T17-42-59.851366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T17-42-59.851366.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_16T17_42_59.851366", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T17-42-59.851366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T17-42-59.851366.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_16T17_42_59.851366", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T17-42-59.851366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T17-42-59.851366.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_16T17_42_59.851366", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T17-42-59.851366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T17-42-59.851366.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_16T17_42_59.851366", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T17-42-59.851366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T17-42-59.851366.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_16T17_42_59.851366", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T17-42-59.851366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T17-42-59.851366.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_16T17_42_59.851366", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T17-42-59.851366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T17-42-59.851366.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_16T17_42_59.851366", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T17-42-59.851366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T17-42-59.851366.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_16T17_42_59.851366", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T17-42-59.851366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T17-42-59.851366.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_16T17_42_59.851366", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T17-42-59.851366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T17-42-59.851366.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_16T17_42_59.851366", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T17-42-59.851366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T17-42-59.851366.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_16T17_42_59.851366", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T17-42-59.851366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T17-42-59.851366.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_16T17_42_59.851366", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T17-42-59.851366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T17-42-59.851366.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_16T17_42_59.851366", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T17-42-59.851366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T17-42-59.851366.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_16T17_42_59.851366", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T17-42-59.851366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T17-42-59.851366.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_16T17_42_59.851366", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T17-42-59.851366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T17-42-59.851366.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_16T17_42_59.851366", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T17-42-59.851366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T17-42-59.851366.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_16T17_42_59.851366", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T17-42-59.851366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T17-42-59.851366.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_16T17_42_59.851366", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T17-42-59.851366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T17-42-59.851366.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_16T17_42_59.851366", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T17-42-59.851366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T17-42-59.851366.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_16T17_42_59.851366", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T17-42-59.851366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T17-42-59.851366.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_16T17_42_59.851366", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T17-42-59.851366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T17-42-59.851366.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_16T17_42_59.851366", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T17-42-59.851366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T17-42-59.851366.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_16T17_42_59.851366", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T17-42-59.851366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T17-42-59.851366.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_16T17_42_59.851366", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T17-42-59.851366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T17-42-59.851366.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_16T17_42_59.851366", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T17-42-59.851366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T17-42-59.851366.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_16T17_42_59.851366", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T17-42-59.851366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T17-42-59.851366.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_16T17_42_59.851366", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T17-42-59.851366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T17-42-59.851366.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_16T17_42_59.851366", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T17-42-59.851366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T17-42-59.851366.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_16T17_42_59.851366", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T17-42-59.851366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T17-42-59.851366.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_16T17_42_59.851366", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T17-42-59.851366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T17-42-59.851366.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_16T17_42_59.851366", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T17-42-59.851366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T17-42-59.851366.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_16T17_42_59.851366", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T17-42-59.851366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T17-42-59.851366.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_16T17_42_59.851366", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T17-42-59.851366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T17-42-59.851366.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_16T17_42_59.851366", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T17-42-59.851366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T17-42-59.851366.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_16T17_42_59.851366", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T17-42-59.851366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T17-42-59.851366.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_16T17_42_59.851366", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T17-42-59.851366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T17-42-59.851366.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_16T17_42_59.851366", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T17-42-59.851366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T17-42-59.851366.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_16T17_42_59.851366", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T17-42-59.851366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T17-42-59.851366.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_16T17_42_59.851366", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T17-42-59.851366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T17-42-59.851366.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_16T17_42_59.851366", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T17-42-59.851366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T17-42-59.851366.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_16T17_42_59.851366", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T17-42-59.851366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T17-42-59.851366.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_16T17_42_59.851366", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T17-42-59.851366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T17-42-59.851366.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_16T17_42_59.851366", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T17-42-59.851366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T17-42-59.851366.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_16T17_42_59.851366", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T17-42-59.851366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T17-42-59.851366.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_16T17_42_59.851366", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T17-42-59.851366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T17-42-59.851366.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_16T17_42_59.851366", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T17-42-59.851366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T17-42-59.851366.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_16T17_42_59.851366", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T17-42-59.851366.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T17-42-59.851366.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_16T17_42_59.851366", "path": ["**/details_harness|winogrande|5_2024-02-16T17-42-59.851366.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-16T17-42-59.851366.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_16T17_42_59.851366", "path": ["results_2024-02-16T17-42-59.851366.parquet"]}, {"split": "latest", "path": ["results_2024-02-16T17-42-59.851366.parquet"]}]}]} | 2024-02-16T17:45:43+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of sethuiyer/Aika-7B
Dataset automatically created during the evaluation run of model sethuiyer/Aika-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-16T17:42:59.851366(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of sethuiyer/Aika-7B\n\n\n\nDataset automatically created during the evaluation run of model sethuiyer/Aika-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-16T17:42:59.851366(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of sethuiyer/Aika-7B\n\n\n\nDataset automatically created during the evaluation run of model sethuiyer/Aika-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-16T17:42:59.851366(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
177,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of sethuiyer/Aika-7B\n\n\n\nDataset automatically created during the evaluation run of model sethuiyer/Aika-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-16T17:42:59.851366(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
1d856855b9d01ece801946c6b3824c6f09aff1fd | # Dataset Card for "distilabel-neurology-preferences"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | macadeliccc/distilabel-neurology-preferences | [
"region:us"
] | 2024-02-16T17:55:14+00:00 | {"language": ["en"], "dataset_info": {"features": [{"name": "input", "dtype": "string"}, {"name": "generation_model", "sequence": "string"}, {"name": "generation_prompt", "sequence": "string"}, {"name": "raw_generation_responses", "sequence": "string"}, {"name": "generations", "sequence": "string"}, {"name": "labelling_model", "dtype": "string"}, {"name": "labelling_prompt", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "raw_labelling_response", "dtype": "string"}, {"name": "rating", "sequence": "float64"}, {"name": "rationale", "sequence": "string"}], "splits": [{"name": "train", "num_bytes": 3076288, "num_examples": 500}], "download_size": 1363349, "dataset_size": 3076288}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-16T17:55:16+00:00 | [] | [
"en"
] | TAGS
#region-us
| # Dataset Card for "distilabel-neurology-preferences"
More Information needed | [
"# Dataset Card for \"distilabel-neurology-preferences\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"distilabel-neurology-preferences\"\n\nMore Information needed"
] | [
6,
21
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"distilabel-neurology-preferences\"\n\nMore Information needed"
] |
aaeb901d844fc3e3ebdd2a0d936b86c861622016 |
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | AlexAmin/test | [
"region:us"
] | 2024-02-16T18:02:59+00:00 | {} | 2024-02-16T18:04:18+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Dataset Name
This dataset card aims to be a base template for new datasets. It has been generated using this raw template.
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
34,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
b2a71d0b027d4af3ac3b05aa9a4fd43acc441b08 |
# Dataset Card for Evaluation run of Josephgflowers/TinyLlama-748M-Reason-With-Cinder-Test-2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Josephgflowers/TinyLlama-748M-Reason-With-Cinder-Test-2](https://huggingface.co/Josephgflowers/TinyLlama-748M-Reason-With-Cinder-Test-2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Josephgflowers__TinyLlama-748M-Reason-With-Cinder-Test-2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-16T18:21:24.569209](https://huggingface.co/datasets/open-llm-leaderboard/details_Josephgflowers__TinyLlama-748M-Reason-With-Cinder-Test-2/blob/main/results_2024-02-16T18-21-24.569209.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2521207309170715,
"acc_stderr": 0.030556259826906736,
"acc_norm": 0.2529609814071766,
"acc_norm_stderr": 0.03131972311648323,
"mc1": 0.2558139534883721,
"mc1_stderr": 0.015274176219283352,
"mc2": 0.42762316543412854,
"mc2_stderr": 0.015330016474026912
},
"harness|arc:challenge|25": {
"acc": 0.22781569965870307,
"acc_stderr": 0.012256708602326912,
"acc_norm": 0.24658703071672355,
"acc_norm_stderr": 0.012595726268790134
},
"harness|hellaswag|10": {
"acc": 0.304919338777136,
"acc_stderr": 0.004594323838650341,
"acc_norm": 0.34495120493925513,
"acc_norm_stderr": 0.004743808792037872
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.19,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.039725528847851375,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.039725528847851375
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.03110318238312337,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.03110318238312337
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21132075471698114,
"acc_stderr": 0.025125766484827842,
"acc_norm": 0.21132075471698114,
"acc_norm_stderr": 0.025125766484827842
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.24305555555555555,
"acc_stderr": 0.03586879280080342,
"acc_norm": 0.24305555555555555,
"acc_norm_stderr": 0.03586879280080342
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2543352601156069,
"acc_stderr": 0.0332055644308557,
"acc_norm": 0.2543352601156069,
"acc_norm_stderr": 0.0332055644308557
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.04488482852329017,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.04488482852329017
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.17,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.17,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.23829787234042554,
"acc_stderr": 0.027851252973889778,
"acc_norm": 0.23829787234042554,
"acc_norm_stderr": 0.027851252973889778
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.040969851398436695,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.040969851398436695
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.22758620689655173,
"acc_stderr": 0.03493950380131184,
"acc_norm": 0.22758620689655173,
"acc_norm_stderr": 0.03493950380131184
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.23015873015873015,
"acc_stderr": 0.02167921966369314,
"acc_norm": 0.23015873015873015,
"acc_norm_stderr": 0.02167921966369314
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.16666666666666666,
"acc_stderr": 0.03333333333333337,
"acc_norm": 0.16666666666666666,
"acc_norm_stderr": 0.03333333333333337
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3161290322580645,
"acc_stderr": 0.02645087448904277,
"acc_norm": 0.3161290322580645,
"acc_norm_stderr": 0.02645087448904277
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.29064039408866993,
"acc_stderr": 0.0319474007226554,
"acc_norm": 0.29064039408866993,
"acc_norm_stderr": 0.0319474007226554
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2606060606060606,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.2606060606060606,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2676767676767677,
"acc_stderr": 0.03154449888270285,
"acc_norm": 0.2676767676767677,
"acc_norm_stderr": 0.03154449888270285
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.27979274611398963,
"acc_stderr": 0.03239637046735703,
"acc_norm": 0.27979274611398963,
"acc_norm_stderr": 0.03239637046735703
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2692307692307692,
"acc_stderr": 0.022489389793654845,
"acc_norm": 0.2692307692307692,
"acc_norm_stderr": 0.022489389793654845
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.02606715922227579,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.02606715922227579
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.33613445378151263,
"acc_stderr": 0.03068473711513537,
"acc_norm": 0.33613445378151263,
"acc_norm_stderr": 0.03068473711513537
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.23178807947019867,
"acc_stderr": 0.03445406271987054,
"acc_norm": 0.23178807947019867,
"acc_norm_stderr": 0.03445406271987054
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.24770642201834864,
"acc_stderr": 0.01850814360254782,
"acc_norm": 0.24770642201834864,
"acc_norm_stderr": 0.01850814360254782
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4675925925925926,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.4675925925925926,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.27941176470588236,
"acc_stderr": 0.031493281045079556,
"acc_norm": 0.27941176470588236,
"acc_norm_stderr": 0.031493281045079556
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.23628691983122363,
"acc_stderr": 0.027652153144159263,
"acc_norm": 0.23628691983122363,
"acc_norm_stderr": 0.027652153144159263
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.21524663677130046,
"acc_stderr": 0.027584066602208263,
"acc_norm": 0.21524663677130046,
"acc_norm_stderr": 0.027584066602208263
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2748091603053435,
"acc_stderr": 0.03915345408847836,
"acc_norm": 0.2748091603053435,
"acc_norm_stderr": 0.03915345408847836
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2892561983471074,
"acc_stderr": 0.041391127276354626,
"acc_norm": 0.2892561983471074,
"acc_norm_stderr": 0.041391127276354626
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.21296296296296297,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.21296296296296297,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2392638036809816,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.2392638036809816,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.23214285714285715,
"acc_stderr": 0.04007341809755807,
"acc_norm": 0.23214285714285715,
"acc_norm_stderr": 0.04007341809755807
},
"harness|hendrycksTest-management|5": {
"acc": 0.1941747572815534,
"acc_stderr": 0.039166677628225836,
"acc_norm": 0.1941747572815534,
"acc_norm_stderr": 0.039166677628225836
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.20085470085470086,
"acc_stderr": 0.02624677294689048,
"acc_norm": 0.20085470085470086,
"acc_norm_stderr": 0.02624677294689048
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.26309067688378035,
"acc_stderr": 0.01574549716904906,
"acc_norm": 0.26309067688378035,
"acc_norm_stderr": 0.01574549716904906
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2254335260115607,
"acc_stderr": 0.022497230190967547,
"acc_norm": 0.2254335260115607,
"acc_norm_stderr": 0.022497230190967547
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808871,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808871
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.024954184324879912,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.024954184324879912
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.29260450160771706,
"acc_stderr": 0.025839898334877983,
"acc_norm": 0.29260450160771706,
"acc_norm_stderr": 0.025839898334877983
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.02492200116888633,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.02492200116888633
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2198581560283688,
"acc_stderr": 0.024706141070705474,
"acc_norm": 0.2198581560283688,
"acc_norm_stderr": 0.024706141070705474
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2196870925684485,
"acc_stderr": 0.010574639934167518,
"acc_norm": 0.2196870925684485,
"acc_norm_stderr": 0.010574639934167518
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.39338235294117646,
"acc_stderr": 0.02967428828131118,
"acc_norm": 0.39338235294117646,
"acc_norm_stderr": 0.02967428828131118
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.016906615927288145,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.016906615927288145
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.04265792110940589,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.04265792110940589
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3551020408163265,
"acc_stderr": 0.030635655150387638,
"acc_norm": 0.3551020408163265,
"acc_norm_stderr": 0.030635655150387638
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.22885572139303484,
"acc_stderr": 0.029705284056772436,
"acc_norm": 0.22885572139303484,
"acc_norm_stderr": 0.029705284056772436
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.22,
"acc_stderr": 0.0416333199893227,
"acc_norm": 0.22,
"acc_norm_stderr": 0.0416333199893227
},
"harness|hendrycksTest-virology|5": {
"acc": 0.21686746987951808,
"acc_stderr": 0.03208284450356365,
"acc_norm": 0.21686746987951808,
"acc_norm_stderr": 0.03208284450356365
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.28654970760233917,
"acc_stderr": 0.03467826685703826,
"acc_norm": 0.28654970760233917,
"acc_norm_stderr": 0.03467826685703826
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2558139534883721,
"mc1_stderr": 0.015274176219283352,
"mc2": 0.42762316543412854,
"mc2_stderr": 0.015330016474026912
},
"harness|winogrande|5": {
"acc": 0.505130228887135,
"acc_stderr": 0.014051745961790516
},
"harness|gsm8k|5": {
"acc": 0.008339651250947688,
"acc_stderr": 0.002504942226860505
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Josephgflowers__TinyLlama-748M-Reason-With-Cinder-Test-2 | [
"region:us"
] | 2024-02-16T18:23:16+00:00 | {"pretty_name": "Evaluation run of Josephgflowers/TinyLlama-748M-Reason-With-Cinder-Test-2", "dataset_summary": "Dataset automatically created during the evaluation run of model [Josephgflowers/TinyLlama-748M-Reason-With-Cinder-Test-2](https://huggingface.co/Josephgflowers/TinyLlama-748M-Reason-With-Cinder-Test-2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Josephgflowers__TinyLlama-748M-Reason-With-Cinder-Test-2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-16T18:21:24.569209](https://huggingface.co/datasets/open-llm-leaderboard/details_Josephgflowers__TinyLlama-748M-Reason-With-Cinder-Test-2/blob/main/results_2024-02-16T18-21-24.569209.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2521207309170715,\n \"acc_stderr\": 0.030556259826906736,\n \"acc_norm\": 0.2529609814071766,\n \"acc_norm_stderr\": 0.03131972311648323,\n \"mc1\": 0.2558139534883721,\n \"mc1_stderr\": 0.015274176219283352,\n \"mc2\": 0.42762316543412854,\n \"mc2_stderr\": 0.015330016474026912\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.22781569965870307,\n \"acc_stderr\": 0.012256708602326912,\n \"acc_norm\": 0.24658703071672355,\n \"acc_norm_stderr\": 0.012595726268790134\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.304919338777136,\n \"acc_stderr\": 0.004594323838650341,\n \"acc_norm\": 0.34495120493925513,\n \"acc_norm_stderr\": 0.004743808792037872\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3037037037037037,\n \"acc_stderr\": 0.039725528847851375,\n \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.039725528847851375\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.03110318238312337,\n \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.03110318238312337\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.21132075471698114,\n \"acc_stderr\": 0.025125766484827842,\n \"acc_norm\": 0.21132075471698114,\n \"acc_norm_stderr\": 0.025125766484827842\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.24305555555555555,\n \"acc_stderr\": 0.03586879280080342,\n \"acc_norm\": 0.24305555555555555,\n \"acc_norm_stderr\": 0.03586879280080342\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2543352601156069,\n \"acc_stderr\": 0.0332055644308557,\n \"acc_norm\": 0.2543352601156069,\n \"acc_norm_stderr\": 0.0332055644308557\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.04488482852329017,\n \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.04488482852329017\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.17,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.23829787234042554,\n \"acc_stderr\": 0.027851252973889778,\n \"acc_norm\": 0.23829787234042554,\n \"acc_norm_stderr\": 0.027851252973889778\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n \"acc_stderr\": 0.040969851398436695,\n \"acc_norm\": 0.2543859649122807,\n \"acc_norm_stderr\": 0.040969851398436695\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.22758620689655173,\n \"acc_stderr\": 0.03493950380131184,\n \"acc_norm\": 0.22758620689655173,\n \"acc_norm_stderr\": 0.03493950380131184\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.23015873015873015,\n \"acc_stderr\": 0.02167921966369314,\n \"acc_norm\": 0.23015873015873015,\n \"acc_norm_stderr\": 0.02167921966369314\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.16666666666666666,\n \"acc_stderr\": 0.03333333333333337,\n \"acc_norm\": 0.16666666666666666,\n \"acc_norm_stderr\": 0.03333333333333337\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3161290322580645,\n \"acc_stderr\": 0.02645087448904277,\n \"acc_norm\": 0.3161290322580645,\n \"acc_norm_stderr\": 0.02645087448904277\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.29064039408866993,\n \"acc_stderr\": 0.0319474007226554,\n \"acc_norm\": 0.29064039408866993,\n \"acc_norm_stderr\": 0.0319474007226554\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.2606060606060606,\n \"acc_stderr\": 0.034277431758165236,\n \"acc_norm\": 0.2606060606060606,\n \"acc_norm_stderr\": 0.034277431758165236\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.2676767676767677,\n \"acc_stderr\": 0.03154449888270285,\n \"acc_norm\": 0.2676767676767677,\n \"acc_norm_stderr\": 0.03154449888270285\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.27979274611398963,\n \"acc_stderr\": 0.03239637046735703,\n \"acc_norm\": 0.27979274611398963,\n \"acc_norm_stderr\": 0.03239637046735703\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.2692307692307692,\n \"acc_stderr\": 0.022489389793654845,\n \"acc_norm\": 0.2692307692307692,\n \"acc_norm_stderr\": 0.022489389793654845\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.24074074074074073,\n \"acc_stderr\": 0.02606715922227579,\n \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.02606715922227579\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.33613445378151263,\n \"acc_stderr\": 0.03068473711513537,\n \"acc_norm\": 0.33613445378151263,\n \"acc_norm_stderr\": 0.03068473711513537\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.23178807947019867,\n \"acc_stderr\": 0.03445406271987054,\n \"acc_norm\": 0.23178807947019867,\n \"acc_norm_stderr\": 0.03445406271987054\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.24770642201834864,\n \"acc_stderr\": 0.01850814360254782,\n \"acc_norm\": 0.24770642201834864,\n \"acc_norm_stderr\": 0.01850814360254782\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4675925925925926,\n \"acc_stderr\": 0.03402801581358966,\n \"acc_norm\": 0.4675925925925926,\n \"acc_norm_stderr\": 0.03402801581358966\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.27941176470588236,\n \"acc_stderr\": 0.031493281045079556,\n \"acc_norm\": 0.27941176470588236,\n \"acc_norm_stderr\": 0.031493281045079556\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.23628691983122363,\n \"acc_stderr\": 0.027652153144159263,\n \"acc_norm\": 0.23628691983122363,\n \"acc_norm_stderr\": 0.027652153144159263\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.21524663677130046,\n \"acc_stderr\": 0.027584066602208263,\n \"acc_norm\": 0.21524663677130046,\n \"acc_norm_stderr\": 0.027584066602208263\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2748091603053435,\n \"acc_stderr\": 0.03915345408847836,\n \"acc_norm\": 0.2748091603053435,\n \"acc_norm_stderr\": 0.03915345408847836\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2892561983471074,\n \"acc_stderr\": 0.041391127276354626,\n \"acc_norm\": 0.2892561983471074,\n \"acc_norm_stderr\": 0.041391127276354626\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.21296296296296297,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.21296296296296297,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.2392638036809816,\n \"acc_stderr\": 0.033519538795212696,\n \"acc_norm\": 0.2392638036809816,\n \"acc_norm_stderr\": 0.033519538795212696\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.23214285714285715,\n \"acc_stderr\": 0.04007341809755807,\n \"acc_norm\": 0.23214285714285715,\n \"acc_norm_stderr\": 0.04007341809755807\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.1941747572815534,\n \"acc_stderr\": 0.039166677628225836,\n \"acc_norm\": 0.1941747572815534,\n \"acc_norm_stderr\": 0.039166677628225836\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.20085470085470086,\n \"acc_stderr\": 0.02624677294689048,\n \"acc_norm\": 0.20085470085470086,\n \"acc_norm_stderr\": 0.02624677294689048\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.26309067688378035,\n \"acc_stderr\": 0.01574549716904906,\n \"acc_norm\": 0.26309067688378035,\n \"acc_norm_stderr\": 0.01574549716904906\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.2254335260115607,\n \"acc_stderr\": 0.022497230190967547,\n \"acc_norm\": 0.2254335260115607,\n \"acc_norm_stderr\": 0.022497230190967547\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n \"acc_stderr\": 0.014422292204808871,\n \"acc_norm\": 0.24692737430167597,\n \"acc_norm_stderr\": 0.014422292204808871\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.024954184324879912,\n \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.024954184324879912\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.29260450160771706,\n \"acc_stderr\": 0.025839898334877983,\n \"acc_norm\": 0.29260450160771706,\n \"acc_norm_stderr\": 0.025839898334877983\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.02492200116888633,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.02492200116888633\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.2198581560283688,\n \"acc_stderr\": 0.024706141070705474,\n \"acc_norm\": 0.2198581560283688,\n \"acc_norm_stderr\": 0.024706141070705474\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2196870925684485,\n \"acc_stderr\": 0.010574639934167518,\n \"acc_norm\": 0.2196870925684485,\n \"acc_norm_stderr\": 0.010574639934167518\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.39338235294117646,\n \"acc_stderr\": 0.02967428828131118,\n \"acc_norm\": 0.39338235294117646,\n \"acc_norm_stderr\": 0.02967428828131118\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.016906615927288145,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.016906615927288145\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2727272727272727,\n \"acc_stderr\": 0.04265792110940589,\n \"acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.04265792110940589\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.3551020408163265,\n \"acc_stderr\": 0.030635655150387638,\n \"acc_norm\": 0.3551020408163265,\n \"acc_norm_stderr\": 0.030635655150387638\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.22885572139303484,\n \"acc_stderr\": 0.029705284056772436,\n \"acc_norm\": 0.22885572139303484,\n \"acc_norm_stderr\": 0.029705284056772436\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.0416333199893227,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.0416333199893227\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.21686746987951808,\n \"acc_stderr\": 0.03208284450356365,\n \"acc_norm\": 0.21686746987951808,\n \"acc_norm_stderr\": 0.03208284450356365\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.28654970760233917,\n \"acc_stderr\": 0.03467826685703826,\n \"acc_norm\": 0.28654970760233917,\n \"acc_norm_stderr\": 0.03467826685703826\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2558139534883721,\n \"mc1_stderr\": 0.015274176219283352,\n \"mc2\": 0.42762316543412854,\n \"mc2_stderr\": 0.015330016474026912\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.505130228887135,\n \"acc_stderr\": 0.014051745961790516\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.008339651250947688,\n \"acc_stderr\": 0.002504942226860505\n }\n}\n```", "repo_url": "https://huggingface.co/Josephgflowers/TinyLlama-748M-Reason-With-Cinder-Test-2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_16T18_21_24.569209", "path": ["**/details_harness|arc:challenge|25_2024-02-16T18-21-24.569209.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-16T18-21-24.569209.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_16T18_21_24.569209", "path": ["**/details_harness|gsm8k|5_2024-02-16T18-21-24.569209.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-16T18-21-24.569209.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_16T18_21_24.569209", "path": ["**/details_harness|hellaswag|10_2024-02-16T18-21-24.569209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-16T18-21-24.569209.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_16T18_21_24.569209", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T18-21-24.569209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T18-21-24.569209.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T18-21-24.569209.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_16T18_21_24.569209", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T18-21-24.569209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T18-21-24.569209.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_16T18_21_24.569209", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T18-21-24.569209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T18-21-24.569209.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_16T18_21_24.569209", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T18-21-24.569209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T18-21-24.569209.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_16T18_21_24.569209", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T18-21-24.569209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T18-21-24.569209.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_16T18_21_24.569209", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T18-21-24.569209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T18-21-24.569209.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_16T18_21_24.569209", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T18-21-24.569209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T18-21-24.569209.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_16T18_21_24.569209", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T18-21-24.569209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T18-21-24.569209.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_16T18_21_24.569209", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T18-21-24.569209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T18-21-24.569209.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_16T18_21_24.569209", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T18-21-24.569209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T18-21-24.569209.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_16T18_21_24.569209", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T18-21-24.569209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T18-21-24.569209.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_16T18_21_24.569209", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T18-21-24.569209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T18-21-24.569209.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_16T18_21_24.569209", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T18-21-24.569209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T18-21-24.569209.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_16T18_21_24.569209", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T18-21-24.569209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T18-21-24.569209.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_16T18_21_24.569209", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T18-21-24.569209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T18-21-24.569209.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_16T18_21_24.569209", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T18-21-24.569209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T18-21-24.569209.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_16T18_21_24.569209", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T18-21-24.569209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T18-21-24.569209.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_16T18_21_24.569209", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T18-21-24.569209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T18-21-24.569209.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_16T18_21_24.569209", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T18-21-24.569209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T18-21-24.569209.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_16T18_21_24.569209", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T18-21-24.569209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T18-21-24.569209.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_16T18_21_24.569209", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T18-21-24.569209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T18-21-24.569209.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_16T18_21_24.569209", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T18-21-24.569209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T18-21-24.569209.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_16T18_21_24.569209", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T18-21-24.569209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T18-21-24.569209.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_16T18_21_24.569209", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T18-21-24.569209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T18-21-24.569209.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_16T18_21_24.569209", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T18-21-24.569209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T18-21-24.569209.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_16T18_21_24.569209", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T18-21-24.569209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T18-21-24.569209.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_16T18_21_24.569209", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T18-21-24.569209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T18-21-24.569209.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_16T18_21_24.569209", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T18-21-24.569209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T18-21-24.569209.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_16T18_21_24.569209", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T18-21-24.569209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T18-21-24.569209.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_16T18_21_24.569209", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T18-21-24.569209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T18-21-24.569209.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_16T18_21_24.569209", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T18-21-24.569209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T18-21-24.569209.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_16T18_21_24.569209", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T18-21-24.569209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T18-21-24.569209.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_16T18_21_24.569209", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T18-21-24.569209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T18-21-24.569209.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_16T18_21_24.569209", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T18-21-24.569209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T18-21-24.569209.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_16T18_21_24.569209", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T18-21-24.569209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T18-21-24.569209.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_16T18_21_24.569209", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T18-21-24.569209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T18-21-24.569209.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_16T18_21_24.569209", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T18-21-24.569209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T18-21-24.569209.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_16T18_21_24.569209", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T18-21-24.569209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T18-21-24.569209.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_16T18_21_24.569209", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T18-21-24.569209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T18-21-24.569209.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_16T18_21_24.569209", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T18-21-24.569209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T18-21-24.569209.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_16T18_21_24.569209", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T18-21-24.569209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T18-21-24.569209.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_16T18_21_24.569209", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T18-21-24.569209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T18-21-24.569209.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_16T18_21_24.569209", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T18-21-24.569209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T18-21-24.569209.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_16T18_21_24.569209", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T18-21-24.569209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T18-21-24.569209.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_16T18_21_24.569209", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T18-21-24.569209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T18-21-24.569209.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_16T18_21_24.569209", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T18-21-24.569209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T18-21-24.569209.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_16T18_21_24.569209", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T18-21-24.569209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T18-21-24.569209.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_16T18_21_24.569209", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T18-21-24.569209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T18-21-24.569209.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_16T18_21_24.569209", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T18-21-24.569209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T18-21-24.569209.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_16T18_21_24.569209", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T18-21-24.569209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T18-21-24.569209.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_16T18_21_24.569209", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T18-21-24.569209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T18-21-24.569209.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_16T18_21_24.569209", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T18-21-24.569209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T18-21-24.569209.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_16T18_21_24.569209", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T18-21-24.569209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T18-21-24.569209.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_16T18_21_24.569209", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T18-21-24.569209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T18-21-24.569209.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_16T18_21_24.569209", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T18-21-24.569209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T18-21-24.569209.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_16T18_21_24.569209", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T18-21-24.569209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T18-21-24.569209.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_16T18_21_24.569209", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T18-21-24.569209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T18-21-24.569209.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_16T18_21_24.569209", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T18-21-24.569209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T18-21-24.569209.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_16T18_21_24.569209", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T18-21-24.569209.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T18-21-24.569209.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_16T18_21_24.569209", "path": ["**/details_harness|winogrande|5_2024-02-16T18-21-24.569209.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-16T18-21-24.569209.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_16T18_21_24.569209", "path": ["results_2024-02-16T18-21-24.569209.parquet"]}, {"split": "latest", "path": ["results_2024-02-16T18-21-24.569209.parquet"]}]}]} | 2024-02-16T18:23:39+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Josephgflowers/TinyLlama-748M-Reason-With-Cinder-Test-2
Dataset automatically created during the evaluation run of model Josephgflowers/TinyLlama-748M-Reason-With-Cinder-Test-2 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-16T18:21:24.569209(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Josephgflowers/TinyLlama-748M-Reason-With-Cinder-Test-2\n\n\n\nDataset automatically created during the evaluation run of model Josephgflowers/TinyLlama-748M-Reason-With-Cinder-Test-2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-16T18:21:24.569209(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Josephgflowers/TinyLlama-748M-Reason-With-Cinder-Test-2\n\n\n\nDataset automatically created during the evaluation run of model Josephgflowers/TinyLlama-748M-Reason-With-Cinder-Test-2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-16T18:21:24.569209(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
209,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Josephgflowers/TinyLlama-748M-Reason-With-Cinder-Test-2\n\n\n\nDataset automatically created during the evaluation run of model Josephgflowers/TinyLlama-748M-Reason-With-Cinder-Test-2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-16T18:21:24.569209(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]"
] |
0bdcc50a89fb1697fba9a40d5ee628c8515d53b1 |
# Dataset Card for Evaluation run of shahzebnaveed/StarlingHermes-2.5-Mistral-7B-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [shahzebnaveed/StarlingHermes-2.5-Mistral-7B-slerp](https://huggingface.co/shahzebnaveed/StarlingHermes-2.5-Mistral-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_shahzebnaveed__StarlingHermes-2.5-Mistral-7B-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-16T18:45:36.411445](https://huggingface.co/datasets/open-llm-leaderboard/details_shahzebnaveed__StarlingHermes-2.5-Mistral-7B-slerp/blob/main/results_2024-02-16T18-45-36.411445.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6499656745636075,
"acc_stderr": 0.03190074035438905,
"acc_norm": 0.6509134828280464,
"acc_norm_stderr": 0.032545406816765036,
"mc1": 0.3402692778457772,
"mc1_stderr": 0.016586304901762564,
"mc2": 0.49557311062145515,
"mc2_stderr": 0.015305674753451043
},
"harness|arc:challenge|25": {
"acc": 0.6313993174061433,
"acc_stderr": 0.014097810678042198,
"acc_norm": 0.6604095563139932,
"acc_norm_stderr": 0.013839039762820169
},
"harness|hellaswag|10": {
"acc": 0.6680940051782513,
"acc_stderr": 0.0046993506536956225,
"acc_norm": 0.851822346146186,
"acc_norm_stderr": 0.0035454991695580535
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353227,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353227
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7171052631578947,
"acc_stderr": 0.03665349695640767,
"acc_norm": 0.7171052631578947,
"acc_norm_stderr": 0.03665349695640767
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.028152837942493875,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.028152837942493875
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542126,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542126
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.0355068398916558,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.0355068398916558
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.025424835086924,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.025424835086924
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8032258064516129,
"acc_stderr": 0.02261640942074202,
"acc_norm": 0.8032258064516129,
"acc_norm_stderr": 0.02261640942074202
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.03517603540361008,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.03517603540361008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009181,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.029376616484945633,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.029376616484945633
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.020986854593289733,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.020986854593289733
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6717948717948717,
"acc_stderr": 0.023807633198657266,
"acc_norm": 0.6717948717948717,
"acc_norm_stderr": 0.023807633198657266
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6974789915966386,
"acc_stderr": 0.029837962388291943,
"acc_norm": 0.6974789915966386,
"acc_norm_stderr": 0.029837962388291943
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658752,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658752
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.025845017986926917,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.025845017986926917
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.0263616516683891,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.0263616516683891
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7130044843049327,
"acc_stderr": 0.030360379710291954,
"acc_norm": 0.7130044843049327,
"acc_norm_stderr": 0.030360379710291954
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624734,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624734
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098822,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098822
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.031921934489347235,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.031921934489347235
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.03760178006026621,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.03760178006026621
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841407,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841407
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8250319284802043,
"acc_stderr": 0.013586619219903338,
"acc_norm": 0.8250319284802043,
"acc_norm_stderr": 0.013586619219903338
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.023703099525258172,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.023703099525258172
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4044692737430168,
"acc_stderr": 0.016414440917293147,
"acc_norm": 0.4044692737430168,
"acc_norm_stderr": 0.016414440917293147
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.025360603796242557,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.025360603796242557
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.02447722285613511,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.02447722285613511
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873866,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873866
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47327249022164275,
"acc_stderr": 0.012751977967676008,
"acc_norm": 0.47327249022164275,
"acc_norm_stderr": 0.012751977967676008
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6985294117647058,
"acc_stderr": 0.027875982114273168,
"acc_norm": 0.6985294117647058,
"acc_norm_stderr": 0.027875982114273168
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6683006535947712,
"acc_stderr": 0.019047485239360378,
"acc_norm": 0.6683006535947712,
"acc_norm_stderr": 0.019047485239360378
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7551020408163265,
"acc_stderr": 0.02752963744017493,
"acc_norm": 0.7551020408163265,
"acc_norm_stderr": 0.02752963744017493
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578334,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578334
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160882,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160882
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3402692778457772,
"mc1_stderr": 0.016586304901762564,
"mc2": 0.49557311062145515,
"mc2_stderr": 0.015305674753451043
},
"harness|winogrande|5": {
"acc": 0.7971586424625099,
"acc_stderr": 0.011301439925936648
},
"harness|gsm8k|5": {
"acc": 0.6595905989385898,
"acc_stderr": 0.013052097103299099
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_shahzebnaveed__StarlingHermes-2.5-Mistral-7B-slerp | [
"region:us"
] | 2024-02-16T18:36:34+00:00 | {"pretty_name": "Evaluation run of shahzebnaveed/StarlingHermes-2.5-Mistral-7B-slerp", "dataset_summary": "Dataset automatically created during the evaluation run of model [shahzebnaveed/StarlingHermes-2.5-Mistral-7B-slerp](https://huggingface.co/shahzebnaveed/StarlingHermes-2.5-Mistral-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_shahzebnaveed__StarlingHermes-2.5-Mistral-7B-slerp\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-16T18:45:36.411445](https://huggingface.co/datasets/open-llm-leaderboard/details_shahzebnaveed__StarlingHermes-2.5-Mistral-7B-slerp/blob/main/results_2024-02-16T18-45-36.411445.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6499656745636075,\n \"acc_stderr\": 0.03190074035438905,\n \"acc_norm\": 0.6509134828280464,\n \"acc_norm_stderr\": 0.032545406816765036,\n \"mc1\": 0.3402692778457772,\n \"mc1_stderr\": 0.016586304901762564,\n \"mc2\": 0.49557311062145515,\n \"mc2_stderr\": 0.015305674753451043\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6313993174061433,\n \"acc_stderr\": 0.014097810678042198,\n \"acc_norm\": 0.6604095563139932,\n \"acc_norm_stderr\": 0.013839039762820169\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6680940051782513,\n \"acc_stderr\": 0.0046993506536956225,\n \"acc_norm\": 0.851822346146186,\n \"acc_norm_stderr\": 0.0035454991695580535\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n \"acc_stderr\": 0.04244633238353227,\n \"acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.04244633238353227\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7171052631578947,\n \"acc_stderr\": 0.03665349695640767,\n \"acc_norm\": 0.7171052631578947,\n \"acc_norm_stderr\": 0.03665349695640767\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.028152837942493875,\n \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.028152837942493875\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542126,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542126\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.6820809248554913,\n \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.03246956919789958,\n \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.03246956919789958\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482758,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482758\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42063492063492064,\n \"acc_stderr\": 0.025424835086924,\n \"acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086924\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8032258064516129,\n \"acc_stderr\": 0.02261640942074202,\n \"acc_norm\": 0.8032258064516129,\n \"acc_norm_stderr\": 0.02261640942074202\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.03517603540361008,\n \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.03517603540361008\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009181,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009181\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7828282828282829,\n \"acc_stderr\": 0.029376616484945633,\n \"acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.029376616484945633\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.020986854593289733,\n \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.020986854593289733\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657266,\n \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657266\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6974789915966386,\n \"acc_stderr\": 0.029837962388291943,\n \"acc_norm\": 0.6974789915966386,\n \"acc_norm_stderr\": 0.029837962388291943\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658752,\n \"acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658752\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5416666666666666,\n \"acc_stderr\": 0.03398110890294636,\n \"acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.03398110890294636\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926917,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926917\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7932489451476793,\n \"acc_stderr\": 0.0263616516683891,\n \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.0263616516683891\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7130044843049327,\n \"acc_stderr\": 0.030360379710291954,\n \"acc_norm\": 0.7130044843049327,\n \"acc_norm_stderr\": 0.030360379710291954\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624734,\n \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624734\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098822,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098822\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.03760178006026621,\n \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.03760178006026621\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n \"acc_stderr\": 0.023086635086841407,\n \"acc_norm\": 0.8547008547008547,\n \"acc_norm_stderr\": 0.023086635086841407\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8250319284802043,\n \"acc_stderr\": 0.013586619219903338,\n \"acc_norm\": 0.8250319284802043,\n \"acc_norm_stderr\": 0.013586619219903338\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.023703099525258172,\n \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.023703099525258172\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4044692737430168,\n \"acc_stderr\": 0.016414440917293147,\n \"acc_norm\": 0.4044692737430168,\n \"acc_norm_stderr\": 0.016414440917293147\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242557,\n \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242557\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.02447722285613511,\n \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.02447722285613511\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873866,\n \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873866\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47327249022164275,\n \"acc_stderr\": 0.012751977967676008,\n \"acc_norm\": 0.47327249022164275,\n \"acc_norm_stderr\": 0.012751977967676008\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6985294117647058,\n \"acc_stderr\": 0.027875982114273168,\n \"acc_norm\": 0.6985294117647058,\n \"acc_norm_stderr\": 0.027875982114273168\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6683006535947712,\n \"acc_stderr\": 0.019047485239360378,\n \"acc_norm\": 0.6683006535947712,\n \"acc_norm_stderr\": 0.019047485239360378\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7551020408163265,\n \"acc_stderr\": 0.02752963744017493,\n \"acc_norm\": 0.7551020408163265,\n \"acc_norm_stderr\": 0.02752963744017493\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.025538433368578334,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.025538433368578334\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160882,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160882\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3402692778457772,\n \"mc1_stderr\": 0.016586304901762564,\n \"mc2\": 0.49557311062145515,\n \"mc2_stderr\": 0.015305674753451043\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7971586424625099,\n \"acc_stderr\": 0.011301439925936648\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6595905989385898,\n \"acc_stderr\": 0.013052097103299099\n }\n}\n```", "repo_url": "https://huggingface.co/shahzebnaveed/StarlingHermes-2.5-Mistral-7B-slerp", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_16T18_34_16.912824", "path": ["**/details_harness|arc:challenge|25_2024-02-16T18-34-16.912824.parquet"]}, {"split": "2024_02_16T18_45_36.411445", "path": ["**/details_harness|arc:challenge|25_2024-02-16T18-45-36.411445.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-16T18-45-36.411445.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_16T18_34_16.912824", "path": ["**/details_harness|gsm8k|5_2024-02-16T18-34-16.912824.parquet"]}, {"split": "2024_02_16T18_45_36.411445", "path": ["**/details_harness|gsm8k|5_2024-02-16T18-45-36.411445.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-16T18-45-36.411445.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_16T18_34_16.912824", "path": ["**/details_harness|hellaswag|10_2024-02-16T18-34-16.912824.parquet"]}, {"split": "2024_02_16T18_45_36.411445", "path": ["**/details_harness|hellaswag|10_2024-02-16T18-45-36.411445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-16T18-45-36.411445.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_16T18_34_16.912824", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T18-34-16.912824.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T18-34-16.912824.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T18-34-16.912824.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T18-34-16.912824.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T18-34-16.912824.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T18-34-16.912824.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T18-34-16.912824.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T18-34-16.912824.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T18-34-16.912824.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T18-34-16.912824.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T18-34-16.912824.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T18-34-16.912824.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T18-34-16.912824.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T18-34-16.912824.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T18-34-16.912824.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T18-34-16.912824.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T18-34-16.912824.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T18-34-16.912824.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T18-34-16.912824.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T18-34-16.912824.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T18-34-16.912824.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T18-34-16.912824.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T18-34-16.912824.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T18-34-16.912824.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T18-34-16.912824.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T18-34-16.912824.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T18-34-16.912824.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T18-34-16.912824.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T18-34-16.912824.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T18-34-16.912824.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T18-34-16.912824.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T18-34-16.912824.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T18-34-16.912824.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T18-34-16.912824.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T18-34-16.912824.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T18-34-16.912824.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T18-34-16.912824.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T18-34-16.912824.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T18-34-16.912824.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T18-34-16.912824.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T18-34-16.912824.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T18-34-16.912824.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T18-34-16.912824.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T18-34-16.912824.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T18-34-16.912824.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T18-34-16.912824.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T18-34-16.912824.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T18-34-16.912824.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T18-34-16.912824.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T18-34-16.912824.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T18-34-16.912824.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T18-34-16.912824.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T18-34-16.912824.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T18-34-16.912824.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T18-34-16.912824.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T18-34-16.912824.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T18-34-16.912824.parquet"]}, {"split": "2024_02_16T18_45_36.411445", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T18-45-36.411445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T18-45-36.411445.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T18-45-36.411445.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_16T18_34_16.912824", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T18-34-16.912824.parquet"]}, {"split": "2024_02_16T18_45_36.411445", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T18-45-36.411445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T18-45-36.411445.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_16T18_34_16.912824", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T18-34-16.912824.parquet"]}, {"split": "2024_02_16T18_45_36.411445", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T18-45-36.411445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T18-45-36.411445.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_16T18_34_16.912824", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T18-34-16.912824.parquet"]}, {"split": "2024_02_16T18_45_36.411445", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T18-45-36.411445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T18-45-36.411445.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_16T18_34_16.912824", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T18-34-16.912824.parquet"]}, {"split": "2024_02_16T18_45_36.411445", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T18-45-36.411445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T18-45-36.411445.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_16T18_34_16.912824", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T18-34-16.912824.parquet"]}, {"split": "2024_02_16T18_45_36.411445", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T18-45-36.411445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T18-45-36.411445.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_16T18_34_16.912824", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T18-34-16.912824.parquet"]}, {"split": "2024_02_16T18_45_36.411445", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T18-45-36.411445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T18-45-36.411445.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_16T18_34_16.912824", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T18-34-16.912824.parquet"]}, {"split": "2024_02_16T18_45_36.411445", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T18-45-36.411445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T18-45-36.411445.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_16T18_34_16.912824", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T18-34-16.912824.parquet"]}, {"split": "2024_02_16T18_45_36.411445", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T18-45-36.411445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T18-45-36.411445.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_16T18_34_16.912824", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T18-34-16.912824.parquet"]}, {"split": "2024_02_16T18_45_36.411445", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T18-45-36.411445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T18-45-36.411445.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_16T18_34_16.912824", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T18-34-16.912824.parquet"]}, {"split": "2024_02_16T18_45_36.411445", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T18-45-36.411445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T18-45-36.411445.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_16T18_34_16.912824", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T18-34-16.912824.parquet"]}, {"split": "2024_02_16T18_45_36.411445", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T18-45-36.411445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T18-45-36.411445.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_16T18_34_16.912824", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T18-34-16.912824.parquet"]}, {"split": "2024_02_16T18_45_36.411445", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T18-45-36.411445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T18-45-36.411445.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_16T18_34_16.912824", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T18-34-16.912824.parquet"]}, {"split": "2024_02_16T18_45_36.411445", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T18-45-36.411445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T18-45-36.411445.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_16T18_34_16.912824", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T18-34-16.912824.parquet"]}, {"split": "2024_02_16T18_45_36.411445", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T18-45-36.411445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T18-45-36.411445.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_16T18_34_16.912824", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T18-34-16.912824.parquet"]}, {"split": "2024_02_16T18_45_36.411445", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T18-45-36.411445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T18-45-36.411445.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_16T18_34_16.912824", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T18-34-16.912824.parquet"]}, {"split": "2024_02_16T18_45_36.411445", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T18-45-36.411445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T18-45-36.411445.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_16T18_34_16.912824", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T18-34-16.912824.parquet"]}, {"split": "2024_02_16T18_45_36.411445", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T18-45-36.411445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T18-45-36.411445.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_16T18_34_16.912824", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T18-34-16.912824.parquet"]}, {"split": "2024_02_16T18_45_36.411445", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T18-45-36.411445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T18-45-36.411445.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_16T18_34_16.912824", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T18-34-16.912824.parquet"]}, {"split": "2024_02_16T18_45_36.411445", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T18-45-36.411445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T18-45-36.411445.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_16T18_34_16.912824", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T18-34-16.912824.parquet"]}, {"split": "2024_02_16T18_45_36.411445", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T18-45-36.411445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T18-45-36.411445.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_16T18_34_16.912824", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T18-34-16.912824.parquet"]}, {"split": "2024_02_16T18_45_36.411445", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T18-45-36.411445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T18-45-36.411445.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_16T18_34_16.912824", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T18-34-16.912824.parquet"]}, {"split": "2024_02_16T18_45_36.411445", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T18-45-36.411445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T18-45-36.411445.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_16T18_34_16.912824", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T18-34-16.912824.parquet"]}, {"split": "2024_02_16T18_45_36.411445", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T18-45-36.411445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T18-45-36.411445.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_16T18_34_16.912824", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T18-34-16.912824.parquet"]}, {"split": "2024_02_16T18_45_36.411445", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T18-45-36.411445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T18-45-36.411445.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_16T18_34_16.912824", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T18-34-16.912824.parquet"]}, {"split": "2024_02_16T18_45_36.411445", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T18-45-36.411445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T18-45-36.411445.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_16T18_34_16.912824", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T18-34-16.912824.parquet"]}, {"split": "2024_02_16T18_45_36.411445", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T18-45-36.411445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T18-45-36.411445.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_16T18_34_16.912824", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T18-34-16.912824.parquet"]}, {"split": "2024_02_16T18_45_36.411445", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T18-45-36.411445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T18-45-36.411445.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_16T18_34_16.912824", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T18-34-16.912824.parquet"]}, {"split": "2024_02_16T18_45_36.411445", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T18-45-36.411445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T18-45-36.411445.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_16T18_34_16.912824", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T18-34-16.912824.parquet"]}, {"split": "2024_02_16T18_45_36.411445", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T18-45-36.411445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T18-45-36.411445.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_16T18_34_16.912824", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T18-34-16.912824.parquet"]}, {"split": "2024_02_16T18_45_36.411445", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T18-45-36.411445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T18-45-36.411445.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_16T18_34_16.912824", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T18-34-16.912824.parquet"]}, {"split": "2024_02_16T18_45_36.411445", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T18-45-36.411445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T18-45-36.411445.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_16T18_34_16.912824", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T18-34-16.912824.parquet"]}, {"split": "2024_02_16T18_45_36.411445", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T18-45-36.411445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T18-45-36.411445.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_16T18_34_16.912824", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T18-34-16.912824.parquet"]}, {"split": "2024_02_16T18_45_36.411445", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T18-45-36.411445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T18-45-36.411445.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_16T18_34_16.912824", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T18-34-16.912824.parquet"]}, {"split": "2024_02_16T18_45_36.411445", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T18-45-36.411445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T18-45-36.411445.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_16T18_34_16.912824", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T18-34-16.912824.parquet"]}, {"split": "2024_02_16T18_45_36.411445", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T18-45-36.411445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T18-45-36.411445.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_16T18_34_16.912824", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T18-34-16.912824.parquet"]}, {"split": "2024_02_16T18_45_36.411445", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T18-45-36.411445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T18-45-36.411445.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_16T18_34_16.912824", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T18-34-16.912824.parquet"]}, {"split": "2024_02_16T18_45_36.411445", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T18-45-36.411445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T18-45-36.411445.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_16T18_34_16.912824", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T18-34-16.912824.parquet"]}, {"split": "2024_02_16T18_45_36.411445", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T18-45-36.411445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T18-45-36.411445.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_16T18_34_16.912824", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T18-34-16.912824.parquet"]}, {"split": "2024_02_16T18_45_36.411445", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T18-45-36.411445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T18-45-36.411445.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_16T18_34_16.912824", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T18-34-16.912824.parquet"]}, {"split": "2024_02_16T18_45_36.411445", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T18-45-36.411445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T18-45-36.411445.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_16T18_34_16.912824", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T18-34-16.912824.parquet"]}, {"split": "2024_02_16T18_45_36.411445", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T18-45-36.411445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T18-45-36.411445.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_16T18_34_16.912824", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T18-34-16.912824.parquet"]}, {"split": "2024_02_16T18_45_36.411445", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T18-45-36.411445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T18-45-36.411445.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_16T18_34_16.912824", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T18-34-16.912824.parquet"]}, {"split": "2024_02_16T18_45_36.411445", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T18-45-36.411445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T18-45-36.411445.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_16T18_34_16.912824", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T18-34-16.912824.parquet"]}, {"split": "2024_02_16T18_45_36.411445", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T18-45-36.411445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T18-45-36.411445.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_16T18_34_16.912824", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T18-34-16.912824.parquet"]}, {"split": "2024_02_16T18_45_36.411445", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T18-45-36.411445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T18-45-36.411445.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_16T18_34_16.912824", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T18-34-16.912824.parquet"]}, {"split": "2024_02_16T18_45_36.411445", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T18-45-36.411445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T18-45-36.411445.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_16T18_34_16.912824", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T18-34-16.912824.parquet"]}, {"split": "2024_02_16T18_45_36.411445", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T18-45-36.411445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T18-45-36.411445.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_16T18_34_16.912824", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T18-34-16.912824.parquet"]}, {"split": "2024_02_16T18_45_36.411445", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T18-45-36.411445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T18-45-36.411445.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_16T18_34_16.912824", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T18-34-16.912824.parquet"]}, {"split": "2024_02_16T18_45_36.411445", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T18-45-36.411445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T18-45-36.411445.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_16T18_34_16.912824", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T18-34-16.912824.parquet"]}, {"split": "2024_02_16T18_45_36.411445", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T18-45-36.411445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T18-45-36.411445.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_16T18_34_16.912824", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T18-34-16.912824.parquet"]}, {"split": "2024_02_16T18_45_36.411445", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T18-45-36.411445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T18-45-36.411445.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_16T18_34_16.912824", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T18-34-16.912824.parquet"]}, {"split": "2024_02_16T18_45_36.411445", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T18-45-36.411445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T18-45-36.411445.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_16T18_34_16.912824", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T18-34-16.912824.parquet"]}, {"split": "2024_02_16T18_45_36.411445", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T18-45-36.411445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T18-45-36.411445.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_16T18_34_16.912824", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T18-34-16.912824.parquet"]}, {"split": "2024_02_16T18_45_36.411445", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T18-45-36.411445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T18-45-36.411445.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_16T18_34_16.912824", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T18-34-16.912824.parquet"]}, {"split": "2024_02_16T18_45_36.411445", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T18-45-36.411445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T18-45-36.411445.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_16T18_34_16.912824", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T18-34-16.912824.parquet"]}, {"split": "2024_02_16T18_45_36.411445", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T18-45-36.411445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T18-45-36.411445.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_16T18_34_16.912824", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T18-34-16.912824.parquet"]}, {"split": "2024_02_16T18_45_36.411445", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T18-45-36.411445.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T18-45-36.411445.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_16T18_34_16.912824", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T18-34-16.912824.parquet"]}, {"split": "2024_02_16T18_45_36.411445", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T18-45-36.411445.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T18-45-36.411445.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_16T18_34_16.912824", "path": ["**/details_harness|winogrande|5_2024-02-16T18-34-16.912824.parquet"]}, {"split": "2024_02_16T18_45_36.411445", "path": ["**/details_harness|winogrande|5_2024-02-16T18-45-36.411445.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-16T18-45-36.411445.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_16T18_34_16.912824", "path": ["results_2024-02-16T18-34-16.912824.parquet"]}, {"split": "2024_02_16T18_45_36.411445", "path": ["results_2024-02-16T18-45-36.411445.parquet"]}, {"split": "latest", "path": ["results_2024-02-16T18-45-36.411445.parquet"]}]}]} | 2024-02-16T18:47:56+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of shahzebnaveed/StarlingHermes-2.5-Mistral-7B-slerp
Dataset automatically created during the evaluation run of model shahzebnaveed/StarlingHermes-2.5-Mistral-7B-slerp on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-16T18:45:36.411445(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of shahzebnaveed/StarlingHermes-2.5-Mistral-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model shahzebnaveed/StarlingHermes-2.5-Mistral-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-16T18:45:36.411445(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of shahzebnaveed/StarlingHermes-2.5-Mistral-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model shahzebnaveed/StarlingHermes-2.5-Mistral-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-16T18:45:36.411445(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
203,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of shahzebnaveed/StarlingHermes-2.5-Mistral-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model shahzebnaveed/StarlingHermes-2.5-Mistral-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-16T18:45:36.411445(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]"
] |
5661c3f6035e3ef65e1d7680807563b0917d0bf0 |
# Dataset Card for Evaluation run of shahzebnaveed/NeuralHermes-2.5-Mistral-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [shahzebnaveed/NeuralHermes-2.5-Mistral-7B](https://huggingface.co/shahzebnaveed/NeuralHermes-2.5-Mistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_shahzebnaveed__NeuralHermes-2.5-Mistral-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-16T18:38:39.552249](https://huggingface.co/datasets/open-llm-leaderboard/details_shahzebnaveed__NeuralHermes-2.5-Mistral-7B/blob/main/results_2024-02-16T18-38-39.552249.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6389813616730637,
"acc_stderr": 0.032241943281319616,
"acc_norm": 0.6417845405458356,
"acc_norm_stderr": 0.03288190466152383,
"mc1": 0.3598531211750306,
"mc1_stderr": 0.01680186046667716,
"mc2": 0.5229369986337806,
"mc2_stderr": 0.01525073227946668
},
"harness|arc:challenge|25": {
"acc": 0.6126279863481229,
"acc_stderr": 0.014235872487909869,
"acc_norm": 0.6484641638225256,
"acc_norm_stderr": 0.013952413699600931
},
"harness|hellaswag|10": {
"acc": 0.6519617606054571,
"acc_stderr": 0.004753746951620152,
"acc_norm": 0.8428599880501892,
"acc_norm_stderr": 0.003631889496122536
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.0421850621536888,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.0421850621536888
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.02854479331905533,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.02854479331905533
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.03586879280080341,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.03586879280080341
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.036928207672648664,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.036928207672648664
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.02546714904546955,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.02546714904546955
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083525,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.032250781083062896,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.032250781083062896
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.028057791672989017,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.028057791672989017
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768776,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768776
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6153846153846154,
"acc_stderr": 0.024666744915187208,
"acc_norm": 0.6153846153846154,
"acc_norm_stderr": 0.024666744915187208
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.02803792996911499,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.02803792996911499
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8330275229357799,
"acc_stderr": 0.01599015488507338,
"acc_norm": 0.8330275229357799,
"acc_norm_stderr": 0.01599015488507338
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.02812597226565437,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.02812597226565437
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.030636591348699803,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.030636591348699803
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.039578354719809805,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.039578354719809805
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597528,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597528
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8301404853128991,
"acc_stderr": 0.013428186370608306,
"acc_norm": 0.8301404853128991,
"acc_norm_stderr": 0.013428186370608306
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.02425790170532338,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.02425790170532338
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3094972067039106,
"acc_stderr": 0.015461169002371542,
"acc_norm": 0.3094972067039106,
"acc_norm_stderr": 0.015461169002371542
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.024630048979824782,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.024630048979824782
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.684887459807074,
"acc_stderr": 0.026385273703464485,
"acc_norm": 0.684887459807074,
"acc_norm_stderr": 0.026385273703464485
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.02399350170904211,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.02399350170904211
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5070921985815603,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.5070921985815603,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46936114732724904,
"acc_stderr": 0.012746237711716634,
"acc_norm": 0.46936114732724904,
"acc_norm_stderr": 0.012746237711716634
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462937,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462937
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.018975427920507215,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.018975427920507215
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142773,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142773
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8159203980099502,
"acc_stderr": 0.027403859410786845,
"acc_norm": 0.8159203980099502,
"acc_norm_stderr": 0.027403859410786845
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3598531211750306,
"mc1_stderr": 0.01680186046667716,
"mc2": 0.5229369986337806,
"mc2_stderr": 0.01525073227946668
},
"harness|winogrande|5": {
"acc": 0.7797947908445146,
"acc_stderr": 0.011646276755089688
},
"harness|gsm8k|5": {
"acc": 0.5595147839272175,
"acc_stderr": 0.01367457213169389
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_shahzebnaveed__NeuralHermes-2.5-Mistral-7B | [
"region:us"
] | 2024-02-16T18:40:58+00:00 | {"pretty_name": "Evaluation run of shahzebnaveed/NeuralHermes-2.5-Mistral-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [shahzebnaveed/NeuralHermes-2.5-Mistral-7B](https://huggingface.co/shahzebnaveed/NeuralHermes-2.5-Mistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_shahzebnaveed__NeuralHermes-2.5-Mistral-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-16T18:38:39.552249](https://huggingface.co/datasets/open-llm-leaderboard/details_shahzebnaveed__NeuralHermes-2.5-Mistral-7B/blob/main/results_2024-02-16T18-38-39.552249.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6389813616730637,\n \"acc_stderr\": 0.032241943281319616,\n \"acc_norm\": 0.6417845405458356,\n \"acc_norm_stderr\": 0.03288190466152383,\n \"mc1\": 0.3598531211750306,\n \"mc1_stderr\": 0.01680186046667716,\n \"mc2\": 0.5229369986337806,\n \"mc2_stderr\": 0.01525073227946668\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6126279863481229,\n \"acc_stderr\": 0.014235872487909869,\n \"acc_norm\": 0.6484641638225256,\n \"acc_norm_stderr\": 0.013952413699600931\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6519617606054571,\n \"acc_stderr\": 0.004753746951620152,\n \"acc_norm\": 0.8428599880501892,\n \"acc_norm_stderr\": 0.003631889496122536\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.02854479331905533,\n \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.02854479331905533\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.7569444444444444,\n \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6242774566473989,\n \"acc_stderr\": 0.036928207672648664,\n \"acc_norm\": 0.6242774566473989,\n \"acc_norm_stderr\": 0.036928207672648664\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42592592592592593,\n \"acc_stderr\": 0.02546714904546955,\n \"acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.02546714904546955\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n \"acc_stderr\": 0.023157879349083525,\n \"acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.023157879349083525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.032250781083062896,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.032250781083062896\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768776,\n \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768776\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6153846153846154,\n \"acc_stderr\": 0.024666744915187208,\n \"acc_norm\": 0.6153846153846154,\n \"acc_norm_stderr\": 0.024666744915187208\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3037037037037037,\n \"acc_stderr\": 0.02803792996911499,\n \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.02803792996911499\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8330275229357799,\n \"acc_stderr\": 0.01599015488507338,\n \"acc_norm\": 0.8330275229357799,\n \"acc_norm_stderr\": 0.01599015488507338\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7990196078431373,\n \"acc_stderr\": 0.02812597226565437,\n \"acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.02812597226565437\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n \"acc_stderr\": 0.030636591348699803,\n \"acc_norm\": 0.7040358744394619,\n \"acc_norm_stderr\": 0.030636591348699803\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.039578354719809805,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.039578354719809805\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.022801382534597528,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.022801382534597528\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8301404853128991,\n \"acc_stderr\": 0.013428186370608306,\n \"acc_norm\": 0.8301404853128991,\n \"acc_norm_stderr\": 0.013428186370608306\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.02425790170532338,\n \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.02425790170532338\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3094972067039106,\n \"acc_stderr\": 0.015461169002371542,\n \"acc_norm\": 0.3094972067039106,\n \"acc_norm_stderr\": 0.015461169002371542\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.024630048979824782,\n \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.024630048979824782\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.684887459807074,\n \"acc_stderr\": 0.026385273703464485,\n \"acc_norm\": 0.684887459807074,\n \"acc_norm_stderr\": 0.026385273703464485\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.02399350170904211,\n \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.02399350170904211\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5070921985815603,\n \"acc_stderr\": 0.02982449855912901,\n \"acc_norm\": 0.5070921985815603,\n \"acc_norm_stderr\": 0.02982449855912901\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46936114732724904,\n \"acc_stderr\": 0.012746237711716634,\n \"acc_norm\": 0.46936114732724904,\n \"acc_norm_stderr\": 0.012746237711716634\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462937,\n \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462937\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.673202614379085,\n \"acc_stderr\": 0.018975427920507215,\n \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.018975427920507215\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142773,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142773\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n \"acc_stderr\": 0.027403859410786845,\n \"acc_norm\": 0.8159203980099502,\n \"acc_norm_stderr\": 0.027403859410786845\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3598531211750306,\n \"mc1_stderr\": 0.01680186046667716,\n \"mc2\": 0.5229369986337806,\n \"mc2_stderr\": 0.01525073227946668\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7797947908445146,\n \"acc_stderr\": 0.011646276755089688\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5595147839272175,\n \"acc_stderr\": 0.01367457213169389\n }\n}\n```", "repo_url": "https://huggingface.co/shahzebnaveed/NeuralHermes-2.5-Mistral-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_16T18_38_39.552249", "path": ["**/details_harness|arc:challenge|25_2024-02-16T18-38-39.552249.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-16T18-38-39.552249.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_16T18_38_39.552249", "path": ["**/details_harness|gsm8k|5_2024-02-16T18-38-39.552249.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-16T18-38-39.552249.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_16T18_38_39.552249", "path": ["**/details_harness|hellaswag|10_2024-02-16T18-38-39.552249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-16T18-38-39.552249.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_16T18_38_39.552249", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T18-38-39.552249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T18-38-39.552249.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T18-38-39.552249.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_16T18_38_39.552249", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T18-38-39.552249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T18-38-39.552249.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_16T18_38_39.552249", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T18-38-39.552249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T18-38-39.552249.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_16T18_38_39.552249", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T18-38-39.552249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T18-38-39.552249.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_16T18_38_39.552249", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T18-38-39.552249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T18-38-39.552249.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_16T18_38_39.552249", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T18-38-39.552249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T18-38-39.552249.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_16T18_38_39.552249", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T18-38-39.552249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T18-38-39.552249.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_16T18_38_39.552249", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T18-38-39.552249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T18-38-39.552249.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_16T18_38_39.552249", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T18-38-39.552249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T18-38-39.552249.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_16T18_38_39.552249", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T18-38-39.552249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T18-38-39.552249.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_16T18_38_39.552249", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T18-38-39.552249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T18-38-39.552249.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_16T18_38_39.552249", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T18-38-39.552249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T18-38-39.552249.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_16T18_38_39.552249", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T18-38-39.552249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T18-38-39.552249.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_16T18_38_39.552249", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T18-38-39.552249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T18-38-39.552249.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_16T18_38_39.552249", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T18-38-39.552249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T18-38-39.552249.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_16T18_38_39.552249", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T18-38-39.552249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T18-38-39.552249.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_16T18_38_39.552249", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T18-38-39.552249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T18-38-39.552249.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_16T18_38_39.552249", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T18-38-39.552249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T18-38-39.552249.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_16T18_38_39.552249", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T18-38-39.552249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T18-38-39.552249.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_16T18_38_39.552249", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T18-38-39.552249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T18-38-39.552249.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_16T18_38_39.552249", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T18-38-39.552249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T18-38-39.552249.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_16T18_38_39.552249", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T18-38-39.552249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T18-38-39.552249.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_16T18_38_39.552249", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T18-38-39.552249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T18-38-39.552249.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_16T18_38_39.552249", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T18-38-39.552249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T18-38-39.552249.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_16T18_38_39.552249", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T18-38-39.552249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T18-38-39.552249.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_16T18_38_39.552249", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T18-38-39.552249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T18-38-39.552249.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_16T18_38_39.552249", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T18-38-39.552249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T18-38-39.552249.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_16T18_38_39.552249", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T18-38-39.552249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T18-38-39.552249.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_16T18_38_39.552249", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T18-38-39.552249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T18-38-39.552249.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_16T18_38_39.552249", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T18-38-39.552249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T18-38-39.552249.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_16T18_38_39.552249", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T18-38-39.552249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T18-38-39.552249.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_16T18_38_39.552249", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T18-38-39.552249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T18-38-39.552249.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_16T18_38_39.552249", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T18-38-39.552249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T18-38-39.552249.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_16T18_38_39.552249", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T18-38-39.552249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T18-38-39.552249.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_16T18_38_39.552249", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T18-38-39.552249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T18-38-39.552249.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_16T18_38_39.552249", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T18-38-39.552249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T18-38-39.552249.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_16T18_38_39.552249", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T18-38-39.552249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T18-38-39.552249.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_16T18_38_39.552249", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T18-38-39.552249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T18-38-39.552249.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_16T18_38_39.552249", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T18-38-39.552249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T18-38-39.552249.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_16T18_38_39.552249", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T18-38-39.552249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T18-38-39.552249.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_16T18_38_39.552249", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T18-38-39.552249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T18-38-39.552249.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_16T18_38_39.552249", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T18-38-39.552249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T18-38-39.552249.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_16T18_38_39.552249", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T18-38-39.552249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T18-38-39.552249.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_16T18_38_39.552249", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T18-38-39.552249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T18-38-39.552249.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_16T18_38_39.552249", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T18-38-39.552249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T18-38-39.552249.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_16T18_38_39.552249", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T18-38-39.552249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T18-38-39.552249.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_16T18_38_39.552249", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T18-38-39.552249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T18-38-39.552249.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_16T18_38_39.552249", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T18-38-39.552249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T18-38-39.552249.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_16T18_38_39.552249", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T18-38-39.552249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T18-38-39.552249.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_16T18_38_39.552249", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T18-38-39.552249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T18-38-39.552249.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_16T18_38_39.552249", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T18-38-39.552249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T18-38-39.552249.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_16T18_38_39.552249", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T18-38-39.552249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T18-38-39.552249.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_16T18_38_39.552249", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T18-38-39.552249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T18-38-39.552249.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_16T18_38_39.552249", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T18-38-39.552249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T18-38-39.552249.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_16T18_38_39.552249", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T18-38-39.552249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T18-38-39.552249.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_16T18_38_39.552249", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T18-38-39.552249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T18-38-39.552249.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_16T18_38_39.552249", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T18-38-39.552249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T18-38-39.552249.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_16T18_38_39.552249", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T18-38-39.552249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T18-38-39.552249.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_16T18_38_39.552249", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T18-38-39.552249.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T18-38-39.552249.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_16T18_38_39.552249", "path": ["**/details_harness|winogrande|5_2024-02-16T18-38-39.552249.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-16T18-38-39.552249.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_16T18_38_39.552249", "path": ["results_2024-02-16T18-38-39.552249.parquet"]}, {"split": "latest", "path": ["results_2024-02-16T18-38-39.552249.parquet"]}]}]} | 2024-02-16T18:41:22+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of shahzebnaveed/NeuralHermes-2.5-Mistral-7B
Dataset automatically created during the evaluation run of model shahzebnaveed/NeuralHermes-2.5-Mistral-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-16T18:38:39.552249(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of shahzebnaveed/NeuralHermes-2.5-Mistral-7B\n\n\n\nDataset automatically created during the evaluation run of model shahzebnaveed/NeuralHermes-2.5-Mistral-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-16T18:38:39.552249(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of shahzebnaveed/NeuralHermes-2.5-Mistral-7B\n\n\n\nDataset automatically created during the evaluation run of model shahzebnaveed/NeuralHermes-2.5-Mistral-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-16T18:38:39.552249(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
197,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of shahzebnaveed/NeuralHermes-2.5-Mistral-7B\n\n\n\nDataset automatically created during the evaluation run of model shahzebnaveed/NeuralHermes-2.5-Mistral-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-16T18:38:39.552249(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]"
] |
fbfe4434dc0363a87a37384feee5fd5b9d263cf6 | Dataset for ShawGPT, a fine-tuned data science YouTube comment responder.
Video link: *coming soon!* <br>
Blog link: *coming soon!* | shawhin/shawgpt-youtube-comments | [
"size_categories:n<1K",
"license:mit",
"region:us"
] | 2024-02-16T19:14:02+00:00 | {"license": "mit", "size_categories": ["n<1K"], "dataset_info": {"features": [{"name": "example", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 41700, "num_examples": 50}, {"name": "test", "num_bytes": 7489, "num_examples": 9}], "download_size": 27338, "dataset_size": 49189}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-02-16T20:14:38+00:00 | [] | [] | TAGS
#size_categories-n<1K #license-mit #region-us
| Dataset for ShawGPT, a fine-tuned data science YouTube comment responder.
Video link: *coming soon!* <br>
Blog link: *coming soon!* | [] | [
"TAGS\n#size_categories-n<1K #license-mit #region-us \n"
] | [
21
] | [
"passage: TAGS\n#size_categories-n<1K #license-mit #region-us \n"
] |
16c22d326504ac71adab3a6f198a278858b81b33 | # Dataset Card for Academic Essay Prompt-Completion Pairs
## Dataset Description
This dataset is designed to distinguish between essays authored by students and those generated by Large Language Models (LLMs), offering an essential resource for researchers and practitioners in natural language processing, educational technology, and academic integrity. Hosted on Huggingface, it supports the development and evaluation of models aimed at identifying the origin of textual content, which is crucial for a variety of applications including enhancing automated grading systems and detecting AI-generated text in academic submissions.
### Dataset Overview
The dataset consists of prompt-completion pairs, carefully crafted to simulate real-world academic writing scenarios. Each entry within the dataset is uniquely structured, encapsulated by `<s>` and `</s>` tags, ensuring a standardized format. Within these tags, the prompt is specified between `[INST]` and `[/INST]` tags, comprising the Source Text, Essay Instructions, and the Essay. The completion, positioned outside the `[INST]` and `[/INST]` tags but still within the `<s>` and `</s>` encapsulation, categorically states the essay's origin—either as "This essay was written by an actual student." or "This essay was generated by a Large Language Model." This arrangement provides a nuanced classification task, focusing on discerning student-written essays from machine-generated ones.
### Structure
Upon utilizing the `load_dataset` command on Huggingface to access the dataset, users will encounter two primary splits:
- **Train:** Accounts for approximately 70% of the dataset, tailored for the training of machine learning models.
- **Test:** Comprises the remaining 30%, designated for the assessment of the models' performance.
#### Fields
Each entry in the dataset is meticulously structured to include:
- **Prompt:** Located within `[INST]` and `[/INST]` tags and encapsulated by `<s>` and `</s>` tags, the prompt includes the Source Text, Essay Instructions, and the Essay.
- **Completion:** Situated outside the instructional tags yet within the `<s>` and `</s>` encapsulation, the completion provides a definitive statement regarding the essay's authorship, indicating it was either "student-written" or "machine-generated."
### Use Cases
This dataset is exceptionally suited for:
- Crafting algorithms that can autonomously distinguish between human-authored and AI-generated text.
- Reinforcing academic integrity tools by identifying submissions that may be AI-generated.
- Enhancing the capabilities of automated essay scoring systems by introducing them to a wide variety of textual origins.
- Conducting in-depth research in natural language understanding, particularly in exploring the stylistic and content-based differences between human and AI authors.
### Accessing the Dataset
To access and load the dataset into Python environments, use the following command through Huggingface's `datasets` library:
```python
from datasets import load_dataset
dataset = load_dataset("knarasi1/student_and_llm_essays")
```
### Acknowledgments
This dataset represents a collective endeavor to foster innovation and uphold integrity in academic writing and research. It underscores the community's dedication to improving interactions between humans and AI within educational frameworks.
### Disclaimer
Dataset users are urged to employ this resource ethically and responsibly, especially in light of its potential impact on educational and research settings. The creators of the dataset and Huggingface explicitly discourage the misuse of AI-generated text for academic dishonesty or any form of deception. | knarasi1/student_and_llm_essays | [
"region:us"
] | 2024-02-16T19:19:16+00:00 | {} | 2024-02-16T19:42:26+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for Academic Essay Prompt-Completion Pairs
## Dataset Description
This dataset is designed to distinguish between essays authored by students and those generated by Large Language Models (LLMs), offering an essential resource for researchers and practitioners in natural language processing, educational technology, and academic integrity. Hosted on Huggingface, it supports the development and evaluation of models aimed at identifying the origin of textual content, which is crucial for a variety of applications including enhancing automated grading systems and detecting AI-generated text in academic submissions.
### Dataset Overview
The dataset consists of prompt-completion pairs, carefully crafted to simulate real-world academic writing scenarios. Each entry within the dataset is uniquely structured, encapsulated by '<s>' and '</s>' tags, ensuring a standardized format. Within these tags, the prompt is specified between '[INST]' and '[/INST]' tags, comprising the Source Text, Essay Instructions, and the Essay. The completion, positioned outside the '[INST]' and '[/INST]' tags but still within the '<s>' and '</s>' encapsulation, categorically states the essay's origin—either as "This essay was written by an actual student." or "This essay was generated by a Large Language Model." This arrangement provides a nuanced classification task, focusing on discerning student-written essays from machine-generated ones.
### Structure
Upon utilizing the 'load_dataset' command on Huggingface to access the dataset, users will encounter two primary splits:
- Train: Accounts for approximately 70% of the dataset, tailored for the training of machine learning models.
- Test: Comprises the remaining 30%, designated for the assessment of the models' performance.
#### Fields
Each entry in the dataset is meticulously structured to include:
- Prompt: Located within '[INST]' and '[/INST]' tags and encapsulated by '<s>' and '</s>' tags, the prompt includes the Source Text, Essay Instructions, and the Essay.
- Completion: Situated outside the instructional tags yet within the '<s>' and '</s>' encapsulation, the completion provides a definitive statement regarding the essay's authorship, indicating it was either "student-written" or "machine-generated."
### Use Cases
This dataset is exceptionally suited for:
- Crafting algorithms that can autonomously distinguish between human-authored and AI-generated text.
- Reinforcing academic integrity tools by identifying submissions that may be AI-generated.
- Enhancing the capabilities of automated essay scoring systems by introducing them to a wide variety of textual origins.
- Conducting in-depth research in natural language understanding, particularly in exploring the stylistic and content-based differences between human and AI authors.
### Accessing the Dataset
To access and load the dataset into Python environments, use the following command through Huggingface's 'datasets' library:
### Acknowledgments
This dataset represents a collective endeavor to foster innovation and uphold integrity in academic writing and research. It underscores the community's dedication to improving interactions between humans and AI within educational frameworks.
### Disclaimer
Dataset users are urged to employ this resource ethically and responsibly, especially in light of its potential impact on educational and research settings. The creators of the dataset and Huggingface explicitly discourage the misuse of AI-generated text for academic dishonesty or any form of deception. | [
"# Dataset Card for Academic Essay Prompt-Completion Pairs",
"## Dataset Description\n\nThis dataset is designed to distinguish between essays authored by students and those generated by Large Language Models (LLMs), offering an essential resource for researchers and practitioners in natural language processing, educational technology, and academic integrity. Hosted on Huggingface, it supports the development and evaluation of models aimed at identifying the origin of textual content, which is crucial for a variety of applications including enhancing automated grading systems and detecting AI-generated text in academic submissions.",
"### Dataset Overview\n\nThe dataset consists of prompt-completion pairs, carefully crafted to simulate real-world academic writing scenarios. Each entry within the dataset is uniquely structured, encapsulated by '<s>' and '</s>' tags, ensuring a standardized format. Within these tags, the prompt is specified between '[INST]' and '[/INST]' tags, comprising the Source Text, Essay Instructions, and the Essay. The completion, positioned outside the '[INST]' and '[/INST]' tags but still within the '<s>' and '</s>' encapsulation, categorically states the essay's origin—either as \"This essay was written by an actual student.\" or \"This essay was generated by a Large Language Model.\" This arrangement provides a nuanced classification task, focusing on discerning student-written essays from machine-generated ones.",
"### Structure\n\nUpon utilizing the 'load_dataset' command on Huggingface to access the dataset, users will encounter two primary splits:\n\n- Train: Accounts for approximately 70% of the dataset, tailored for the training of machine learning models.\n- Test: Comprises the remaining 30%, designated for the assessment of the models' performance.",
"#### Fields\n\nEach entry in the dataset is meticulously structured to include:\n\n- Prompt: Located within '[INST]' and '[/INST]' tags and encapsulated by '<s>' and '</s>' tags, the prompt includes the Source Text, Essay Instructions, and the Essay.\n- Completion: Situated outside the instructional tags yet within the '<s>' and '</s>' encapsulation, the completion provides a definitive statement regarding the essay's authorship, indicating it was either \"student-written\" or \"machine-generated.\"",
"### Use Cases\n\nThis dataset is exceptionally suited for:\n\n- Crafting algorithms that can autonomously distinguish between human-authored and AI-generated text.\n- Reinforcing academic integrity tools by identifying submissions that may be AI-generated.\n- Enhancing the capabilities of automated essay scoring systems by introducing them to a wide variety of textual origins.\n- Conducting in-depth research in natural language understanding, particularly in exploring the stylistic and content-based differences between human and AI authors.",
"### Accessing the Dataset\n\nTo access and load the dataset into Python environments, use the following command through Huggingface's 'datasets' library:",
"### Acknowledgments\n\nThis dataset represents a collective endeavor to foster innovation and uphold integrity in academic writing and research. It underscores the community's dedication to improving interactions between humans and AI within educational frameworks.",
"### Disclaimer\n\nDataset users are urged to employ this resource ethically and responsibly, especially in light of its potential impact on educational and research settings. The creators of the dataset and Huggingface explicitly discourage the misuse of AI-generated text for academic dishonesty or any form of deception."
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Academic Essay Prompt-Completion Pairs",
"## Dataset Description\n\nThis dataset is designed to distinguish between essays authored by students and those generated by Large Language Models (LLMs), offering an essential resource for researchers and practitioners in natural language processing, educational technology, and academic integrity. Hosted on Huggingface, it supports the development and evaluation of models aimed at identifying the origin of textual content, which is crucial for a variety of applications including enhancing automated grading systems and detecting AI-generated text in academic submissions.",
"### Dataset Overview\n\nThe dataset consists of prompt-completion pairs, carefully crafted to simulate real-world academic writing scenarios. Each entry within the dataset is uniquely structured, encapsulated by '<s>' and '</s>' tags, ensuring a standardized format. Within these tags, the prompt is specified between '[INST]' and '[/INST]' tags, comprising the Source Text, Essay Instructions, and the Essay. The completion, positioned outside the '[INST]' and '[/INST]' tags but still within the '<s>' and '</s>' encapsulation, categorically states the essay's origin—either as \"This essay was written by an actual student.\" or \"This essay was generated by a Large Language Model.\" This arrangement provides a nuanced classification task, focusing on discerning student-written essays from machine-generated ones.",
"### Structure\n\nUpon utilizing the 'load_dataset' command on Huggingface to access the dataset, users will encounter two primary splits:\n\n- Train: Accounts for approximately 70% of the dataset, tailored for the training of machine learning models.\n- Test: Comprises the remaining 30%, designated for the assessment of the models' performance.",
"#### Fields\n\nEach entry in the dataset is meticulously structured to include:\n\n- Prompt: Located within '[INST]' and '[/INST]' tags and encapsulated by '<s>' and '</s>' tags, the prompt includes the Source Text, Essay Instructions, and the Essay.\n- Completion: Situated outside the instructional tags yet within the '<s>' and '</s>' encapsulation, the completion provides a definitive statement regarding the essay's authorship, indicating it was either \"student-written\" or \"machine-generated.\"",
"### Use Cases\n\nThis dataset is exceptionally suited for:\n\n- Crafting algorithms that can autonomously distinguish between human-authored and AI-generated text.\n- Reinforcing academic integrity tools by identifying submissions that may be AI-generated.\n- Enhancing the capabilities of automated essay scoring systems by introducing them to a wide variety of textual origins.\n- Conducting in-depth research in natural language understanding, particularly in exploring the stylistic and content-based differences between human and AI authors.",
"### Accessing the Dataset\n\nTo access and load the dataset into Python environments, use the following command through Huggingface's 'datasets' library:",
"### Acknowledgments\n\nThis dataset represents a collective endeavor to foster innovation and uphold integrity in academic writing and research. It underscores the community's dedication to improving interactions between humans and AI within educational frameworks.",
"### Disclaimer\n\nDataset users are urged to employ this resource ethically and responsibly, especially in light of its potential impact on educational and research settings. The creators of the dataset and Huggingface explicitly discourage the misuse of AI-generated text for academic dishonesty or any form of deception."
] | [
6,
16,
117,
212,
81,
134,
122,
37,
55,
73
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Academic Essay Prompt-Completion Pairs## Dataset Description\n\nThis dataset is designed to distinguish between essays authored by students and those generated by Large Language Models (LLMs), offering an essential resource for researchers and practitioners in natural language processing, educational technology, and academic integrity. Hosted on Huggingface, it supports the development and evaluation of models aimed at identifying the origin of textual content, which is crucial for a variety of applications including enhancing automated grading systems and detecting AI-generated text in academic submissions.### Dataset Overview\n\nThe dataset consists of prompt-completion pairs, carefully crafted to simulate real-world academic writing scenarios. Each entry within the dataset is uniquely structured, encapsulated by '<s>' and '</s>' tags, ensuring a standardized format. Within these tags, the prompt is specified between '[INST]' and '[/INST]' tags, comprising the Source Text, Essay Instructions, and the Essay. The completion, positioned outside the '[INST]' and '[/INST]' tags but still within the '<s>' and '</s>' encapsulation, categorically states the essay's origin—either as \"This essay was written by an actual student.\" or \"This essay was generated by a Large Language Model.\" This arrangement provides a nuanced classification task, focusing on discerning student-written essays from machine-generated ones.### Structure\n\nUpon utilizing the 'load_dataset' command on Huggingface to access the dataset, users will encounter two primary splits:\n\n- Train: Accounts for approximately 70% of the dataset, tailored for the training of machine learning models.\n- Test: Comprises the remaining 30%, designated for the assessment of the models' performance."
] |
964984dfe8f2614d2a831a7ee5b5018fb76abcf7 | # Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
These datasets has been used in TooT-PLM-ionCT tool which is a composite framework consisting of three distinct systems, each with different architectures and trained on unique datasets. Each system within TooT-PLM-ionCT is dedicated to a specific task: segregating ion channels (ICs) and ion transporters (ITs) from other membrane proteins and differentiating ICs from ITs.
- **Curated by:** Hamed Ghazikhani
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** UniProt/SwissProt
- **Paper [optional]:** https://www.biorxiv.org/content/10.1101/2023.07.11.548644v1.abstract
- **Demo [optional]:** https://tootsuite.encs.concordia.ca/toot_plm_ionct
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
@misc{ghazikhani_exploiting_2023,
title = {Exploiting protein language models for the precise classification of ion channels and ion transporters},
copyright = {© 2023, Posted by Cold Spring Harbor Laboratory. This pre-print is available under a Creative Commons License (Attribution 4.0 International), CC BY 4.0, as described at http://creativecommons.org/licenses/by/4.0/},
url = {https://www.biorxiv.org/content/10.1101/2023.07.11.548644v1},
doi = {10.1101/2023.07.11.548644},
abstract = {This study presents TooT-PLM-ionCT, a composite framework consisting of three distinct systems, each with different architectures and trained on unique datasets. Each system within TooT-PLM-ionCT is dedicated to a specific task: segregating ion channels (ICs) and ion transporters (ITs) from other membrane proteins and differentiating ICs from ITs. These systems exploit the capabilities of six diverse Protein Language Models (PLMs) - ProtBERT, ProtBERT-BFD, ESM-1b, ESM-2 (650M parameters), and ESM-2 (15B parameters). As these proteins play a pivotal role in the regulation of ion movement across cellular membranes, they are integral to numerous biological processes and overall cellular vitality. To circumvent the costly and time-consuming nature of wet lab experiments, we harness the predictive prowess of PLMs, drawing parallels with techniques in natural language processing. Our strategy engages six classifiers, embracing both conventional methodologies and a deep learning model, for each of our defined tasks. Furthermore, we delve into critical factors influencing our tasks, including the implications of dataset balancing, the effect of frozen versus fine-tuned PLM representations, and the potential variance between half and full precision floating-point computations. Our empirical results showcase superior performance in distinguishing ITs from other membrane proteins and differentiating ICs from ITs, while the task of discriminating ICs from other membrane proteins exhibits results commensurate with the current state-of-the-art.},
language = {en},
urldate = {2023-07-31},
publisher = {bioRxiv},
author = {Ghazikhani, Hamed and Butler, Gregory},
month = jul,
year = {2023},
note = {Pages: 2023.07.11.548644
Section: New Results},
file = {Full Text PDF:/Users/hamedghazikhani/Zotero/storage/NVPQKEMJ/Ghazikhani and Butler - 2023 - Exploiting protein language models for the precise.pdf:application/pdf},
}
| ghazikhanihamed/TooT-PLM-ionCT_DB | [
"task_categories:text-classification",
"license:mit",
"region:us"
] | 2024-02-16T19:40:46+00:00 | {"license": "mit", "task_categories": ["text-classification"]} | 2024-02-16T19:53:27+00:00 | [] | [] | TAGS
#task_categories-text-classification #license-mit #region-us
| # Dataset Card for Dataset Name
These datasets has been used in TooT-PLM-ionCT tool which is a composite framework consisting of three distinct systems, each with different architectures and trained on unique datasets. Each system within TooT-PLM-ionCT is dedicated to a specific task: segregating ion channels (ICs) and ion transporters (ITs) from other membrane proteins and differentiating ICs from ITs.
- Curated by: Hamed Ghazikhani
### Dataset Sources [optional]
- Repository: UniProt/SwissProt
- Paper [optional]: URL
- Demo [optional]: URL
[optional]
BibTeX:
@misc{ghazikhani_exploiting_2023,
title = {Exploiting protein language models for the precise classification of ion channels and ion transporters},
copyright = {© 2023, Posted by Cold Spring Harbor Laboratory. This pre-print is available under a Creative Commons License (Attribution 4.0 International), CC BY 4.0, as described at URL
url = {URL
doi = {10.1101/2023.07.11.548644},
abstract = {This study presents TooT-PLM-ionCT, a composite framework consisting of three distinct systems, each with different architectures and trained on unique datasets. Each system within TooT-PLM-ionCT is dedicated to a specific task: segregating ion channels (ICs) and ion transporters (ITs) from other membrane proteins and differentiating ICs from ITs. These systems exploit the capabilities of six diverse Protein Language Models (PLMs) - ProtBERT, ProtBERT-BFD, ESM-1b, ESM-2 (650M parameters), and ESM-2 (15B parameters). As these proteins play a pivotal role in the regulation of ion movement across cellular membranes, they are integral to numerous biological processes and overall cellular vitality. To circumvent the costly and time-consuming nature of wet lab experiments, we harness the predictive prowess of PLMs, drawing parallels with techniques in natural language processing. Our strategy engages six classifiers, embracing both conventional methodologies and a deep learning model, for each of our defined tasks. Furthermore, we delve into critical factors influencing our tasks, including the implications of dataset balancing, the effect of frozen versus fine-tuned PLM representations, and the potential variance between half and full precision floating-point computations. Our empirical results showcase superior performance in distinguishing ITs from other membrane proteins and differentiating ICs from ITs, while the task of discriminating ICs from other membrane proteins exhibits results commensurate with the current state-of-the-art.},
language = {en},
urldate = {2023-07-31},
publisher = {bioRxiv},
author = {Ghazikhani, Hamed and Butler, Gregory},
month = jul,
year = {2023},
note = {Pages: 2023.07.11.548644
Section: New Results},
file = {Full Text PDF:/Users/hamedghazikhani/Zotero/storage/NVPQKEMJ/Ghazikhani and Butler - 2023 - Exploiting protein language models for the URL:application/pdf},
}
| [
"# Dataset Card for Dataset Name\n\n\n\nThese datasets has been used in TooT-PLM-ionCT tool which is a composite framework consisting of three distinct systems, each with different architectures and trained on unique datasets. Each system within TooT-PLM-ionCT is dedicated to a specific task: segregating ion channels (ICs) and ion transporters (ITs) from other membrane proteins and differentiating ICs from ITs.\n\n\n\n- Curated by: Hamed Ghazikhani",
"### Dataset Sources [optional]\n\n\n\n- Repository: UniProt/SwissProt\n- Paper [optional]: URL\n- Demo [optional]: URL\n\n[optional]\n\n\n\nBibTeX:\n\n\n\n@misc{ghazikhani_exploiting_2023,\n\ttitle = {Exploiting protein language models for the precise classification of ion channels and ion transporters},\n\tcopyright = {© 2023, Posted by Cold Spring Harbor Laboratory. This pre-print is available under a Creative Commons License (Attribution 4.0 International), CC BY 4.0, as described at URL\n\turl = {URL\n\tdoi = {10.1101/2023.07.11.548644},\n\tabstract = {This study presents TooT-PLM-ionCT, a composite framework consisting of three distinct systems, each with different architectures and trained on unique datasets. Each system within TooT-PLM-ionCT is dedicated to a specific task: segregating ion channels (ICs) and ion transporters (ITs) from other membrane proteins and differentiating ICs from ITs. These systems exploit the capabilities of six diverse Protein Language Models (PLMs) - ProtBERT, ProtBERT-BFD, ESM-1b, ESM-2 (650M parameters), and ESM-2 (15B parameters). As these proteins play a pivotal role in the regulation of ion movement across cellular membranes, they are integral to numerous biological processes and overall cellular vitality. To circumvent the costly and time-consuming nature of wet lab experiments, we harness the predictive prowess of PLMs, drawing parallels with techniques in natural language processing. Our strategy engages six classifiers, embracing both conventional methodologies and a deep learning model, for each of our defined tasks. Furthermore, we delve into critical factors influencing our tasks, including the implications of dataset balancing, the effect of frozen versus fine-tuned PLM representations, and the potential variance between half and full precision floating-point computations. Our empirical results showcase superior performance in distinguishing ITs from other membrane proteins and differentiating ICs from ITs, while the task of discriminating ICs from other membrane proteins exhibits results commensurate with the current state-of-the-art.},\n\tlanguage = {en},\n\turldate = {2023-07-31},\n\tpublisher = {bioRxiv},\n\tauthor = {Ghazikhani, Hamed and Butler, Gregory},\n\tmonth = jul,\n\tyear = {2023},\n\tnote = {Pages: 2023.07.11.548644\nSection: New Results},\n\tfile = {Full Text PDF:/Users/hamedghazikhani/Zotero/storage/NVPQKEMJ/Ghazikhani and Butler - 2023 - Exploiting protein language models for the URL:application/pdf},\n}"
] | [
"TAGS\n#task_categories-text-classification #license-mit #region-us \n",
"# Dataset Card for Dataset Name\n\n\n\nThese datasets has been used in TooT-PLM-ionCT tool which is a composite framework consisting of three distinct systems, each with different architectures and trained on unique datasets. Each system within TooT-PLM-ionCT is dedicated to a specific task: segregating ion channels (ICs) and ion transporters (ITs) from other membrane proteins and differentiating ICs from ITs.\n\n\n\n- Curated by: Hamed Ghazikhani",
"### Dataset Sources [optional]\n\n\n\n- Repository: UniProt/SwissProt\n- Paper [optional]: URL\n- Demo [optional]: URL\n\n[optional]\n\n\n\nBibTeX:\n\n\n\n@misc{ghazikhani_exploiting_2023,\n\ttitle = {Exploiting protein language models for the precise classification of ion channels and ion transporters},\n\tcopyright = {© 2023, Posted by Cold Spring Harbor Laboratory. This pre-print is available under a Creative Commons License (Attribution 4.0 International), CC BY 4.0, as described at URL\n\turl = {URL\n\tdoi = {10.1101/2023.07.11.548644},\n\tabstract = {This study presents TooT-PLM-ionCT, a composite framework consisting of three distinct systems, each with different architectures and trained on unique datasets. Each system within TooT-PLM-ionCT is dedicated to a specific task: segregating ion channels (ICs) and ion transporters (ITs) from other membrane proteins and differentiating ICs from ITs. These systems exploit the capabilities of six diverse Protein Language Models (PLMs) - ProtBERT, ProtBERT-BFD, ESM-1b, ESM-2 (650M parameters), and ESM-2 (15B parameters). As these proteins play a pivotal role in the regulation of ion movement across cellular membranes, they are integral to numerous biological processes and overall cellular vitality. To circumvent the costly and time-consuming nature of wet lab experiments, we harness the predictive prowess of PLMs, drawing parallels with techniques in natural language processing. Our strategy engages six classifiers, embracing both conventional methodologies and a deep learning model, for each of our defined tasks. Furthermore, we delve into critical factors influencing our tasks, including the implications of dataset balancing, the effect of frozen versus fine-tuned PLM representations, and the potential variance between half and full precision floating-point computations. Our empirical results showcase superior performance in distinguishing ITs from other membrane proteins and differentiating ICs from ITs, while the task of discriminating ICs from other membrane proteins exhibits results commensurate with the current state-of-the-art.},\n\tlanguage = {en},\n\turldate = {2023-07-31},\n\tpublisher = {bioRxiv},\n\tauthor = {Ghazikhani, Hamed and Butler, Gregory},\n\tmonth = jul,\n\tyear = {2023},\n\tnote = {Pages: 2023.07.11.548644\nSection: New Results},\n\tfile = {Full Text PDF:/Users/hamedghazikhani/Zotero/storage/NVPQKEMJ/Ghazikhani and Butler - 2023 - Exploiting protein language models for the URL:application/pdf},\n}"
] | [
22,
114,
668
] | [
"passage: TAGS\n#task_categories-text-classification #license-mit #region-us \n# Dataset Card for Dataset Name\n\n\n\nThese datasets has been used in TooT-PLM-ionCT tool which is a composite framework consisting of three distinct systems, each with different architectures and trained on unique datasets. Each system within TooT-PLM-ionCT is dedicated to a specific task: segregating ion channels (ICs) and ion transporters (ITs) from other membrane proteins and differentiating ICs from ITs.\n\n\n\n- Curated by: Hamed Ghazikhani"
] |
9e312099a8e5031f2a22df917a93b589cae578b6 |
# Dataset Card for Evaluation run of DatPySci/pythia-1b-dpo-full
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [DatPySci/pythia-1b-dpo-full](https://huggingface.co/DatPySci/pythia-1b-dpo-full) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_DatPySci__pythia-1b-dpo-full",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-16T19:40:16.291334](https://huggingface.co/datasets/open-llm-leaderboard/details_DatPySci__pythia-1b-dpo-full/blob/main/results_2024-02-16T19-40-16.291334.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.24547450110873706,
"acc_stderr": 0.030321295510317755,
"acc_norm": 0.24643856591402344,
"acc_norm_stderr": 0.03105260906412087,
"mc1": 0.2252141982864137,
"mc1_stderr": 0.014623240768023496,
"mc2": 0.3726747548681276,
"mc2_stderr": 0.014362441702987668
},
"harness|arc:challenge|25": {
"acc": 0.2781569965870307,
"acc_stderr": 0.013094469919538814,
"acc_norm": 0.29436860068259385,
"acc_norm_stderr": 0.013318528460539424
},
"harness|hellaswag|10": {
"acc": 0.38657637920732923,
"acc_stderr": 0.004859699562451461,
"acc_norm": 0.4903405696076479,
"acc_norm_stderr": 0.004988850185477489
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.22962962962962963,
"acc_stderr": 0.03633384414073465,
"acc_norm": 0.22962962962962963,
"acc_norm_stderr": 0.03633384414073465
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.16447368421052633,
"acc_stderr": 0.0301675334686327,
"acc_norm": 0.16447368421052633,
"acc_norm_stderr": 0.0301675334686327
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.17,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.17,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.27169811320754716,
"acc_stderr": 0.02737770662467071,
"acc_norm": 0.27169811320754716,
"acc_norm_stderr": 0.02737770662467071
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.034765901043041336,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.034765901043041336
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366255,
"acc_norm": 0.19,
"acc_norm_stderr": 0.039427724440366255
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.03295304696818318,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.03295304696818318
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171452,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171452
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.027678452578212394,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.027678452578212394
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.03999423879281337,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.03999423879281337
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.1724137931034483,
"acc_stderr": 0.031478307902595745,
"acc_norm": 0.1724137931034483,
"acc_norm_stderr": 0.031478307902595745
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.03893259610604673,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.03893259610604673
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.23548387096774193,
"acc_stderr": 0.02413763242933771,
"acc_norm": 0.23548387096774193,
"acc_norm_stderr": 0.02413763242933771
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.20689655172413793,
"acc_stderr": 0.028501378167893946,
"acc_norm": 0.20689655172413793,
"acc_norm_stderr": 0.028501378167893946
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.18787878787878787,
"acc_stderr": 0.03050193405942914,
"acc_norm": 0.18787878787878787,
"acc_norm_stderr": 0.03050193405942914
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.21761658031088082,
"acc_stderr": 0.029778663037752947,
"acc_norm": 0.21761658031088082,
"acc_norm_stderr": 0.029778663037752947
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.21025641025641026,
"acc_stderr": 0.020660597485026924,
"acc_norm": 0.21025641025641026,
"acc_norm_stderr": 0.020660597485026924
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.02730914058823019,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.02730914058823019
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.25165562913907286,
"acc_stderr": 0.03543304234389985,
"acc_norm": 0.25165562913907286,
"acc_norm_stderr": 0.03543304234389985
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.29724770642201837,
"acc_stderr": 0.01959570722464353,
"acc_norm": 0.29724770642201837,
"acc_norm_stderr": 0.01959570722464353
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.375,
"acc_stderr": 0.033016908987210894,
"acc_norm": 0.375,
"acc_norm_stderr": 0.033016908987210894
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.027865942286639325,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.027865942286639325
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2742616033755274,
"acc_stderr": 0.029041333510598018,
"acc_norm": 0.2742616033755274,
"acc_norm_stderr": 0.029041333510598018
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3004484304932735,
"acc_stderr": 0.030769352008229143,
"acc_norm": 0.3004484304932735,
"acc_norm_stderr": 0.030769352008229143
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.22900763358778625,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.22900763358778625,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2331288343558282,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.2331288343558282,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04287858751340456,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04287858751340456
},
"harness|hendrycksTest-management|5": {
"acc": 0.1941747572815534,
"acc_stderr": 0.03916667762822585,
"acc_norm": 0.1941747572815534,
"acc_norm_stderr": 0.03916667762822585
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2948717948717949,
"acc_stderr": 0.029872577708891162,
"acc_norm": 0.2948717948717949,
"acc_norm_stderr": 0.029872577708891162
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.280970625798212,
"acc_stderr": 0.016073127851221235,
"acc_norm": 0.280970625798212,
"acc_norm_stderr": 0.016073127851221235
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808835,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808835
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.26688102893890675,
"acc_stderr": 0.02512263760881665,
"acc_norm": 0.26688102893890675,
"acc_norm_stderr": 0.02512263760881665
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2716049382716049,
"acc_stderr": 0.02474862449053737,
"acc_norm": 0.2716049382716049,
"acc_norm_stderr": 0.02474862449053737
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2375886524822695,
"acc_stderr": 0.025389512552729903,
"acc_norm": 0.2375886524822695,
"acc_norm_stderr": 0.025389512552729903
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.23989569752281617,
"acc_stderr": 0.010906282617981648,
"acc_norm": 0.23989569752281617,
"acc_norm_stderr": 0.010906282617981648
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.2757352941176471,
"acc_stderr": 0.027146271936625162,
"acc_norm": 0.2757352941176471,
"acc_norm_stderr": 0.027146271936625162
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25326797385620914,
"acc_stderr": 0.017593486895366835,
"acc_norm": 0.25326797385620914,
"acc_norm_stderr": 0.017593486895366835
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.23636363636363636,
"acc_stderr": 0.04069306319721378,
"acc_norm": 0.23636363636363636,
"acc_norm_stderr": 0.04069306319721378
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.17959183673469387,
"acc_stderr": 0.024573293589585637,
"acc_norm": 0.17959183673469387,
"acc_norm_stderr": 0.024573293589585637
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.263681592039801,
"acc_stderr": 0.031157150869355554,
"acc_norm": 0.263681592039801,
"acc_norm_stderr": 0.031157150869355554
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-virology|5": {
"acc": 0.24096385542168675,
"acc_stderr": 0.0332939411907353,
"acc_norm": 0.24096385542168675,
"acc_norm_stderr": 0.0332939411907353
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.1871345029239766,
"acc_stderr": 0.02991312723236805,
"acc_norm": 0.1871345029239766,
"acc_norm_stderr": 0.02991312723236805
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2252141982864137,
"mc1_stderr": 0.014623240768023496,
"mc2": 0.3726747548681276,
"mc2_stderr": 0.014362441702987668
},
"harness|winogrande|5": {
"acc": 0.5343330702446725,
"acc_stderr": 0.014019317531542565
},
"harness|gsm8k|5": {
"acc": 0.019711902956785442,
"acc_stderr": 0.0038289829787357082
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_DatPySci__pythia-1b-dpo-full | [
"region:us"
] | 2024-02-16T19:41:59+00:00 | {"pretty_name": "Evaluation run of DatPySci/pythia-1b-dpo-full", "dataset_summary": "Dataset automatically created during the evaluation run of model [DatPySci/pythia-1b-dpo-full](https://huggingface.co/DatPySci/pythia-1b-dpo-full) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_DatPySci__pythia-1b-dpo-full\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-16T19:40:16.291334](https://huggingface.co/datasets/open-llm-leaderboard/details_DatPySci__pythia-1b-dpo-full/blob/main/results_2024-02-16T19-40-16.291334.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.24547450110873706,\n \"acc_stderr\": 0.030321295510317755,\n \"acc_norm\": 0.24643856591402344,\n \"acc_norm_stderr\": 0.03105260906412087,\n \"mc1\": 0.2252141982864137,\n \"mc1_stderr\": 0.014623240768023496,\n \"mc2\": 0.3726747548681276,\n \"mc2_stderr\": 0.014362441702987668\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.2781569965870307,\n \"acc_stderr\": 0.013094469919538814,\n \"acc_norm\": 0.29436860068259385,\n \"acc_norm_stderr\": 0.013318528460539424\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.38657637920732923,\n \"acc_stderr\": 0.004859699562451461,\n \"acc_norm\": 0.4903405696076479,\n \"acc_norm_stderr\": 0.004988850185477489\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.22962962962962963,\n \"acc_stderr\": 0.03633384414073465,\n \"acc_norm\": 0.22962962962962963,\n \"acc_norm_stderr\": 0.03633384414073465\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.16447368421052633,\n \"acc_stderr\": 0.0301675334686327,\n \"acc_norm\": 0.16447368421052633,\n \"acc_norm_stderr\": 0.0301675334686327\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.17,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.27169811320754716,\n \"acc_stderr\": 0.02737770662467071,\n \"acc_norm\": 0.27169811320754716,\n \"acc_norm_stderr\": 0.02737770662467071\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.034765901043041336,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.034765901043041336\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366255,\n \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.039427724440366255\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.03295304696818318,\n \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.03295304696818318\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171452,\n \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171452\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.23404255319148937,\n \"acc_stderr\": 0.027678452578212394,\n \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.027678452578212394\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.03999423879281337,\n \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.03999423879281337\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.1724137931034483,\n \"acc_stderr\": 0.031478307902595745,\n \"acc_norm\": 0.1724137931034483,\n \"acc_norm_stderr\": 0.031478307902595745\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.25396825396825395,\n \"acc_stderr\": 0.03893259610604673,\n \"acc_norm\": 0.25396825396825395,\n \"acc_norm_stderr\": 0.03893259610604673\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.23548387096774193,\n \"acc_stderr\": 0.02413763242933771,\n \"acc_norm\": 0.23548387096774193,\n \"acc_norm_stderr\": 0.02413763242933771\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.20689655172413793,\n \"acc_stderr\": 0.028501378167893946,\n \"acc_norm\": 0.20689655172413793,\n \"acc_norm_stderr\": 0.028501378167893946\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.18787878787878787,\n \"acc_stderr\": 0.03050193405942914,\n \"acc_norm\": 0.18787878787878787,\n \"acc_norm_stderr\": 0.03050193405942914\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.21761658031088082,\n \"acc_stderr\": 0.029778663037752947,\n \"acc_norm\": 0.21761658031088082,\n \"acc_norm_stderr\": 0.029778663037752947\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.21025641025641026,\n \"acc_stderr\": 0.020660597485026924,\n \"acc_norm\": 0.21025641025641026,\n \"acc_norm_stderr\": 0.020660597485026924\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.02730914058823019,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.02730914058823019\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.25165562913907286,\n \"acc_stderr\": 0.03543304234389985,\n \"acc_norm\": 0.25165562913907286,\n \"acc_norm_stderr\": 0.03543304234389985\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.29724770642201837,\n \"acc_stderr\": 0.01959570722464353,\n \"acc_norm\": 0.29724770642201837,\n \"acc_norm_stderr\": 0.01959570722464353\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.375,\n \"acc_stderr\": 0.033016908987210894,\n \"acc_norm\": 0.375,\n \"acc_norm_stderr\": 0.033016908987210894\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.027865942286639325,\n \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.027865942286639325\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.2742616033755274,\n \"acc_stderr\": 0.029041333510598018,\n \"acc_norm\": 0.2742616033755274,\n \"acc_norm_stderr\": 0.029041333510598018\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3004484304932735,\n \"acc_stderr\": 0.030769352008229143,\n \"acc_norm\": 0.3004484304932735,\n \"acc_norm_stderr\": 0.030769352008229143\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.22900763358778625,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.22900763358778625,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070416,\n \"acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070416\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.24074074074074073,\n \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.2331288343558282,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.2331288343558282,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.04287858751340456,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.04287858751340456\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.1941747572815534,\n \"acc_stderr\": 0.03916667762822585,\n \"acc_norm\": 0.1941747572815534,\n \"acc_norm_stderr\": 0.03916667762822585\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2948717948717949,\n \"acc_stderr\": 0.029872577708891162,\n \"acc_norm\": 0.2948717948717949,\n \"acc_norm_stderr\": 0.029872577708891162\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.280970625798212,\n \"acc_stderr\": 0.016073127851221235,\n \"acc_norm\": 0.280970625798212,\n \"acc_norm_stderr\": 0.016073127851221235\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.26688102893890675,\n \"acc_stderr\": 0.02512263760881665,\n \"acc_norm\": 0.26688102893890675,\n \"acc_norm_stderr\": 0.02512263760881665\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2716049382716049,\n \"acc_stderr\": 0.02474862449053737,\n \"acc_norm\": 0.2716049382716049,\n \"acc_norm_stderr\": 0.02474862449053737\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.2375886524822695,\n \"acc_stderr\": 0.025389512552729903,\n \"acc_norm\": 0.2375886524822695,\n \"acc_norm_stderr\": 0.025389512552729903\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.23989569752281617,\n \"acc_stderr\": 0.010906282617981648,\n \"acc_norm\": 0.23989569752281617,\n \"acc_norm_stderr\": 0.010906282617981648\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.2757352941176471,\n \"acc_stderr\": 0.027146271936625162,\n \"acc_norm\": 0.2757352941176471,\n \"acc_norm_stderr\": 0.027146271936625162\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.25326797385620914,\n \"acc_stderr\": 0.017593486895366835,\n \"acc_norm\": 0.25326797385620914,\n \"acc_norm_stderr\": 0.017593486895366835\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.23636363636363636,\n \"acc_stderr\": 0.04069306319721378,\n \"acc_norm\": 0.23636363636363636,\n \"acc_norm_stderr\": 0.04069306319721378\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.17959183673469387,\n \"acc_stderr\": 0.024573293589585637,\n \"acc_norm\": 0.17959183673469387,\n \"acc_norm_stderr\": 0.024573293589585637\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.263681592039801,\n \"acc_stderr\": 0.031157150869355554,\n \"acc_norm\": 0.263681592039801,\n \"acc_norm_stderr\": 0.031157150869355554\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.24096385542168675,\n \"acc_stderr\": 0.0332939411907353,\n \"acc_norm\": 0.24096385542168675,\n \"acc_norm_stderr\": 0.0332939411907353\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.1871345029239766,\n \"acc_stderr\": 0.02991312723236805,\n \"acc_norm\": 0.1871345029239766,\n \"acc_norm_stderr\": 0.02991312723236805\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2252141982864137,\n \"mc1_stderr\": 0.014623240768023496,\n \"mc2\": 0.3726747548681276,\n \"mc2_stderr\": 0.014362441702987668\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5343330702446725,\n \"acc_stderr\": 0.014019317531542565\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.019711902956785442,\n \"acc_stderr\": 0.0038289829787357082\n }\n}\n```", "repo_url": "https://huggingface.co/DatPySci/pythia-1b-dpo-full", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_16T19_40_16.291334", "path": ["**/details_harness|arc:challenge|25_2024-02-16T19-40-16.291334.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-16T19-40-16.291334.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_16T19_40_16.291334", "path": ["**/details_harness|gsm8k|5_2024-02-16T19-40-16.291334.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-16T19-40-16.291334.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_16T19_40_16.291334", "path": ["**/details_harness|hellaswag|10_2024-02-16T19-40-16.291334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-16T19-40-16.291334.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_16T19_40_16.291334", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T19-40-16.291334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T19-40-16.291334.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T19-40-16.291334.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_16T19_40_16.291334", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T19-40-16.291334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T19-40-16.291334.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_16T19_40_16.291334", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T19-40-16.291334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T19-40-16.291334.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_16T19_40_16.291334", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T19-40-16.291334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T19-40-16.291334.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_16T19_40_16.291334", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T19-40-16.291334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T19-40-16.291334.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_16T19_40_16.291334", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T19-40-16.291334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T19-40-16.291334.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_16T19_40_16.291334", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T19-40-16.291334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T19-40-16.291334.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_16T19_40_16.291334", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T19-40-16.291334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T19-40-16.291334.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_16T19_40_16.291334", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T19-40-16.291334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T19-40-16.291334.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_16T19_40_16.291334", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T19-40-16.291334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T19-40-16.291334.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_16T19_40_16.291334", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T19-40-16.291334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T19-40-16.291334.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_16T19_40_16.291334", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T19-40-16.291334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T19-40-16.291334.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_16T19_40_16.291334", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T19-40-16.291334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T19-40-16.291334.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_16T19_40_16.291334", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T19-40-16.291334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T19-40-16.291334.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_16T19_40_16.291334", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T19-40-16.291334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T19-40-16.291334.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_16T19_40_16.291334", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T19-40-16.291334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T19-40-16.291334.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_16T19_40_16.291334", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T19-40-16.291334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T19-40-16.291334.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_16T19_40_16.291334", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T19-40-16.291334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T19-40-16.291334.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_16T19_40_16.291334", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T19-40-16.291334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T19-40-16.291334.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_16T19_40_16.291334", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T19-40-16.291334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T19-40-16.291334.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_16T19_40_16.291334", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T19-40-16.291334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T19-40-16.291334.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_16T19_40_16.291334", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T19-40-16.291334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T19-40-16.291334.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_16T19_40_16.291334", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T19-40-16.291334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T19-40-16.291334.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_16T19_40_16.291334", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T19-40-16.291334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T19-40-16.291334.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_16T19_40_16.291334", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T19-40-16.291334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T19-40-16.291334.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_16T19_40_16.291334", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T19-40-16.291334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T19-40-16.291334.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_16T19_40_16.291334", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T19-40-16.291334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T19-40-16.291334.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_16T19_40_16.291334", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T19-40-16.291334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T19-40-16.291334.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_16T19_40_16.291334", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T19-40-16.291334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T19-40-16.291334.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_16T19_40_16.291334", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T19-40-16.291334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T19-40-16.291334.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_16T19_40_16.291334", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T19-40-16.291334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T19-40-16.291334.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_16T19_40_16.291334", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T19-40-16.291334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T19-40-16.291334.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_16T19_40_16.291334", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T19-40-16.291334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T19-40-16.291334.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_16T19_40_16.291334", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T19-40-16.291334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T19-40-16.291334.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_16T19_40_16.291334", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T19-40-16.291334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T19-40-16.291334.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_16T19_40_16.291334", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T19-40-16.291334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T19-40-16.291334.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_16T19_40_16.291334", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T19-40-16.291334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T19-40-16.291334.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_16T19_40_16.291334", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T19-40-16.291334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T19-40-16.291334.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_16T19_40_16.291334", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T19-40-16.291334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T19-40-16.291334.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_16T19_40_16.291334", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T19-40-16.291334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T19-40-16.291334.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_16T19_40_16.291334", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T19-40-16.291334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T19-40-16.291334.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_16T19_40_16.291334", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T19-40-16.291334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T19-40-16.291334.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_16T19_40_16.291334", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T19-40-16.291334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T19-40-16.291334.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_16T19_40_16.291334", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T19-40-16.291334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T19-40-16.291334.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_16T19_40_16.291334", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T19-40-16.291334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T19-40-16.291334.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_16T19_40_16.291334", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T19-40-16.291334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T19-40-16.291334.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_16T19_40_16.291334", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T19-40-16.291334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T19-40-16.291334.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_16T19_40_16.291334", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T19-40-16.291334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T19-40-16.291334.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_16T19_40_16.291334", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T19-40-16.291334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T19-40-16.291334.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_16T19_40_16.291334", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T19-40-16.291334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T19-40-16.291334.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_16T19_40_16.291334", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T19-40-16.291334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T19-40-16.291334.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_16T19_40_16.291334", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T19-40-16.291334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T19-40-16.291334.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_16T19_40_16.291334", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T19-40-16.291334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T19-40-16.291334.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_16T19_40_16.291334", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T19-40-16.291334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T19-40-16.291334.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_16T19_40_16.291334", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T19-40-16.291334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T19-40-16.291334.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_16T19_40_16.291334", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T19-40-16.291334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T19-40-16.291334.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_16T19_40_16.291334", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T19-40-16.291334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T19-40-16.291334.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_16T19_40_16.291334", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T19-40-16.291334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T19-40-16.291334.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_16T19_40_16.291334", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T19-40-16.291334.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T19-40-16.291334.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_16T19_40_16.291334", "path": ["**/details_harness|winogrande|5_2024-02-16T19-40-16.291334.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-16T19-40-16.291334.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_16T19_40_16.291334", "path": ["results_2024-02-16T19-40-16.291334.parquet"]}, {"split": "latest", "path": ["results_2024-02-16T19-40-16.291334.parquet"]}]}]} | 2024-02-16T19:42:23+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of DatPySci/pythia-1b-dpo-full
Dataset automatically created during the evaluation run of model DatPySci/pythia-1b-dpo-full on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-16T19:40:16.291334(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of DatPySci/pythia-1b-dpo-full\n\n\n\nDataset automatically created during the evaluation run of model DatPySci/pythia-1b-dpo-full on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-16T19:40:16.291334(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of DatPySci/pythia-1b-dpo-full\n\n\n\nDataset automatically created during the evaluation run of model DatPySci/pythia-1b-dpo-full on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-16T19:40:16.291334(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
191,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of DatPySci/pythia-1b-dpo-full\n\n\n\nDataset automatically created during the evaluation run of model DatPySci/pythia-1b-dpo-full on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-16T19:40:16.291334(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
570aec4096bbf95de5b7d85f28a0faf591554e09 | WIP
| manestay/borderlines | [
"region:us"
] | 2024-02-16T19:46:26+00:00 | {"configs": [{"config_name": "territories", "data_files": "disputed_territories.csv"}, {"config_name": "countries", "data_files": "countries_info.csv"}, {"config_name": "queries", "data_files": [{"split": "iw", "path": "queries/iw.csv"}, {"split": "sq", "path": "queries/sq.csv"}, {"split": "ar", "path": "queries/ar.csv"}, {"split": "az", "path": "queries/az.csv"}, {"split": "bn", "path": "queries/bn.csv"}, {"split": "bs", "path": "queries/bs.csv"}, {"split": "da", "path": "queries/da.csv"}, {"split": "el", "path": "queries/el.csv"}, {"split": "en", "path": "queries/en.csv"}, {"split": "es", "path": "queries/es.csv"}, {"split": "fa", "path": "queries/fa.csv"}, {"split": "fr", "path": "queries/fr.csv"}, {"split": "hi", "path": "queries/hi.csv"}, {"split": "hr", "path": "queries/hr.csv"}, {"split": "ht", "path": "queries/ht.csv"}, {"split": "hy", "path": "queries/hy.csv"}, {"split": "id", "path": "queries/id.csv"}, {"split": "is", "path": "queries/is.csv"}, {"split": "it", "path": "queries/it.csv"}, {"split": "ja", "path": "queries/ja.csv"}, {"split": "ka", "path": "queries/ka.csv"}, {"split": "km", "path": "queries/km.csv"}, {"split": "ko", "path": "queries/ko.csv"}, {"split": "ky", "path": "queries/ky.csv"}, {"split": "lo", "path": "queries/lo.csv"}, {"split": "mg", "path": "queries/mg.csv"}, {"split": "mn", "path": "queries/mn.csv"}, {"split": "ms", "path": "queries/ms.csv"}, {"split": "my", "path": "queries/my.csv"}, {"split": "ne", "path": "queries/ne.csv"}, {"split": "nl", "path": "queries/nl.csv"}, {"split": "pt", "path": "queries/pt.csv"}, {"split": "ru", "path": "queries/ru.csv"}, {"split": "sl", "path": "queries/sl.csv"}, {"split": "sn", "path": "queries/sn.csv"}, {"split": "so", "path": "queries/so.csv"}, {"split": "sr", "path": "queries/sr.csv"}, {"split": "sw", "path": "queries/sw.csv"}, {"split": "tg", "path": "queries/tg.csv"}, {"split": "th", "path": "queries/th.csv"}, {"split": "ti", "path": "queries/ti.csv"}, {"split": "tl", "path": "queries/tl.csv"}, {"split": "tr", "path": "queries/tr.csv"}, {"split": "uk", "path": "queries/uk.csv"}, {"split": "ur", "path": "queries/ur.csv"}, {"split": "uz", "path": "queries/uz.csv"}, {"split": "vi", "path": "queries/vi.csv"}, {"split": "zht", "path": "queries/zh-TW.csv"}, {"split": "zhs", "path": "queries/zh-CN.csv"}]}]} | 2024-02-17T00:21:20+00:00 | [] | [] | TAGS
#region-us
| WIP
| [] | [
"TAGS\n#region-us \n"
] | [
6
] | [
"passage: TAGS\n#region-us \n"
] |
901e017600b5599f1ef26be5e49cc9c71a40443b | # Dataset Card for "lipid_droplets"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | jhaberbe/lipid_droplets | [
"region:us"
] | 2024-02-16T20:04:21+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "label", "dtype": "image"}], "splits": [{"name": "train", "num_bytes": 17759568.0, "num_examples": 16}, {"name": "validation", "num_bytes": 16724901.0, "num_examples": 15}], "download_size": 0, "dataset_size": 34484469.0}} | 2024-02-16T20:15:14+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "lipid_droplets"
More Information needed | [
"# Dataset Card for \"lipid_droplets\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"lipid_droplets\"\n\nMore Information needed"
] | [
6,
16
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"lipid_droplets\"\n\nMore Information needed"
] |
d314e9a00e210848c69bb8c647b14f0154678296 |
# Dataset Card for Evaluation run of chasedreaminf/Dream-7B-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [chasedreaminf/Dream-7B-slerp](https://huggingface.co/chasedreaminf/Dream-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_chasedreaminf__Dream-7B-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-16T20:05:27.586603](https://huggingface.co/datasets/open-llm-leaderboard/details_chasedreaminf__Dream-7B-slerp/blob/main/results_2024-02-16T20-05-27.586603.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6507213038118388,
"acc_stderr": 0.03214268781825063,
"acc_norm": 0.6503644948036131,
"acc_norm_stderr": 0.03280953874341137,
"mc1": 0.4724602203182375,
"mc1_stderr": 0.017476930190712187,
"mc2": 0.6184735043628918,
"mc2_stderr": 0.015107906651203224
},
"harness|arc:challenge|25": {
"acc": 0.6663822525597269,
"acc_stderr": 0.013778687054176538,
"acc_norm": 0.6851535836177475,
"acc_norm_stderr": 0.013572657703084948
},
"harness|hellaswag|10": {
"acc": 0.6787492531368253,
"acc_stderr": 0.004660025270817022,
"acc_norm": 0.8634734116709819,
"acc_norm_stderr": 0.0034264517445078474
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.03586879280080341,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.03586879280080341
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.049512182523962625,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.049512182523962625
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5957446808510638,
"acc_stderr": 0.032081157507886836,
"acc_norm": 0.5957446808510638,
"acc_norm_stderr": 0.032081157507886836
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42328042328042326,
"acc_stderr": 0.02544636563440679,
"acc_norm": 0.42328042328042326,
"acc_norm_stderr": 0.02544636563440679
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.0442626668137991,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.0442626668137991
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7677419354838709,
"acc_stderr": 0.024022256130308235,
"acc_norm": 0.7677419354838709,
"acc_norm_stderr": 0.024022256130308235
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.02860620428922987,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.02860620428922987
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768766,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768766
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616255,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374308,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374308
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8088235294117647,
"acc_stderr": 0.027599174300640766,
"acc_norm": 0.8088235294117647,
"acc_norm_stderr": 0.027599174300640766
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.0263616516683891,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.0263616516683891
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.0364129708131373,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.0364129708131373
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.02126271940040697,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.02126271940040697
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8250319284802043,
"acc_stderr": 0.01358661921990334,
"acc_norm": 0.8250319284802043,
"acc_norm_stderr": 0.01358661921990334
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7485549132947977,
"acc_stderr": 0.02335736578587403,
"acc_norm": 0.7485549132947977,
"acc_norm_stderr": 0.02335736578587403
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.42905027932960893,
"acc_stderr": 0.016553287863116037,
"acc_norm": 0.42905027932960893,
"acc_norm_stderr": 0.016553287863116037
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.025360603796242557,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.025360603796242557
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.02399350170904211,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.02399350170904211
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.029790719243829727,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.029790719243829727
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4576271186440678,
"acc_stderr": 0.012724296550980188,
"acc_norm": 0.4576271186440678,
"acc_norm_stderr": 0.012724296550980188
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.028501452860396556,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.028501452860396556
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6683006535947712,
"acc_stderr": 0.01904748523936038,
"acc_norm": 0.6683006535947712,
"acc_norm_stderr": 0.01904748523936038
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.02797982353874455,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.02797982353874455
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.02619392354445412,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.02619392354445412
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.02991312723236804,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.02991312723236804
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4724602203182375,
"mc1_stderr": 0.017476930190712187,
"mc2": 0.6184735043628918,
"mc2_stderr": 0.015107906651203224
},
"harness|winogrande|5": {
"acc": 0.8034727703235991,
"acc_stderr": 0.011168120593569562
},
"harness|gsm8k|5": {
"acc": 0.7225170583775588,
"acc_stderr": 0.01233344758104754
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_chasedreaminf__Dream-7B-slerp | [
"region:us"
] | 2024-02-16T20:07:47+00:00 | {"pretty_name": "Evaluation run of chasedreaminf/Dream-7B-slerp", "dataset_summary": "Dataset automatically created during the evaluation run of model [chasedreaminf/Dream-7B-slerp](https://huggingface.co/chasedreaminf/Dream-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_chasedreaminf__Dream-7B-slerp\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-16T20:05:27.586603](https://huggingface.co/datasets/open-llm-leaderboard/details_chasedreaminf__Dream-7B-slerp/blob/main/results_2024-02-16T20-05-27.586603.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6507213038118388,\n \"acc_stderr\": 0.03214268781825063,\n \"acc_norm\": 0.6503644948036131,\n \"acc_norm_stderr\": 0.03280953874341137,\n \"mc1\": 0.4724602203182375,\n \"mc1_stderr\": 0.017476930190712187,\n \"mc2\": 0.6184735043628918,\n \"mc2_stderr\": 0.015107906651203224\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6663822525597269,\n \"acc_stderr\": 0.013778687054176538,\n \"acc_norm\": 0.6851535836177475,\n \"acc_norm_stderr\": 0.013572657703084948\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6787492531368253,\n \"acc_stderr\": 0.004660025270817022,\n \"acc_norm\": 0.8634734116709819,\n \"acc_norm_stderr\": 0.0034264517445078474\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.7569444444444444,\n \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.049512182523962625,\n \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.049512182523962625\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5957446808510638,\n \"acc_stderr\": 0.032081157507886836,\n \"acc_norm\": 0.5957446808510638,\n \"acc_norm_stderr\": 0.032081157507886836\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42328042328042326,\n \"acc_stderr\": 0.02544636563440679,\n \"acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.02544636563440679\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.0442626668137991,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.0442626668137991\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7677419354838709,\n \"acc_stderr\": 0.024022256130308235,\n \"acc_norm\": 0.7677419354838709,\n \"acc_norm_stderr\": 0.024022256130308235\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768766,\n \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768766\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616255,\n \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616255\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374308,\n \"acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374308\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5416666666666666,\n \"acc_stderr\": 0.03398110890294636,\n \"acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.03398110890294636\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8088235294117647,\n \"acc_stderr\": 0.027599174300640766,\n \"acc_norm\": 0.8088235294117647,\n \"acc_norm_stderr\": 0.027599174300640766\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7932489451476793,\n \"acc_stderr\": 0.0263616516683891,\n \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.0263616516683891\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.0364129708131373,\n \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.0364129708131373\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.02126271940040697,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.02126271940040697\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8250319284802043,\n \"acc_stderr\": 0.01358661921990334,\n \"acc_norm\": 0.8250319284802043,\n \"acc_norm_stderr\": 0.01358661921990334\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7485549132947977,\n \"acc_stderr\": 0.02335736578587403,\n \"acc_norm\": 0.7485549132947977,\n \"acc_norm_stderr\": 0.02335736578587403\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.42905027932960893,\n \"acc_stderr\": 0.016553287863116037,\n \"acc_norm\": 0.42905027932960893,\n \"acc_norm_stderr\": 0.016553287863116037\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242557,\n \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242557\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.02399350170904211,\n \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.02399350170904211\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.475177304964539,\n \"acc_stderr\": 0.029790719243829727,\n \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.029790719243829727\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4576271186440678,\n \"acc_stderr\": 0.012724296550980188,\n \"acc_norm\": 0.4576271186440678,\n \"acc_norm_stderr\": 0.012724296550980188\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.028501452860396556,\n \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.028501452860396556\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6683006535947712,\n \"acc_stderr\": 0.01904748523936038,\n \"acc_norm\": 0.6683006535947712,\n \"acc_norm_stderr\": 0.01904748523936038\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.02619392354445412,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.02619392354445412\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.02991312723236804,\n \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.02991312723236804\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4724602203182375,\n \"mc1_stderr\": 0.017476930190712187,\n \"mc2\": 0.6184735043628918,\n \"mc2_stderr\": 0.015107906651203224\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8034727703235991,\n \"acc_stderr\": 0.011168120593569562\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7225170583775588,\n \"acc_stderr\": 0.01233344758104754\n }\n}\n```", "repo_url": "https://huggingface.co/chasedreaminf/Dream-7B-slerp", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_16T20_05_27.586603", "path": ["**/details_harness|arc:challenge|25_2024-02-16T20-05-27.586603.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-16T20-05-27.586603.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_16T20_05_27.586603", "path": ["**/details_harness|gsm8k|5_2024-02-16T20-05-27.586603.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-16T20-05-27.586603.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_16T20_05_27.586603", "path": ["**/details_harness|hellaswag|10_2024-02-16T20-05-27.586603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-16T20-05-27.586603.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_16T20_05_27.586603", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T20-05-27.586603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T20-05-27.586603.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T20-05-27.586603.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_16T20_05_27.586603", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T20-05-27.586603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T20-05-27.586603.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_16T20_05_27.586603", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T20-05-27.586603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T20-05-27.586603.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_16T20_05_27.586603", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T20-05-27.586603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T20-05-27.586603.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_16T20_05_27.586603", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T20-05-27.586603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T20-05-27.586603.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_16T20_05_27.586603", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T20-05-27.586603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T20-05-27.586603.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_16T20_05_27.586603", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T20-05-27.586603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T20-05-27.586603.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_16T20_05_27.586603", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T20-05-27.586603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T20-05-27.586603.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_16T20_05_27.586603", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T20-05-27.586603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T20-05-27.586603.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_16T20_05_27.586603", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T20-05-27.586603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T20-05-27.586603.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_16T20_05_27.586603", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T20-05-27.586603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T20-05-27.586603.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_16T20_05_27.586603", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T20-05-27.586603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T20-05-27.586603.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_16T20_05_27.586603", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T20-05-27.586603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T20-05-27.586603.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_16T20_05_27.586603", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T20-05-27.586603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T20-05-27.586603.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_16T20_05_27.586603", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T20-05-27.586603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T20-05-27.586603.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_16T20_05_27.586603", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T20-05-27.586603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T20-05-27.586603.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_16T20_05_27.586603", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T20-05-27.586603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T20-05-27.586603.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_16T20_05_27.586603", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T20-05-27.586603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T20-05-27.586603.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_16T20_05_27.586603", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T20-05-27.586603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T20-05-27.586603.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_16T20_05_27.586603", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T20-05-27.586603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T20-05-27.586603.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_16T20_05_27.586603", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T20-05-27.586603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T20-05-27.586603.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_16T20_05_27.586603", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T20-05-27.586603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T20-05-27.586603.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_16T20_05_27.586603", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T20-05-27.586603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T20-05-27.586603.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_16T20_05_27.586603", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T20-05-27.586603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T20-05-27.586603.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_16T20_05_27.586603", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T20-05-27.586603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T20-05-27.586603.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_16T20_05_27.586603", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T20-05-27.586603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T20-05-27.586603.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_16T20_05_27.586603", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T20-05-27.586603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T20-05-27.586603.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_16T20_05_27.586603", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T20-05-27.586603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T20-05-27.586603.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_16T20_05_27.586603", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T20-05-27.586603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T20-05-27.586603.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_16T20_05_27.586603", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T20-05-27.586603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T20-05-27.586603.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_16T20_05_27.586603", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T20-05-27.586603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T20-05-27.586603.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_16T20_05_27.586603", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T20-05-27.586603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T20-05-27.586603.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_16T20_05_27.586603", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T20-05-27.586603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T20-05-27.586603.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_16T20_05_27.586603", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T20-05-27.586603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T20-05-27.586603.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_16T20_05_27.586603", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T20-05-27.586603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T20-05-27.586603.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_16T20_05_27.586603", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T20-05-27.586603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T20-05-27.586603.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_16T20_05_27.586603", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T20-05-27.586603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T20-05-27.586603.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_16T20_05_27.586603", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T20-05-27.586603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T20-05-27.586603.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_16T20_05_27.586603", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T20-05-27.586603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T20-05-27.586603.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_16T20_05_27.586603", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T20-05-27.586603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T20-05-27.586603.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_16T20_05_27.586603", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T20-05-27.586603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T20-05-27.586603.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_16T20_05_27.586603", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T20-05-27.586603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T20-05-27.586603.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_16T20_05_27.586603", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T20-05-27.586603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T20-05-27.586603.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_16T20_05_27.586603", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T20-05-27.586603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T20-05-27.586603.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_16T20_05_27.586603", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T20-05-27.586603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T20-05-27.586603.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_16T20_05_27.586603", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T20-05-27.586603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T20-05-27.586603.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_16T20_05_27.586603", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T20-05-27.586603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T20-05-27.586603.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_16T20_05_27.586603", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T20-05-27.586603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T20-05-27.586603.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_16T20_05_27.586603", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T20-05-27.586603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T20-05-27.586603.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_16T20_05_27.586603", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T20-05-27.586603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T20-05-27.586603.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_16T20_05_27.586603", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T20-05-27.586603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T20-05-27.586603.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_16T20_05_27.586603", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T20-05-27.586603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T20-05-27.586603.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_16T20_05_27.586603", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T20-05-27.586603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T20-05-27.586603.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_16T20_05_27.586603", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T20-05-27.586603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T20-05-27.586603.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_16T20_05_27.586603", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T20-05-27.586603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T20-05-27.586603.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_16T20_05_27.586603", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T20-05-27.586603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T20-05-27.586603.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_16T20_05_27.586603", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T20-05-27.586603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T20-05-27.586603.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_16T20_05_27.586603", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T20-05-27.586603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T20-05-27.586603.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_16T20_05_27.586603", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T20-05-27.586603.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T20-05-27.586603.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_16T20_05_27.586603", "path": ["**/details_harness|winogrande|5_2024-02-16T20-05-27.586603.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-16T20-05-27.586603.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_16T20_05_27.586603", "path": ["results_2024-02-16T20-05-27.586603.parquet"]}, {"split": "latest", "path": ["results_2024-02-16T20-05-27.586603.parquet"]}]}]} | 2024-02-16T20:08:11+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of chasedreaminf/Dream-7B-slerp
Dataset automatically created during the evaluation run of model chasedreaminf/Dream-7B-slerp on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-16T20:05:27.586603(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of chasedreaminf/Dream-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model chasedreaminf/Dream-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-16T20:05:27.586603(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of chasedreaminf/Dream-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model chasedreaminf/Dream-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-16T20:05:27.586603(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
185,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of chasedreaminf/Dream-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model chasedreaminf/Dream-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-16T20:05:27.586603(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
85dcc431554f48343a29bbb35fd6617a1bd0fdb6 | # Dataset Card for "test_el_talar"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | ganader-ai-developers/test_el_talar | [
"region:us"
] | 2024-02-16T20:09:09+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "file_name", "dtype": "string"}, {"name": "cow_id", "dtype": "int64"}, {"name": "weight", "dtype": "string"}, {"name": "source", "dtype": "string"}, {"name": "breed", "dtype": "string"}, {"name": "sex", "dtype": "string"}, {"name": "orientation", "dtype": "string"}, {"name": "internal_cow_id", "dtype": "string"}, {"name": "vertical_distance_meters", "dtype": "float64"}, {"name": "horizontal_distance_meters", "dtype": "float64"}, {"name": "picture_quality", "dtype": "string"}, {"name": "id", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 595981, "num_examples": 2999}], "download_size": 41295, "dataset_size": 595981}} | 2024-02-16T20:09:12+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "test_el_talar"
More Information needed | [
"# Dataset Card for \"test_el_talar\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"test_el_talar\"\n\nMore Information needed"
] | [
6,
16
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"test_el_talar\"\n\nMore Information needed"
] |
0994b0ceeb5f421883b0836717e2a61d9dca2175 | # Dataset Card for "safety-data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | DavideG/safety-data | [
"region:us"
] | 2024-02-16T20:12:08+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "input_ids", "sequence": "int32"}, {"name": "attention_mask", "sequence": "int8"}], "splits": [{"name": "train", "num_bytes": 1461769623, "num_examples": 458194}, {"name": "test", "num_bytes": 129475839, "num_examples": 41596}], "download_size": 312584599, "dataset_size": 1591245462}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-02-17T00:44:33+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "safety-data"
More Information needed | [
"# Dataset Card for \"safety-data\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"safety-data\"\n\nMore Information needed"
] | [
6,
14
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"safety-data\"\n\nMore Information needed"
] |
42e63c53aa060086b9371c9f3e1fd6cfe67724db | # Dataset Card for "MeltpoolsLabeled"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | MottsCoding/MeltpoolsLabeled | [
"region:us"
] | 2024-02-16T20:16:54+00:00 | {"dataset_info": {"features": [{"name": "images", "dtype": "image"}, {"name": "labels", "sequence": {"sequence": "int32"}}], "splits": [{"name": "train", "num_bytes": 51624539.0, "num_examples": 12}], "download_size": 15662464, "dataset_size": 51624539.0}} | 2024-02-16T20:30:45+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "MeltpoolsLabeled"
More Information needed | [
"# Dataset Card for \"MeltpoolsLabeled\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"MeltpoolsLabeled\"\n\nMore Information needed"
] | [
6,
17
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"MeltpoolsLabeled\"\n\nMore Information needed"
] |
9bf956e118f917bf1908f051946e03b753d30b99 |
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset is [alexandrainst/nst-da](https://huggingface.co/datasets/alexandrainst/nst-da) with pseudo labels (sentiment) added by [alexandrainst/da-sentiment-base](https://huggingface.co/alexandrainst/da-sentiment-base) with a filter on 24 text length
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
Be aware, sentiment distribution is off balanced.
Random sample of 0.1% of the dataset:
![image/png](https://cdn-uploads.huggingface.co/production/uploads/63ea8326a332618465d2ab69/PxK-m7T61pR-OHgwrLV5q.png)
All audio is in 16000Hz
- **Curated by:** [overflowwwww](https://huggingface.co/datasets/overflowwwww)
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | overflowwwww/nst-da-sentiment-unbalanced | [
"license:cc0-1.0",
"region:us"
] | 2024-02-16T20:33:10+00:00 | {"license": "cc0-1.0"} | 2024-02-17T08:49:59+00:00 | [] | [] | TAGS
#license-cc0-1.0 #region-us
|
# Dataset Card for Dataset Name
This dataset is alexandrainst/nst-da with pseudo labels (sentiment) added by alexandrainst/da-sentiment-base with a filter on 24 text length
## Dataset Details
### Dataset Description
Be aware, sentiment distribution is off balanced.
Random sample of 0.1% of the dataset:
!image/png
All audio is in 16000Hz
- Curated by: overflowwwww
### Dataset Sources [optional]
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Dataset Name\n\n\n\nThis dataset is alexandrainst/nst-da with pseudo labels (sentiment) added by alexandrainst/da-sentiment-base with a filter on 24 text length",
"## Dataset Details",
"### Dataset Description\n\n\nBe aware, sentiment distribution is off balanced.\nRandom sample of 0.1% of the dataset:\n!image/png\n\nAll audio is in 16000Hz\n\n- Curated by: overflowwwww",
"### Dataset Sources [optional]",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#license-cc0-1.0 #region-us \n",
"# Dataset Card for Dataset Name\n\n\n\nThis dataset is alexandrainst/nst-da with pseudo labels (sentiment) added by alexandrainst/da-sentiment-base with a filter on 24 text length",
"## Dataset Details",
"### Dataset Description\n\n\nBe aware, sentiment distribution is off balanced.\nRandom sample of 0.1% of the dataset:\n!image/png\n\nAll audio is in 16000Hz\n\n- Curated by: overflowwwww",
"### Dataset Sources [optional]",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
14,
51,
4,
46,
10,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#license-cc0-1.0 #region-us \n# Dataset Card for Dataset Name\n\n\n\nThis dataset is alexandrainst/nst-da with pseudo labels (sentiment) added by alexandrainst/da-sentiment-base with a filter on 24 text length## Dataset Details### Dataset Description\n\n\nBe aware, sentiment distribution is off balanced.\nRandom sample of 0.1% of the dataset:\n!image/png\n\nAll audio is in 16000Hz\n\n- Curated by: overflowwwww### Dataset Sources [optional]## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
dd819082e978d2462d51cffa42f7dbb6b45de8dd |
# Dataset Card for Evaluation run of liminerity/ultra0
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [liminerity/ultra0](https://huggingface.co/liminerity/ultra0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_liminerity__ultra0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-16T21:00:48.786495](https://huggingface.co/datasets/open-llm-leaderboard/details_liminerity__ultra0/blob/main/results_2024-02-16T21-00-48.786495.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.3398058115621321,
"acc_stderr": 0.033239808655417924,
"acc_norm": 0.3409105453426057,
"acc_norm_stderr": 0.033965935374477625,
"mc1": 0.26193390452876375,
"mc1_stderr": 0.015392118805015027,
"mc2": 0.41485075799478666,
"mc2_stderr": 0.014670252998442896
},
"harness|arc:challenge|25": {
"acc": 0.3856655290102389,
"acc_stderr": 0.01422425097325718,
"acc_norm": 0.41467576791808874,
"acc_norm_stderr": 0.014397070564409174
},
"harness|hellaswag|10": {
"acc": 0.5078669587731528,
"acc_stderr": 0.004989163747650774,
"acc_norm": 0.6802429794861581,
"acc_norm_stderr": 0.004654291661255925
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4148148148148148,
"acc_stderr": 0.04256193767901407,
"acc_norm": 0.4148148148148148,
"acc_norm_stderr": 0.04256193767901407
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3092105263157895,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.3092105263157895,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.3584905660377358,
"acc_stderr": 0.02951470358398176,
"acc_norm": 0.3584905660377358,
"acc_norm_stderr": 0.02951470358398176
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2638888888888889,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.2638888888888889,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3352601156069364,
"acc_stderr": 0.03599586301247078,
"acc_norm": 0.3352601156069364,
"acc_norm_stderr": 0.03599586301247078
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.10784313725490197,
"acc_stderr": 0.03086428212206013,
"acc_norm": 0.10784313725490197,
"acc_norm_stderr": 0.03086428212206013
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.251063829787234,
"acc_stderr": 0.028346963777162466,
"acc_norm": 0.251063829787234,
"acc_norm_stderr": 0.028346963777162466
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.04142439719489362,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.04142439719489362
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.31724137931034485,
"acc_stderr": 0.03878352372138621,
"acc_norm": 0.31724137931034485,
"acc_norm_stderr": 0.03878352372138621
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30158730158730157,
"acc_stderr": 0.0236369759961018,
"acc_norm": 0.30158730158730157,
"acc_norm_stderr": 0.0236369759961018
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.03932537680392871,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.03932537680392871
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3741935483870968,
"acc_stderr": 0.027528904299845787,
"acc_norm": 0.3741935483870968,
"acc_norm_stderr": 0.027528904299845787
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2660098522167488,
"acc_stderr": 0.03108982600293752,
"acc_norm": 0.2660098522167488,
"acc_norm_stderr": 0.03108982600293752
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.44242424242424244,
"acc_stderr": 0.03878372113711274,
"acc_norm": 0.44242424242424244,
"acc_norm_stderr": 0.03878372113711274
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.45454545454545453,
"acc_stderr": 0.03547601494006936,
"acc_norm": 0.45454545454545453,
"acc_norm_stderr": 0.03547601494006936
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.37305699481865284,
"acc_stderr": 0.03490205592048574,
"acc_norm": 0.37305699481865284,
"acc_norm_stderr": 0.03490205592048574
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.26153846153846155,
"acc_stderr": 0.022282141204204426,
"acc_norm": 0.26153846153846155,
"acc_norm_stderr": 0.022282141204204426
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.02684205787383371,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.02684205787383371
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.28991596638655465,
"acc_stderr": 0.029472485833136084,
"acc_norm": 0.28991596638655465,
"acc_norm_stderr": 0.029472485833136084
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943343,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943343
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3357798165137615,
"acc_stderr": 0.02024808139675293,
"acc_norm": 0.3357798165137615,
"acc_norm_stderr": 0.02024808139675293
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.03167468706828979,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.03167468706828979
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.4068627450980392,
"acc_stderr": 0.03447891136353382,
"acc_norm": 0.4068627450980392,
"acc_norm_stderr": 0.03447891136353382
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.45569620253164556,
"acc_stderr": 0.03241920684693334,
"acc_norm": 0.45569620253164556,
"acc_norm_stderr": 0.03241920684693334
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3542600896860987,
"acc_stderr": 0.03210062154134986,
"acc_norm": 0.3542600896860987,
"acc_norm_stderr": 0.03210062154134986
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.37404580152671757,
"acc_stderr": 0.04243869242230524,
"acc_norm": 0.37404580152671757,
"acc_norm_stderr": 0.04243869242230524
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5206611570247934,
"acc_stderr": 0.04560456086387235,
"acc_norm": 0.5206611570247934,
"acc_norm_stderr": 0.04560456086387235
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.04750077341199986,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.04750077341199986
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3374233128834356,
"acc_stderr": 0.03714908409935574,
"acc_norm": 0.3374233128834356,
"acc_norm_stderr": 0.03714908409935574
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.23214285714285715,
"acc_stderr": 0.04007341809755806,
"acc_norm": 0.23214285714285715,
"acc_norm_stderr": 0.04007341809755806
},
"harness|hendrycksTest-management|5": {
"acc": 0.39805825242718446,
"acc_stderr": 0.04846748253977239,
"acc_norm": 0.39805825242718446,
"acc_norm_stderr": 0.04846748253977239
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.3717948717948718,
"acc_stderr": 0.031660988918880785,
"acc_norm": 0.3717948717948718,
"acc_norm_stderr": 0.031660988918880785
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.45977011494252873,
"acc_stderr": 0.017821994096933535,
"acc_norm": 0.45977011494252873,
"acc_norm_stderr": 0.017821994096933535
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.30346820809248554,
"acc_stderr": 0.02475241196091722,
"acc_norm": 0.30346820809248554,
"acc_norm_stderr": 0.02475241196091722
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217892,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217892
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.02768418188330289,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.02768418188330289
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.40836012861736337,
"acc_stderr": 0.02791705074848462,
"acc_norm": 0.40836012861736337,
"acc_norm_stderr": 0.02791705074848462
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.32407407407407407,
"acc_stderr": 0.02604176620271716,
"acc_norm": 0.32407407407407407,
"acc_norm_stderr": 0.02604176620271716
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2872340425531915,
"acc_stderr": 0.026992199173064356,
"acc_norm": 0.2872340425531915,
"acc_norm_stderr": 0.026992199173064356
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2757496740547588,
"acc_stderr": 0.011413813609160986,
"acc_norm": 0.2757496740547588,
"acc_norm_stderr": 0.011413813609160986
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3382352941176471,
"acc_stderr": 0.02873932851398357,
"acc_norm": 0.3382352941176471,
"acc_norm_stderr": 0.02873932851398357
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.33169934640522875,
"acc_stderr": 0.019047485239360378,
"acc_norm": 0.33169934640522875,
"acc_norm_stderr": 0.019047485239360378
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.32727272727272727,
"acc_stderr": 0.04494290866252088,
"acc_norm": 0.32727272727272727,
"acc_norm_stderr": 0.04494290866252088
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3183673469387755,
"acc_stderr": 0.029822533793982073,
"acc_norm": 0.3183673469387755,
"acc_norm_stderr": 0.029822533793982073
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.3283582089552239,
"acc_stderr": 0.033206858897443244,
"acc_norm": 0.3283582089552239,
"acc_norm_stderr": 0.033206858897443244
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2891566265060241,
"acc_stderr": 0.03529486801511115,
"acc_norm": 0.2891566265060241,
"acc_norm_stderr": 0.03529486801511115
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.4093567251461988,
"acc_stderr": 0.03771283107626544,
"acc_norm": 0.4093567251461988,
"acc_norm_stderr": 0.03771283107626544
},
"harness|truthfulqa:mc|0": {
"mc1": 0.26193390452876375,
"mc1_stderr": 0.015392118805015027,
"mc2": 0.41485075799478666,
"mc2_stderr": 0.014670252998442896
},
"harness|winogrande|5": {
"acc": 0.6550907655880032,
"acc_stderr": 0.01335937980503369
},
"harness|gsm8k|5": {
"acc": 0.1607278241091736,
"acc_stderr": 0.010116708586037183
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_liminerity__ultra0 | [
"region:us"
] | 2024-02-16T21:03:10+00:00 | {"pretty_name": "Evaluation run of liminerity/ultra0", "dataset_summary": "Dataset automatically created during the evaluation run of model [liminerity/ultra0](https://huggingface.co/liminerity/ultra0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_liminerity__ultra0\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-16T21:00:48.786495](https://huggingface.co/datasets/open-llm-leaderboard/details_liminerity__ultra0/blob/main/results_2024-02-16T21-00-48.786495.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3398058115621321,\n \"acc_stderr\": 0.033239808655417924,\n \"acc_norm\": 0.3409105453426057,\n \"acc_norm_stderr\": 0.033965935374477625,\n \"mc1\": 0.26193390452876375,\n \"mc1_stderr\": 0.015392118805015027,\n \"mc2\": 0.41485075799478666,\n \"mc2_stderr\": 0.014670252998442896\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.3856655290102389,\n \"acc_stderr\": 0.01422425097325718,\n \"acc_norm\": 0.41467576791808874,\n \"acc_norm_stderr\": 0.014397070564409174\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5078669587731528,\n \"acc_stderr\": 0.004989163747650774,\n \"acc_norm\": 0.6802429794861581,\n \"acc_norm_stderr\": 0.004654291661255925\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4148148148148148,\n \"acc_stderr\": 0.04256193767901407,\n \"acc_norm\": 0.4148148148148148,\n \"acc_norm_stderr\": 0.04256193767901407\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.3092105263157895,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.3092105263157895,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.3584905660377358,\n \"acc_stderr\": 0.02951470358398176,\n \"acc_norm\": 0.3584905660377358,\n \"acc_norm_stderr\": 0.02951470358398176\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2638888888888889,\n \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.2638888888888889,\n \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3352601156069364,\n \"acc_stderr\": 0.03599586301247078,\n \"acc_norm\": 0.3352601156069364,\n \"acc_norm_stderr\": 0.03599586301247078\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.10784313725490197,\n \"acc_stderr\": 0.03086428212206013,\n \"acc_norm\": 0.10784313725490197,\n \"acc_norm_stderr\": 0.03086428212206013\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.251063829787234,\n \"acc_stderr\": 0.028346963777162466,\n \"acc_norm\": 0.251063829787234,\n \"acc_norm_stderr\": 0.028346963777162466\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.04142439719489362,\n \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.04142439719489362\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.31724137931034485,\n \"acc_stderr\": 0.03878352372138621,\n \"acc_norm\": 0.31724137931034485,\n \"acc_norm_stderr\": 0.03878352372138621\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.30158730158730157,\n \"acc_stderr\": 0.0236369759961018,\n \"acc_norm\": 0.30158730158730157,\n \"acc_norm_stderr\": 0.0236369759961018\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2619047619047619,\n \"acc_stderr\": 0.03932537680392871,\n \"acc_norm\": 0.2619047619047619,\n \"acc_norm_stderr\": 0.03932537680392871\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3741935483870968,\n \"acc_stderr\": 0.027528904299845787,\n \"acc_norm\": 0.3741935483870968,\n \"acc_norm_stderr\": 0.027528904299845787\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.2660098522167488,\n \"acc_stderr\": 0.03108982600293752,\n \"acc_norm\": 0.2660098522167488,\n \"acc_norm_stderr\": 0.03108982600293752\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.44242424242424244,\n \"acc_stderr\": 0.03878372113711274,\n \"acc_norm\": 0.44242424242424244,\n \"acc_norm_stderr\": 0.03878372113711274\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.45454545454545453,\n \"acc_stderr\": 0.03547601494006936,\n \"acc_norm\": 0.45454545454545453,\n \"acc_norm_stderr\": 0.03547601494006936\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.37305699481865284,\n \"acc_stderr\": 0.03490205592048574,\n \"acc_norm\": 0.37305699481865284,\n \"acc_norm_stderr\": 0.03490205592048574\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.26153846153846155,\n \"acc_stderr\": 0.022282141204204426,\n \"acc_norm\": 0.26153846153846155,\n \"acc_norm_stderr\": 0.022282141204204426\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.26296296296296295,\n \"acc_stderr\": 0.02684205787383371,\n \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.02684205787383371\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.28991596638655465,\n \"acc_stderr\": 0.029472485833136084,\n \"acc_norm\": 0.28991596638655465,\n \"acc_norm_stderr\": 0.029472485833136084\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.304635761589404,\n \"acc_stderr\": 0.03757949922943343,\n \"acc_norm\": 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943343\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.3357798165137615,\n \"acc_stderr\": 0.02024808139675293,\n \"acc_norm\": 0.3357798165137615,\n \"acc_norm_stderr\": 0.02024808139675293\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.03167468706828979,\n \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.03167468706828979\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.4068627450980392,\n \"acc_stderr\": 0.03447891136353382,\n \"acc_norm\": 0.4068627450980392,\n \"acc_norm_stderr\": 0.03447891136353382\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.45569620253164556,\n \"acc_stderr\": 0.03241920684693334,\n \"acc_norm\": 0.45569620253164556,\n \"acc_norm_stderr\": 0.03241920684693334\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3542600896860987,\n \"acc_stderr\": 0.03210062154134986,\n \"acc_norm\": 0.3542600896860987,\n \"acc_norm_stderr\": 0.03210062154134986\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.37404580152671757,\n \"acc_stderr\": 0.04243869242230524,\n \"acc_norm\": 0.37404580152671757,\n \"acc_norm_stderr\": 0.04243869242230524\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.5206611570247934,\n \"acc_stderr\": 0.04560456086387235,\n \"acc_norm\": 0.5206611570247934,\n \"acc_norm_stderr\": 0.04560456086387235\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4074074074074074,\n \"acc_stderr\": 0.04750077341199986,\n \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.04750077341199986\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.3374233128834356,\n \"acc_stderr\": 0.03714908409935574,\n \"acc_norm\": 0.3374233128834356,\n \"acc_norm_stderr\": 0.03714908409935574\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.23214285714285715,\n \"acc_stderr\": 0.04007341809755806,\n \"acc_norm\": 0.23214285714285715,\n \"acc_norm_stderr\": 0.04007341809755806\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.39805825242718446,\n \"acc_stderr\": 0.04846748253977239,\n \"acc_norm\": 0.39805825242718446,\n \"acc_norm_stderr\": 0.04846748253977239\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.3717948717948718,\n \"acc_stderr\": 0.031660988918880785,\n \"acc_norm\": 0.3717948717948718,\n \"acc_norm_stderr\": 0.031660988918880785\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.45977011494252873,\n \"acc_stderr\": 0.017821994096933535,\n \"acc_norm\": 0.45977011494252873,\n \"acc_norm_stderr\": 0.017821994096933535\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.30346820809248554,\n \"acc_stderr\": 0.02475241196091722,\n \"acc_norm\": 0.30346820809248554,\n \"acc_norm_stderr\": 0.02475241196091722\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n \"acc_stderr\": 0.014333522059217892,\n \"acc_norm\": 0.2424581005586592,\n \"acc_norm_stderr\": 0.014333522059217892\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.02768418188330289,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.02768418188330289\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.40836012861736337,\n \"acc_stderr\": 0.02791705074848462,\n \"acc_norm\": 0.40836012861736337,\n \"acc_norm_stderr\": 0.02791705074848462\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.32407407407407407,\n \"acc_stderr\": 0.02604176620271716,\n \"acc_norm\": 0.32407407407407407,\n \"acc_norm_stderr\": 0.02604176620271716\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.2872340425531915,\n \"acc_stderr\": 0.026992199173064356,\n \"acc_norm\": 0.2872340425531915,\n \"acc_norm_stderr\": 0.026992199173064356\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2757496740547588,\n \"acc_stderr\": 0.011413813609160986,\n \"acc_norm\": 0.2757496740547588,\n \"acc_norm_stderr\": 0.011413813609160986\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.3382352941176471,\n \"acc_stderr\": 0.02873932851398357,\n \"acc_norm\": 0.3382352941176471,\n \"acc_norm_stderr\": 0.02873932851398357\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.33169934640522875,\n \"acc_stderr\": 0.019047485239360378,\n \"acc_norm\": 0.33169934640522875,\n \"acc_norm_stderr\": 0.019047485239360378\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.32727272727272727,\n \"acc_stderr\": 0.04494290866252088,\n \"acc_norm\": 0.32727272727272727,\n \"acc_norm_stderr\": 0.04494290866252088\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.3183673469387755,\n \"acc_stderr\": 0.029822533793982073,\n \"acc_norm\": 0.3183673469387755,\n \"acc_norm_stderr\": 0.029822533793982073\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.3283582089552239,\n \"acc_stderr\": 0.033206858897443244,\n \"acc_norm\": 0.3283582089552239,\n \"acc_norm_stderr\": 0.033206858897443244\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2891566265060241,\n \"acc_stderr\": 0.03529486801511115,\n \"acc_norm\": 0.2891566265060241,\n \"acc_norm_stderr\": 0.03529486801511115\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.4093567251461988,\n \"acc_stderr\": 0.03771283107626544,\n \"acc_norm\": 0.4093567251461988,\n \"acc_norm_stderr\": 0.03771283107626544\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.26193390452876375,\n \"mc1_stderr\": 0.015392118805015027,\n \"mc2\": 0.41485075799478666,\n \"mc2_stderr\": 0.014670252998442896\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6550907655880032,\n \"acc_stderr\": 0.01335937980503369\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1607278241091736,\n \"acc_stderr\": 0.010116708586037183\n }\n}\n```", "repo_url": "https://huggingface.co/liminerity/ultra0", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_16T21_00_48.786495", "path": ["**/details_harness|arc:challenge|25_2024-02-16T21-00-48.786495.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-16T21-00-48.786495.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_16T21_00_48.786495", "path": ["**/details_harness|gsm8k|5_2024-02-16T21-00-48.786495.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-16T21-00-48.786495.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_16T21_00_48.786495", "path": ["**/details_harness|hellaswag|10_2024-02-16T21-00-48.786495.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-16T21-00-48.786495.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_16T21_00_48.786495", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T21-00-48.786495.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T21-00-48.786495.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T21-00-48.786495.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_16T21_00_48.786495", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T21-00-48.786495.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T21-00-48.786495.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_16T21_00_48.786495", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T21-00-48.786495.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T21-00-48.786495.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_16T21_00_48.786495", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T21-00-48.786495.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T21-00-48.786495.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_16T21_00_48.786495", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T21-00-48.786495.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T21-00-48.786495.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_16T21_00_48.786495", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T21-00-48.786495.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T21-00-48.786495.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_16T21_00_48.786495", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T21-00-48.786495.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T21-00-48.786495.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_16T21_00_48.786495", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T21-00-48.786495.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T21-00-48.786495.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_16T21_00_48.786495", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T21-00-48.786495.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T21-00-48.786495.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_16T21_00_48.786495", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T21-00-48.786495.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T21-00-48.786495.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_16T21_00_48.786495", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T21-00-48.786495.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T21-00-48.786495.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_16T21_00_48.786495", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T21-00-48.786495.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T21-00-48.786495.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_16T21_00_48.786495", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T21-00-48.786495.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T21-00-48.786495.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_16T21_00_48.786495", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T21-00-48.786495.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T21-00-48.786495.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_16T21_00_48.786495", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T21-00-48.786495.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T21-00-48.786495.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_16T21_00_48.786495", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T21-00-48.786495.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T21-00-48.786495.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_16T21_00_48.786495", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T21-00-48.786495.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T21-00-48.786495.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_16T21_00_48.786495", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T21-00-48.786495.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T21-00-48.786495.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_16T21_00_48.786495", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T21-00-48.786495.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T21-00-48.786495.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_16T21_00_48.786495", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T21-00-48.786495.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T21-00-48.786495.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_16T21_00_48.786495", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T21-00-48.786495.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T21-00-48.786495.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_16T21_00_48.786495", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T21-00-48.786495.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T21-00-48.786495.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_16T21_00_48.786495", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T21-00-48.786495.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T21-00-48.786495.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_16T21_00_48.786495", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T21-00-48.786495.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T21-00-48.786495.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_16T21_00_48.786495", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T21-00-48.786495.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T21-00-48.786495.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_16T21_00_48.786495", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T21-00-48.786495.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T21-00-48.786495.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_16T21_00_48.786495", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T21-00-48.786495.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T21-00-48.786495.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_16T21_00_48.786495", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T21-00-48.786495.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T21-00-48.786495.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_16T21_00_48.786495", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T21-00-48.786495.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T21-00-48.786495.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_16T21_00_48.786495", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T21-00-48.786495.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T21-00-48.786495.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_16T21_00_48.786495", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T21-00-48.786495.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T21-00-48.786495.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_16T21_00_48.786495", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T21-00-48.786495.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T21-00-48.786495.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_16T21_00_48.786495", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T21-00-48.786495.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T21-00-48.786495.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_16T21_00_48.786495", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T21-00-48.786495.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T21-00-48.786495.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_16T21_00_48.786495", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T21-00-48.786495.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T21-00-48.786495.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_16T21_00_48.786495", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T21-00-48.786495.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T21-00-48.786495.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_16T21_00_48.786495", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T21-00-48.786495.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T21-00-48.786495.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_16T21_00_48.786495", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T21-00-48.786495.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T21-00-48.786495.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_16T21_00_48.786495", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T21-00-48.786495.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T21-00-48.786495.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_16T21_00_48.786495", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T21-00-48.786495.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T21-00-48.786495.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_16T21_00_48.786495", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T21-00-48.786495.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T21-00-48.786495.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_16T21_00_48.786495", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T21-00-48.786495.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T21-00-48.786495.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_16T21_00_48.786495", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T21-00-48.786495.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T21-00-48.786495.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_16T21_00_48.786495", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T21-00-48.786495.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T21-00-48.786495.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_16T21_00_48.786495", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T21-00-48.786495.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T21-00-48.786495.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_16T21_00_48.786495", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T21-00-48.786495.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T21-00-48.786495.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_16T21_00_48.786495", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T21-00-48.786495.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T21-00-48.786495.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_16T21_00_48.786495", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T21-00-48.786495.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T21-00-48.786495.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_16T21_00_48.786495", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T21-00-48.786495.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T21-00-48.786495.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_16T21_00_48.786495", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T21-00-48.786495.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T21-00-48.786495.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_16T21_00_48.786495", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T21-00-48.786495.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T21-00-48.786495.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_16T21_00_48.786495", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T21-00-48.786495.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T21-00-48.786495.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_16T21_00_48.786495", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T21-00-48.786495.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T21-00-48.786495.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_16T21_00_48.786495", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T21-00-48.786495.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T21-00-48.786495.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_16T21_00_48.786495", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T21-00-48.786495.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T21-00-48.786495.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_16T21_00_48.786495", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T21-00-48.786495.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T21-00-48.786495.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_16T21_00_48.786495", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T21-00-48.786495.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T21-00-48.786495.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_16T21_00_48.786495", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T21-00-48.786495.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T21-00-48.786495.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_16T21_00_48.786495", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T21-00-48.786495.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T21-00-48.786495.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_16T21_00_48.786495", "path": ["**/details_harness|winogrande|5_2024-02-16T21-00-48.786495.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-16T21-00-48.786495.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_16T21_00_48.786495", "path": ["results_2024-02-16T21-00-48.786495.parquet"]}, {"split": "latest", "path": ["results_2024-02-16T21-00-48.786495.parquet"]}]}]} | 2024-02-16T21:03:33+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of liminerity/ultra0
Dataset automatically created during the evaluation run of model liminerity/ultra0 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-16T21:00:48.786495(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of liminerity/ultra0\n\n\n\nDataset automatically created during the evaluation run of model liminerity/ultra0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-16T21:00:48.786495(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of liminerity/ultra0\n\n\n\nDataset automatically created during the evaluation run of model liminerity/ultra0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-16T21:00:48.786495(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
175,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of liminerity/ultra0\n\n\n\nDataset automatically created during the evaluation run of model liminerity/ultra0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-16T21:00:48.786495(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
79a6996a6433d614c234b3d374d574cb91ffd0f4 | <!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Data Card</title>
<link href="https://fonts.googleapis.com/css2?family=Quicksand:wght@400;500;600&display=swap" rel="stylesheet">
<style>
body {
font-family: 'Quicksand', sans-serif;
background-color: #1A202C;
color: #D8DEE9;
margin: 0;
padding: 0; /* Remove default padding */
font-size: 16px;
background: linear-gradient(135deg, #2E3440 0%, #1A202C 100%);
}
p {
padding-left: 10px
}
.container {
width: 100%;
margin: auto;
background-color: rgb(255 255 255 / 1%);
padding: 20px 30px 40px; /* Add padding below the image only */
padding-right: 32px;
border-radius: 12px;
box-shadow: 0 4px 10px rgba(0, 0, 0, 0.2);
backdrop-filter: blur(10px);
border: 1px solid rgba(255, 255, 255, 0.05);
}
.header {
display: flex;
align-items: center;
justify-content: space-between;
gap: 20px;
}
img {
border-radius: 10px 10px 0 0!important;
padding-left: 0px !important;
}
.header h1 {
font-size: 28px;
color: #ECEFF4;
margin: 0;
text-shadow: 2px 2px 4px rgba(0, 0, 0, 0.3);
}
.info {
background-color: rgba(255, 255, 255, 0.05);
color: #AEBAC7;
border-radius: 12px;
box-shadow: 0 2px 4px rgba(0, 0, 0, 0.2);
font-size: 14px;
line-height: 1.6;
margin-left: 5px;
overflow-x: auto;
margin-top: 20px; /* Adjusted margin */
border: 1px solid rgba(255, 255, 255, 0.05);
transition: background-color 0.6s ease; /* Smooth transition over 0.5 seconds */
}
.info:hover {
}
.info img {
width: 100%;
border-radius: 10px 10px 0 0;
margin-top: -20px; /* Negative margin to overlap container margin */
}
a {
color: #88C0D0;
text-decoration: none;
transition: color 0.3s ease;
position: relative;
}
a:hover {
color: #A3BE8C;
text-decoration: none;
}
a::before {
content: '';
position: absolute;
width: 100%;
height: 2px;
bottom: 0;
left: 0;
background-color: #A3BE8C;
visibility: hidden;
transform: scaleX(0);
transition: all 0.3s ease-in-out;
}
a:hover::before {
visibility: visible;
transform: scaleX(1);
}
.button {
display: inline-block;
background-color: #5E81AC;
color: #E5E9F0;
padding: 10px 20px;
border-radius: 5px;
cursor: pointer;
text-decoration: none;
transition: background-color 0.3s ease;
}
.button:hover {
background-color: #81A1C1;
}
</style>
</head>
<body>
<div class="container">
<div class="header">
<h1>Aether Dataset</h1>
</div>
<div class="info">
<img src="https://cdn-uploads.huggingface.co/production/uploads/64545af5ec40bbbd01242ca6/N4UWofDAapZ_kCraMuQDJ.webp" style="border-radius: 10px;">
<p><strong>Creator:</strong> <a href="https://huggingface.co/Steelskull" target="_blank">SteelSkull</a></p>
<p><strong>Community Organization:</strong> <a href="https://huggingface.co/ConvexAI" target="_blank">ConvexAI</a></p>
<p><strong>Discord:</strong> <a href="https://discord.gg/yYqmNmg7Wj" target="_blank">Join us on Discord</a></p>
</head>
<body>
<div>
<div>
<p><strong>About Aether:</strong> The Aether dataset.</p>
<p>rebuilt script, new dataset</p>
<p>from 1.2.2 to 1.5, changed datasets, added two.</p>
<p> version v1.5 is a rework of the human -> gpt conversations and added system and tool columns
<p><strong>Source Datasets:</strong></p>
<ul>
<li>grimulkan/bluemoon_Karen_cleaned</li>
<li>Doctor-Shotgun/no-robots-sharegpt</li>
<li>Locutusque/hercules-v2.5</li>
<li>jondurbin/airoboros-3.2</li>
<li>openerotica/freedom-rp</li>
<li>teknium/OpenHermes-2.5</li>
<li>Doctor-Shotgun/capybara-sharegpt</li>
<li>KaraKaraWitch/PIPPA-ShareGPT-formatted</li>
<li>Locutusque/bagel-clean-v0.3-shuffled</li>
</ul>
<p><strong>Phrases and Data Removed:</strong></p>
<p>To enhance the dataset's coherence and relevance across varied contexts, certain phrases have been selectively omitted. each dataset is run against a "keyed" list of phrases.
<p>Filtering Stats:
<p>Total Objects Removed: 72114
<p>
<p>Deduplication:
<p>Initial row count: 3296307
<p>Final row count: 2728791
<p>Rows removed: 567516
<p>Filter:
<ul>
<li>Couldn't help but</li>
<li>Can't resist</li>
<li>I'm sorry, but</li>
<li>As an AI</li>
<li>However, it is important to</li>
<li>Cannot provide</li>
<li>And others</li>
</ul>
</div>
</div>
</body> | Steelskull/Aether-v1.5 | [
"size_categories:1M<n<10M",
"language:en",
"license:apache-2.0",
"not-for-all-audiences",
"region:us"
] | 2024-02-16T21:22:50+00:00 | {"language": ["en"], "license": "apache-2.0", "size_categories": ["1M<n<10M"], "dataset_info": {"features": [{"name": "conversations", "list": [{"name": "from", "dtype": "string"}, {"name": "value", "dtype": "string"}]}, {"name": "system", "dtype": "string"}, {"name": "tools", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4655376981, "num_examples": 2712289}], "download_size": 2446047146, "dataset_size": 4655376981}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "tags": ["not-for-all-audiences"]} | 2024-02-17T16:42:58+00:00 | [] | [
"en"
] | TAGS
#size_categories-1M<n<10M #language-English #license-apache-2.0 #not-for-all-audiences #region-us
| <!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Data Card</title>
<link href="URL rel="stylesheet">
<style>
body {
font-family: 'Quicksand', sans-serif;
background-color: #1A202C;
color: #D8DEE9;
margin: 0;
padding: 0; /* Remove default padding */
font-size: 16px;
background: linear-gradient(135deg, #2E3440 0%, #1A202C 100%);
}
p {
padding-left: 10px
}
.container {
width: 100%;
margin: auto;
background-color: rgb(255 255 255 / 1%);
padding: 20px 30px 40px; /* Add padding below the image only */
padding-right: 32px;
border-radius: 12px;
box-shadow: 0 4px 10px rgba(0, 0, 0, 0.2);
backdrop-filter: blur(10px);
border: 1px solid rgba(255, 255, 255, 0.05);
}
.header {
display: flex;
align-items: center;
justify-content: space-between;
gap: 20px;
}
img {
border-radius: 10px 10px 0 0!important;
padding-left: 0px !important;
}
.header h1 {
font-size: 28px;
color: #ECEFF4;
margin: 0;
text-shadow: 2px 2px 4px rgba(0, 0, 0, 0.3);
}
.info {
background-color: rgba(255, 255, 255, 0.05);
color: #AEBAC7;
border-radius: 12px;
box-shadow: 0 2px 4px rgba(0, 0, 0, 0.2);
font-size: 14px;
line-height: 1.6;
margin-left: 5px;
overflow-x: auto;
margin-top: 20px; /* Adjusted margin */
border: 1px solid rgba(255, 255, 255, 0.05);
transition: background-color 0.6s ease; /* Smooth transition over 0.5 seconds */
}
.info:hover {
}
.info img {
width: 100%;
border-radius: 10px 10px 0 0;
margin-top: -20px; /* Negative margin to overlap container margin */
}
a {
color: #88C0D0;
text-decoration: none;
transition: color 0.3s ease;
position: relative;
}
a:hover {
color: #A3BE8C;
text-decoration: none;
}
a::before {
content: '';
position: absolute;
width: 100%;
height: 2px;
bottom: 0;
left: 0;
background-color: #A3BE8C;
visibility: hidden;
transform: scaleX(0);
transition: all 0.3s ease-in-out;
}
a:hover::before {
visibility: visible;
transform: scaleX(1);
}
.button {
display: inline-block;
background-color: #5E81AC;
color: #E5E9F0;
padding: 10px 20px;
border-radius: 5px;
cursor: pointer;
text-decoration: none;
transition: background-color 0.3s ease;
}
.button:hover {
background-color: #81A1C1;
}
</style>
</head>
<body>
<div class="container">
<div class="header">
<h1>Aether Dataset</h1>
</div>
<div class="info">
<img src="URL style="border-radius: 10px;">
<p><strong>Creator:</strong> <a href="URL target="_blank">SteelSkull</a></p>
<p><strong>Community Organization:</strong> <a href="URL target="_blank">ConvexAI</a></p>
<p><strong>Discord:</strong> <a href="URL target="_blank">Join us on Discord</a></p>
</head>
<body>
<div>
<div>
<p><strong>About Aether:</strong> The Aether dataset.</p>
<p>rebuilt script, new dataset</p>
<p>from 1.2.2 to 1.5, changed datasets, added two.</p>
<p> version v1.5 is a rework of the human -> gpt conversations and added system and tool columns
<p><strong>Source Datasets:</strong></p>
<ul>
<li>grimulkan/bluemoon_Karen_cleaned</li>
<li>Doctor-Shotgun/no-robots-sharegpt</li>
<li>Locutusque/hercules-v2.5</li>
<li>jondurbin/airoboros-3.2</li>
<li>openerotica/freedom-rp</li>
<li>teknium/OpenHermes-2.5</li>
<li>Doctor-Shotgun/capybara-sharegpt</li>
<li>KaraKaraWitch/PIPPA-ShareGPT-formatted</li>
<li>Locutusque/bagel-clean-v0.3-shuffled</li>
</ul>
<p><strong>Phrases and Data Removed:</strong></p>
<p>To enhance the dataset's coherence and relevance across varied contexts, certain phrases have been selectively omitted. each dataset is run against a "keyed" list of phrases.
<p>Filtering Stats:
<p>Total Objects Removed: 72114
<p>
<p>Deduplication:
<p>Initial row count: 3296307
<p>Final row count: 2728791
<p>Rows removed: 567516
<p>Filter:
<ul>
<li>Couldn't help but</li>
<li>Can't resist</li>
<li>I'm sorry, but</li>
<li>As an AI</li>
<li>However, it is important to</li>
<li>Cannot provide</li>
<li>And others</li>
</ul>
</div>
</div>
</body> | [] | [
"TAGS\n#size_categories-1M<n<10M #language-English #license-apache-2.0 #not-for-all-audiences #region-us \n"
] | [
39
] | [
"passage: TAGS\n#size_categories-1M<n<10M #language-English #license-apache-2.0 #not-for-all-audiences #region-us \n"
] |
c991795be509290ddc42dcdb71a59d5f09dd9d4d |
# Dataset Card for Evaluation run of Kquant03/Samlagast-7B-laser-bf16
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Kquant03/Samlagast-7B-laser-bf16](https://huggingface.co/Kquant03/Samlagast-7B-laser-bf16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Kquant03__Samlagast-7B-laser-bf16",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-16T21:27:55.439399](https://huggingface.co/datasets/open-llm-leaderboard/details_Kquant03__Samlagast-7B-laser-bf16/blob/main/results_2024-02-16T21-27-55.439399.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.649387957418536,
"acc_stderr": 0.03220384800343332,
"acc_norm": 0.6490741317673534,
"acc_norm_stderr": 0.03287747775346528,
"mc1": 0.594859241126071,
"mc1_stderr": 0.01718561172775337,
"mc2": 0.7316299369099223,
"mc2_stderr": 0.01462879068661053
},
"harness|arc:challenge|25": {
"acc": 0.7081911262798635,
"acc_stderr": 0.01328452529240351,
"acc_norm": 0.7286689419795221,
"acc_norm_stderr": 0.012993807727545796
},
"harness|hellaswag|10": {
"acc": 0.7167894841665007,
"acc_stderr": 0.004496369742132102,
"acc_norm": 0.8895638319059949,
"acc_norm_stderr": 0.003127920738394109
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.040723148118768364,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.040723148118768364
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.03823428969926605,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.03823428969926605
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.02815283794249386,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.02815283794249386
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7847222222222222,
"acc_stderr": 0.03437079344106135,
"acc_norm": 0.7847222222222222,
"acc_norm_stderr": 0.03437079344106135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.047028804320496165,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.047028804320496165
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778398,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778398
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.044444444444444495,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.044444444444444495
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356853,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356853
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.032876667586034906,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.032876667586034906
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919443,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402534,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.028037929969114993,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.028037929969114993
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.03038835355188679,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.03038835355188679
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3841059602649007,
"acc_stderr": 0.03971301814719197,
"acc_norm": 0.3841059602649007,
"acc_norm_stderr": 0.03971301814719197
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8403669724770643,
"acc_stderr": 0.015703498348461763,
"acc_norm": 0.8403669724770643,
"acc_norm_stderr": 0.015703498348461763
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5324074074074074,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.5324074074074074,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.02584501798692692,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.02584501798692692
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.02675082699467618,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.02675082699467618
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.021901905115073325,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.021901905115073325
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8288633461047255,
"acc_stderr": 0.013468201614066306,
"acc_norm": 0.8288633461047255,
"acc_norm_stderr": 0.013468201614066306
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7312138728323699,
"acc_stderr": 0.023868003262500104,
"acc_norm": 0.7312138728323699,
"acc_norm_stderr": 0.023868003262500104
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4424581005586592,
"acc_stderr": 0.01661139368726858,
"acc_norm": 0.4424581005586592,
"acc_norm_stderr": 0.01661139368726858
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.025738854797818737,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.025738854797818737
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.684887459807074,
"acc_stderr": 0.026385273703464492,
"acc_norm": 0.684887459807074,
"acc_norm_stderr": 0.026385273703464492
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.02399350170904211,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.02399350170904211
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47327249022164275,
"acc_stderr": 0.01275197796767601,
"acc_norm": 0.47327249022164275,
"acc_norm_stderr": 0.01275197796767601
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.028739328513983572,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.028739328513983572
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.018926082916083383,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.018926082916083383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.0289205832206756,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.0289205832206756
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685516,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685516
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.594859241126071,
"mc1_stderr": 0.01718561172775337,
"mc2": 0.7316299369099223,
"mc2_stderr": 0.01462879068661053
},
"harness|winogrande|5": {
"acc": 0.8539857932123125,
"acc_stderr": 0.009924440374585244
},
"harness|gsm8k|5": {
"acc": 0.6565579984836998,
"acc_stderr": 0.013079933811800308
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Kquant03__Samlagast-7B-laser-bf16 | [
"region:us"
] | 2024-02-16T21:30:17+00:00 | {"pretty_name": "Evaluation run of Kquant03/Samlagast-7B-laser-bf16", "dataset_summary": "Dataset automatically created during the evaluation run of model [Kquant03/Samlagast-7B-laser-bf16](https://huggingface.co/Kquant03/Samlagast-7B-laser-bf16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Kquant03__Samlagast-7B-laser-bf16\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-16T21:27:55.439399](https://huggingface.co/datasets/open-llm-leaderboard/details_Kquant03__Samlagast-7B-laser-bf16/blob/main/results_2024-02-16T21-27-55.439399.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.649387957418536,\n \"acc_stderr\": 0.03220384800343332,\n \"acc_norm\": 0.6490741317673534,\n \"acc_norm_stderr\": 0.03287747775346528,\n \"mc1\": 0.594859241126071,\n \"mc1_stderr\": 0.01718561172775337,\n \"mc2\": 0.7316299369099223,\n \"mc2_stderr\": 0.01462879068661053\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7081911262798635,\n \"acc_stderr\": 0.01328452529240351,\n \"acc_norm\": 0.7286689419795221,\n \"acc_norm_stderr\": 0.012993807727545796\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7167894841665007,\n \"acc_stderr\": 0.004496369742132102,\n \"acc_norm\": 0.8895638319059949,\n \"acc_norm_stderr\": 0.003127920738394109\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.040723148118768364,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.040723148118768364\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.03823428969926605,\n \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.03823428969926605\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.02815283794249386,\n \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.02815283794249386\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.7847222222222222,\n \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.03246956919789958,\n \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.03246956919789958\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n \"acc_stderr\": 0.047028804320496165,\n \"acc_norm\": 0.5087719298245614,\n \"acc_norm_stderr\": 0.047028804320496165\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778398,\n \"acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778398\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.044444444444444495,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.044444444444444495\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n \"acc_stderr\": 0.02341529343356853,\n \"acc_norm\": 0.7838709677419354,\n \"acc_norm_stderr\": 0.02341529343356853\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.032876667586034906,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.032876667586034906\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919443,\n \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919443\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3037037037037037,\n \"acc_stderr\": 0.028037929969114993,\n \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.028037929969114993\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.03038835355188679,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.03038835355188679\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8403669724770643,\n \"acc_stderr\": 0.015703498348461763,\n \"acc_norm\": 0.8403669724770643,\n \"acc_norm_stderr\": 0.015703498348461763\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5324074074074074,\n \"acc_stderr\": 0.03402801581358966,\n \"acc_norm\": 0.5324074074074074,\n \"acc_norm_stderr\": 0.03402801581358966\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.02584501798692692,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.02584501798692692\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7848101265822784,\n \"acc_stderr\": 0.02675082699467618,\n \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.02675082699467618\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.021901905115073325,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.021901905115073325\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8288633461047255,\n \"acc_stderr\": 0.013468201614066306,\n \"acc_norm\": 0.8288633461047255,\n \"acc_norm_stderr\": 0.013468201614066306\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.023868003262500104,\n \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.023868003262500104\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4424581005586592,\n \"acc_stderr\": 0.01661139368726858,\n \"acc_norm\": 0.4424581005586592,\n \"acc_norm_stderr\": 0.01661139368726858\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818737,\n \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818737\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.684887459807074,\n \"acc_stderr\": 0.026385273703464492,\n \"acc_norm\": 0.684887459807074,\n \"acc_norm_stderr\": 0.026385273703464492\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.02399350170904211,\n \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.02399350170904211\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47327249022164275,\n \"acc_stderr\": 0.01275197796767601,\n \"acc_norm\": 0.47327249022164275,\n \"acc_norm_stderr\": 0.01275197796767601\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.028739328513983572,\n \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.028739328513983572\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.018926082916083383,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.018926082916083383\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.0289205832206756,\n \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.0289205832206756\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.5662650602409639,\n \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.594859241126071,\n \"mc1_stderr\": 0.01718561172775337,\n \"mc2\": 0.7316299369099223,\n \"mc2_stderr\": 0.01462879068661053\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8539857932123125,\n \"acc_stderr\": 0.009924440374585244\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6565579984836998,\n \"acc_stderr\": 0.013079933811800308\n }\n}\n```", "repo_url": "https://huggingface.co/Kquant03/Samlagast-7B-laser-bf16", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_16T21_27_55.439399", "path": ["**/details_harness|arc:challenge|25_2024-02-16T21-27-55.439399.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-16T21-27-55.439399.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_16T21_27_55.439399", "path": ["**/details_harness|gsm8k|5_2024-02-16T21-27-55.439399.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-16T21-27-55.439399.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_16T21_27_55.439399", "path": ["**/details_harness|hellaswag|10_2024-02-16T21-27-55.439399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-16T21-27-55.439399.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_16T21_27_55.439399", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T21-27-55.439399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T21-27-55.439399.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T21-27-55.439399.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_16T21_27_55.439399", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T21-27-55.439399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T21-27-55.439399.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_16T21_27_55.439399", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T21-27-55.439399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T21-27-55.439399.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_16T21_27_55.439399", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T21-27-55.439399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T21-27-55.439399.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_16T21_27_55.439399", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T21-27-55.439399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T21-27-55.439399.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_16T21_27_55.439399", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T21-27-55.439399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T21-27-55.439399.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_16T21_27_55.439399", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T21-27-55.439399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T21-27-55.439399.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_16T21_27_55.439399", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T21-27-55.439399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T21-27-55.439399.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_16T21_27_55.439399", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T21-27-55.439399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T21-27-55.439399.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_16T21_27_55.439399", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T21-27-55.439399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T21-27-55.439399.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_16T21_27_55.439399", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T21-27-55.439399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T21-27-55.439399.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_16T21_27_55.439399", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T21-27-55.439399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T21-27-55.439399.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_16T21_27_55.439399", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T21-27-55.439399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T21-27-55.439399.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_16T21_27_55.439399", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T21-27-55.439399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T21-27-55.439399.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_16T21_27_55.439399", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T21-27-55.439399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T21-27-55.439399.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_16T21_27_55.439399", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T21-27-55.439399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T21-27-55.439399.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_16T21_27_55.439399", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T21-27-55.439399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T21-27-55.439399.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_16T21_27_55.439399", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T21-27-55.439399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T21-27-55.439399.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_16T21_27_55.439399", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T21-27-55.439399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T21-27-55.439399.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_16T21_27_55.439399", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T21-27-55.439399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T21-27-55.439399.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_16T21_27_55.439399", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T21-27-55.439399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T21-27-55.439399.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_16T21_27_55.439399", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T21-27-55.439399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T21-27-55.439399.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_16T21_27_55.439399", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T21-27-55.439399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T21-27-55.439399.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_16T21_27_55.439399", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T21-27-55.439399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T21-27-55.439399.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_16T21_27_55.439399", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T21-27-55.439399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T21-27-55.439399.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_16T21_27_55.439399", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T21-27-55.439399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T21-27-55.439399.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_16T21_27_55.439399", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T21-27-55.439399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T21-27-55.439399.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_16T21_27_55.439399", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T21-27-55.439399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T21-27-55.439399.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_16T21_27_55.439399", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T21-27-55.439399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T21-27-55.439399.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_16T21_27_55.439399", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T21-27-55.439399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T21-27-55.439399.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_16T21_27_55.439399", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T21-27-55.439399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T21-27-55.439399.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_16T21_27_55.439399", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T21-27-55.439399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T21-27-55.439399.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_16T21_27_55.439399", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T21-27-55.439399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T21-27-55.439399.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_16T21_27_55.439399", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T21-27-55.439399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T21-27-55.439399.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_16T21_27_55.439399", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T21-27-55.439399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T21-27-55.439399.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_16T21_27_55.439399", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T21-27-55.439399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T21-27-55.439399.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_16T21_27_55.439399", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T21-27-55.439399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T21-27-55.439399.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_16T21_27_55.439399", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T21-27-55.439399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T21-27-55.439399.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_16T21_27_55.439399", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T21-27-55.439399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T21-27-55.439399.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_16T21_27_55.439399", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T21-27-55.439399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T21-27-55.439399.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_16T21_27_55.439399", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T21-27-55.439399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T21-27-55.439399.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_16T21_27_55.439399", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T21-27-55.439399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T21-27-55.439399.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_16T21_27_55.439399", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T21-27-55.439399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T21-27-55.439399.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_16T21_27_55.439399", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T21-27-55.439399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T21-27-55.439399.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_16T21_27_55.439399", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T21-27-55.439399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T21-27-55.439399.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_16T21_27_55.439399", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T21-27-55.439399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T21-27-55.439399.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_16T21_27_55.439399", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T21-27-55.439399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T21-27-55.439399.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_16T21_27_55.439399", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T21-27-55.439399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T21-27-55.439399.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_16T21_27_55.439399", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T21-27-55.439399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T21-27-55.439399.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_16T21_27_55.439399", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T21-27-55.439399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T21-27-55.439399.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_16T21_27_55.439399", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T21-27-55.439399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T21-27-55.439399.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_16T21_27_55.439399", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T21-27-55.439399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T21-27-55.439399.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_16T21_27_55.439399", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T21-27-55.439399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T21-27-55.439399.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_16T21_27_55.439399", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T21-27-55.439399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T21-27-55.439399.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_16T21_27_55.439399", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T21-27-55.439399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T21-27-55.439399.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_16T21_27_55.439399", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T21-27-55.439399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T21-27-55.439399.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_16T21_27_55.439399", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T21-27-55.439399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T21-27-55.439399.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_16T21_27_55.439399", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T21-27-55.439399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T21-27-55.439399.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_16T21_27_55.439399", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T21-27-55.439399.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T21-27-55.439399.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_16T21_27_55.439399", "path": ["**/details_harness|winogrande|5_2024-02-16T21-27-55.439399.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-16T21-27-55.439399.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_16T21_27_55.439399", "path": ["results_2024-02-16T21-27-55.439399.parquet"]}, {"split": "latest", "path": ["results_2024-02-16T21-27-55.439399.parquet"]}]}]} | 2024-02-16T21:30:39+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Kquant03/Samlagast-7B-laser-bf16
Dataset automatically created during the evaluation run of model Kquant03/Samlagast-7B-laser-bf16 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-16T21:27:55.439399(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Kquant03/Samlagast-7B-laser-bf16\n\n\n\nDataset automatically created during the evaluation run of model Kquant03/Samlagast-7B-laser-bf16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-16T21:27:55.439399(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Kquant03/Samlagast-7B-laser-bf16\n\n\n\nDataset automatically created during the evaluation run of model Kquant03/Samlagast-7B-laser-bf16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-16T21:27:55.439399(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
191,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Kquant03/Samlagast-7B-laser-bf16\n\n\n\nDataset automatically created during the evaluation run of model Kquant03/Samlagast-7B-laser-bf16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-16T21:27:55.439399(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
02074b8c1fb607db460c6a7150967d6172d073f2 | # Dataset Card for "el_talar_febrero24"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | ganader-ai-developers/el_talar_febrero24 | [
"region:us"
] | 2024-02-16T21:30:31+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "cow_id", "dtype": "int64"}, {"name": "weight", "dtype": "string"}, {"name": "source", "dtype": "string"}, {"name": "breed", "dtype": "string"}, {"name": "sex", "dtype": "string"}, {"name": "orientation", "dtype": "string"}, {"name": "internal_cow_id", "dtype": "string"}, {"name": "vertical_distance_meters", "dtype": "float64"}, {"name": "horizontal_distance_meters", "dtype": "float64"}, {"name": "picture_quality", "dtype": "string"}, {"name": "id", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 12063030422.09, "num_examples": 2999}], "download_size": 11177066135, "dataset_size": 12063030422.09}} | 2024-02-16T22:10:24+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "el_talar_febrero24"
More Information needed | [
"# Dataset Card for \"el_talar_febrero24\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"el_talar_febrero24\"\n\nMore Information needed"
] | [
6,
19
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"el_talar_febrero24\"\n\nMore Information needed"
] |
84bd2706f53f4480bdebf851cfe654fe010370b5 | # Dataset Card for "legal_chat"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | worldboss/legal_chat | [
"region:us"
] | 2024-02-16T21:31:14+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 7228218, "num_examples": 4394}], "download_size": 2605909, "dataset_size": 7228218}} | 2024-02-16T21:31:16+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "legal_chat"
More Information needed | [
"# Dataset Card for \"legal_chat\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"legal_chat\"\n\nMore Information needed"
] | [
6,
13
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"legal_chat\"\n\nMore Information needed"
] |
4bdd96f433a22309d59d0dab4fa2ac813cc8919d | # Dataset Card for "nia_faq_chat"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | worldboss/nia_faq_chat | [
"region:us"
] | 2024-02-16T21:32:41+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 36318, "num_examples": 66}], "download_size": 20689, "dataset_size": 36318}} | 2024-02-16T21:32:42+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "nia_faq_chat"
More Information needed | [
"# Dataset Card for \"nia_faq_chat\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"nia_faq_chat\"\n\nMore Information needed"
] | [
6,
15
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"nia_faq_chat\"\n\nMore Information needed"
] |
cce54d5379de2928bea79f54eeb2a682fa9f3006 | # Dataset Card for "qa_nia_faq_chat"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | worldboss/qa_nia_faq_chat | [
"region:us"
] | 2024-02-16T21:33:45+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 36450, "num_examples": 66}], "download_size": 20652, "dataset_size": 36450}} | 2024-02-16T21:49:29+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "qa_nia_faq_chat"
More Information needed | [
"# Dataset Card for \"qa_nia_faq_chat\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"qa_nia_faq_chat\"\n\nMore Information needed"
] | [
6,
17
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"qa_nia_faq_chat\"\n\nMore Information needed"
] |
ff1f2be954d5968402ef7ed6929c71f073a7e58a |
# Dataset Card for Evaluation run of eren23/OGNO-7b-dpo-truthful
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [eren23/OGNO-7b-dpo-truthful](https://huggingface.co/eren23/OGNO-7b-dpo-truthful) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_eren23__OGNO-7b-dpo-truthful",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-16T22:05:00.872209](https://huggingface.co/datasets/open-llm-leaderboard/details_eren23__OGNO-7b-dpo-truthful/blob/main/results_2024-02-16T22-05-00.872209.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.652348491009575,
"acc_stderr": 0.03197996047742033,
"acc_norm": 0.6516420684516023,
"acc_norm_stderr": 0.03264932973384798,
"mc1": 0.6242350061199511,
"mc1_stderr": 0.01695458406021429,
"mc2": 0.7660822976380632,
"mc2_stderr": 0.013995111777693896
},
"harness|arc:challenge|25": {
"acc": 0.7141638225255973,
"acc_stderr": 0.013203196088537372,
"acc_norm": 0.7295221843003413,
"acc_norm_stderr": 0.012980954547659556
},
"harness|hellaswag|10": {
"acc": 0.7150965943039235,
"acc_stderr": 0.004504459553909766,
"acc_norm": 0.890161322445728,
"acc_norm_stderr": 0.0031204952388275576
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.02533120243894443,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.02533120243894443
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.02328766512726854,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.02328766512726854
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.028057791672989017,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.028057791672989017
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.917098445595855,
"acc_stderr": 0.01989934131572178,
"acc_norm": 0.917098445595855,
"acc_norm_stderr": 0.01989934131572178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374303,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374303
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931792,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931792
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290916,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752598,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752598
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.01354741565866226,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.01354741565866226
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.023703099525258172,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.023703099525258172
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.44581005586592176,
"acc_stderr": 0.016623998513333106,
"acc_norm": 0.44581005586592176,
"acc_norm_stderr": 0.016623998513333106
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.025646863097137897,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.025646863097137897
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.02558306248998481,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.02558306248998481
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135114,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.02982074719142248,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.02982074719142248
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46936114732724904,
"acc_stderr": 0.012746237711716634,
"acc_norm": 0.46936114732724904,
"acc_norm_stderr": 0.012746237711716634
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.02824568739146292,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.02824568739146292
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6683006535947712,
"acc_stderr": 0.019047485239360378,
"acc_norm": 0.6683006535947712,
"acc_norm_stderr": 0.019047485239360378
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6242350061199511,
"mc1_stderr": 0.01695458406021429,
"mc2": 0.7660822976380632,
"mc2_stderr": 0.013995111777693896
},
"harness|winogrande|5": {
"acc": 0.8468823993685872,
"acc_stderr": 0.010120623252272955
},
"harness|gsm8k|5": {
"acc": 0.6899166034874905,
"acc_stderr": 0.012740305717376268
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_eren23__OGNO-7b-dpo-truthful | [
"region:us"
] | 2024-02-16T22:07:21+00:00 | {"pretty_name": "Evaluation run of eren23/OGNO-7b-dpo-truthful", "dataset_summary": "Dataset automatically created during the evaluation run of model [eren23/OGNO-7b-dpo-truthful](https://huggingface.co/eren23/OGNO-7b-dpo-truthful) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_eren23__OGNO-7b-dpo-truthful\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-16T22:05:00.872209](https://huggingface.co/datasets/open-llm-leaderboard/details_eren23__OGNO-7b-dpo-truthful/blob/main/results_2024-02-16T22-05-00.872209.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.652348491009575,\n \"acc_stderr\": 0.03197996047742033,\n \"acc_norm\": 0.6516420684516023,\n \"acc_norm_stderr\": 0.03264932973384798,\n \"mc1\": 0.6242350061199511,\n \"mc1_stderr\": 0.01695458406021429,\n \"mc2\": 0.7660822976380632,\n \"mc2_stderr\": 0.013995111777693896\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7141638225255973,\n \"acc_stderr\": 0.013203196088537372,\n \"acc_norm\": 0.7295221843003413,\n \"acc_norm_stderr\": 0.012980954547659556\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7150965943039235,\n \"acc_stderr\": 0.004504459553909766,\n \"acc_norm\": 0.890161322445728,\n \"acc_norm_stderr\": 0.0031204952388275576\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41005291005291006,\n \"acc_stderr\": 0.02533120243894443,\n \"acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.02533120243894443\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n \"acc_stderr\": 0.02328766512726854,\n \"acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.02328766512726854\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.917098445595855,\n \"acc_stderr\": 0.01989934131572178,\n \"acc_norm\": 0.917098445595855,\n \"acc_norm_stderr\": 0.01989934131572178\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374303,\n \"acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374303\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931792,\n \"acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931792\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290916,\n \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290916\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752598,\n \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752598\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n \"acc_stderr\": 0.01354741565866226,\n \"acc_norm\": 0.8263090676883781,\n \"acc_norm_stderr\": 0.01354741565866226\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.023703099525258172,\n \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.023703099525258172\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.44581005586592176,\n \"acc_stderr\": 0.016623998513333106,\n \"acc_norm\": 0.44581005586592176,\n \"acc_norm_stderr\": 0.016623998513333106\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137897,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137897\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n \"acc_stderr\": 0.02558306248998481,\n \"acc_norm\": 0.7170418006430869,\n \"acc_norm_stderr\": 0.02558306248998481\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135114,\n \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135114\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.02982074719142248,\n \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.02982074719142248\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46936114732724904,\n \"acc_stderr\": 0.012746237711716634,\n \"acc_norm\": 0.46936114732724904,\n \"acc_norm_stderr\": 0.012746237711716634\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.02824568739146292,\n \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.02824568739146292\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6683006535947712,\n \"acc_stderr\": 0.019047485239360378,\n \"acc_norm\": 0.6683006535947712,\n \"acc_norm_stderr\": 0.019047485239360378\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6242350061199511,\n \"mc1_stderr\": 0.01695458406021429,\n \"mc2\": 0.7660822976380632,\n \"mc2_stderr\": 0.013995111777693896\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8468823993685872,\n \"acc_stderr\": 0.010120623252272955\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6899166034874905,\n \"acc_stderr\": 0.012740305717376268\n }\n}\n```", "repo_url": "https://huggingface.co/eren23/OGNO-7b-dpo-truthful", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_16T22_05_00.872209", "path": ["**/details_harness|arc:challenge|25_2024-02-16T22-05-00.872209.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-16T22-05-00.872209.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_16T22_05_00.872209", "path": ["**/details_harness|gsm8k|5_2024-02-16T22-05-00.872209.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-16T22-05-00.872209.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_16T22_05_00.872209", "path": ["**/details_harness|hellaswag|10_2024-02-16T22-05-00.872209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-16T22-05-00.872209.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_16T22_05_00.872209", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T22-05-00.872209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T22-05-00.872209.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T22-05-00.872209.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_16T22_05_00.872209", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T22-05-00.872209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T22-05-00.872209.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_16T22_05_00.872209", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T22-05-00.872209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T22-05-00.872209.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_16T22_05_00.872209", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T22-05-00.872209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T22-05-00.872209.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_16T22_05_00.872209", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T22-05-00.872209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T22-05-00.872209.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_16T22_05_00.872209", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T22-05-00.872209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T22-05-00.872209.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_16T22_05_00.872209", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T22-05-00.872209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T22-05-00.872209.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_16T22_05_00.872209", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T22-05-00.872209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T22-05-00.872209.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_16T22_05_00.872209", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T22-05-00.872209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T22-05-00.872209.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_16T22_05_00.872209", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T22-05-00.872209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T22-05-00.872209.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_16T22_05_00.872209", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T22-05-00.872209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T22-05-00.872209.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_16T22_05_00.872209", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T22-05-00.872209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T22-05-00.872209.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_16T22_05_00.872209", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T22-05-00.872209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T22-05-00.872209.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_16T22_05_00.872209", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T22-05-00.872209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T22-05-00.872209.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_16T22_05_00.872209", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T22-05-00.872209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T22-05-00.872209.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_16T22_05_00.872209", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T22-05-00.872209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T22-05-00.872209.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_16T22_05_00.872209", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T22-05-00.872209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T22-05-00.872209.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_16T22_05_00.872209", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T22-05-00.872209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T22-05-00.872209.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_16T22_05_00.872209", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T22-05-00.872209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T22-05-00.872209.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_16T22_05_00.872209", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T22-05-00.872209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T22-05-00.872209.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_16T22_05_00.872209", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T22-05-00.872209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T22-05-00.872209.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_16T22_05_00.872209", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T22-05-00.872209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T22-05-00.872209.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_16T22_05_00.872209", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T22-05-00.872209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T22-05-00.872209.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_16T22_05_00.872209", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T22-05-00.872209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T22-05-00.872209.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_16T22_05_00.872209", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T22-05-00.872209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T22-05-00.872209.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_16T22_05_00.872209", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T22-05-00.872209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T22-05-00.872209.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_16T22_05_00.872209", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T22-05-00.872209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T22-05-00.872209.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_16T22_05_00.872209", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T22-05-00.872209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T22-05-00.872209.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_16T22_05_00.872209", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T22-05-00.872209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T22-05-00.872209.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_16T22_05_00.872209", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T22-05-00.872209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T22-05-00.872209.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_16T22_05_00.872209", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T22-05-00.872209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T22-05-00.872209.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_16T22_05_00.872209", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T22-05-00.872209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T22-05-00.872209.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_16T22_05_00.872209", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T22-05-00.872209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T22-05-00.872209.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_16T22_05_00.872209", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T22-05-00.872209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T22-05-00.872209.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_16T22_05_00.872209", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T22-05-00.872209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T22-05-00.872209.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_16T22_05_00.872209", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T22-05-00.872209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T22-05-00.872209.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_16T22_05_00.872209", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T22-05-00.872209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T22-05-00.872209.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_16T22_05_00.872209", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T22-05-00.872209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T22-05-00.872209.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_16T22_05_00.872209", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T22-05-00.872209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T22-05-00.872209.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_16T22_05_00.872209", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T22-05-00.872209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T22-05-00.872209.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_16T22_05_00.872209", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T22-05-00.872209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T22-05-00.872209.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_16T22_05_00.872209", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T22-05-00.872209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T22-05-00.872209.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_16T22_05_00.872209", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T22-05-00.872209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T22-05-00.872209.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_16T22_05_00.872209", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T22-05-00.872209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T22-05-00.872209.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_16T22_05_00.872209", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T22-05-00.872209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T22-05-00.872209.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_16T22_05_00.872209", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T22-05-00.872209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T22-05-00.872209.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_16T22_05_00.872209", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T22-05-00.872209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T22-05-00.872209.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_16T22_05_00.872209", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T22-05-00.872209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T22-05-00.872209.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_16T22_05_00.872209", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T22-05-00.872209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T22-05-00.872209.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_16T22_05_00.872209", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T22-05-00.872209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T22-05-00.872209.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_16T22_05_00.872209", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T22-05-00.872209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T22-05-00.872209.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_16T22_05_00.872209", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T22-05-00.872209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T22-05-00.872209.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_16T22_05_00.872209", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T22-05-00.872209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T22-05-00.872209.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_16T22_05_00.872209", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T22-05-00.872209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T22-05-00.872209.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_16T22_05_00.872209", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T22-05-00.872209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T22-05-00.872209.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_16T22_05_00.872209", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T22-05-00.872209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T22-05-00.872209.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_16T22_05_00.872209", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T22-05-00.872209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T22-05-00.872209.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_16T22_05_00.872209", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T22-05-00.872209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T22-05-00.872209.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_16T22_05_00.872209", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T22-05-00.872209.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T22-05-00.872209.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_16T22_05_00.872209", "path": ["**/details_harness|winogrande|5_2024-02-16T22-05-00.872209.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-16T22-05-00.872209.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_16T22_05_00.872209", "path": ["results_2024-02-16T22-05-00.872209.parquet"]}, {"split": "latest", "path": ["results_2024-02-16T22-05-00.872209.parquet"]}]}]} | 2024-02-16T22:07:44+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of eren23/OGNO-7b-dpo-truthful
Dataset automatically created during the evaluation run of model eren23/OGNO-7b-dpo-truthful on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-16T22:05:00.872209(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of eren23/OGNO-7b-dpo-truthful\n\n\n\nDataset automatically created during the evaluation run of model eren23/OGNO-7b-dpo-truthful on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-16T22:05:00.872209(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of eren23/OGNO-7b-dpo-truthful\n\n\n\nDataset automatically created during the evaluation run of model eren23/OGNO-7b-dpo-truthful on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-16T22:05:00.872209(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
191,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of eren23/OGNO-7b-dpo-truthful\n\n\n\nDataset automatically created during the evaluation run of model eren23/OGNO-7b-dpo-truthful on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-16T22:05:00.872209(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
a6e61cdc7f550396aea15c2c8e429a00757c048f | # Dataset Card for "java_renaming_patch"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | thiomajid/java_renaming_patch | [
"region:us"
] | 2024-02-16T22:29:03+00:00 | {"dataset_info": {"features": [{"name": "commit_sha", "dtype": "string"}, {"name": "new_methods", "list": [{"name": "arguments", "sequence": "string"}, {"name": "filename", "dtype": "string"}, {"name": "implementation", "dtype": "string"}, {"name": "signature", "dtype": "string"}]}, {"name": "old_methods", "list": [{"name": "arguments", "sequence": "string"}, {"name": "filename", "dtype": "string"}, {"name": "implementation", "dtype": "string"}, {"name": "signature", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 794271, "num_examples": 74}], "download_size": 271079, "dataset_size": 794271}} | 2024-02-16T23:33:28+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "java_renaming_patch"
More Information needed | [
"# Dataset Card for \"java_renaming_patch\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"java_renaming_patch\"\n\nMore Information needed"
] | [
6,
17
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"java_renaming_patch\"\n\nMore Information needed"
] |
8adc28cb398bc930e769f73f10c718af81671547 |
# Dataset Card for Evaluation run of fzzhang/mistralv1_gsm8k_merged_s
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [fzzhang/mistralv1_gsm8k_merged_s](https://huggingface.co/fzzhang/mistralv1_gsm8k_merged_s) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_fzzhang__mistralv1_gsm8k_merged_s",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-16T22:54:23.549032](https://huggingface.co/datasets/open-llm-leaderboard/details_fzzhang__mistralv1_gsm8k_merged_s/blob/main/results_2024-02-16T22-54-23.549032.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6168865618276048,
"acc_stderr": 0.03272942181052946,
"acc_norm": 0.620454228546075,
"acc_norm_stderr": 0.03338469630654732,
"mc1": 0.2729498164014688,
"mc1_stderr": 0.015594753632006526,
"mc2": 0.42426145726798214,
"mc2_stderr": 0.014425554324623623
},
"harness|arc:challenge|25": {
"acc": 0.5827645051194539,
"acc_stderr": 0.014409825518403084,
"acc_norm": 0.6203071672354948,
"acc_norm_stderr": 0.014182119866974872
},
"harness|hellaswag|10": {
"acc": 0.6467835092611034,
"acc_stderr": 0.004769924131304649,
"acc_norm": 0.8394742083250348,
"acc_norm_stderr": 0.0036634275361781595
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6381578947368421,
"acc_stderr": 0.039105257528497236,
"acc_norm": 0.6381578947368421,
"acc_norm_stderr": 0.039105257528497236
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.02881561571343211,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.02881561571343211
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7083333333333334,
"acc_stderr": 0.038009680605548594,
"acc_norm": 0.7083333333333334,
"acc_norm_stderr": 0.038009680605548594
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.03643037168958548,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.03643037168958548
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.04951218252396262,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.04951218252396262
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165044,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165044
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4851063829787234,
"acc_stderr": 0.032671518489247764,
"acc_norm": 0.4851063829787234,
"acc_norm_stderr": 0.032671518489247764
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.046570472605949625,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.046570472605949625
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3783068783068783,
"acc_stderr": 0.02497695405315524,
"acc_norm": 0.3783068783068783,
"acc_norm_stderr": 0.02497695405315524
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377562,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377562
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7516129032258064,
"acc_stderr": 0.024580028921481003,
"acc_norm": 0.7516129032258064,
"acc_norm_stderr": 0.024580028921481003
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7393939393939394,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.7393939393939394,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.02937661648494562,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.02937661648494562
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8704663212435233,
"acc_stderr": 0.024233532297758733,
"acc_norm": 0.8704663212435233,
"acc_norm_stderr": 0.024233532297758733
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6230769230769231,
"acc_stderr": 0.024570975364225995,
"acc_norm": 0.6230769230769231,
"acc_norm_stderr": 0.024570975364225995
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948485,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948485
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6260504201680672,
"acc_stderr": 0.03142946637883708,
"acc_norm": 0.6260504201680672,
"acc_norm_stderr": 0.03142946637883708
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7944954128440367,
"acc_stderr": 0.01732435232501601,
"acc_norm": 0.7944954128440367,
"acc_norm_stderr": 0.01732435232501601
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.44907407407407407,
"acc_stderr": 0.03392238405321616,
"acc_norm": 0.44907407407407407,
"acc_norm_stderr": 0.03392238405321616
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.030190282453501954,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.030190282453501954
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7805907172995781,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.7805907172995781,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6502242152466368,
"acc_stderr": 0.03200736719484503,
"acc_norm": 0.6502242152466368,
"acc_norm_stderr": 0.03200736719484503
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.039418975265163025,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.039418975265163025
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.044143436668549335,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.044143436668549335
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.375,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.0230866350868414,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.0230866350868414
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8007662835249042,
"acc_stderr": 0.014283378044296413,
"acc_norm": 0.8007662835249042,
"acc_norm_stderr": 0.014283378044296413
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.023948512905468358,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.023948512905468358
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4346368715083799,
"acc_stderr": 0.01657899743549671,
"acc_norm": 0.4346368715083799,
"acc_norm_stderr": 0.01657899743549671
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.025360603796242557,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.025360603796242557
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.02600330111788514,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.02600330111788514
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7283950617283951,
"acc_stderr": 0.02474862449053737,
"acc_norm": 0.7283950617283951,
"acc_norm_stderr": 0.02474862449053737
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.02982074719142248,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.02982074719142248
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4361147327249022,
"acc_stderr": 0.01266556813545533,
"acc_norm": 0.4361147327249022,
"acc_norm_stderr": 0.01266556813545533
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6286764705882353,
"acc_stderr": 0.02934980313976587,
"acc_norm": 0.6286764705882353,
"acc_norm_stderr": 0.02934980313976587
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6552287581699346,
"acc_stderr": 0.019228322018696644,
"acc_norm": 0.6552287581699346,
"acc_norm_stderr": 0.019228322018696644
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.04494290866252091,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.04494290866252091
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6857142857142857,
"acc_stderr": 0.02971932942241748,
"acc_norm": 0.6857142857142857,
"acc_norm_stderr": 0.02971932942241748
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616914,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896309,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896309
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835816,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835816
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2729498164014688,
"mc1_stderr": 0.015594753632006526,
"mc2": 0.42426145726798214,
"mc2_stderr": 0.014425554324623623
},
"harness|winogrande|5": {
"acc": 0.77663772691397,
"acc_stderr": 0.011705697565205193
},
"harness|gsm8k|5": {
"acc": 0.47687642153146326,
"acc_stderr": 0.013757748544245317
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_fzzhang__mistralv1_gsm8k_merged_s | [
"region:us"
] | 2024-02-16T22:56:43+00:00 | {"pretty_name": "Evaluation run of fzzhang/mistralv1_gsm8k_merged_s", "dataset_summary": "Dataset automatically created during the evaluation run of model [fzzhang/mistralv1_gsm8k_merged_s](https://huggingface.co/fzzhang/mistralv1_gsm8k_merged_s) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_fzzhang__mistralv1_gsm8k_merged_s\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-16T22:54:23.549032](https://huggingface.co/datasets/open-llm-leaderboard/details_fzzhang__mistralv1_gsm8k_merged_s/blob/main/results_2024-02-16T22-54-23.549032.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6168865618276048,\n \"acc_stderr\": 0.03272942181052946,\n \"acc_norm\": 0.620454228546075,\n \"acc_norm_stderr\": 0.03338469630654732,\n \"mc1\": 0.2729498164014688,\n \"mc1_stderr\": 0.015594753632006526,\n \"mc2\": 0.42426145726798214,\n \"mc2_stderr\": 0.014425554324623623\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5827645051194539,\n \"acc_stderr\": 0.014409825518403084,\n \"acc_norm\": 0.6203071672354948,\n \"acc_norm_stderr\": 0.014182119866974872\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6467835092611034,\n \"acc_stderr\": 0.004769924131304649,\n \"acc_norm\": 0.8394742083250348,\n \"acc_norm_stderr\": 0.0036634275361781595\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6381578947368421,\n \"acc_stderr\": 0.039105257528497236,\n \"acc_norm\": 0.6381578947368421,\n \"acc_norm_stderr\": 0.039105257528497236\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.02881561571343211,\n \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.02881561571343211\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7083333333333334,\n \"acc_stderr\": 0.038009680605548594,\n \"acc_norm\": 0.7083333333333334,\n \"acc_norm_stderr\": 0.038009680605548594\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.03643037168958548,\n \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.03643037168958548\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.04951218252396262,\n \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.04951218252396262\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165044,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165044\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4851063829787234,\n \"acc_stderr\": 0.032671518489247764,\n \"acc_norm\": 0.4851063829787234,\n \"acc_norm_stderr\": 0.032671518489247764\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n \"acc_stderr\": 0.046570472605949625,\n \"acc_norm\": 0.4298245614035088,\n \"acc_norm_stderr\": 0.046570472605949625\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482758,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482758\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3783068783068783,\n \"acc_stderr\": 0.02497695405315524,\n \"acc_norm\": 0.3783068783068783,\n \"acc_norm_stderr\": 0.02497695405315524\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7516129032258064,\n \"acc_stderr\": 0.024580028921481003,\n \"acc_norm\": 0.7516129032258064,\n \"acc_norm_stderr\": 0.024580028921481003\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7393939393939394,\n \"acc_stderr\": 0.034277431758165236,\n \"acc_norm\": 0.7393939393939394,\n \"acc_norm_stderr\": 0.034277431758165236\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494562,\n \"acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494562\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.024233532297758733,\n \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.024233532297758733\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6230769230769231,\n \"acc_stderr\": 0.024570975364225995,\n \"acc_norm\": 0.6230769230769231,\n \"acc_norm_stderr\": 0.024570975364225995\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948485,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948485\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6260504201680672,\n \"acc_stderr\": 0.03142946637883708,\n \"acc_norm\": 0.6260504201680672,\n \"acc_norm_stderr\": 0.03142946637883708\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7944954128440367,\n \"acc_stderr\": 0.01732435232501601,\n \"acc_norm\": 0.7944954128440367,\n \"acc_norm_stderr\": 0.01732435232501601\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.44907407407407407,\n \"acc_stderr\": 0.03392238405321616,\n \"acc_norm\": 0.44907407407407407,\n \"acc_norm_stderr\": 0.03392238405321616\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.030190282453501954,\n \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.030190282453501954\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6502242152466368,\n \"acc_stderr\": 0.03200736719484503,\n \"acc_norm\": 0.6502242152466368,\n \"acc_norm_stderr\": 0.03200736719484503\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306086,\n \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306086\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7520661157024794,\n \"acc_stderr\": 0.039418975265163025,\n \"acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.039418975265163025\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.044143436668549335,\n \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.044143436668549335\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n \"acc_stderr\": 0.0230866350868414,\n \"acc_norm\": 0.8547008547008547,\n \"acc_norm_stderr\": 0.0230866350868414\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8007662835249042,\n \"acc_stderr\": 0.014283378044296413,\n \"acc_norm\": 0.8007662835249042,\n \"acc_norm_stderr\": 0.014283378044296413\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.023948512905468358,\n \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.023948512905468358\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4346368715083799,\n \"acc_stderr\": 0.01657899743549671,\n \"acc_norm\": 0.4346368715083799,\n \"acc_norm_stderr\": 0.01657899743549671\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242557,\n \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242557\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n \"acc_stderr\": 0.02600330111788514,\n \"acc_norm\": 0.7009646302250804,\n \"acc_norm_stderr\": 0.02600330111788514\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7283950617283951,\n \"acc_stderr\": 0.02474862449053737,\n \"acc_norm\": 0.7283950617283951,\n \"acc_norm_stderr\": 0.02474862449053737\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.02982074719142248,\n \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.02982074719142248\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4361147327249022,\n \"acc_stderr\": 0.01266556813545533,\n \"acc_norm\": 0.4361147327249022,\n \"acc_norm_stderr\": 0.01266556813545533\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6286764705882353,\n \"acc_stderr\": 0.02934980313976587,\n \"acc_norm\": 0.6286764705882353,\n \"acc_norm_stderr\": 0.02934980313976587\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6552287581699346,\n \"acc_stderr\": 0.019228322018696644,\n \"acc_norm\": 0.6552287581699346,\n \"acc_norm_stderr\": 0.019228322018696644\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.04494290866252091,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.04494290866252091\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6857142857142857,\n \"acc_stderr\": 0.02971932942241748,\n \"acc_norm\": 0.6857142857142857,\n \"acc_norm_stderr\": 0.02971932942241748\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.02587064676616914,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.02587064676616914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896309,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896309\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n \"acc_stderr\": 0.03891364495835816,\n \"acc_norm\": 0.5120481927710844,\n \"acc_norm_stderr\": 0.03891364495835816\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2729498164014688,\n \"mc1_stderr\": 0.015594753632006526,\n \"mc2\": 0.42426145726798214,\n \"mc2_stderr\": 0.014425554324623623\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.77663772691397,\n \"acc_stderr\": 0.011705697565205193\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.47687642153146326,\n \"acc_stderr\": 0.013757748544245317\n }\n}\n```", "repo_url": "https://huggingface.co/fzzhang/mistralv1_gsm8k_merged_s", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_16T22_54_23.549032", "path": ["**/details_harness|arc:challenge|25_2024-02-16T22-54-23.549032.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-16T22-54-23.549032.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_16T22_54_23.549032", "path": ["**/details_harness|gsm8k|5_2024-02-16T22-54-23.549032.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-16T22-54-23.549032.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_16T22_54_23.549032", "path": ["**/details_harness|hellaswag|10_2024-02-16T22-54-23.549032.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-16T22-54-23.549032.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_16T22_54_23.549032", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T22-54-23.549032.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T22-54-23.549032.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T22-54-23.549032.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_16T22_54_23.549032", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T22-54-23.549032.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T22-54-23.549032.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_16T22_54_23.549032", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T22-54-23.549032.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T22-54-23.549032.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_16T22_54_23.549032", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T22-54-23.549032.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T22-54-23.549032.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_16T22_54_23.549032", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T22-54-23.549032.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T22-54-23.549032.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_16T22_54_23.549032", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T22-54-23.549032.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T22-54-23.549032.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_16T22_54_23.549032", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T22-54-23.549032.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T22-54-23.549032.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_16T22_54_23.549032", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T22-54-23.549032.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T22-54-23.549032.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_16T22_54_23.549032", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T22-54-23.549032.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T22-54-23.549032.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_16T22_54_23.549032", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T22-54-23.549032.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T22-54-23.549032.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_16T22_54_23.549032", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T22-54-23.549032.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T22-54-23.549032.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_16T22_54_23.549032", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T22-54-23.549032.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T22-54-23.549032.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_16T22_54_23.549032", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T22-54-23.549032.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T22-54-23.549032.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_16T22_54_23.549032", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T22-54-23.549032.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T22-54-23.549032.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_16T22_54_23.549032", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T22-54-23.549032.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T22-54-23.549032.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_16T22_54_23.549032", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T22-54-23.549032.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T22-54-23.549032.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_16T22_54_23.549032", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T22-54-23.549032.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T22-54-23.549032.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_16T22_54_23.549032", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T22-54-23.549032.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T22-54-23.549032.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_16T22_54_23.549032", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T22-54-23.549032.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T22-54-23.549032.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_16T22_54_23.549032", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T22-54-23.549032.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T22-54-23.549032.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_16T22_54_23.549032", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T22-54-23.549032.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T22-54-23.549032.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_16T22_54_23.549032", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T22-54-23.549032.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T22-54-23.549032.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_16T22_54_23.549032", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T22-54-23.549032.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T22-54-23.549032.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_16T22_54_23.549032", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T22-54-23.549032.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T22-54-23.549032.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_16T22_54_23.549032", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T22-54-23.549032.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T22-54-23.549032.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_16T22_54_23.549032", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T22-54-23.549032.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T22-54-23.549032.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_16T22_54_23.549032", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T22-54-23.549032.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T22-54-23.549032.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_16T22_54_23.549032", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T22-54-23.549032.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T22-54-23.549032.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_16T22_54_23.549032", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T22-54-23.549032.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T22-54-23.549032.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_16T22_54_23.549032", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T22-54-23.549032.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T22-54-23.549032.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_16T22_54_23.549032", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T22-54-23.549032.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T22-54-23.549032.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_16T22_54_23.549032", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T22-54-23.549032.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T22-54-23.549032.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_16T22_54_23.549032", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T22-54-23.549032.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T22-54-23.549032.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_16T22_54_23.549032", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T22-54-23.549032.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T22-54-23.549032.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_16T22_54_23.549032", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T22-54-23.549032.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T22-54-23.549032.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_16T22_54_23.549032", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T22-54-23.549032.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T22-54-23.549032.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_16T22_54_23.549032", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T22-54-23.549032.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T22-54-23.549032.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_16T22_54_23.549032", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T22-54-23.549032.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T22-54-23.549032.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_16T22_54_23.549032", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T22-54-23.549032.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T22-54-23.549032.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_16T22_54_23.549032", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T22-54-23.549032.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T22-54-23.549032.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_16T22_54_23.549032", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T22-54-23.549032.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T22-54-23.549032.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_16T22_54_23.549032", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T22-54-23.549032.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T22-54-23.549032.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_16T22_54_23.549032", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T22-54-23.549032.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T22-54-23.549032.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_16T22_54_23.549032", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T22-54-23.549032.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T22-54-23.549032.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_16T22_54_23.549032", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T22-54-23.549032.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T22-54-23.549032.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_16T22_54_23.549032", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T22-54-23.549032.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T22-54-23.549032.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_16T22_54_23.549032", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T22-54-23.549032.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T22-54-23.549032.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_16T22_54_23.549032", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T22-54-23.549032.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T22-54-23.549032.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_16T22_54_23.549032", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T22-54-23.549032.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T22-54-23.549032.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_16T22_54_23.549032", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T22-54-23.549032.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T22-54-23.549032.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_16T22_54_23.549032", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T22-54-23.549032.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T22-54-23.549032.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_16T22_54_23.549032", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T22-54-23.549032.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T22-54-23.549032.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_16T22_54_23.549032", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T22-54-23.549032.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T22-54-23.549032.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_16T22_54_23.549032", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T22-54-23.549032.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T22-54-23.549032.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_16T22_54_23.549032", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T22-54-23.549032.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T22-54-23.549032.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_16T22_54_23.549032", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T22-54-23.549032.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T22-54-23.549032.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_16T22_54_23.549032", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T22-54-23.549032.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T22-54-23.549032.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_16T22_54_23.549032", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T22-54-23.549032.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T22-54-23.549032.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_16T22_54_23.549032", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T22-54-23.549032.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T22-54-23.549032.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_16T22_54_23.549032", "path": ["**/details_harness|winogrande|5_2024-02-16T22-54-23.549032.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-16T22-54-23.549032.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_16T22_54_23.549032", "path": ["results_2024-02-16T22-54-23.549032.parquet"]}, {"split": "latest", "path": ["results_2024-02-16T22-54-23.549032.parquet"]}]}]} | 2024-02-16T22:57:09+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of fzzhang/mistralv1_gsm8k_merged_s
Dataset automatically created during the evaluation run of model fzzhang/mistralv1_gsm8k_merged_s on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-16T22:54:23.549032(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of fzzhang/mistralv1_gsm8k_merged_s\n\n\n\nDataset automatically created during the evaluation run of model fzzhang/mistralv1_gsm8k_merged_s on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-16T22:54:23.549032(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of fzzhang/mistralv1_gsm8k_merged_s\n\n\n\nDataset automatically created during the evaluation run of model fzzhang/mistralv1_gsm8k_merged_s on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-16T22:54:23.549032(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
197,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of fzzhang/mistralv1_gsm8k_merged_s\n\n\n\nDataset automatically created during the evaluation run of model fzzhang/mistralv1_gsm8k_merged_s on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-16T22:54:23.549032(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]"
] |
63859932e52e82d63cc35d96fa0a39dde229c714 |
# Dataset Card for Evaluation run of fzzhang/Marcoroni-neural-chat-7B-v2_gsm8k_merged_s
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [fzzhang/Marcoroni-neural-chat-7B-v2_gsm8k_merged_s](https://huggingface.co/fzzhang/Marcoroni-neural-chat-7B-v2_gsm8k_merged_s) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_fzzhang__Marcoroni-neural-chat-7B-v2_gsm8k_merged_s",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-16T23:09:18.709191](https://huggingface.co/datasets/open-llm-leaderboard/details_fzzhang__Marcoroni-neural-chat-7B-v2_gsm8k_merged_s/blob/main/results_2024-02-16T23-09-18.709191.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6322971394538486,
"acc_stderr": 0.03234149565305396,
"acc_norm": 0.63180337036129,
"acc_norm_stderr": 0.03300828288156676,
"mc1": 0.4675642594859241,
"mc1_stderr": 0.017466632149577613,
"mc2": 0.6329203738044532,
"mc2_stderr": 0.01541374646266871
},
"harness|arc:challenge|25": {
"acc": 0.6527303754266212,
"acc_stderr": 0.013913034529620444,
"acc_norm": 0.6715017064846417,
"acc_norm_stderr": 0.013724978465537302
},
"harness|hellaswag|10": {
"acc": 0.6754630551682932,
"acc_stderr": 0.004672447046820005,
"acc_norm": 0.8568014339772954,
"acc_norm_stderr": 0.003495593662520757
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.0421850621536888,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.0421850621536888
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.03823428969926605,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.03823428969926605
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.027834912527544074,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.027834912527544074
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663454,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663454
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.025107425481137282,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.025107425481137282
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7677419354838709,
"acc_stderr": 0.024022256130308235,
"acc_norm": 0.7677419354838709,
"acc_norm_stderr": 0.024022256130308235
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.458128078817734,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.458128078817734,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7525252525252525,
"acc_stderr": 0.030746300742124484,
"acc_norm": 0.7525252525252525,
"acc_norm_stderr": 0.030746300742124484
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.023381935348121427,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.023381935348121427
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6512820512820513,
"acc_stderr": 0.02416278028401772,
"acc_norm": 0.6512820512820513,
"acc_norm_stderr": 0.02416278028401772
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3074074074074074,
"acc_stderr": 0.028133252578815642,
"acc_norm": 0.3074074074074074,
"acc_norm_stderr": 0.028133252578815642
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6512605042016807,
"acc_stderr": 0.030956636328566548,
"acc_norm": 0.6512605042016807,
"acc_norm_stderr": 0.030956636328566548
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763743,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763743
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8311926605504587,
"acc_stderr": 0.016060056268530343,
"acc_norm": 0.8311926605504587,
"acc_norm_stderr": 0.016060056268530343
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8088235294117647,
"acc_stderr": 0.027599174300640763,
"acc_norm": 0.8088235294117647,
"acc_norm_stderr": 0.027599174300640763
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.02655837250266192,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.02655837250266192
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.03768335959728743,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.03768335959728743
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7300613496932515,
"acc_stderr": 0.03487825168497892,
"acc_norm": 0.7300613496932515,
"acc_norm_stderr": 0.03487825168497892
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8931623931623932,
"acc_stderr": 0.020237149008990925,
"acc_norm": 0.8931623931623932,
"acc_norm_stderr": 0.020237149008990925
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8288633461047255,
"acc_stderr": 0.013468201614066302,
"acc_norm": 0.8288633461047255,
"acc_norm_stderr": 0.013468201614066302
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7225433526011561,
"acc_stderr": 0.02410571260775431,
"acc_norm": 0.7225433526011561,
"acc_norm_stderr": 0.02410571260775431
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.37206703910614525,
"acc_stderr": 0.016165847583563295,
"acc_norm": 0.37206703910614525,
"acc_norm_stderr": 0.016165847583563295
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6928104575163399,
"acc_stderr": 0.026415601914388992,
"acc_norm": 0.6928104575163399,
"acc_norm_stderr": 0.026415601914388992
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.02600330111788514,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.02600330111788514
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.02517104191530968,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.02517104191530968
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.02977945095730307,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.02977945095730307
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44589308996088656,
"acc_stderr": 0.012695244711379774,
"acc_norm": 0.44589308996088656,
"acc_norm_stderr": 0.012695244711379774
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6213235294117647,
"acc_stderr": 0.02946513363977613,
"acc_norm": 0.6213235294117647,
"acc_norm_stderr": 0.02946513363977613
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6584967320261438,
"acc_stderr": 0.019184639328092487,
"acc_norm": 0.6584967320261438,
"acc_norm_stderr": 0.019184639328092487
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784603,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784603
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8706467661691543,
"acc_stderr": 0.023729830881018515,
"acc_norm": 0.8706467661691543,
"acc_norm_stderr": 0.023729830881018515
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4675642594859241,
"mc1_stderr": 0.017466632149577613,
"mc2": 0.6329203738044532,
"mc2_stderr": 0.01541374646266871
},
"harness|winogrande|5": {
"acc": 0.7955801104972375,
"acc_stderr": 0.011334090612597207
},
"harness|gsm8k|5": {
"acc": 0.6982562547384382,
"acc_stderr": 0.012643544762873358
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_fzzhang__Marcoroni-neural-chat-7B-v2_gsm8k_merged_s | [
"region:us"
] | 2024-02-16T23:11:41+00:00 | {"pretty_name": "Evaluation run of fzzhang/Marcoroni-neural-chat-7B-v2_gsm8k_merged_s", "dataset_summary": "Dataset automatically created during the evaluation run of model [fzzhang/Marcoroni-neural-chat-7B-v2_gsm8k_merged_s](https://huggingface.co/fzzhang/Marcoroni-neural-chat-7B-v2_gsm8k_merged_s) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_fzzhang__Marcoroni-neural-chat-7B-v2_gsm8k_merged_s\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-16T23:09:18.709191](https://huggingface.co/datasets/open-llm-leaderboard/details_fzzhang__Marcoroni-neural-chat-7B-v2_gsm8k_merged_s/blob/main/results_2024-02-16T23-09-18.709191.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6322971394538486,\n \"acc_stderr\": 0.03234149565305396,\n \"acc_norm\": 0.63180337036129,\n \"acc_norm_stderr\": 0.03300828288156676,\n \"mc1\": 0.4675642594859241,\n \"mc1_stderr\": 0.017466632149577613,\n \"mc2\": 0.6329203738044532,\n \"mc2_stderr\": 0.01541374646266871\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6527303754266212,\n \"acc_stderr\": 0.013913034529620444,\n \"acc_norm\": 0.6715017064846417,\n \"acc_norm_stderr\": 0.013724978465537302\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6754630551682932,\n \"acc_stderr\": 0.004672447046820005,\n \"acc_norm\": 0.8568014339772954,\n \"acc_norm_stderr\": 0.003495593662520757\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.03823428969926605,\n \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.03823428969926605\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544074,\n \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544074\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.03246956919789958,\n \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.03246956919789958\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.025107425481137282,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.025107425481137282\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7677419354838709,\n \"acc_stderr\": 0.024022256130308235,\n \"acc_norm\": 0.7677419354838709,\n \"acc_norm_stderr\": 0.024022256130308235\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.458128078817734,\n \"acc_stderr\": 0.03505630140785741,\n \"acc_norm\": 0.458128078817734,\n \"acc_norm_stderr\": 0.03505630140785741\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7525252525252525,\n \"acc_stderr\": 0.030746300742124484,\n \"acc_norm\": 0.7525252525252525,\n \"acc_norm_stderr\": 0.030746300742124484\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.023381935348121427,\n \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.023381935348121427\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6512820512820513,\n \"acc_stderr\": 0.02416278028401772,\n \"acc_norm\": 0.6512820512820513,\n \"acc_norm_stderr\": 0.02416278028401772\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3074074074074074,\n \"acc_stderr\": 0.028133252578815642,\n \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.028133252578815642\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6512605042016807,\n \"acc_stderr\": 0.030956636328566548,\n \"acc_norm\": 0.6512605042016807,\n \"acc_norm_stderr\": 0.030956636328566548\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763743,\n \"acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763743\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8311926605504587,\n \"acc_stderr\": 0.016060056268530343,\n \"acc_norm\": 0.8311926605504587,\n \"acc_norm_stderr\": 0.016060056268530343\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8088235294117647,\n \"acc_stderr\": 0.027599174300640763,\n \"acc_norm\": 0.8088235294117647,\n \"acc_norm_stderr\": 0.027599174300640763\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7890295358649789,\n \"acc_stderr\": 0.02655837250266192,\n \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.02655837250266192\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.03768335959728743,\n \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.03768335959728743\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.03487825168497892,\n \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.03487825168497892\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n \"acc_stderr\": 0.020237149008990925,\n \"acc_norm\": 0.8931623931623932,\n \"acc_norm_stderr\": 0.020237149008990925\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8288633461047255,\n \"acc_stderr\": 0.013468201614066302,\n \"acc_norm\": 0.8288633461047255,\n \"acc_norm_stderr\": 0.013468201614066302\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7225433526011561,\n \"acc_stderr\": 0.02410571260775431,\n \"acc_norm\": 0.7225433526011561,\n \"acc_norm_stderr\": 0.02410571260775431\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.37206703910614525,\n \"acc_stderr\": 0.016165847583563295,\n \"acc_norm\": 0.37206703910614525,\n \"acc_norm_stderr\": 0.016165847583563295\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6928104575163399,\n \"acc_stderr\": 0.026415601914388992,\n \"acc_norm\": 0.6928104575163399,\n \"acc_norm_stderr\": 0.026415601914388992\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n \"acc_stderr\": 0.02600330111788514,\n \"acc_norm\": 0.7009646302250804,\n \"acc_norm_stderr\": 0.02600330111788514\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7129629629629629,\n \"acc_stderr\": 0.02517104191530968,\n \"acc_norm\": 0.7129629629629629,\n \"acc_norm_stderr\": 0.02517104191530968\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4716312056737589,\n \"acc_stderr\": 0.02977945095730307,\n \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.02977945095730307\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44589308996088656,\n \"acc_stderr\": 0.012695244711379774,\n \"acc_norm\": 0.44589308996088656,\n \"acc_norm_stderr\": 0.012695244711379774\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6213235294117647,\n \"acc_stderr\": 0.02946513363977613,\n \"acc_norm\": 0.6213235294117647,\n \"acc_norm_stderr\": 0.02946513363977613\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6584967320261438,\n \"acc_stderr\": 0.019184639328092487,\n \"acc_norm\": 0.6584967320261438,\n \"acc_norm_stderr\": 0.019184639328092487\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784603,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784603\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8706467661691543,\n \"acc_stderr\": 0.023729830881018515,\n \"acc_norm\": 0.8706467661691543,\n \"acc_norm_stderr\": 0.023729830881018515\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4675642594859241,\n \"mc1_stderr\": 0.017466632149577613,\n \"mc2\": 0.6329203738044532,\n \"mc2_stderr\": 0.01541374646266871\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7955801104972375,\n \"acc_stderr\": 0.011334090612597207\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6982562547384382,\n \"acc_stderr\": 0.012643544762873358\n }\n}\n```", "repo_url": "https://huggingface.co/fzzhang/Marcoroni-neural-chat-7B-v2_gsm8k_merged_s", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_16T23_09_18.709191", "path": ["**/details_harness|arc:challenge|25_2024-02-16T23-09-18.709191.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-16T23-09-18.709191.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_16T23_09_18.709191", "path": ["**/details_harness|gsm8k|5_2024-02-16T23-09-18.709191.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-16T23-09-18.709191.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_16T23_09_18.709191", "path": ["**/details_harness|hellaswag|10_2024-02-16T23-09-18.709191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-16T23-09-18.709191.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_16T23_09_18.709191", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T23-09-18.709191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T23-09-18.709191.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T23-09-18.709191.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_16T23_09_18.709191", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T23-09-18.709191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T23-09-18.709191.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_16T23_09_18.709191", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T23-09-18.709191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T23-09-18.709191.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_16T23_09_18.709191", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T23-09-18.709191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T23-09-18.709191.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_16T23_09_18.709191", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T23-09-18.709191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T23-09-18.709191.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_16T23_09_18.709191", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T23-09-18.709191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T23-09-18.709191.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_16T23_09_18.709191", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T23-09-18.709191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T23-09-18.709191.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_16T23_09_18.709191", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T23-09-18.709191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T23-09-18.709191.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_16T23_09_18.709191", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T23-09-18.709191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T23-09-18.709191.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_16T23_09_18.709191", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T23-09-18.709191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T23-09-18.709191.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_16T23_09_18.709191", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T23-09-18.709191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T23-09-18.709191.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_16T23_09_18.709191", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T23-09-18.709191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T23-09-18.709191.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_16T23_09_18.709191", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T23-09-18.709191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T23-09-18.709191.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_16T23_09_18.709191", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T23-09-18.709191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T23-09-18.709191.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_16T23_09_18.709191", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T23-09-18.709191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T23-09-18.709191.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_16T23_09_18.709191", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T23-09-18.709191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T23-09-18.709191.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_16T23_09_18.709191", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T23-09-18.709191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T23-09-18.709191.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_16T23_09_18.709191", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T23-09-18.709191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T23-09-18.709191.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_16T23_09_18.709191", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T23-09-18.709191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T23-09-18.709191.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_16T23_09_18.709191", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T23-09-18.709191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T23-09-18.709191.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_16T23_09_18.709191", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T23-09-18.709191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T23-09-18.709191.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_16T23_09_18.709191", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T23-09-18.709191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T23-09-18.709191.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_16T23_09_18.709191", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T23-09-18.709191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T23-09-18.709191.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_16T23_09_18.709191", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T23-09-18.709191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T23-09-18.709191.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_16T23_09_18.709191", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T23-09-18.709191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T23-09-18.709191.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_16T23_09_18.709191", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T23-09-18.709191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T23-09-18.709191.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_16T23_09_18.709191", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T23-09-18.709191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T23-09-18.709191.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_16T23_09_18.709191", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T23-09-18.709191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T23-09-18.709191.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_16T23_09_18.709191", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T23-09-18.709191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T23-09-18.709191.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_16T23_09_18.709191", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T23-09-18.709191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T23-09-18.709191.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_16T23_09_18.709191", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T23-09-18.709191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T23-09-18.709191.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_16T23_09_18.709191", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T23-09-18.709191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T23-09-18.709191.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_16T23_09_18.709191", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T23-09-18.709191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T23-09-18.709191.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_16T23_09_18.709191", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T23-09-18.709191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T23-09-18.709191.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_16T23_09_18.709191", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T23-09-18.709191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T23-09-18.709191.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_16T23_09_18.709191", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T23-09-18.709191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T23-09-18.709191.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_16T23_09_18.709191", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T23-09-18.709191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T23-09-18.709191.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_16T23_09_18.709191", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T23-09-18.709191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T23-09-18.709191.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_16T23_09_18.709191", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T23-09-18.709191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T23-09-18.709191.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_16T23_09_18.709191", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T23-09-18.709191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T23-09-18.709191.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_16T23_09_18.709191", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T23-09-18.709191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T23-09-18.709191.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_16T23_09_18.709191", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T23-09-18.709191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T23-09-18.709191.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_16T23_09_18.709191", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T23-09-18.709191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T23-09-18.709191.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_16T23_09_18.709191", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T23-09-18.709191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T23-09-18.709191.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_16T23_09_18.709191", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T23-09-18.709191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T23-09-18.709191.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_16T23_09_18.709191", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T23-09-18.709191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T23-09-18.709191.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_16T23_09_18.709191", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T23-09-18.709191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T23-09-18.709191.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_16T23_09_18.709191", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T23-09-18.709191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T23-09-18.709191.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_16T23_09_18.709191", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T23-09-18.709191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T23-09-18.709191.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_16T23_09_18.709191", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T23-09-18.709191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T23-09-18.709191.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_16T23_09_18.709191", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T23-09-18.709191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T23-09-18.709191.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_16T23_09_18.709191", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T23-09-18.709191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T23-09-18.709191.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_16T23_09_18.709191", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T23-09-18.709191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T23-09-18.709191.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_16T23_09_18.709191", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T23-09-18.709191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T23-09-18.709191.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_16T23_09_18.709191", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T23-09-18.709191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T23-09-18.709191.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_16T23_09_18.709191", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T23-09-18.709191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T23-09-18.709191.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_16T23_09_18.709191", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T23-09-18.709191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T23-09-18.709191.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_16T23_09_18.709191", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T23-09-18.709191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T23-09-18.709191.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_16T23_09_18.709191", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T23-09-18.709191.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T23-09-18.709191.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_16T23_09_18.709191", "path": ["**/details_harness|winogrande|5_2024-02-16T23-09-18.709191.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-16T23-09-18.709191.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_16T23_09_18.709191", "path": ["results_2024-02-16T23-09-18.709191.parquet"]}, {"split": "latest", "path": ["results_2024-02-16T23-09-18.709191.parquet"]}]}]} | 2024-02-16T23:12:03+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of fzzhang/Marcoroni-neural-chat-7B-v2_gsm8k_merged_s
Dataset automatically created during the evaluation run of model fzzhang/Marcoroni-neural-chat-7B-v2_gsm8k_merged_s on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-16T23:09:18.709191(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of fzzhang/Marcoroni-neural-chat-7B-v2_gsm8k_merged_s\n\n\n\nDataset automatically created during the evaluation run of model fzzhang/Marcoroni-neural-chat-7B-v2_gsm8k_merged_s on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-16T23:09:18.709191(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of fzzhang/Marcoroni-neural-chat-7B-v2_gsm8k_merged_s\n\n\n\nDataset automatically created during the evaluation run of model fzzhang/Marcoroni-neural-chat-7B-v2_gsm8k_merged_s on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-16T23:09:18.709191(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
215,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of fzzhang/Marcoroni-neural-chat-7B-v2_gsm8k_merged_s\n\n\n\nDataset automatically created during the evaluation run of model fzzhang/Marcoroni-neural-chat-7B-v2_gsm8k_merged_s on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-16T23:09:18.709191(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:"
] |
e30a1c8e035abdbb5dd766350bba13bfbb692bf6 |
# Test
This dataset is a test. | dwancin/test | [
"license:mit",
"region:us"
] | 2024-02-16T23:34:14+00:00 | {"license": "mit"} | 2024-02-16T23:36:02+00:00 | [] | [] | TAGS
#license-mit #region-us
|
# Test
This dataset is a test. | [
"# Test\n\nThis dataset is a test."
] | [
"TAGS\n#license-mit #region-us \n",
"# Test\n\nThis dataset is a test."
] | [
11,
9
] | [
"passage: TAGS\n#license-mit #region-us \n# Test\n\nThis dataset is a test."
] |
af457ab1d2ecfe1affba047e985a28896c6f6d0c |
# ms_marco_japanese
- [ms_marco](https://huggingface.co/datasets/ms_marco) の日本語翻訳データです。
- 翻訳には、[google/madlad400-3b-mt](https://huggingface.co/google/madlad400-3b-mt)を利用しています。
- HuggingFace で公開されている、ms_marco と同等の構造で保存しています。
- 翻訳品質はそれほど高くありません。繁体字などが含まれるデータもあります。Google Translate API を用いて翻訳された、マルチリンガルms_marcoデータセットである、[mMARCO](https://github.com/unicamp-dl/mMARCO)の方が品質が高いです。そのため、このデータセットを利用の際は、他の翻訳データセットとの比較をお勧めします。
- wellFormedAnswers カラムは翻訳していません
- 翻訳にかかった時間は、高速化のため[santhosh/madlad400-3b-ct2](https://huggingface.co/santhosh/madlad400-3b-ct2)を利用し、対象のデータ約1000万文に対して RTX3090 で8日ほどでした。
## 利用方法
```
from datasets import load_dataset
train_ds = load_dataset("hotchpotch/ms_marco_japanese", "v2.1-madlad400-3b", split="train")
validation_ds = load_dataset("hotchpotch/ms_marco_japanese", "v2.1-madlad400-3b", split="validation")
test_ds = load_dataset("hotchpotch/ms_marco_japanese", "v2.1-madlad400-3b", split="test"
```
```
print(train_ds[0])
{'answers': ['マンハッタン計画の成功が直接的にもたらした影響は、原子力研究者や技術員達による素晴しい業績を覆い隠す唯一な雲であった。その成果と真実であるもの:何十万という無辜なる命々があきれていたことだろうか?'], 'passages': {'is_selected': [1, 0, 0, 0, 0, 0, 0, 0, 0, 0], 'passage_text': ['科学者の間でコミュニケーションが行われることは、マンハッタン計画を成功させるために重要であった。原子力研究家や技術員たちによって達成された素晴らしい業績には雲だけがあふれているものだろうか?その実際的な意味と言えば何十万という無辜なる人々へ生命も犠牲になっていることですね!', 'マンハッタン計画とその原子爆弾は第二次世界大戦の終結に寄与し、平和的な目標をもって核エネルギーが利用されたことで歴史や科学界には影響力があった。', 'マンハッタン計画は原子爆弾の製造が可能かどうかなんて見るために始められた。このプロジェクトを成功させれば、世界には永遠な変化がありそこまで強力で人工的であることも知らしむことになっただろいますからね.', 'マンハッタン計画(Manhattan Project)は、第二次世界大戦中にアメリカ合衆国で行われた原子爆弾開発プロジェクトの名称。特別には1942年から翌日までレスリー・R. グローブズ将軍が指揮する米陸军工兵隊によって実施されたものをいうことが多かったのである 。', 'また、各巻のバージョンと補完的なウェブサイトもある。最初に作られたのは『マンハッタン計画: インタラクティヴ・ヒストリー』であり([http://www.cfo-doe/me70_history)歴史遺産資源局および国家核安全保障庁によるものだったが現在では全て廃止されています(https//en](http://www.cfo-doe/me70_history)%E6%AD%B4%E5%8F%B2%E9%81%BA%E7%94%A3%E8%B3%87%E6%BA%90%E5%B1%80%E3%81%8A%E3%82%88%E3%81%B3%E5%9B%BD%E5%AE%B6%E6%A0%B8%E5%AE%89%E5%85%A8%E4%BF%9D%E9%9A%9C%E5%BA%81%E3%81%AB%E3%82%88%E3%82%8B%E3%82%82%E3%81%AE%E3%81%A0%E3%81%A3%E3%81%9F%E3%81%8C%E7%8F%BE%E5%9C%A8%E3%81%A7%E3%81%AF%E5%85%A8%E3%81%A6%E5%BB%83%E6%AD%A2%E3%81%95%E3%82%8C%E3%81%A6%E3%81%84%E3%81%BE%E3%81%99(https//en))', '原子爆弾は、1945年7月にニューメキシコ州の砂漠で初めて実験的な核兵器として使用された。その後も多くが開発され続けたものだったのである(マンハッタン計画)。', 'また、原爆や第二次世界大戦の終結に関する非常によく豊富な文献を置き換える試みもない。本コレクションはマンハッタン計画について起源と発展が記録されることには努めていませんのである 。', 'マンハッタン計画(Manhattan Project)は、第二次世界大戦中に最初の核兵器を生産した研究開発事業である。イギリスとカナダによる支援下アメリカ合衆国が主導していたものだった 。1942年から1946年代までこのプロジェクトには米陸軍工廠少将レスリー・グローブス (Leslie Groves) (英語版 )(en:Lesley G.Grove, US Army Corp of Engineer), ロサンゼル斯原子力実験場所長ロバート·オペンハーマーらも参加しており,その間爆弾設計者として活躍していることでも知られていたのであり ,また彼等自身について言及する必要性があると考えている人物であることなどよりこれ以上詳細な情報ではないかという意見がありました', '1942年6月、アメリカ陸軍工兵隊はマンハッタン計画を開始した。原子爆弾の秘密名称であるが.', 'マンハッタン計画のB炉がハンフォードに建設される理由は、北アメリカ沿岸から太平洋へ流れ込む最大級河川であるコロンビア湖と近いことだった。'], 'url': ['[http://www.pitt.edu/~sdb14/atombomb.html](http://www.pitt.edu/~sdb14/atombomb.html)', '[http://www.osti.gov/accomplishments/manhattan_story.html](http://www.osti.gov/accomplishments/manhattan_story.html)', '[http://www.123helpme.com/impact-of-the-manhattan-project-preview.asp?id=177337](http://www.123helpme.com/impact-of-the-manhattan-project-preview.asp?id=177337)', '[http://www.answers.com/Q/How_did_the_Manhattan_Project_impact_on_society](http://www.answers.com/Q/How_did_the_Manhattan_Project_impact_on_society)', '[https://www.osti.gov/manhattan-project-history/publications/Manhattan_Project_2010.pdf](https://www.osti.gov/manhattan-project-history/publications/Manhattan_Project_2010.pdf)', '[http://www.ushistory.org/us/51f.asp](http://www.ushistory.org/us/51f.asp)', '[http://nsarchive.gwu.edu/NSAEBB/NSAEBB162](http://nsarchive.gwu.edu/NSAEBB/NSAEBB162)', '[https://en.wikipedia.org/wiki/Manhattan_Project](https://en.wikipedia.org/wiki/Manhattan_Project)', '[https://quizlet.com/41456230/a-bomb-flash-cards/](https://quizlet.com/41456230/a-bomb-flash-cards/)', '[https://www.atomicheritage.org/history/environmental-consequences](https://www.atomicheritage.org/history/environmental-consequences)']}, 'query': '(マンハッタン計画の成功が直接的にもたらした影響は何でしょうか。', 'query_id': 1185869, 'query_type': 'DESCRIPTION', 'wellFormedAnswers': []}
```
## ライセンス
- ms_marco と同等とします。 | hotchpotch/ms_marco_japanese | [
"license:other",
"region:us"
] | 2024-02-16T23:37:22+00:00 | {"license": "other", "license_name": "same-ms-marco", "license_link": "https://huggingface.co/datasets/ms_marco", "dataset_info": {"config_name": "v2.1-madlad400-3b", "features": [{"name": "answers", "sequence": "string"}, {"name": "passages", "sequence": [{"name": "is_selected", "dtype": "int32"}, {"name": "passage_text", "dtype": "string"}, {"name": "url", "dtype": "string"}]}, {"name": "query", "dtype": "string"}, {"name": "query_id", "dtype": "int32"}, {"name": "query_type", "dtype": "string"}, {"name": "wellFormedAnswers", "sequence": "string"}], "splits": [{"name": "validation", "num_bytes": 440690468, "num_examples": 101093}, {"name": "train", "num_bytes": 3590508080, "num_examples": 808731}, {"name": "test", "num_bytes": 430765349, "num_examples": 101092}], "download_size": 2491144245, "dataset_size": 4461963897}, "configs": [{"config_name": "v2.1-madlad400-3b", "data_files": [{"split": "validation", "path": "v2.1-madlad400-3b/validation-*"}, {"split": "train", "path": "v2.1-madlad400-3b/train-*"}, {"split": "test", "path": "v2.1-madlad400-3b/test-*"}]}]} | 2024-02-17T01:00:50+00:00 | [] | [] | TAGS
#license-other #region-us
|
# ms_marco_japanese
- ms_marco の日本語翻訳データです。
- 翻訳には、google/madlad400-3b-mtを利用しています。
- HuggingFace で公開されている、ms_marco と同等の構造で保存しています。
- 翻訳品質はそれほど高くありません。繁体字などが含まれるデータもあります。Google Translate API を用いて翻訳された、マルチリンガルms_marcoデータセットである、mMARCOの方が品質が高いです。そのため、このデータセットを利用の際は、他の翻訳データセットとの比較をお勧めします。
- wellFormedAnswers カラムは翻訳していません
- 翻訳にかかった時間は、高速化のためsanthosh/madlad400-3b-ct2を利用し、対象のデータ約1000万文に対して RTX3090 で8日ほどでした。
## 利用方法
## ライセンス
- ms_marco と同等とします。 | [
"# ms_marco_japanese\n\n- ms_marco の日本語翻訳データです。\n- 翻訳には、google/madlad400-3b-mtを利用しています。\n- HuggingFace で公開されている、ms_marco と同等の構造で保存しています。\n- 翻訳品質はそれほど高くありません。繁体字などが含まれるデータもあります。Google Translate API を用いて翻訳された、マルチリンガルms_marcoデータセットである、mMARCOの方が品質が高いです。そのため、このデータセットを利用の際は、他の翻訳データセットとの比較をお勧めします。\n- wellFormedAnswers カラムは翻訳していません\n- 翻訳にかかった時間は、高速化のためsanthosh/madlad400-3b-ct2を利用し、対象のデータ約1000万文に対して RTX3090 で8日ほどでした。",
"## 利用方法",
"## ライセンス\n\n- ms_marco と同等とします。"
] | [
"TAGS\n#license-other #region-us \n",
"# ms_marco_japanese\n\n- ms_marco の日本語翻訳データです。\n- 翻訳には、google/madlad400-3b-mtを利用しています。\n- HuggingFace で公開されている、ms_marco と同等の構造で保存しています。\n- 翻訳品質はそれほど高くありません。繁体字などが含まれるデータもあります。Google Translate API を用いて翻訳された、マルチリンガルms_marcoデータセットである、mMARCOの方が品質が高いです。そのため、このデータセットを利用の際は、他の翻訳データセットとの比較をお勧めします。\n- wellFormedAnswers カラムは翻訳していません\n- 翻訳にかかった時間は、高速化のためsanthosh/madlad400-3b-ct2を利用し、対象のデータ約1000万文に対して RTX3090 で8日ほどでした。",
"## 利用方法",
"## ライセンス\n\n- ms_marco と同等とします。"
] | [
11,
193,
4,
17
] | [
"passage: TAGS\n#license-other #region-us \n# ms_marco_japanese\n\n- ms_marco の日本語翻訳データです。\n- 翻訳には、google/madlad400-3b-mtを利用しています。\n- HuggingFace で公開されている、ms_marco と同等の構造で保存しています。\n- 翻訳品質はそれほど高くありません。繁体字などが含まれるデータもあります。Google Translate API を用いて翻訳された、マルチリンガルms_marcoデータセットである、mMARCOの方が品質が高いです。そのため、このデータセットを利用の際は、他の翻訳データセットとの比較をお勧めします。\n- wellFormedAnswers カラムは翻訳していません\n- 翻訳にかかった時間は、高速化のためsanthosh/madlad400-3b-ct2を利用し、対象のデータ約1000万文に対して RTX3090 で8日ほどでした。## 利用方法## ライセンス\n\n- ms_marco と同等とします。"
] |
660ea158fa5d5507447ed695366b063f98dce6d3 |
# Dataset Card for Evaluation run of Yuma42/KangalKhan-Ruby-7B-Fixed
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Yuma42/KangalKhan-Ruby-7B-Fixed](https://huggingface.co/Yuma42/KangalKhan-Ruby-7B-Fixed) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Yuma42__KangalKhan-Ruby-7B-Fixed",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-17T00:20:49.422540](https://huggingface.co/datasets/open-llm-leaderboard/details_Yuma42__KangalKhan-Ruby-7B-Fixed/blob/main/results_2024-02-17T00-20-49.422540.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6347473013166217,
"acc_stderr": 0.032259927653459516,
"acc_norm": 0.6365178163764577,
"acc_norm_stderr": 0.03290211517224385,
"mc1": 0.38922888616891066,
"mc1_stderr": 0.01706855268069033,
"mc2": 0.5648879973341684,
"mc2_stderr": 0.01540236564556069
},
"harness|arc:challenge|25": {
"acc": 0.6220136518771331,
"acc_stderr": 0.014169664520303098,
"acc_norm": 0.6723549488054608,
"acc_norm_stderr": 0.013715847940719337
},
"harness|hellaswag|10": {
"acc": 0.6683927504481179,
"acc_stderr": 0.004698285350019216,
"acc_norm": 0.8522206731726748,
"acc_norm_stderr": 0.00354155826377909
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.028544793319055326,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.028544793319055326
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5953757225433526,
"acc_stderr": 0.03742461193887248,
"acc_norm": 0.5953757225433526,
"acc_norm_stderr": 0.03742461193887248
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105653,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105653
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108101,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108101
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.0470070803355104,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.0470070803355104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.025331202438944433,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.025331202438944433
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.023287665127268552,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.023287665127268552
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.65,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.029376616484945633,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.029376616484945633
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919443,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6102564102564103,
"acc_stderr": 0.024726967886647074,
"acc_norm": 0.6102564102564103,
"acc_norm_stderr": 0.024726967886647074
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.02822644674968352,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.02822644674968352
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658752,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658752
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8385321100917431,
"acc_stderr": 0.015776239256163227,
"acc_norm": 0.8385321100917431,
"acc_norm_stderr": 0.015776239256163227
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.027865942286639318,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.027865942286639318
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290913,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290913
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.03641297081313728,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.03641297081313728
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597528,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597528
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.01354741565866226,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.01354741565866226
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7138728323699421,
"acc_stderr": 0.02433214677913413,
"acc_norm": 0.7138728323699421,
"acc_norm_stderr": 0.02433214677913413
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3307262569832402,
"acc_stderr": 0.01573502625896612,
"acc_norm": 0.3307262569832402,
"acc_norm_stderr": 0.01573502625896612
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7418300653594772,
"acc_stderr": 0.02505850331695814,
"acc_norm": 0.7418300653594772,
"acc_norm_stderr": 0.02505850331695814
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6913183279742765,
"acc_stderr": 0.026236965881153266,
"acc_norm": 0.6913183279742765,
"acc_norm_stderr": 0.026236965881153266
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7438271604938271,
"acc_stderr": 0.024288533637726095,
"acc_norm": 0.7438271604938271,
"acc_norm_stderr": 0.024288533637726095
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5,
"acc_stderr": 0.029827499313594685,
"acc_norm": 0.5,
"acc_norm_stderr": 0.029827499313594685
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47327249022164275,
"acc_stderr": 0.012751977967676008,
"acc_norm": 0.47327249022164275,
"acc_norm_stderr": 0.012751977967676008
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.02841820861940676,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.02841820861940676
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6748366013071896,
"acc_stderr": 0.01895088677080631,
"acc_norm": 0.6748366013071896,
"acc_norm_stderr": 0.01895088677080631
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.04494290866252091,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.04494290866252091
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142773,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142773
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8109452736318408,
"acc_stderr": 0.027686913588013003,
"acc_norm": 0.8109452736318408,
"acc_norm_stderr": 0.027686913588013003
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685517,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685517
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.38922888616891066,
"mc1_stderr": 0.01706855268069033,
"mc2": 0.5648879973341684,
"mc2_stderr": 0.01540236564556069
},
"harness|winogrande|5": {
"acc": 0.7797947908445146,
"acc_stderr": 0.011646276755089693
},
"harness|gsm8k|5": {
"acc": 0.6194086429112965,
"acc_stderr": 0.013373971277729818
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Yuma42__KangalKhan-Ruby-7B-Fixed | [
"region:us"
] | 2024-02-17T00:23:08+00:00 | {"pretty_name": "Evaluation run of Yuma42/KangalKhan-Ruby-7B-Fixed", "dataset_summary": "Dataset automatically created during the evaluation run of model [Yuma42/KangalKhan-Ruby-7B-Fixed](https://huggingface.co/Yuma42/KangalKhan-Ruby-7B-Fixed) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Yuma42__KangalKhan-Ruby-7B-Fixed\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-17T00:20:49.422540](https://huggingface.co/datasets/open-llm-leaderboard/details_Yuma42__KangalKhan-Ruby-7B-Fixed/blob/main/results_2024-02-17T00-20-49.422540.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6347473013166217,\n \"acc_stderr\": 0.032259927653459516,\n \"acc_norm\": 0.6365178163764577,\n \"acc_norm_stderr\": 0.03290211517224385,\n \"mc1\": 0.38922888616891066,\n \"mc1_stderr\": 0.01706855268069033,\n \"mc2\": 0.5648879973341684,\n \"mc2_stderr\": 0.01540236564556069\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6220136518771331,\n \"acc_stderr\": 0.014169664520303098,\n \"acc_norm\": 0.6723549488054608,\n \"acc_norm_stderr\": 0.013715847940719337\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6683927504481179,\n \"acc_stderr\": 0.004698285350019216,\n \"acc_norm\": 0.8522206731726748,\n \"acc_norm_stderr\": 0.00354155826377909\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5953757225433526,\n \"acc_stderr\": 0.03742461193887248,\n \"acc_norm\": 0.5953757225433526,\n \"acc_norm_stderr\": 0.03742461193887248\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105653,\n \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105653\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108101,\n \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108101\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.0470070803355104,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.0470070803355104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41005291005291006,\n \"acc_stderr\": 0.025331202438944433,\n \"acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.025331202438944433\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n \"acc_stderr\": 0.023287665127268552,\n \"acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.023287665127268552\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7828282828282829,\n \"acc_stderr\": 0.029376616484945633,\n \"acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.029376616484945633\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919443,\n \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919443\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6102564102564103,\n \"acc_stderr\": 0.024726967886647074,\n \"acc_norm\": 0.6102564102564103,\n \"acc_norm_stderr\": 0.024726967886647074\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3111111111111111,\n \"acc_stderr\": 0.02822644674968352,\n \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.02822644674968352\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658752,\n \"acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658752\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8385321100917431,\n \"acc_stderr\": 0.015776239256163227,\n \"acc_norm\": 0.8385321100917431,\n \"acc_norm_stderr\": 0.015776239256163227\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.803921568627451,\n \"acc_stderr\": 0.027865942286639318,\n \"acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639318\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290913,\n \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290913\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313728,\n \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313728\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.022801382534597528,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.022801382534597528\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n \"acc_stderr\": 0.01354741565866226,\n \"acc_norm\": 0.8263090676883781,\n \"acc_norm_stderr\": 0.01354741565866226\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7138728323699421,\n \"acc_stderr\": 0.02433214677913413,\n \"acc_norm\": 0.7138728323699421,\n \"acc_norm_stderr\": 0.02433214677913413\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3307262569832402,\n \"acc_stderr\": 0.01573502625896612,\n \"acc_norm\": 0.3307262569832402,\n \"acc_norm_stderr\": 0.01573502625896612\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7418300653594772,\n \"acc_stderr\": 0.02505850331695814,\n \"acc_norm\": 0.7418300653594772,\n \"acc_norm_stderr\": 0.02505850331695814\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6913183279742765,\n \"acc_stderr\": 0.026236965881153266,\n \"acc_norm\": 0.6913183279742765,\n \"acc_norm_stderr\": 0.026236965881153266\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.024288533637726095,\n \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.024288533637726095\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47327249022164275,\n \"acc_stderr\": 0.012751977967676008,\n \"acc_norm\": 0.47327249022164275,\n \"acc_norm_stderr\": 0.012751977967676008\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.02841820861940676,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.02841820861940676\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6748366013071896,\n \"acc_stderr\": 0.01895088677080631,\n \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.01895088677080631\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.04494290866252091,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.04494290866252091\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142773,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142773\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8109452736318408,\n \"acc_stderr\": 0.027686913588013003,\n \"acc_norm\": 0.8109452736318408,\n \"acc_norm_stderr\": 0.027686913588013003\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n \"acc_stderr\": 0.03858158940685517,\n \"acc_norm\": 0.5662650602409639,\n \"acc_norm_stderr\": 0.03858158940685517\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.38922888616891066,\n \"mc1_stderr\": 0.01706855268069033,\n \"mc2\": 0.5648879973341684,\n \"mc2_stderr\": 0.01540236564556069\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7797947908445146,\n \"acc_stderr\": 0.011646276755089693\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6194086429112965,\n \"acc_stderr\": 0.013373971277729818\n }\n}\n```", "repo_url": "https://huggingface.co/Yuma42/KangalKhan-Ruby-7B-Fixed", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_17T00_20_49.422540", "path": ["**/details_harness|arc:challenge|25_2024-02-17T00-20-49.422540.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-17T00-20-49.422540.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_17T00_20_49.422540", "path": ["**/details_harness|gsm8k|5_2024-02-17T00-20-49.422540.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-17T00-20-49.422540.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_17T00_20_49.422540", "path": ["**/details_harness|hellaswag|10_2024-02-17T00-20-49.422540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-17T00-20-49.422540.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_17T00_20_49.422540", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-17T00-20-49.422540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-17T00-20-49.422540.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-17T00-20-49.422540.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_17T00_20_49.422540", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T00-20-49.422540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T00-20-49.422540.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_17T00_20_49.422540", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-17T00-20-49.422540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-17T00-20-49.422540.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_17T00_20_49.422540", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-17T00-20-49.422540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-17T00-20-49.422540.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_17T00_20_49.422540", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T00-20-49.422540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T00-20-49.422540.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_17T00_20_49.422540", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T00-20-49.422540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T00-20-49.422540.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_17T00_20_49.422540", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-17T00-20-49.422540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-17T00-20-49.422540.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_17T00_20_49.422540", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T00-20-49.422540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T00-20-49.422540.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_17T00_20_49.422540", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T00-20-49.422540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T00-20-49.422540.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_17T00_20_49.422540", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T00-20-49.422540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T00-20-49.422540.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_17T00_20_49.422540", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T00-20-49.422540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T00-20-49.422540.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_17T00_20_49.422540", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-17T00-20-49.422540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-17T00-20-49.422540.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_17T00_20_49.422540", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-17T00-20-49.422540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-17T00-20-49.422540.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_17T00_20_49.422540", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T00-20-49.422540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T00-20-49.422540.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_17T00_20_49.422540", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-17T00-20-49.422540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-17T00-20-49.422540.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_17T00_20_49.422540", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T00-20-49.422540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T00-20-49.422540.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_17T00_20_49.422540", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T00-20-49.422540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T00-20-49.422540.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_17T00_20_49.422540", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T00-20-49.422540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T00-20-49.422540.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_17T00_20_49.422540", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-17T00-20-49.422540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-17T00-20-49.422540.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_17T00_20_49.422540", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T00-20-49.422540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T00-20-49.422540.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_17T00_20_49.422540", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T00-20-49.422540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T00-20-49.422540.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_17T00_20_49.422540", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T00-20-49.422540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T00-20-49.422540.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_17T00_20_49.422540", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T00-20-49.422540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T00-20-49.422540.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_17T00_20_49.422540", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T00-20-49.422540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T00-20-49.422540.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_17T00_20_49.422540", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T00-20-49.422540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T00-20-49.422540.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_17T00_20_49.422540", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T00-20-49.422540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T00-20-49.422540.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_17T00_20_49.422540", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T00-20-49.422540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T00-20-49.422540.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_17T00_20_49.422540", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T00-20-49.422540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T00-20-49.422540.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_17T00_20_49.422540", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T00-20-49.422540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T00-20-49.422540.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_17T00_20_49.422540", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T00-20-49.422540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T00-20-49.422540.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_17T00_20_49.422540", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T00-20-49.422540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T00-20-49.422540.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_17T00_20_49.422540", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T00-20-49.422540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T00-20-49.422540.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_17T00_20_49.422540", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T00-20-49.422540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T00-20-49.422540.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_17T00_20_49.422540", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-17T00-20-49.422540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-17T00-20-49.422540.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_17T00_20_49.422540", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T00-20-49.422540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T00-20-49.422540.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_17T00_20_49.422540", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-17T00-20-49.422540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-17T00-20-49.422540.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_17T00_20_49.422540", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T00-20-49.422540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T00-20-49.422540.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_17T00_20_49.422540", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T00-20-49.422540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T00-20-49.422540.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_17T00_20_49.422540", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T00-20-49.422540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T00-20-49.422540.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_17T00_20_49.422540", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-17T00-20-49.422540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-17T00-20-49.422540.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_17T00_20_49.422540", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-17T00-20-49.422540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-17T00-20-49.422540.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_17T00_20_49.422540", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T00-20-49.422540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T00-20-49.422540.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_17T00_20_49.422540", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T00-20-49.422540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T00-20-49.422540.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_17T00_20_49.422540", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T00-20-49.422540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T00-20-49.422540.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_17T00_20_49.422540", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T00-20-49.422540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T00-20-49.422540.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_17T00_20_49.422540", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-17T00-20-49.422540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-17T00-20-49.422540.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_17T00_20_49.422540", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-17T00-20-49.422540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-17T00-20-49.422540.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_17T00_20_49.422540", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-17T00-20-49.422540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-17T00-20-49.422540.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_17T00_20_49.422540", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T00-20-49.422540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T00-20-49.422540.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_17T00_20_49.422540", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-17T00-20-49.422540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-17T00-20-49.422540.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_17T00_20_49.422540", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T00-20-49.422540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T00-20-49.422540.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_17T00_20_49.422540", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T00-20-49.422540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T00-20-49.422540.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_17T00_20_49.422540", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-17T00-20-49.422540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-17T00-20-49.422540.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_17T00_20_49.422540", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-17T00-20-49.422540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-17T00-20-49.422540.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_17T00_20_49.422540", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-17T00-20-49.422540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-17T00-20-49.422540.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_17T00_20_49.422540", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T00-20-49.422540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T00-20-49.422540.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_17T00_20_49.422540", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-17T00-20-49.422540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-17T00-20-49.422540.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_17T00_20_49.422540", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-17T00-20-49.422540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-17T00-20-49.422540.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_17T00_20_49.422540", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-17T00-20-49.422540.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-17T00-20-49.422540.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_17T00_20_49.422540", "path": ["**/details_harness|winogrande|5_2024-02-17T00-20-49.422540.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-17T00-20-49.422540.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_17T00_20_49.422540", "path": ["results_2024-02-17T00-20-49.422540.parquet"]}, {"split": "latest", "path": ["results_2024-02-17T00-20-49.422540.parquet"]}]}]} | 2024-02-17T00:23:32+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Yuma42/KangalKhan-Ruby-7B-Fixed
Dataset automatically created during the evaluation run of model Yuma42/KangalKhan-Ruby-7B-Fixed on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-17T00:20:49.422540(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Yuma42/KangalKhan-Ruby-7B-Fixed\n\n\n\nDataset automatically created during the evaluation run of model Yuma42/KangalKhan-Ruby-7B-Fixed on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-17T00:20:49.422540(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Yuma42/KangalKhan-Ruby-7B-Fixed\n\n\n\nDataset automatically created during the evaluation run of model Yuma42/KangalKhan-Ruby-7B-Fixed on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-17T00:20:49.422540(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
195,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Yuma42/KangalKhan-Ruby-7B-Fixed\n\n\n\nDataset automatically created during the evaluation run of model Yuma42/KangalKhan-Ruby-7B-Fixed on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-17T00:20:49.422540(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]"
] |
5101f04e961b12bf8db5111585f147a795f19916 |
# Dataset Card for Evaluation run of jeiku/Cookie_7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [jeiku/Cookie_7B](https://huggingface.co/jeiku/Cookie_7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jeiku__Cookie_7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-17T00:21:45.959538](https://huggingface.co/datasets/open-llm-leaderboard/details_jeiku__Cookie_7B/blob/main/results_2024-02-17T00-21-45.959538.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6489384774347392,
"acc_stderr": 0.032131421920616465,
"acc_norm": 0.6498781521461111,
"acc_norm_stderr": 0.032783132740631354,
"mc1": 0.5128518971848225,
"mc1_stderr": 0.01749771794429982,
"mc2": 0.6687534212220169,
"mc2_stderr": 0.015263939252034519
},
"harness|arc:challenge|25": {
"acc": 0.6791808873720137,
"acc_stderr": 0.01364094309194653,
"acc_norm": 0.697098976109215,
"acc_norm_stderr": 0.013428241573185349
},
"harness|hellaswag|10": {
"acc": 0.7105158334993029,
"acc_stderr": 0.004525960965551707,
"acc_norm": 0.8757219677355108,
"acc_norm_stderr": 0.0032922425436373417
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7171052631578947,
"acc_stderr": 0.03665349695640767,
"acc_norm": 0.7171052631578947,
"acc_norm_stderr": 0.03665349695640767
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.027834912527544067,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.027834912527544067
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107224,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107224
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224469,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224469
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.02533120243894443,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.02533120243894443
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768177,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768177
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.023287665127268545,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.023287665127268545
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5320197044334976,
"acc_stderr": 0.03510766597959215,
"acc_norm": 0.5320197044334976,
"acc_norm_stderr": 0.03510766597959215
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.028606204289229865,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.028606204289229865
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.023814477086593552,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.023814477086593552
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.02882088466625326,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.02882088466625326
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.029953823891887027,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.029953823891887027
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8495412844036697,
"acc_stderr": 0.015328563932669237,
"acc_norm": 0.8495412844036697,
"acc_norm_stderr": 0.015328563932669237
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931045,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931045
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8185654008438819,
"acc_stderr": 0.02508596114457966,
"acc_norm": 0.8185654008438819,
"acc_norm_stderr": 0.02508596114457966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822585,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822585
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281376,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281376
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8275862068965517,
"acc_stderr": 0.013507943909371803,
"acc_norm": 0.8275862068965517,
"acc_norm_stderr": 0.013507943909371803
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.023948512905468358,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.023948512905468358
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.46368715083798884,
"acc_stderr": 0.016678341894533166,
"acc_norm": 0.46368715083798884,
"acc_norm_stderr": 0.016678341894533166
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292452,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292452
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.025494259350694912,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.025494259350694912
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.024659685185967284,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.024659685185967284
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4645390070921986,
"acc_stderr": 0.029752389657427047,
"acc_norm": 0.4645390070921986,
"acc_norm_stderr": 0.029752389657427047
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45697522816166886,
"acc_stderr": 0.012722869501611419,
"acc_norm": 0.45697522816166886,
"acc_norm_stderr": 0.012722869501611419
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.01933314202079716,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.01933314202079716
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.028666857790274645,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.028666857790274645
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.02519692987482707,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.02519692987482707
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.027539122889061456,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.027539122889061456
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5128518971848225,
"mc1_stderr": 0.01749771794429982,
"mc2": 0.6687534212220169,
"mc2_stderr": 0.015263939252034519
},
"harness|winogrande|5": {
"acc": 0.813733228097869,
"acc_stderr": 0.010941877955676207
},
"harness|gsm8k|5": {
"acc": 0.6118271417740713,
"acc_stderr": 0.013423607564002757
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_jeiku__Cookie_7B | [
"region:us"
] | 2024-02-17T00:24:04+00:00 | {"pretty_name": "Evaluation run of jeiku/Cookie_7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [jeiku/Cookie_7B](https://huggingface.co/jeiku/Cookie_7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jeiku__Cookie_7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-17T00:21:45.959538](https://huggingface.co/datasets/open-llm-leaderboard/details_jeiku__Cookie_7B/blob/main/results_2024-02-17T00-21-45.959538.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6489384774347392,\n \"acc_stderr\": 0.032131421920616465,\n \"acc_norm\": 0.6498781521461111,\n \"acc_norm_stderr\": 0.032783132740631354,\n \"mc1\": 0.5128518971848225,\n \"mc1_stderr\": 0.01749771794429982,\n \"mc2\": 0.6687534212220169,\n \"mc2_stderr\": 0.015263939252034519\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6791808873720137,\n \"acc_stderr\": 0.01364094309194653,\n \"acc_norm\": 0.697098976109215,\n \"acc_norm_stderr\": 0.013428241573185349\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7105158334993029,\n \"acc_stderr\": 0.004525960965551707,\n \"acc_norm\": 0.8757219677355108,\n \"acc_norm_stderr\": 0.0032922425436373417\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7171052631578947,\n \"acc_stderr\": 0.03665349695640767,\n \"acc_norm\": 0.7171052631578947,\n \"acc_norm_stderr\": 0.03665349695640767\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544067,\n \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544067\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107224,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107224\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224469,\n \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224469\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41005291005291006,\n \"acc_stderr\": 0.02533120243894443,\n \"acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.02533120243894443\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.04403438954768177,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.04403438954768177\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n \"acc_stderr\": 0.023287665127268545,\n \"acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.023287665127268545\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5320197044334976,\n \"acc_stderr\": 0.03510766597959215,\n \"acc_norm\": 0.5320197044334976,\n \"acc_norm_stderr\": 0.03510766597959215\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.797979797979798,\n \"acc_stderr\": 0.028606204289229865,\n \"acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229865\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.023814477086593552,\n \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.023814477086593552\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.337037037037037,\n \"acc_stderr\": 0.02882088466625326,\n \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.02882088466625326\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.029953823891887027,\n \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.029953823891887027\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669237,\n \"acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669237\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931045,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931045\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8185654008438819,\n \"acc_stderr\": 0.02508596114457966,\n \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.02508596114457966\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822585,\n \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822585\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n \"acc_stderr\": 0.021586494001281376,\n \"acc_norm\": 0.8760683760683761,\n \"acc_norm_stderr\": 0.021586494001281376\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n \"acc_stderr\": 0.013507943909371803,\n \"acc_norm\": 0.8275862068965517,\n \"acc_norm_stderr\": 0.013507943909371803\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.023948512905468358,\n \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.023948512905468358\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.46368715083798884,\n \"acc_stderr\": 0.016678341894533166,\n \"acc_norm\": 0.46368715083798884,\n \"acc_norm_stderr\": 0.016678341894533166\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292452,\n \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292452\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n \"acc_stderr\": 0.025494259350694912,\n \"acc_norm\": 0.7202572347266881,\n \"acc_norm_stderr\": 0.025494259350694912\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.024659685185967284,\n \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.024659685185967284\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4645390070921986,\n \"acc_stderr\": 0.029752389657427047,\n \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.029752389657427047\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45697522816166886,\n \"acc_stderr\": 0.012722869501611419,\n \"acc_norm\": 0.45697522816166886,\n \"acc_norm_stderr\": 0.012722869501611419\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.01933314202079716,\n \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.01933314202079716\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274645,\n \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274645\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n \"acc_stderr\": 0.02519692987482707,\n \"acc_norm\": 0.8507462686567164,\n \"acc_norm_stderr\": 0.02519692987482707\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.027539122889061456,\n \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.027539122889061456\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5128518971848225,\n \"mc1_stderr\": 0.01749771794429982,\n \"mc2\": 0.6687534212220169,\n \"mc2_stderr\": 0.015263939252034519\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.813733228097869,\n \"acc_stderr\": 0.010941877955676207\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6118271417740713,\n \"acc_stderr\": 0.013423607564002757\n }\n}\n```", "repo_url": "https://huggingface.co/jeiku/Cookie_7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_17T00_21_45.959538", "path": ["**/details_harness|arc:challenge|25_2024-02-17T00-21-45.959538.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-17T00-21-45.959538.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_17T00_21_45.959538", "path": ["**/details_harness|gsm8k|5_2024-02-17T00-21-45.959538.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-17T00-21-45.959538.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_17T00_21_45.959538", "path": ["**/details_harness|hellaswag|10_2024-02-17T00-21-45.959538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-17T00-21-45.959538.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_17T00_21_45.959538", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-17T00-21-45.959538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-17T00-21-45.959538.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-17T00-21-45.959538.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_17T00_21_45.959538", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T00-21-45.959538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T00-21-45.959538.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_17T00_21_45.959538", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-17T00-21-45.959538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-17T00-21-45.959538.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_17T00_21_45.959538", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-17T00-21-45.959538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-17T00-21-45.959538.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_17T00_21_45.959538", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T00-21-45.959538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T00-21-45.959538.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_17T00_21_45.959538", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T00-21-45.959538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T00-21-45.959538.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_17T00_21_45.959538", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-17T00-21-45.959538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-17T00-21-45.959538.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_17T00_21_45.959538", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T00-21-45.959538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T00-21-45.959538.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_17T00_21_45.959538", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T00-21-45.959538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T00-21-45.959538.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_17T00_21_45.959538", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T00-21-45.959538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T00-21-45.959538.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_17T00_21_45.959538", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T00-21-45.959538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T00-21-45.959538.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_17T00_21_45.959538", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-17T00-21-45.959538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-17T00-21-45.959538.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_17T00_21_45.959538", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-17T00-21-45.959538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-17T00-21-45.959538.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_17T00_21_45.959538", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T00-21-45.959538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T00-21-45.959538.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_17T00_21_45.959538", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-17T00-21-45.959538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-17T00-21-45.959538.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_17T00_21_45.959538", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T00-21-45.959538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T00-21-45.959538.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_17T00_21_45.959538", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T00-21-45.959538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T00-21-45.959538.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_17T00_21_45.959538", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T00-21-45.959538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T00-21-45.959538.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_17T00_21_45.959538", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-17T00-21-45.959538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-17T00-21-45.959538.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_17T00_21_45.959538", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T00-21-45.959538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T00-21-45.959538.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_17T00_21_45.959538", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T00-21-45.959538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T00-21-45.959538.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_17T00_21_45.959538", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T00-21-45.959538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T00-21-45.959538.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_17T00_21_45.959538", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T00-21-45.959538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T00-21-45.959538.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_17T00_21_45.959538", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T00-21-45.959538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T00-21-45.959538.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_17T00_21_45.959538", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T00-21-45.959538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T00-21-45.959538.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_17T00_21_45.959538", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T00-21-45.959538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T00-21-45.959538.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_17T00_21_45.959538", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T00-21-45.959538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T00-21-45.959538.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_17T00_21_45.959538", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T00-21-45.959538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T00-21-45.959538.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_17T00_21_45.959538", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T00-21-45.959538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T00-21-45.959538.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_17T00_21_45.959538", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T00-21-45.959538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T00-21-45.959538.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_17T00_21_45.959538", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T00-21-45.959538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T00-21-45.959538.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_17T00_21_45.959538", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T00-21-45.959538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T00-21-45.959538.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_17T00_21_45.959538", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T00-21-45.959538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T00-21-45.959538.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_17T00_21_45.959538", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-17T00-21-45.959538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-17T00-21-45.959538.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_17T00_21_45.959538", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T00-21-45.959538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T00-21-45.959538.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_17T00_21_45.959538", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-17T00-21-45.959538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-17T00-21-45.959538.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_17T00_21_45.959538", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T00-21-45.959538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T00-21-45.959538.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_17T00_21_45.959538", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T00-21-45.959538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T00-21-45.959538.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_17T00_21_45.959538", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T00-21-45.959538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T00-21-45.959538.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_17T00_21_45.959538", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-17T00-21-45.959538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-17T00-21-45.959538.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_17T00_21_45.959538", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-17T00-21-45.959538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-17T00-21-45.959538.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_17T00_21_45.959538", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T00-21-45.959538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T00-21-45.959538.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_17T00_21_45.959538", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T00-21-45.959538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T00-21-45.959538.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_17T00_21_45.959538", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T00-21-45.959538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T00-21-45.959538.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_17T00_21_45.959538", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T00-21-45.959538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T00-21-45.959538.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_17T00_21_45.959538", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-17T00-21-45.959538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-17T00-21-45.959538.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_17T00_21_45.959538", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-17T00-21-45.959538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-17T00-21-45.959538.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_17T00_21_45.959538", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-17T00-21-45.959538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-17T00-21-45.959538.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_17T00_21_45.959538", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T00-21-45.959538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T00-21-45.959538.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_17T00_21_45.959538", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-17T00-21-45.959538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-17T00-21-45.959538.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_17T00_21_45.959538", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T00-21-45.959538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T00-21-45.959538.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_17T00_21_45.959538", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T00-21-45.959538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T00-21-45.959538.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_17T00_21_45.959538", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-17T00-21-45.959538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-17T00-21-45.959538.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_17T00_21_45.959538", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-17T00-21-45.959538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-17T00-21-45.959538.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_17T00_21_45.959538", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-17T00-21-45.959538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-17T00-21-45.959538.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_17T00_21_45.959538", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T00-21-45.959538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T00-21-45.959538.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_17T00_21_45.959538", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-17T00-21-45.959538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-17T00-21-45.959538.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_17T00_21_45.959538", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-17T00-21-45.959538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-17T00-21-45.959538.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_17T00_21_45.959538", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-17T00-21-45.959538.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-17T00-21-45.959538.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_17T00_21_45.959538", "path": ["**/details_harness|winogrande|5_2024-02-17T00-21-45.959538.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-17T00-21-45.959538.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_17T00_21_45.959538", "path": ["results_2024-02-17T00-21-45.959538.parquet"]}, {"split": "latest", "path": ["results_2024-02-17T00-21-45.959538.parquet"]}]}]} | 2024-02-17T00:24:28+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of jeiku/Cookie_7B
Dataset automatically created during the evaluation run of model jeiku/Cookie_7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-17T00:21:45.959538(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of jeiku/Cookie_7B\n\n\n\nDataset automatically created during the evaluation run of model jeiku/Cookie_7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-17T00:21:45.959538(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of jeiku/Cookie_7B\n\n\n\nDataset automatically created during the evaluation run of model jeiku/Cookie_7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-17T00:21:45.959538(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
175,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of jeiku/Cookie_7B\n\n\n\nDataset automatically created during the evaluation run of model jeiku/Cookie_7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-17T00:21:45.959538(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
1953844eb698af211069a78ce94e32be41ee1b63 |
# Dataset Card for Evaluation run of fzzhang/Marcoroni-neural-chat-7B-v2_gsm8k_quantized_mergedfloat_s
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [fzzhang/Marcoroni-neural-chat-7B-v2_gsm8k_quantized_mergedfloat_s](https://huggingface.co/fzzhang/Marcoroni-neural-chat-7B-v2_gsm8k_quantized_mergedfloat_s) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_fzzhang__Marcoroni-neural-chat-7B-v2_gsm8k_quantized_mergedfloat_s",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-17T00:25:52.922442](https://huggingface.co/datasets/open-llm-leaderboard/details_fzzhang__Marcoroni-neural-chat-7B-v2_gsm8k_quantized_mergedfloat_s/blob/main/results_2024-02-17T00-25-52.922442.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6135127589001832,
"acc_stderr": 0.032796760940438034,
"acc_norm": 0.6157670560754505,
"acc_norm_stderr": 0.033451817306662635,
"mc1": 0.37454100367197063,
"mc1_stderr": 0.016943535128405327,
"mc2": 0.5477195184186756,
"mc2_stderr": 0.015358664393160576
},
"harness|arc:challenge|25": {
"acc": 0.5930034129692833,
"acc_stderr": 0.01435639941800912,
"acc_norm": 0.6407849829351536,
"acc_norm_stderr": 0.014020224155839162
},
"harness|hellaswag|10": {
"acc": 0.6493726349332802,
"acc_stderr": 0.00476191251170751,
"acc_norm": 0.841167098187612,
"acc_norm_stderr": 0.003647731723938848
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.0421850621536888,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.0421850621536888
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6447368421052632,
"acc_stderr": 0.03894734487013316,
"acc_norm": 0.6447368421052632,
"acc_norm_stderr": 0.03894734487013316
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.028450154794118637,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.028450154794118637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.03586879280080341,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.03586879280080341
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201942,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201942
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5361702127659574,
"acc_stderr": 0.03260038511835771,
"acc_norm": 0.5361702127659574,
"acc_norm_stderr": 0.03260038511835771
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.046570472605949625,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.046570472605949625
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.503448275862069,
"acc_stderr": 0.04166567577101579,
"acc_norm": 0.503448275862069,
"acc_norm_stderr": 0.04166567577101579
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.025197101074246483,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.025197101074246483
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7354838709677419,
"acc_stderr": 0.02509189237885928,
"acc_norm": 0.7354838709677419,
"acc_norm_stderr": 0.02509189237885928
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4630541871921182,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.4630541871921182,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7171717171717171,
"acc_stderr": 0.03208779558786751,
"acc_norm": 0.7171717171717171,
"acc_norm_stderr": 0.03208779558786751
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8549222797927462,
"acc_stderr": 0.025416343096306422,
"acc_norm": 0.8549222797927462,
"acc_norm_stderr": 0.025416343096306422
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6076923076923076,
"acc_stderr": 0.024756000382130952,
"acc_norm": 0.6076923076923076,
"acc_norm_stderr": 0.024756000382130952
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131147,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131147
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6428571428571429,
"acc_stderr": 0.031124619309328177,
"acc_norm": 0.6428571428571429,
"acc_norm_stderr": 0.031124619309328177
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943343,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943343
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8165137614678899,
"acc_stderr": 0.0165952597103993,
"acc_norm": 0.8165137614678899,
"acc_norm_stderr": 0.0165952597103993
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.029771775228145635,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.029771775228145635
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7805907172995781,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.7805907172995781,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6367713004484304,
"acc_stderr": 0.032277904428505,
"acc_norm": 0.6367713004484304,
"acc_norm_stderr": 0.032277904428505
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7404580152671756,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.7404580152671756,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04065578140908705,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04065578140908705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7055214723926381,
"acc_stderr": 0.03581165790474082,
"acc_norm": 0.7055214723926381,
"acc_norm_stderr": 0.03581165790474082
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5535714285714286,
"acc_stderr": 0.047184714852195865,
"acc_norm": 0.5535714285714286,
"acc_norm_stderr": 0.047184714852195865
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.03760178006026622,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.03760178006026622
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8504273504273504,
"acc_stderr": 0.023365051491753715,
"acc_norm": 0.8504273504273504,
"acc_norm_stderr": 0.023365051491753715
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526094,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526094
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8109833971902938,
"acc_stderr": 0.014000791294407003,
"acc_norm": 0.8109833971902938,
"acc_norm_stderr": 0.014000791294407003
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6994219653179191,
"acc_stderr": 0.024685316867257796,
"acc_norm": 0.6994219653179191,
"acc_norm_stderr": 0.024685316867257796
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3463687150837989,
"acc_stderr": 0.015913546784020117,
"acc_norm": 0.3463687150837989,
"acc_norm_stderr": 0.015913546784020117
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.696078431372549,
"acc_stderr": 0.026336613469046626,
"acc_norm": 0.696078431372549,
"acc_norm_stderr": 0.026336613469046626
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.024922001168886335,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.024922001168886335
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.425531914893617,
"acc_stderr": 0.02949482760014437,
"acc_norm": 0.425531914893617,
"acc_norm_stderr": 0.02949482760014437
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4276401564537158,
"acc_stderr": 0.012635799922765844,
"acc_norm": 0.4276401564537158,
"acc_norm_stderr": 0.012635799922765844
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6213235294117647,
"acc_stderr": 0.02946513363977613,
"acc_norm": 0.6213235294117647,
"acc_norm_stderr": 0.02946513363977613
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6454248366013072,
"acc_stderr": 0.019353360547553707,
"acc_norm": 0.6454248366013072,
"acc_norm_stderr": 0.019353360547553707
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.046075820907199756,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.046075820907199756
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7020408163265306,
"acc_stderr": 0.02927956741106568,
"acc_norm": 0.7020408163265306,
"acc_norm_stderr": 0.02927956741106568
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8109452736318408,
"acc_stderr": 0.02768691358801302,
"acc_norm": 0.8109452736318408,
"acc_norm_stderr": 0.02768691358801302
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.030611116557432528,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.030611116557432528
},
"harness|truthfulqa:mc|0": {
"mc1": 0.37454100367197063,
"mc1_stderr": 0.016943535128405327,
"mc2": 0.5477195184186756,
"mc2_stderr": 0.015358664393160576
},
"harness|winogrande|5": {
"acc": 0.7695343330702447,
"acc_stderr": 0.011835872164836676
},
"harness|gsm8k|5": {
"acc": 0.5640636846095527,
"acc_stderr": 0.013658968058849159
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_fzzhang__Marcoroni-neural-chat-7B-v2_gsm8k_quantized_mergedfloat_s | [
"region:us"
] | 2024-02-17T00:28:11+00:00 | {"pretty_name": "Evaluation run of fzzhang/Marcoroni-neural-chat-7B-v2_gsm8k_quantized_mergedfloat_s", "dataset_summary": "Dataset automatically created during the evaluation run of model [fzzhang/Marcoroni-neural-chat-7B-v2_gsm8k_quantized_mergedfloat_s](https://huggingface.co/fzzhang/Marcoroni-neural-chat-7B-v2_gsm8k_quantized_mergedfloat_s) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_fzzhang__Marcoroni-neural-chat-7B-v2_gsm8k_quantized_mergedfloat_s\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-17T00:25:52.922442](https://huggingface.co/datasets/open-llm-leaderboard/details_fzzhang__Marcoroni-neural-chat-7B-v2_gsm8k_quantized_mergedfloat_s/blob/main/results_2024-02-17T00-25-52.922442.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6135127589001832,\n \"acc_stderr\": 0.032796760940438034,\n \"acc_norm\": 0.6157670560754505,\n \"acc_norm_stderr\": 0.033451817306662635,\n \"mc1\": 0.37454100367197063,\n \"mc1_stderr\": 0.016943535128405327,\n \"mc2\": 0.5477195184186756,\n \"mc2_stderr\": 0.015358664393160576\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5930034129692833,\n \"acc_stderr\": 0.01435639941800912,\n \"acc_norm\": 0.6407849829351536,\n \"acc_norm_stderr\": 0.014020224155839162\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6493726349332802,\n \"acc_stderr\": 0.00476191251170751,\n \"acc_norm\": 0.841167098187612,\n \"acc_norm_stderr\": 0.003647731723938848\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6447368421052632,\n \"acc_stderr\": 0.03894734487013316,\n \"acc_norm\": 0.6447368421052632,\n \"acc_norm_stderr\": 0.03894734487013316\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.7569444444444444,\n \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201942,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201942\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5361702127659574,\n \"acc_stderr\": 0.03260038511835771,\n \"acc_norm\": 0.5361702127659574,\n \"acc_norm_stderr\": 0.03260038511835771\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n \"acc_stderr\": 0.046570472605949625,\n \"acc_norm\": 0.4298245614035088,\n \"acc_norm_stderr\": 0.046570472605949625\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.04166567577101579,\n \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.04166567577101579\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3968253968253968,\n \"acc_stderr\": 0.025197101074246483,\n \"acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.025197101074246483\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7354838709677419,\n \"acc_stderr\": 0.02509189237885928,\n \"acc_norm\": 0.7354838709677419,\n \"acc_norm_stderr\": 0.02509189237885928\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4630541871921182,\n \"acc_stderr\": 0.035083705204426656,\n \"acc_norm\": 0.4630541871921182,\n \"acc_norm_stderr\": 0.035083705204426656\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7171717171717171,\n \"acc_stderr\": 0.03208779558786751,\n \"acc_norm\": 0.7171717171717171,\n \"acc_norm_stderr\": 0.03208779558786751\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.025416343096306422,\n \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.025416343096306422\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6076923076923076,\n \"acc_stderr\": 0.024756000382130952,\n \"acc_norm\": 0.6076923076923076,\n \"acc_norm_stderr\": 0.024756000382130952\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131147,\n \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131147\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.031124619309328177,\n \"acc_norm\": 0.6428571428571429,\n \"acc_norm_stderr\": 0.031124619309328177\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.304635761589404,\n \"acc_stderr\": 0.03757949922943343,\n \"acc_norm\": 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943343\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8165137614678899,\n \"acc_stderr\": 0.0165952597103993,\n \"acc_norm\": 0.8165137614678899,\n \"acc_norm_stderr\": 0.0165952597103993\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.029771775228145635,\n \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.029771775228145635\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6367713004484304,\n \"acc_stderr\": 0.032277904428505,\n \"acc_norm\": 0.6367713004484304,\n \"acc_norm_stderr\": 0.032277904428505\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908705,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908705\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7055214723926381,\n \"acc_stderr\": 0.03581165790474082,\n \"acc_norm\": 0.7055214723926381,\n \"acc_norm_stderr\": 0.03581165790474082\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5535714285714286,\n \"acc_stderr\": 0.047184714852195865,\n \"acc_norm\": 0.5535714285714286,\n \"acc_norm_stderr\": 0.047184714852195865\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.03760178006026622,\n \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.03760178006026622\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n \"acc_stderr\": 0.023365051491753715,\n \"acc_norm\": 0.8504273504273504,\n \"acc_norm_stderr\": 0.023365051491753715\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526094,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.047258156262526094\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8109833971902938,\n \"acc_stderr\": 0.014000791294407003,\n \"acc_norm\": 0.8109833971902938,\n \"acc_norm_stderr\": 0.014000791294407003\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6994219653179191,\n \"acc_stderr\": 0.024685316867257796,\n \"acc_norm\": 0.6994219653179191,\n \"acc_norm_stderr\": 0.024685316867257796\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3463687150837989,\n \"acc_stderr\": 0.015913546784020117,\n \"acc_norm\": 0.3463687150837989,\n \"acc_norm_stderr\": 0.015913546784020117\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.696078431372549,\n \"acc_stderr\": 0.026336613469046626,\n \"acc_norm\": 0.696078431372549,\n \"acc_norm_stderr\": 0.026336613469046626\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.024922001168886335,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.024922001168886335\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.425531914893617,\n \"acc_stderr\": 0.02949482760014437,\n \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.02949482760014437\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4276401564537158,\n \"acc_stderr\": 0.012635799922765844,\n \"acc_norm\": 0.4276401564537158,\n \"acc_norm_stderr\": 0.012635799922765844\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6213235294117647,\n \"acc_stderr\": 0.02946513363977613,\n \"acc_norm\": 0.6213235294117647,\n \"acc_norm_stderr\": 0.02946513363977613\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6454248366013072,\n \"acc_stderr\": 0.019353360547553707,\n \"acc_norm\": 0.6454248366013072,\n \"acc_norm_stderr\": 0.019353360547553707\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.046075820907199756,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.046075820907199756\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.02927956741106568,\n \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.02927956741106568\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8109452736318408,\n \"acc_stderr\": 0.02768691358801302,\n \"acc_norm\": 0.8109452736318408,\n \"acc_norm_stderr\": 0.02768691358801302\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.37454100367197063,\n \"mc1_stderr\": 0.016943535128405327,\n \"mc2\": 0.5477195184186756,\n \"mc2_stderr\": 0.015358664393160576\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7695343330702447,\n \"acc_stderr\": 0.011835872164836676\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5640636846095527,\n \"acc_stderr\": 0.013658968058849159\n }\n}\n```", "repo_url": "https://huggingface.co/fzzhang/Marcoroni-neural-chat-7B-v2_gsm8k_quantized_mergedfloat_s", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_17T00_25_52.922442", "path": ["**/details_harness|arc:challenge|25_2024-02-17T00-25-52.922442.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-17T00-25-52.922442.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_17T00_25_52.922442", "path": ["**/details_harness|gsm8k|5_2024-02-17T00-25-52.922442.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-17T00-25-52.922442.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_17T00_25_52.922442", "path": ["**/details_harness|hellaswag|10_2024-02-17T00-25-52.922442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-17T00-25-52.922442.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_17T00_25_52.922442", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-17T00-25-52.922442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-17T00-25-52.922442.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-17T00-25-52.922442.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_17T00_25_52.922442", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T00-25-52.922442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T00-25-52.922442.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_17T00_25_52.922442", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-17T00-25-52.922442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-17T00-25-52.922442.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_17T00_25_52.922442", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-17T00-25-52.922442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-17T00-25-52.922442.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_17T00_25_52.922442", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T00-25-52.922442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T00-25-52.922442.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_17T00_25_52.922442", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T00-25-52.922442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T00-25-52.922442.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_17T00_25_52.922442", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-17T00-25-52.922442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-17T00-25-52.922442.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_17T00_25_52.922442", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T00-25-52.922442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T00-25-52.922442.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_17T00_25_52.922442", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T00-25-52.922442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T00-25-52.922442.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_17T00_25_52.922442", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T00-25-52.922442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T00-25-52.922442.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_17T00_25_52.922442", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T00-25-52.922442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T00-25-52.922442.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_17T00_25_52.922442", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-17T00-25-52.922442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-17T00-25-52.922442.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_17T00_25_52.922442", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-17T00-25-52.922442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-17T00-25-52.922442.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_17T00_25_52.922442", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T00-25-52.922442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T00-25-52.922442.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_17T00_25_52.922442", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-17T00-25-52.922442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-17T00-25-52.922442.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_17T00_25_52.922442", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T00-25-52.922442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T00-25-52.922442.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_17T00_25_52.922442", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T00-25-52.922442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T00-25-52.922442.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_17T00_25_52.922442", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T00-25-52.922442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T00-25-52.922442.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_17T00_25_52.922442", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-17T00-25-52.922442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-17T00-25-52.922442.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_17T00_25_52.922442", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T00-25-52.922442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T00-25-52.922442.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_17T00_25_52.922442", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T00-25-52.922442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T00-25-52.922442.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_17T00_25_52.922442", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T00-25-52.922442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T00-25-52.922442.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_17T00_25_52.922442", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T00-25-52.922442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T00-25-52.922442.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_17T00_25_52.922442", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T00-25-52.922442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T00-25-52.922442.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_17T00_25_52.922442", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T00-25-52.922442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T00-25-52.922442.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_17T00_25_52.922442", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T00-25-52.922442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T00-25-52.922442.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_17T00_25_52.922442", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T00-25-52.922442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T00-25-52.922442.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_17T00_25_52.922442", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T00-25-52.922442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T00-25-52.922442.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_17T00_25_52.922442", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T00-25-52.922442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T00-25-52.922442.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_17T00_25_52.922442", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T00-25-52.922442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T00-25-52.922442.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_17T00_25_52.922442", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T00-25-52.922442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T00-25-52.922442.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_17T00_25_52.922442", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T00-25-52.922442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T00-25-52.922442.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_17T00_25_52.922442", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T00-25-52.922442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T00-25-52.922442.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_17T00_25_52.922442", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-17T00-25-52.922442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-17T00-25-52.922442.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_17T00_25_52.922442", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T00-25-52.922442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T00-25-52.922442.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_17T00_25_52.922442", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-17T00-25-52.922442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-17T00-25-52.922442.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_17T00_25_52.922442", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T00-25-52.922442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T00-25-52.922442.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_17T00_25_52.922442", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T00-25-52.922442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T00-25-52.922442.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_17T00_25_52.922442", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T00-25-52.922442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T00-25-52.922442.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_17T00_25_52.922442", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-17T00-25-52.922442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-17T00-25-52.922442.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_17T00_25_52.922442", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-17T00-25-52.922442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-17T00-25-52.922442.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_17T00_25_52.922442", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T00-25-52.922442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T00-25-52.922442.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_17T00_25_52.922442", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T00-25-52.922442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T00-25-52.922442.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_17T00_25_52.922442", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T00-25-52.922442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T00-25-52.922442.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_17T00_25_52.922442", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T00-25-52.922442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T00-25-52.922442.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_17T00_25_52.922442", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-17T00-25-52.922442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-17T00-25-52.922442.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_17T00_25_52.922442", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-17T00-25-52.922442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-17T00-25-52.922442.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_17T00_25_52.922442", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-17T00-25-52.922442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-17T00-25-52.922442.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_17T00_25_52.922442", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T00-25-52.922442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T00-25-52.922442.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_17T00_25_52.922442", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-17T00-25-52.922442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-17T00-25-52.922442.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_17T00_25_52.922442", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T00-25-52.922442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T00-25-52.922442.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_17T00_25_52.922442", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T00-25-52.922442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T00-25-52.922442.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_17T00_25_52.922442", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-17T00-25-52.922442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-17T00-25-52.922442.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_17T00_25_52.922442", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-17T00-25-52.922442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-17T00-25-52.922442.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_17T00_25_52.922442", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-17T00-25-52.922442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-17T00-25-52.922442.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_17T00_25_52.922442", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T00-25-52.922442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T00-25-52.922442.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_17T00_25_52.922442", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-17T00-25-52.922442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-17T00-25-52.922442.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_17T00_25_52.922442", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-17T00-25-52.922442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-17T00-25-52.922442.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_17T00_25_52.922442", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-17T00-25-52.922442.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-17T00-25-52.922442.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_17T00_25_52.922442", "path": ["**/details_harness|winogrande|5_2024-02-17T00-25-52.922442.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-17T00-25-52.922442.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_17T00_25_52.922442", "path": ["results_2024-02-17T00-25-52.922442.parquet"]}, {"split": "latest", "path": ["results_2024-02-17T00-25-52.922442.parquet"]}]}]} | 2024-02-17T00:28:35+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of fzzhang/Marcoroni-neural-chat-7B-v2_gsm8k_quantized_mergedfloat_s
Dataset automatically created during the evaluation run of model fzzhang/Marcoroni-neural-chat-7B-v2_gsm8k_quantized_mergedfloat_s on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-17T00:25:52.922442(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of fzzhang/Marcoroni-neural-chat-7B-v2_gsm8k_quantized_mergedfloat_s\n\n\n\nDataset automatically created during the evaluation run of model fzzhang/Marcoroni-neural-chat-7B-v2_gsm8k_quantized_mergedfloat_s on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-17T00:25:52.922442(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of fzzhang/Marcoroni-neural-chat-7B-v2_gsm8k_quantized_mergedfloat_s\n\n\n\nDataset automatically created during the evaluation run of model fzzhang/Marcoroni-neural-chat-7B-v2_gsm8k_quantized_mergedfloat_s on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-17T00:25:52.922442(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
225,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of fzzhang/Marcoroni-neural-chat-7B-v2_gsm8k_quantized_mergedfloat_s\n\n\n\nDataset automatically created during the evaluation run of model fzzhang/Marcoroni-neural-chat-7B-v2_gsm8k_quantized_mergedfloat_s on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-17T00:25:52.922442(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations"
] |
15ce75e7a29f1ced3816d1349b09cf9e3a9c6928 |
# Dataset Card for Evaluation run of mayacinka/NeuralZephyr-Beagle-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [mayacinka/NeuralZephyr-Beagle-7B](https://huggingface.co/mayacinka/NeuralZephyr-Beagle-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_mayacinka__NeuralZephyr-Beagle-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-17T00:55:18.728023](https://huggingface.co/datasets/open-llm-leaderboard/details_mayacinka__NeuralZephyr-Beagle-7B/blob/main/results_2024-02-17T00-55-18.728023.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6499857973810573,
"acc_stderr": 0.0322179748198474,
"acc_norm": 0.651015316120668,
"acc_norm_stderr": 0.032873289143278944,
"mc1": 0.4944920440636475,
"mc1_stderr": 0.01750243899045106,
"mc2": 0.6516576799205165,
"mc2_stderr": 0.01520679103207334
},
"harness|arc:challenge|25": {
"acc": 0.6578498293515358,
"acc_stderr": 0.013864152159177275,
"acc_norm": 0.6860068259385665,
"acc_norm_stderr": 0.013562691224726297
},
"harness|hellaswag|10": {
"acc": 0.6852220673172674,
"acc_stderr": 0.004634782156128581,
"acc_norm": 0.8637721569408484,
"acc_norm_stderr": 0.0034232928816321498
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7169811320754716,
"acc_stderr": 0.027724236492700918,
"acc_norm": 0.7169811320754716,
"acc_norm_stderr": 0.027724236492700918
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7361111111111112,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.7361111111111112,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.0355068398916558,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.0355068398916558
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5914893617021276,
"acc_stderr": 0.032134180267015755,
"acc_norm": 0.5914893617021276,
"acc_norm_stderr": 0.032134180267015755
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370333,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370333
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.02533120243894444,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.02533120243894444
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.044444444444444495,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.044444444444444495
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083525,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.02886977846026705,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.02886977846026705
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.02338193534812142,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.02338193534812142
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6692307692307692,
"acc_stderr": 0.02385479568097112,
"acc_norm": 0.6692307692307692,
"acc_norm_stderr": 0.02385479568097112
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3592592592592593,
"acc_stderr": 0.029252905927251972,
"acc_norm": 0.3592592592592593,
"acc_norm_stderr": 0.029252905927251972
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7016806722689075,
"acc_stderr": 0.029719142876342856,
"acc_norm": 0.7016806722689075,
"acc_norm_stderr": 0.029719142876342856
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8330275229357799,
"acc_stderr": 0.01599015488507338,
"acc_norm": 0.8330275229357799,
"acc_norm_stderr": 0.01599015488507338
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.03388857118502325,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.03388857118502325
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.02615686752393104,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.02615686752393104
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7805907172995781,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.7805907172995781,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.03641297081313728,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.03641297081313728
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243838,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243838
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993452,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993452
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7312138728323699,
"acc_stderr": 0.02386800326250011,
"acc_norm": 0.7312138728323699,
"acc_norm_stderr": 0.02386800326250011
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.43798882681564244,
"acc_stderr": 0.016593394227564843,
"acc_norm": 0.43798882681564244,
"acc_norm_stderr": 0.016593394227564843
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.025738854797818737,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.025738854797818737
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.02567025924218893,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.02567025924218893
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.02447722285613511,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.02447722285613511
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5,
"acc_stderr": 0.029827499313594685,
"acc_norm": 0.5,
"acc_norm_stderr": 0.029827499313594685
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4654498044328553,
"acc_stderr": 0.012739711554045702,
"acc_norm": 0.4654498044328553,
"acc_norm_stderr": 0.012739711554045702
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.02841820861940676,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.02841820861940676
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.019139943748487046,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.019139943748487046
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128445,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128445
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616913,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.027539122889061456,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.027539122889061456
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4944920440636475,
"mc1_stderr": 0.01750243899045106,
"mc2": 0.6516576799205165,
"mc2_stderr": 0.01520679103207334
},
"harness|winogrande|5": {
"acc": 0.8113654301499605,
"acc_stderr": 0.010995172318019816
},
"harness|gsm8k|5": {
"acc": 0.6345716451857468,
"acc_stderr": 0.013264282030266633
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_mayacinka__NeuralZephyr-Beagle-7B | [
"region:us"
] | 2024-02-17T00:57:39+00:00 | {"pretty_name": "Evaluation run of mayacinka/NeuralZephyr-Beagle-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [mayacinka/NeuralZephyr-Beagle-7B](https://huggingface.co/mayacinka/NeuralZephyr-Beagle-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mayacinka__NeuralZephyr-Beagle-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-17T00:55:18.728023](https://huggingface.co/datasets/open-llm-leaderboard/details_mayacinka__NeuralZephyr-Beagle-7B/blob/main/results_2024-02-17T00-55-18.728023.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6499857973810573,\n \"acc_stderr\": 0.0322179748198474,\n \"acc_norm\": 0.651015316120668,\n \"acc_norm_stderr\": 0.032873289143278944,\n \"mc1\": 0.4944920440636475,\n \"mc1_stderr\": 0.01750243899045106,\n \"mc2\": 0.6516576799205165,\n \"mc2_stderr\": 0.01520679103207334\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6578498293515358,\n \"acc_stderr\": 0.013864152159177275,\n \"acc_norm\": 0.6860068259385665,\n \"acc_norm_stderr\": 0.013562691224726297\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6852220673172674,\n \"acc_stderr\": 0.004634782156128581,\n \"acc_norm\": 0.8637721569408484,\n \"acc_norm_stderr\": 0.0034232928816321498\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700918,\n \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700918\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.6820809248554913,\n \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5914893617021276,\n \"acc_stderr\": 0.032134180267015755,\n \"acc_norm\": 0.5914893617021276,\n \"acc_norm_stderr\": 0.032134180267015755\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370333,\n \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370333\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41005291005291006,\n \"acc_stderr\": 0.02533120243894444,\n \"acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.02533120243894444\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.044444444444444495,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.044444444444444495\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n \"acc_stderr\": 0.023157879349083525,\n \"acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.023157879349083525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.02886977846026705,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.02886977846026705\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.02338193534812142,\n \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.02338193534812142\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.02385479568097112,\n \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.02385479568097112\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251972,\n \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251972\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7016806722689075,\n \"acc_stderr\": 0.029719142876342856,\n \"acc_norm\": 0.7016806722689075,\n \"acc_norm_stderr\": 0.029719142876342856\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8330275229357799,\n \"acc_stderr\": 0.01599015488507338,\n \"acc_norm\": 0.8330275229357799,\n \"acc_norm_stderr\": 0.01599015488507338\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.03388857118502325,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.03388857118502325\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.02615686752393104,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.02615686752393104\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313728,\n \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313728\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n \"acc_stderr\": 0.013625556907993452,\n \"acc_norm\": 0.8237547892720306,\n \"acc_norm_stderr\": 0.013625556907993452\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.02386800326250011,\n \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.02386800326250011\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43798882681564244,\n \"acc_stderr\": 0.016593394227564843,\n \"acc_norm\": 0.43798882681564244,\n \"acc_norm_stderr\": 0.016593394227564843\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818737,\n \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818737\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n \"acc_stderr\": 0.02567025924218893,\n \"acc_norm\": 0.7138263665594855,\n \"acc_norm_stderr\": 0.02567025924218893\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.02447722285613511,\n \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.02447722285613511\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4654498044328553,\n \"acc_stderr\": 0.012739711554045702,\n \"acc_norm\": 0.4654498044328553,\n \"acc_norm_stderr\": 0.012739711554045702\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.02841820861940676,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.02841820861940676\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.019139943748487046,\n \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.019139943748487046\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128445,\n \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128445\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.027539122889061456,\n \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.027539122889061456\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4944920440636475,\n \"mc1_stderr\": 0.01750243899045106,\n \"mc2\": 0.6516576799205165,\n \"mc2_stderr\": 0.01520679103207334\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8113654301499605,\n \"acc_stderr\": 0.010995172318019816\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6345716451857468,\n \"acc_stderr\": 0.013264282030266633\n }\n}\n```", "repo_url": "https://huggingface.co/mayacinka/NeuralZephyr-Beagle-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_17T00_55_18.728023", "path": ["**/details_harness|arc:challenge|25_2024-02-17T00-55-18.728023.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-17T00-55-18.728023.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_17T00_55_18.728023", "path": ["**/details_harness|gsm8k|5_2024-02-17T00-55-18.728023.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-17T00-55-18.728023.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_17T00_55_18.728023", "path": ["**/details_harness|hellaswag|10_2024-02-17T00-55-18.728023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-17T00-55-18.728023.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_17T00_55_18.728023", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-17T00-55-18.728023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-17T00-55-18.728023.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-17T00-55-18.728023.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_17T00_55_18.728023", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T00-55-18.728023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T00-55-18.728023.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_17T00_55_18.728023", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-17T00-55-18.728023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-17T00-55-18.728023.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_17T00_55_18.728023", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-17T00-55-18.728023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-17T00-55-18.728023.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_17T00_55_18.728023", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T00-55-18.728023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T00-55-18.728023.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_17T00_55_18.728023", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T00-55-18.728023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T00-55-18.728023.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_17T00_55_18.728023", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-17T00-55-18.728023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-17T00-55-18.728023.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_17T00_55_18.728023", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T00-55-18.728023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T00-55-18.728023.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_17T00_55_18.728023", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T00-55-18.728023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T00-55-18.728023.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_17T00_55_18.728023", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T00-55-18.728023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T00-55-18.728023.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_17T00_55_18.728023", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T00-55-18.728023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T00-55-18.728023.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_17T00_55_18.728023", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-17T00-55-18.728023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-17T00-55-18.728023.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_17T00_55_18.728023", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-17T00-55-18.728023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-17T00-55-18.728023.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_17T00_55_18.728023", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T00-55-18.728023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T00-55-18.728023.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_17T00_55_18.728023", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-17T00-55-18.728023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-17T00-55-18.728023.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_17T00_55_18.728023", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T00-55-18.728023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T00-55-18.728023.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_17T00_55_18.728023", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T00-55-18.728023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T00-55-18.728023.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_17T00_55_18.728023", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T00-55-18.728023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T00-55-18.728023.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_17T00_55_18.728023", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-17T00-55-18.728023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-17T00-55-18.728023.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_17T00_55_18.728023", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T00-55-18.728023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T00-55-18.728023.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_17T00_55_18.728023", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T00-55-18.728023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T00-55-18.728023.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_17T00_55_18.728023", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T00-55-18.728023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T00-55-18.728023.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_17T00_55_18.728023", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T00-55-18.728023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T00-55-18.728023.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_17T00_55_18.728023", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T00-55-18.728023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T00-55-18.728023.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_17T00_55_18.728023", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T00-55-18.728023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T00-55-18.728023.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_17T00_55_18.728023", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T00-55-18.728023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T00-55-18.728023.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_17T00_55_18.728023", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T00-55-18.728023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T00-55-18.728023.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_17T00_55_18.728023", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T00-55-18.728023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T00-55-18.728023.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_17T00_55_18.728023", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T00-55-18.728023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T00-55-18.728023.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_17T00_55_18.728023", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T00-55-18.728023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T00-55-18.728023.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_17T00_55_18.728023", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T00-55-18.728023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T00-55-18.728023.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_17T00_55_18.728023", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T00-55-18.728023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T00-55-18.728023.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_17T00_55_18.728023", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T00-55-18.728023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T00-55-18.728023.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_17T00_55_18.728023", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-17T00-55-18.728023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-17T00-55-18.728023.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_17T00_55_18.728023", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T00-55-18.728023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T00-55-18.728023.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_17T00_55_18.728023", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-17T00-55-18.728023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-17T00-55-18.728023.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_17T00_55_18.728023", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T00-55-18.728023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T00-55-18.728023.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_17T00_55_18.728023", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T00-55-18.728023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T00-55-18.728023.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_17T00_55_18.728023", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T00-55-18.728023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T00-55-18.728023.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_17T00_55_18.728023", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-17T00-55-18.728023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-17T00-55-18.728023.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_17T00_55_18.728023", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-17T00-55-18.728023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-17T00-55-18.728023.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_17T00_55_18.728023", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T00-55-18.728023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T00-55-18.728023.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_17T00_55_18.728023", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T00-55-18.728023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T00-55-18.728023.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_17T00_55_18.728023", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T00-55-18.728023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T00-55-18.728023.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_17T00_55_18.728023", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T00-55-18.728023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T00-55-18.728023.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_17T00_55_18.728023", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-17T00-55-18.728023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-17T00-55-18.728023.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_17T00_55_18.728023", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-17T00-55-18.728023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-17T00-55-18.728023.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_17T00_55_18.728023", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-17T00-55-18.728023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-17T00-55-18.728023.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_17T00_55_18.728023", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T00-55-18.728023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T00-55-18.728023.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_17T00_55_18.728023", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-17T00-55-18.728023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-17T00-55-18.728023.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_17T00_55_18.728023", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T00-55-18.728023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T00-55-18.728023.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_17T00_55_18.728023", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T00-55-18.728023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T00-55-18.728023.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_17T00_55_18.728023", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-17T00-55-18.728023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-17T00-55-18.728023.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_17T00_55_18.728023", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-17T00-55-18.728023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-17T00-55-18.728023.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_17T00_55_18.728023", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-17T00-55-18.728023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-17T00-55-18.728023.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_17T00_55_18.728023", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T00-55-18.728023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T00-55-18.728023.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_17T00_55_18.728023", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-17T00-55-18.728023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-17T00-55-18.728023.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_17T00_55_18.728023", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-17T00-55-18.728023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-17T00-55-18.728023.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_17T00_55_18.728023", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-17T00-55-18.728023.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-17T00-55-18.728023.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_17T00_55_18.728023", "path": ["**/details_harness|winogrande|5_2024-02-17T00-55-18.728023.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-17T00-55-18.728023.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_17T00_55_18.728023", "path": ["results_2024-02-17T00-55-18.728023.parquet"]}, {"split": "latest", "path": ["results_2024-02-17T00-55-18.728023.parquet"]}]}]} | 2024-02-17T00:58:02+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of mayacinka/NeuralZephyr-Beagle-7B
Dataset automatically created during the evaluation run of model mayacinka/NeuralZephyr-Beagle-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-17T00:55:18.728023(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of mayacinka/NeuralZephyr-Beagle-7B\n\n\n\nDataset automatically created during the evaluation run of model mayacinka/NeuralZephyr-Beagle-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-17T00:55:18.728023(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of mayacinka/NeuralZephyr-Beagle-7B\n\n\n\nDataset automatically created during the evaluation run of model mayacinka/NeuralZephyr-Beagle-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-17T00:55:18.728023(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
191,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of mayacinka/NeuralZephyr-Beagle-7B\n\n\n\nDataset automatically created during the evaluation run of model mayacinka/NeuralZephyr-Beagle-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-17T00:55:18.728023(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
3b14c002f73c1b5f5bd61b6b1a37c89b3cc3ec67 | 对话重写数据集,总量约160万。
文件格式:jsonl
单行示例:
```
{"dialogue": ["仁波: 我记得古代的金属工艺很厉害啊,那个啥,锻打技术超前的。", "冯艳婷: 没错,像那个景泰蓝,多美啊。"], "last_utterance_rewrite": "冯艳婷: 确实,例如景泰蓝这种金属工艺品,它的外观非常美丽。"}
``` | infgrad/dialogue_rewrite_llm | [
"license:mit",
"region:us"
] | 2024-02-17T02:29:18+00:00 | {"license": "mit"} | 2024-02-17T02:33:52+00:00 | [] | [] | TAGS
#license-mit #region-us
| 对话重写数据集,总量约160万。
文件格式:jsonl
单行示例:
| [] | [
"TAGS\n#license-mit #region-us \n"
] | [
11
] | [
"passage: TAGS\n#license-mit #region-us \n"
] |
b8e0ecfdc8e0175a99b997c1237a9b936133ce5f |
# Dataset Card for Evaluation run of ConvexAI/Luminex-34B-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ConvexAI/Luminex-34B-v0.1](https://huggingface.co/ConvexAI/Luminex-34B-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ConvexAI__Luminex-34B-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-17T02:55:24.187790](https://huggingface.co/datasets/open-llm-leaderboard/details_ConvexAI__Luminex-34B-v0.1/blob/main/results_2024-02-17T02-55-24.187790.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.763525493697814,
"acc_stderr": 0.02835769353284993,
"acc_norm": 0.7666743377682688,
"acc_norm_stderr": 0.02890561343087252,
"mc1": 0.5250917992656059,
"mc1_stderr": 0.01748144680410401,
"mc2": 0.6967876357426273,
"mc2_stderr": 0.014243776412915276
},
"harness|arc:challenge|25": {
"acc": 0.712457337883959,
"acc_stderr": 0.013226719056266129,
"acc_norm": 0.7363481228668942,
"acc_norm_stderr": 0.012875929151297061
},
"harness|hellaswag|10": {
"acc": 0.6719776936865166,
"acc_stderr": 0.00468533484403866,
"acc_norm": 0.8658633738299144,
"acc_norm_stderr": 0.0034010255178737255
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.03785714465066654,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.03785714465066654
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.9013157894736842,
"acc_stderr": 0.024270227737522715,
"acc_norm": 0.9013157894736842,
"acc_norm_stderr": 0.024270227737522715
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8226415094339623,
"acc_stderr": 0.023508739218846945,
"acc_norm": 0.8226415094339623,
"acc_norm_stderr": 0.023508739218846945
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.9027777777777778,
"acc_stderr": 0.024774516250440182,
"acc_norm": 0.9027777777777778,
"acc_norm_stderr": 0.024774516250440182
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.45,
"acc_stderr": 0.04999999999999998,
"acc_norm": 0.45,
"acc_norm_stderr": 0.04999999999999998
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.033450369167889904,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.033450369167889904
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5098039215686274,
"acc_stderr": 0.04974229460422817,
"acc_norm": 0.5098039215686274,
"acc_norm_stderr": 0.04974229460422817
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7659574468085106,
"acc_stderr": 0.02767845257821239,
"acc_norm": 0.7659574468085106,
"acc_norm_stderr": 0.02767845257821239
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5877192982456141,
"acc_stderr": 0.04630653203366596,
"acc_norm": 0.5877192982456141,
"acc_norm_stderr": 0.04630653203366596
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7517241379310344,
"acc_stderr": 0.036001056927277696,
"acc_norm": 0.7517241379310344,
"acc_norm_stderr": 0.036001056927277696
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.7380952380952381,
"acc_stderr": 0.022644212615525214,
"acc_norm": 0.7380952380952381,
"acc_norm_stderr": 0.022644212615525214
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5476190476190477,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.5476190476190477,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.9032258064516129,
"acc_stderr": 0.016818943416345197,
"acc_norm": 0.9032258064516129,
"acc_norm_stderr": 0.016818943416345197
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6748768472906403,
"acc_stderr": 0.03295797566311271,
"acc_norm": 0.6748768472906403,
"acc_norm_stderr": 0.03295797566311271
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8606060606060606,
"acc_stderr": 0.027045948825865394,
"acc_norm": 0.8606060606060606,
"acc_norm_stderr": 0.027045948825865394
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9292929292929293,
"acc_stderr": 0.01826310542019949,
"acc_norm": 0.9292929292929293,
"acc_norm_stderr": 0.01826310542019949
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9740932642487047,
"acc_stderr": 0.01146452335695315,
"acc_norm": 0.9740932642487047,
"acc_norm_stderr": 0.01146452335695315
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8076923076923077,
"acc_stderr": 0.019982347208637306,
"acc_norm": 0.8076923076923077,
"acc_norm_stderr": 0.019982347208637306
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.45925925925925926,
"acc_stderr": 0.03038416923235083,
"acc_norm": 0.45925925925925926,
"acc_norm_stderr": 0.03038416923235083
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.02300545944667394,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.02300545944667394
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5364238410596026,
"acc_stderr": 0.04071636065944216,
"acc_norm": 0.5364238410596026,
"acc_norm_stderr": 0.04071636065944216
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9192660550458716,
"acc_stderr": 0.011680172292862083,
"acc_norm": 0.9192660550458716,
"acc_norm_stderr": 0.011680172292862083
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6712962962962963,
"acc_stderr": 0.032036140846700596,
"acc_norm": 0.6712962962962963,
"acc_norm_stderr": 0.032036140846700596
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9313725490196079,
"acc_stderr": 0.017744453647073322,
"acc_norm": 0.9313725490196079,
"acc_norm_stderr": 0.017744453647073322
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9113924050632911,
"acc_stderr": 0.018498315206865384,
"acc_norm": 0.9113924050632911,
"acc_norm_stderr": 0.018498315206865384
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8026905829596412,
"acc_stderr": 0.02670985334496796,
"acc_norm": 0.8026905829596412,
"acc_norm_stderr": 0.02670985334496796
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8778625954198473,
"acc_stderr": 0.028718776889342344,
"acc_norm": 0.8778625954198473,
"acc_norm_stderr": 0.028718776889342344
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8842975206611571,
"acc_stderr": 0.029199802455622804,
"acc_norm": 0.8842975206611571,
"acc_norm_stderr": 0.029199802455622804
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8518518518518519,
"acc_stderr": 0.03434300243630999,
"acc_norm": 0.8518518518518519,
"acc_norm_stderr": 0.03434300243630999
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8404907975460123,
"acc_stderr": 0.02876748172598386,
"acc_norm": 0.8404907975460123,
"acc_norm_stderr": 0.02876748172598386
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5625,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.5625,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.8349514563106796,
"acc_stderr": 0.03675668832233188,
"acc_norm": 0.8349514563106796,
"acc_norm_stderr": 0.03675668832233188
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9444444444444444,
"acc_stderr": 0.01500631280644693,
"acc_norm": 0.9444444444444444,
"acc_norm_stderr": 0.01500631280644693
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466143,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466143
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9106002554278416,
"acc_stderr": 0.0102030178476883,
"acc_norm": 0.9106002554278416,
"acc_norm_stderr": 0.0102030178476883
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8208092485549133,
"acc_stderr": 0.020647590029679332,
"acc_norm": 0.8208092485549133,
"acc_norm_stderr": 0.020647590029679332
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.7631284916201118,
"acc_stderr": 0.014219570788103986,
"acc_norm": 0.7631284916201118,
"acc_norm_stderr": 0.014219570788103986
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8496732026143791,
"acc_stderr": 0.020464175124332625,
"acc_norm": 0.8496732026143791,
"acc_norm_stderr": 0.020464175124332625
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8006430868167203,
"acc_stderr": 0.022691033780549656,
"acc_norm": 0.8006430868167203,
"acc_norm_stderr": 0.022691033780549656
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8765432098765432,
"acc_stderr": 0.01830386880689179,
"acc_norm": 0.8765432098765432,
"acc_norm_stderr": 0.01830386880689179
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6205673758865248,
"acc_stderr": 0.028947338851614098,
"acc_norm": 0.6205673758865248,
"acc_norm_stderr": 0.028947338851614098
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5925684485006519,
"acc_stderr": 0.012549473714212224,
"acc_norm": 0.5925684485006519,
"acc_norm_stderr": 0.012549473714212224
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8161764705882353,
"acc_stderr": 0.02352924218519311,
"acc_norm": 0.8161764705882353,
"acc_norm_stderr": 0.02352924218519311
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8186274509803921,
"acc_stderr": 0.01558864349537047,
"acc_norm": 0.8186274509803921,
"acc_norm_stderr": 0.01558864349537047
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.04309118709946458,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.04309118709946458
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8408163265306122,
"acc_stderr": 0.02342097206916635,
"acc_norm": 0.8408163265306122,
"acc_norm_stderr": 0.02342097206916635
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.900497512437811,
"acc_stderr": 0.021166216304659407,
"acc_norm": 0.900497512437811,
"acc_norm_stderr": 0.021166216304659407
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.02876234912646613,
"acc_norm": 0.91,
"acc_norm_stderr": 0.02876234912646613
},
"harness|hendrycksTest-virology|5": {
"acc": 0.572289156626506,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.572289156626506,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8713450292397661,
"acc_stderr": 0.025679342723276908,
"acc_norm": 0.8713450292397661,
"acc_norm_stderr": 0.025679342723276908
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5250917992656059,
"mc1_stderr": 0.01748144680410401,
"mc2": 0.6967876357426273,
"mc2_stderr": 0.014243776412915276
},
"harness|winogrande|5": {
"acc": 0.8342541436464088,
"acc_stderr": 0.010450899545370616
},
"harness|gsm8k|5": {
"acc": 0.7247915087187263,
"acc_stderr": 0.012302114305862656
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_ConvexAI__Luminex-34B-v0.1 | [
"region:us"
] | 2024-02-17T02:57:38+00:00 | {"pretty_name": "Evaluation run of ConvexAI/Luminex-34B-v0.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [ConvexAI/Luminex-34B-v0.1](https://huggingface.co/ConvexAI/Luminex-34B-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ConvexAI__Luminex-34B-v0.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-17T02:55:24.187790](https://huggingface.co/datasets/open-llm-leaderboard/details_ConvexAI__Luminex-34B-v0.1/blob/main/results_2024-02-17T02-55-24.187790.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.763525493697814,\n \"acc_stderr\": 0.02835769353284993,\n \"acc_norm\": 0.7666743377682688,\n \"acc_norm_stderr\": 0.02890561343087252,\n \"mc1\": 0.5250917992656059,\n \"mc1_stderr\": 0.01748144680410401,\n \"mc2\": 0.6967876357426273,\n \"mc2_stderr\": 0.014243776412915276\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.712457337883959,\n \"acc_stderr\": 0.013226719056266129,\n \"acc_norm\": 0.7363481228668942,\n \"acc_norm_stderr\": 0.012875929151297061\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6719776936865166,\n \"acc_stderr\": 0.00468533484403866,\n \"acc_norm\": 0.8658633738299144,\n \"acc_norm_stderr\": 0.0034010255178737255\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.03785714465066654,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.03785714465066654\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.9013157894736842,\n \"acc_stderr\": 0.024270227737522715,\n \"acc_norm\": 0.9013157894736842,\n \"acc_norm_stderr\": 0.024270227737522715\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8226415094339623,\n \"acc_stderr\": 0.023508739218846945,\n \"acc_norm\": 0.8226415094339623,\n \"acc_norm_stderr\": 0.023508739218846945\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9027777777777778,\n \"acc_stderr\": 0.024774516250440182,\n \"acc_norm\": 0.9027777777777778,\n \"acc_norm_stderr\": 0.024774516250440182\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.04999999999999998,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.04999999999999998\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.033450369167889904,\n \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.033450369167889904\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5098039215686274,\n \"acc_stderr\": 0.04974229460422817,\n \"acc_norm\": 0.5098039215686274,\n \"acc_norm_stderr\": 0.04974229460422817\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7659574468085106,\n \"acc_stderr\": 0.02767845257821239,\n \"acc_norm\": 0.7659574468085106,\n \"acc_norm_stderr\": 0.02767845257821239\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5877192982456141,\n \"acc_stderr\": 0.04630653203366596,\n \"acc_norm\": 0.5877192982456141,\n \"acc_norm_stderr\": 0.04630653203366596\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7517241379310344,\n \"acc_stderr\": 0.036001056927277696,\n \"acc_norm\": 0.7517241379310344,\n \"acc_norm_stderr\": 0.036001056927277696\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.7380952380952381,\n \"acc_stderr\": 0.022644212615525214,\n \"acc_norm\": 0.7380952380952381,\n \"acc_norm_stderr\": 0.022644212615525214\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5476190476190477,\n \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.5476190476190477,\n \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.9032258064516129,\n \"acc_stderr\": 0.016818943416345197,\n \"acc_norm\": 0.9032258064516129,\n \"acc_norm_stderr\": 0.016818943416345197\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6748768472906403,\n \"acc_stderr\": 0.03295797566311271,\n \"acc_norm\": 0.6748768472906403,\n \"acc_norm_stderr\": 0.03295797566311271\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8606060606060606,\n \"acc_stderr\": 0.027045948825865394,\n \"acc_norm\": 0.8606060606060606,\n \"acc_norm_stderr\": 0.027045948825865394\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9292929292929293,\n \"acc_stderr\": 0.01826310542019949,\n \"acc_norm\": 0.9292929292929293,\n \"acc_norm_stderr\": 0.01826310542019949\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9740932642487047,\n \"acc_stderr\": 0.01146452335695315,\n \"acc_norm\": 0.9740932642487047,\n \"acc_norm_stderr\": 0.01146452335695315\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.8076923076923077,\n \"acc_stderr\": 0.019982347208637306,\n \"acc_norm\": 0.8076923076923077,\n \"acc_norm_stderr\": 0.019982347208637306\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.45925925925925926,\n \"acc_stderr\": 0.03038416923235083,\n \"acc_norm\": 0.45925925925925926,\n \"acc_norm_stderr\": 0.03038416923235083\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8529411764705882,\n \"acc_stderr\": 0.02300545944667394,\n \"acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.02300545944667394\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.5364238410596026,\n \"acc_stderr\": 0.04071636065944216,\n \"acc_norm\": 0.5364238410596026,\n \"acc_norm_stderr\": 0.04071636065944216\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9192660550458716,\n \"acc_stderr\": 0.011680172292862083,\n \"acc_norm\": 0.9192660550458716,\n \"acc_norm_stderr\": 0.011680172292862083\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6712962962962963,\n \"acc_stderr\": 0.032036140846700596,\n \"acc_norm\": 0.6712962962962963,\n \"acc_norm_stderr\": 0.032036140846700596\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9313725490196079,\n \"acc_stderr\": 0.017744453647073322,\n \"acc_norm\": 0.9313725490196079,\n \"acc_norm_stderr\": 0.017744453647073322\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9113924050632911,\n \"acc_stderr\": 0.018498315206865384,\n \"acc_norm\": 0.9113924050632911,\n \"acc_norm_stderr\": 0.018498315206865384\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8026905829596412,\n \"acc_stderr\": 0.02670985334496796,\n \"acc_norm\": 0.8026905829596412,\n \"acc_norm_stderr\": 0.02670985334496796\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8778625954198473,\n \"acc_stderr\": 0.028718776889342344,\n \"acc_norm\": 0.8778625954198473,\n \"acc_norm_stderr\": 0.028718776889342344\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8842975206611571,\n \"acc_stderr\": 0.029199802455622804,\n \"acc_norm\": 0.8842975206611571,\n \"acc_norm_stderr\": 0.029199802455622804\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8518518518518519,\n \"acc_stderr\": 0.03434300243630999,\n \"acc_norm\": 0.8518518518518519,\n \"acc_norm_stderr\": 0.03434300243630999\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8404907975460123,\n \"acc_stderr\": 0.02876748172598386,\n \"acc_norm\": 0.8404907975460123,\n \"acc_norm_stderr\": 0.02876748172598386\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5625,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.5625,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8349514563106796,\n \"acc_stderr\": 0.03675668832233188,\n \"acc_norm\": 0.8349514563106796,\n \"acc_norm_stderr\": 0.03675668832233188\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9444444444444444,\n \"acc_stderr\": 0.01500631280644693,\n \"acc_norm\": 0.9444444444444444,\n \"acc_norm_stderr\": 0.01500631280644693\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466143,\n \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466143\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9106002554278416,\n \"acc_stderr\": 0.0102030178476883,\n \"acc_norm\": 0.9106002554278416,\n \"acc_norm_stderr\": 0.0102030178476883\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8208092485549133,\n \"acc_stderr\": 0.020647590029679332,\n \"acc_norm\": 0.8208092485549133,\n \"acc_norm_stderr\": 0.020647590029679332\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.7631284916201118,\n \"acc_stderr\": 0.014219570788103986,\n \"acc_norm\": 0.7631284916201118,\n \"acc_norm_stderr\": 0.014219570788103986\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8496732026143791,\n \"acc_stderr\": 0.020464175124332625,\n \"acc_norm\": 0.8496732026143791,\n \"acc_norm_stderr\": 0.020464175124332625\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8006430868167203,\n \"acc_stderr\": 0.022691033780549656,\n \"acc_norm\": 0.8006430868167203,\n \"acc_norm_stderr\": 0.022691033780549656\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8765432098765432,\n \"acc_stderr\": 0.01830386880689179,\n \"acc_norm\": 0.8765432098765432,\n \"acc_norm_stderr\": 0.01830386880689179\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6205673758865248,\n \"acc_stderr\": 0.028947338851614098,\n \"acc_norm\": 0.6205673758865248,\n \"acc_norm_stderr\": 0.028947338851614098\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5925684485006519,\n \"acc_stderr\": 0.012549473714212224,\n \"acc_norm\": 0.5925684485006519,\n \"acc_norm_stderr\": 0.012549473714212224\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8161764705882353,\n \"acc_stderr\": 0.02352924218519311,\n \"acc_norm\": 0.8161764705882353,\n \"acc_norm_stderr\": 0.02352924218519311\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.8186274509803921,\n \"acc_stderr\": 0.01558864349537047,\n \"acc_norm\": 0.8186274509803921,\n \"acc_norm_stderr\": 0.01558864349537047\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n \"acc_stderr\": 0.04309118709946458,\n \"acc_norm\": 0.7181818181818181,\n \"acc_norm_stderr\": 0.04309118709946458\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8408163265306122,\n \"acc_stderr\": 0.02342097206916635,\n \"acc_norm\": 0.8408163265306122,\n \"acc_norm_stderr\": 0.02342097206916635\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.900497512437811,\n \"acc_stderr\": 0.021166216304659407,\n \"acc_norm\": 0.900497512437811,\n \"acc_norm_stderr\": 0.021166216304659407\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.91,\n \"acc_stderr\": 0.02876234912646613,\n \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.02876234912646613\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.572289156626506,\n \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.572289156626506,\n \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8713450292397661,\n \"acc_stderr\": 0.025679342723276908,\n \"acc_norm\": 0.8713450292397661,\n \"acc_norm_stderr\": 0.025679342723276908\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5250917992656059,\n \"mc1_stderr\": 0.01748144680410401,\n \"mc2\": 0.6967876357426273,\n \"mc2_stderr\": 0.014243776412915276\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8342541436464088,\n \"acc_stderr\": 0.010450899545370616\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7247915087187263,\n \"acc_stderr\": 0.012302114305862656\n }\n}\n```", "repo_url": "https://huggingface.co/ConvexAI/Luminex-34B-v0.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_17T02_55_24.187790", "path": ["**/details_harness|arc:challenge|25_2024-02-17T02-55-24.187790.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-17T02-55-24.187790.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_17T02_55_24.187790", "path": ["**/details_harness|gsm8k|5_2024-02-17T02-55-24.187790.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-17T02-55-24.187790.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_17T02_55_24.187790", "path": ["**/details_harness|hellaswag|10_2024-02-17T02-55-24.187790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-17T02-55-24.187790.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_17T02_55_24.187790", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-17T02-55-24.187790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-17T02-55-24.187790.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-17T02-55-24.187790.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_17T02_55_24.187790", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T02-55-24.187790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T02-55-24.187790.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_17T02_55_24.187790", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-17T02-55-24.187790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-17T02-55-24.187790.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_17T02_55_24.187790", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-17T02-55-24.187790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-17T02-55-24.187790.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_17T02_55_24.187790", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T02-55-24.187790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T02-55-24.187790.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_17T02_55_24.187790", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T02-55-24.187790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T02-55-24.187790.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_17T02_55_24.187790", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-17T02-55-24.187790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-17T02-55-24.187790.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_17T02_55_24.187790", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T02-55-24.187790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T02-55-24.187790.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_17T02_55_24.187790", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T02-55-24.187790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T02-55-24.187790.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_17T02_55_24.187790", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T02-55-24.187790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T02-55-24.187790.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_17T02_55_24.187790", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T02-55-24.187790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T02-55-24.187790.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_17T02_55_24.187790", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-17T02-55-24.187790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-17T02-55-24.187790.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_17T02_55_24.187790", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-17T02-55-24.187790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-17T02-55-24.187790.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_17T02_55_24.187790", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T02-55-24.187790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T02-55-24.187790.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_17T02_55_24.187790", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-17T02-55-24.187790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-17T02-55-24.187790.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_17T02_55_24.187790", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T02-55-24.187790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T02-55-24.187790.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_17T02_55_24.187790", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T02-55-24.187790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T02-55-24.187790.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_17T02_55_24.187790", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T02-55-24.187790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T02-55-24.187790.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_17T02_55_24.187790", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-17T02-55-24.187790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-17T02-55-24.187790.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_17T02_55_24.187790", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T02-55-24.187790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T02-55-24.187790.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_17T02_55_24.187790", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T02-55-24.187790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T02-55-24.187790.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_17T02_55_24.187790", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T02-55-24.187790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T02-55-24.187790.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_17T02_55_24.187790", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T02-55-24.187790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T02-55-24.187790.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_17T02_55_24.187790", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T02-55-24.187790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T02-55-24.187790.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_17T02_55_24.187790", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T02-55-24.187790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T02-55-24.187790.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_17T02_55_24.187790", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T02-55-24.187790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T02-55-24.187790.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_17T02_55_24.187790", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T02-55-24.187790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T02-55-24.187790.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_17T02_55_24.187790", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T02-55-24.187790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T02-55-24.187790.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_17T02_55_24.187790", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T02-55-24.187790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T02-55-24.187790.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_17T02_55_24.187790", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T02-55-24.187790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T02-55-24.187790.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_17T02_55_24.187790", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T02-55-24.187790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T02-55-24.187790.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_17T02_55_24.187790", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T02-55-24.187790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T02-55-24.187790.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_17T02_55_24.187790", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T02-55-24.187790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T02-55-24.187790.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_17T02_55_24.187790", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-17T02-55-24.187790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-17T02-55-24.187790.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_17T02_55_24.187790", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T02-55-24.187790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T02-55-24.187790.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_17T02_55_24.187790", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-17T02-55-24.187790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-17T02-55-24.187790.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_17T02_55_24.187790", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T02-55-24.187790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T02-55-24.187790.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_17T02_55_24.187790", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T02-55-24.187790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T02-55-24.187790.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_17T02_55_24.187790", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T02-55-24.187790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T02-55-24.187790.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_17T02_55_24.187790", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-17T02-55-24.187790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-17T02-55-24.187790.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_17T02_55_24.187790", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-17T02-55-24.187790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-17T02-55-24.187790.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_17T02_55_24.187790", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T02-55-24.187790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T02-55-24.187790.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_17T02_55_24.187790", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T02-55-24.187790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T02-55-24.187790.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_17T02_55_24.187790", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T02-55-24.187790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T02-55-24.187790.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_17T02_55_24.187790", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T02-55-24.187790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T02-55-24.187790.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_17T02_55_24.187790", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-17T02-55-24.187790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-17T02-55-24.187790.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_17T02_55_24.187790", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-17T02-55-24.187790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-17T02-55-24.187790.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_17T02_55_24.187790", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-17T02-55-24.187790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-17T02-55-24.187790.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_17T02_55_24.187790", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T02-55-24.187790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T02-55-24.187790.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_17T02_55_24.187790", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-17T02-55-24.187790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-17T02-55-24.187790.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_17T02_55_24.187790", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T02-55-24.187790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T02-55-24.187790.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_17T02_55_24.187790", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T02-55-24.187790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T02-55-24.187790.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_17T02_55_24.187790", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-17T02-55-24.187790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-17T02-55-24.187790.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_17T02_55_24.187790", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-17T02-55-24.187790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-17T02-55-24.187790.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_17T02_55_24.187790", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-17T02-55-24.187790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-17T02-55-24.187790.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_17T02_55_24.187790", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T02-55-24.187790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T02-55-24.187790.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_17T02_55_24.187790", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-17T02-55-24.187790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-17T02-55-24.187790.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_17T02_55_24.187790", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-17T02-55-24.187790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-17T02-55-24.187790.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_17T02_55_24.187790", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-17T02-55-24.187790.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-17T02-55-24.187790.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_17T02_55_24.187790", "path": ["**/details_harness|winogrande|5_2024-02-17T02-55-24.187790.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-17T02-55-24.187790.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_17T02_55_24.187790", "path": ["results_2024-02-17T02-55-24.187790.parquet"]}, {"split": "latest", "path": ["results_2024-02-17T02-55-24.187790.parquet"]}]}]} | 2024-02-17T02:58:13+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of ConvexAI/Luminex-34B-v0.1
Dataset automatically created during the evaluation run of model ConvexAI/Luminex-34B-v0.1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-17T02:55:24.187790(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of ConvexAI/Luminex-34B-v0.1\n\n\n\nDataset automatically created during the evaluation run of model ConvexAI/Luminex-34B-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-17T02:55:24.187790(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of ConvexAI/Luminex-34B-v0.1\n\n\n\nDataset automatically created during the evaluation run of model ConvexAI/Luminex-34B-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-17T02:55:24.187790(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
187,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of ConvexAI/Luminex-34B-v0.1\n\n\n\nDataset automatically created during the evaluation run of model ConvexAI/Luminex-34B-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-17T02:55:24.187790(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
6c3308c4202509ce7d234c8958a0819a6b07449d | Literally just https://huggingface.co/datasets/teknium/OpenHermes-2.5 but converted to be usable in MLX lora training (assumes ChatML format) | N8Programs/openhermes-2.5-mlx | [
"region:us"
] | 2024-02-17T03:48:00+00:00 | {} | 2024-02-17T04:06:54+00:00 | [] | [] | TAGS
#region-us
| Literally just URL but converted to be usable in MLX lora training (assumes ChatML format) | [] | [
"TAGS\n#region-us \n"
] | [
6
] | [
"passage: TAGS\n#region-us \n"
] |
e73989ed2fa96e4cdaccab1c830b5ce70d04c022 |
# Dataset Card for Evaluation run of DreadPoor/WhyAreWeStillHere-7B-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [DreadPoor/WhyAreWeStillHere-7B-slerp](https://huggingface.co/DreadPoor/WhyAreWeStillHere-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_DreadPoor__WhyAreWeStillHere-7B-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-17T03:56:00.783844](https://huggingface.co/datasets/open-llm-leaderboard/details_DreadPoor__WhyAreWeStillHere-7B-slerp/blob/main/results_2024-02-17T03-56-00.783844.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6545570581109811,
"acc_stderr": 0.032029886164546564,
"acc_norm": 0.6542595014266129,
"acc_norm_stderr": 0.03270002296417162,
"mc1": 0.5385556915544676,
"mc1_stderr": 0.017451384104637452,
"mc2": 0.6812168345875693,
"mc2_stderr": 0.015141577387322332
},
"harness|arc:challenge|25": {
"acc": 0.6988054607508533,
"acc_stderr": 0.013406741767847632,
"acc_norm": 0.7167235494880546,
"acc_norm_stderr": 0.013167478735134575
},
"harness|hellaswag|10": {
"acc": 0.7187811192989444,
"acc_stderr": 0.004486752200430352,
"acc_norm": 0.8824935271858195,
"acc_norm_stderr": 0.0032136470410029467
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7236842105263158,
"acc_stderr": 0.03639057569952928,
"acc_norm": 0.7236842105263158,
"acc_norm_stderr": 0.03639057569952928
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.02794321998933714,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.02794321998933714
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.03656343653353159,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.03656343653353159
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4312169312169312,
"acc_stderr": 0.025506481698138208,
"acc_norm": 0.4312169312169312,
"acc_norm_stderr": 0.025506481698138208
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.02315787934908352,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.02315787934908352
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.028057791672989017,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.028057791672989017
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033484,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033484
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563976,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563976
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4105960264900662,
"acc_stderr": 0.04016689594849929,
"acc_norm": 0.4105960264900662,
"acc_norm_stderr": 0.04016689594849929
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374307,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374307
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.024857478080250458,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.024857478080250458
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8185654008438819,
"acc_stderr": 0.025085961144579654,
"acc_norm": 0.8185654008438819,
"acc_norm_stderr": 0.025085961144579654
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159465,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159465
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.03322015795776741,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.03322015795776741
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8275862068965517,
"acc_stderr": 0.013507943909371802,
"acc_norm": 0.8275862068965517,
"acc_norm_stderr": 0.013507943909371802
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7543352601156069,
"acc_stderr": 0.023176298203992002,
"acc_norm": 0.7543352601156069,
"acc_norm_stderr": 0.023176298203992002
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.46033519553072627,
"acc_stderr": 0.01666979959211203,
"acc_norm": 0.46033519553072627,
"acc_norm_stderr": 0.01666979959211203
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826524,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826524
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.02438366553103545,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.02438366553103545
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5070921985815603,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.5070921985815603,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4654498044328553,
"acc_stderr": 0.0127397115540457,
"acc_norm": 0.4654498044328553,
"acc_norm_stderr": 0.0127397115540457
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6911764705882353,
"acc_stderr": 0.02806499816704009,
"acc_norm": 0.6911764705882353,
"acc_norm_stderr": 0.02806499816704009
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6683006535947712,
"acc_stderr": 0.01904748523936038,
"acc_norm": 0.6683006535947712,
"acc_norm_stderr": 0.01904748523936038
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.02866685779027465,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.02866685779027465
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578337,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578337
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.038786267710023595,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.038786267710023595
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5385556915544676,
"mc1_stderr": 0.017451384104637452,
"mc2": 0.6812168345875693,
"mc2_stderr": 0.015141577387322332
},
"harness|winogrande|5": {
"acc": 0.8547750591949487,
"acc_stderr": 0.009902153904760824
},
"harness|gsm8k|5": {
"acc": 0.6535253980288097,
"acc_stderr": 0.013107179054313403
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_DreadPoor__WhyAreWeStillHere-7B-slerp | [
"region:us"
] | 2024-02-17T03:58:18+00:00 | {"pretty_name": "Evaluation run of DreadPoor/WhyAreWeStillHere-7B-slerp", "dataset_summary": "Dataset automatically created during the evaluation run of model [DreadPoor/WhyAreWeStillHere-7B-slerp](https://huggingface.co/DreadPoor/WhyAreWeStillHere-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_DreadPoor__WhyAreWeStillHere-7B-slerp\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-17T03:56:00.783844](https://huggingface.co/datasets/open-llm-leaderboard/details_DreadPoor__WhyAreWeStillHere-7B-slerp/blob/main/results_2024-02-17T03-56-00.783844.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6545570581109811,\n \"acc_stderr\": 0.032029886164546564,\n \"acc_norm\": 0.6542595014266129,\n \"acc_norm_stderr\": 0.03270002296417162,\n \"mc1\": 0.5385556915544676,\n \"mc1_stderr\": 0.017451384104637452,\n \"mc2\": 0.6812168345875693,\n \"mc2_stderr\": 0.015141577387322332\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6988054607508533,\n \"acc_stderr\": 0.013406741767847632,\n \"acc_norm\": 0.7167235494880546,\n \"acc_norm_stderr\": 0.013167478735134575\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7187811192989444,\n \"acc_stderr\": 0.004486752200430352,\n \"acc_norm\": 0.8824935271858195,\n \"acc_norm_stderr\": 0.0032136470410029467\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7236842105263158,\n \"acc_stderr\": 0.03639057569952928,\n \"acc_norm\": 0.7236842105263158,\n \"acc_norm_stderr\": 0.03639057569952928\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.02794321998933714,\n \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.02794321998933714\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n \"acc_stderr\": 0.03656343653353159,\n \"acc_norm\": 0.6416184971098265,\n \"acc_norm_stderr\": 0.03656343653353159\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4312169312169312,\n \"acc_stderr\": 0.025506481698138208,\n \"acc_norm\": 0.4312169312169312,\n \"acc_norm_stderr\": 0.025506481698138208\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n \"acc_stderr\": 0.02315787934908352,\n \"acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.02315787934908352\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033484,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033484\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563976,\n \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563976\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.4105960264900662,\n \"acc_stderr\": 0.04016689594849929,\n \"acc_norm\": 0.4105960264900662,\n \"acc_norm_stderr\": 0.04016689594849929\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374307,\n \"acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374307\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8529411764705882,\n \"acc_stderr\": 0.024857478080250458,\n \"acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.024857478080250458\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8185654008438819,\n \"acc_stderr\": 0.025085961144579654,\n \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.025085961144579654\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.03322015795776741,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.03322015795776741\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n \"acc_stderr\": 0.013507943909371802,\n \"acc_norm\": 0.8275862068965517,\n \"acc_norm_stderr\": 0.013507943909371802\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7543352601156069,\n \"acc_stderr\": 0.023176298203992002,\n \"acc_norm\": 0.7543352601156069,\n \"acc_norm_stderr\": 0.023176298203992002\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.46033519553072627,\n \"acc_stderr\": 0.01666979959211203,\n \"acc_norm\": 0.46033519553072627,\n \"acc_norm_stderr\": 0.01666979959211203\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826524,\n \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826524\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.02438366553103545,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.02438366553103545\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5070921985815603,\n \"acc_stderr\": 0.02982449855912901,\n \"acc_norm\": 0.5070921985815603,\n \"acc_norm_stderr\": 0.02982449855912901\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4654498044328553,\n \"acc_stderr\": 0.0127397115540457,\n \"acc_norm\": 0.4654498044328553,\n \"acc_norm_stderr\": 0.0127397115540457\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.02806499816704009,\n \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.02806499816704009\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6683006535947712,\n \"acc_stderr\": 0.01904748523936038,\n \"acc_norm\": 0.6683006535947712,\n \"acc_norm_stderr\": 0.01904748523936038\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.02866685779027465,\n \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.02866685779027465\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.025538433368578337,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.025538433368578337\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.038786267710023595,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.038786267710023595\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5385556915544676,\n \"mc1_stderr\": 0.017451384104637452,\n \"mc2\": 0.6812168345875693,\n \"mc2_stderr\": 0.015141577387322332\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8547750591949487,\n \"acc_stderr\": 0.009902153904760824\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6535253980288097,\n \"acc_stderr\": 0.013107179054313403\n }\n}\n```", "repo_url": "https://huggingface.co/DreadPoor/WhyAreWeStillHere-7B-slerp", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_17T03_56_00.783844", "path": ["**/details_harness|arc:challenge|25_2024-02-17T03-56-00.783844.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-17T03-56-00.783844.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_17T03_56_00.783844", "path": ["**/details_harness|gsm8k|5_2024-02-17T03-56-00.783844.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-17T03-56-00.783844.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_17T03_56_00.783844", "path": ["**/details_harness|hellaswag|10_2024-02-17T03-56-00.783844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-17T03-56-00.783844.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_17T03_56_00.783844", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-17T03-56-00.783844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-17T03-56-00.783844.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-17T03-56-00.783844.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_17T03_56_00.783844", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T03-56-00.783844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T03-56-00.783844.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_17T03_56_00.783844", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-17T03-56-00.783844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-17T03-56-00.783844.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_17T03_56_00.783844", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-17T03-56-00.783844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-17T03-56-00.783844.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_17T03_56_00.783844", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T03-56-00.783844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T03-56-00.783844.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_17T03_56_00.783844", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T03-56-00.783844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T03-56-00.783844.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_17T03_56_00.783844", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-17T03-56-00.783844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-17T03-56-00.783844.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_17T03_56_00.783844", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T03-56-00.783844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T03-56-00.783844.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_17T03_56_00.783844", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T03-56-00.783844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T03-56-00.783844.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_17T03_56_00.783844", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T03-56-00.783844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T03-56-00.783844.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_17T03_56_00.783844", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T03-56-00.783844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T03-56-00.783844.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_17T03_56_00.783844", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-17T03-56-00.783844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-17T03-56-00.783844.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_17T03_56_00.783844", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-17T03-56-00.783844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-17T03-56-00.783844.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_17T03_56_00.783844", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T03-56-00.783844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T03-56-00.783844.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_17T03_56_00.783844", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-17T03-56-00.783844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-17T03-56-00.783844.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_17T03_56_00.783844", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T03-56-00.783844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T03-56-00.783844.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_17T03_56_00.783844", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T03-56-00.783844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T03-56-00.783844.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_17T03_56_00.783844", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T03-56-00.783844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T03-56-00.783844.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_17T03_56_00.783844", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-17T03-56-00.783844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-17T03-56-00.783844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_17T03_56_00.783844", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T03-56-00.783844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T03-56-00.783844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_17T03_56_00.783844", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T03-56-00.783844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T03-56-00.783844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_17T03_56_00.783844", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T03-56-00.783844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T03-56-00.783844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_17T03_56_00.783844", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T03-56-00.783844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T03-56-00.783844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_17T03_56_00.783844", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T03-56-00.783844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T03-56-00.783844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_17T03_56_00.783844", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T03-56-00.783844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T03-56-00.783844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_17T03_56_00.783844", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T03-56-00.783844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T03-56-00.783844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_17T03_56_00.783844", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T03-56-00.783844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T03-56-00.783844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_17T03_56_00.783844", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T03-56-00.783844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T03-56-00.783844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_17T03_56_00.783844", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T03-56-00.783844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T03-56-00.783844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_17T03_56_00.783844", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T03-56-00.783844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T03-56-00.783844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_17T03_56_00.783844", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T03-56-00.783844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T03-56-00.783844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_17T03_56_00.783844", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T03-56-00.783844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T03-56-00.783844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_17T03_56_00.783844", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T03-56-00.783844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T03-56-00.783844.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_17T03_56_00.783844", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-17T03-56-00.783844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-17T03-56-00.783844.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_17T03_56_00.783844", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T03-56-00.783844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T03-56-00.783844.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_17T03_56_00.783844", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-17T03-56-00.783844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-17T03-56-00.783844.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_17T03_56_00.783844", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T03-56-00.783844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T03-56-00.783844.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_17T03_56_00.783844", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T03-56-00.783844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T03-56-00.783844.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_17T03_56_00.783844", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T03-56-00.783844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T03-56-00.783844.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_17T03_56_00.783844", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-17T03-56-00.783844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-17T03-56-00.783844.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_17T03_56_00.783844", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-17T03-56-00.783844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-17T03-56-00.783844.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_17T03_56_00.783844", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T03-56-00.783844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T03-56-00.783844.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_17T03_56_00.783844", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T03-56-00.783844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T03-56-00.783844.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_17T03_56_00.783844", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T03-56-00.783844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T03-56-00.783844.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_17T03_56_00.783844", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T03-56-00.783844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T03-56-00.783844.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_17T03_56_00.783844", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-17T03-56-00.783844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-17T03-56-00.783844.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_17T03_56_00.783844", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-17T03-56-00.783844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-17T03-56-00.783844.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_17T03_56_00.783844", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-17T03-56-00.783844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-17T03-56-00.783844.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_17T03_56_00.783844", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T03-56-00.783844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T03-56-00.783844.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_17T03_56_00.783844", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-17T03-56-00.783844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-17T03-56-00.783844.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_17T03_56_00.783844", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T03-56-00.783844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T03-56-00.783844.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_17T03_56_00.783844", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T03-56-00.783844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T03-56-00.783844.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_17T03_56_00.783844", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-17T03-56-00.783844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-17T03-56-00.783844.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_17T03_56_00.783844", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-17T03-56-00.783844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-17T03-56-00.783844.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_17T03_56_00.783844", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-17T03-56-00.783844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-17T03-56-00.783844.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_17T03_56_00.783844", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T03-56-00.783844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T03-56-00.783844.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_17T03_56_00.783844", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-17T03-56-00.783844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-17T03-56-00.783844.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_17T03_56_00.783844", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-17T03-56-00.783844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-17T03-56-00.783844.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_17T03_56_00.783844", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-17T03-56-00.783844.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-17T03-56-00.783844.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_17T03_56_00.783844", "path": ["**/details_harness|winogrande|5_2024-02-17T03-56-00.783844.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-17T03-56-00.783844.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_17T03_56_00.783844", "path": ["results_2024-02-17T03-56-00.783844.parquet"]}, {"split": "latest", "path": ["results_2024-02-17T03-56-00.783844.parquet"]}]}]} | 2024-02-17T03:58:39+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of DreadPoor/WhyAreWeStillHere-7B-slerp
Dataset automatically created during the evaluation run of model DreadPoor/WhyAreWeStillHere-7B-slerp on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-17T03:56:00.783844(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of DreadPoor/WhyAreWeStillHere-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model DreadPoor/WhyAreWeStillHere-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-17T03:56:00.783844(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of DreadPoor/WhyAreWeStillHere-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model DreadPoor/WhyAreWeStillHere-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-17T03:56:00.783844(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
197,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of DreadPoor/WhyAreWeStillHere-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model DreadPoor/WhyAreWeStillHere-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-17T03:56:00.783844(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]"
] |
de415925d4729d9f8da8f461a5c49c655eda32e7 |
带有难负例的检索训练数据。约20万。
文件格式:jsonl。单行示例:
```
{"Query": "大熊猫的饮食习性", "Positive Document": "大熊猫主要以竹子为食,但也会吃水果和小型动物。它们拥有强壮的颌部和牙齿,能够咬碎竹子坚硬的外壳。", "Hard Negative Document": "老虎是肉食性动物,主要捕食鹿、野猪等大型动物。它们的牙齿和爪子非常锋利,是捕猎的利器。"}
``` | infgrad/retrieval_data_llm | [
"size_categories:100K<n<1M",
"language:zh",
"license:mit",
"region:us"
] | 2024-02-17T04:13:09+00:00 | {"language": ["zh"], "license": "mit", "size_categories": ["100K<n<1M"]} | 2024-02-17T04:16:25+00:00 | [] | [
"zh"
] | TAGS
#size_categories-100K<n<1M #language-Chinese #license-mit #region-us
|
带有难负例的检索训练数据。约20万。
文件格式:jsonl。单行示例:
| [] | [
"TAGS\n#size_categories-100K<n<1M #language-Chinese #license-mit #region-us \n"
] | [
28
] | [
"passage: TAGS\n#size_categories-100K<n<1M #language-Chinese #license-mit #region-us \n"
] |
24a917147cd8a20e4702b782e3c934c9780ccb52 | # Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset is a condensed subset derived from the SetFit/enron_spam dataset. It comprises two primary columns: 'Text' and 'Label'. The dataset contains 1000 samples for training and 1000 samples for testing, making it suitable for binary text classification tasks.
## Dataset Details
### Columns
- **Text:** Represents the content of an email.
- **Label:** Indicates whether the email is categorized as 'spam' or 'ham' (non-spam).
### Dataset Sources
- **Repository:** https://huggingface.co/datasets/SetFit/enron_spam
## Uses
### Direct Use
```
from datasets import load_dataset
spam_dataset = load_dataset("likhith231/enron_spam_small")
``` | likhith231/enron_spam_small | [
"task_categories:text-classification",
"region:us"
] | 2024-02-17T04:36:55+00:00 | {"task_categories": ["text-classification"], "dataset_info": {"features": [{"name": "label", "dtype": "int64"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1452847, "num_examples": 1000}, {"name": "validation", "num_bytes": 1685310, "num_examples": 1000}], "download_size": 1637839, "dataset_size": 3138157}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}]} | 2024-02-17T04:57:46+00:00 | [] | [] | TAGS
#task_categories-text-classification #region-us
| # Dataset Card for Dataset Name
This dataset is a condensed subset derived from the SetFit/enron_spam dataset. It comprises two primary columns: 'Text' and 'Label'. The dataset contains 1000 samples for training and 1000 samples for testing, making it suitable for binary text classification tasks.
## Dataset Details
### Columns
- Text: Represents the content of an email.
- Label: Indicates whether the email is categorized as 'spam' or 'ham' (non-spam).
### Dataset Sources
- Repository: URL
## Uses
### Direct Use
| [
"# Dataset Card for Dataset Name\n\n\n\nThis dataset is a condensed subset derived from the SetFit/enron_spam dataset. It comprises two primary columns: 'Text' and 'Label'. The dataset contains 1000 samples for training and 1000 samples for testing, making it suitable for binary text classification tasks.",
"## Dataset Details",
"### Columns\n\n- Text: Represents the content of an email.\n- Label: Indicates whether the email is categorized as 'spam' or 'ham' (non-spam).",
"### Dataset Sources \n\n- Repository: URL",
"## Uses",
"### Direct Use"
] | [
"TAGS\n#task_categories-text-classification #region-us \n",
"# Dataset Card for Dataset Name\n\n\n\nThis dataset is a condensed subset derived from the SetFit/enron_spam dataset. It comprises two primary columns: 'Text' and 'Label'. The dataset contains 1000 samples for training and 1000 samples for testing, making it suitable for binary text classification tasks.",
"## Dataset Details",
"### Columns\n\n- Text: Represents the content of an email.\n- Label: Indicates whether the email is categorized as 'spam' or 'ham' (non-spam).",
"### Dataset Sources \n\n- Repository: URL",
"## Uses",
"### Direct Use"
] | [
17,
80,
4,
42,
12,
3,
4
] | [
"passage: TAGS\n#task_categories-text-classification #region-us \n# Dataset Card for Dataset Name\n\n\n\nThis dataset is a condensed subset derived from the SetFit/enron_spam dataset. It comprises two primary columns: 'Text' and 'Label'. The dataset contains 1000 samples for training and 1000 samples for testing, making it suitable for binary text classification tasks.## Dataset Details### Columns\n\n- Text: Represents the content of an email.\n- Label: Indicates whether the email is categorized as 'spam' or 'ham' (non-spam).### Dataset Sources \n\n- Repository: URL## Uses### Direct Use"
] |
a50e3e8f18abe2f8e2d89c3b19ae281066e6cb95 |
Total train samples: 26381
Total test samples: 9868
Total tasks: 8
| Task | Train | Test |
| ---- | ----- | ---- |
|reference_number_association_without_question_boxes/2024-02-12|3026|633|
|reference_numbers/2024-02-12|3058|678|
|reference_number_association_with_question_boxes/2024-02-12|3032|633|
|table_cell_incremental_without_question_boxes/2024-02-12|3438|1481|
|table_cell_incremental_with_question_boxes/2024-02-12|4053|1422|
|table_header_with_question_boxes/2024-02-12|3714|3731|
|key_value/2024-02-12|3020|682|
|label_and_location/2024-02-12|3040|608|
Total artifact_qids: 5764
| looppayments/question_answering_token_classification_2024_02_01 | [
"region:us"
] | 2024-02-17T04:59:58+00:00 | {"pretty_name": "Question Answering Token Classification"} | 2024-02-17T05:32:45+00:00 | [] | [] | TAGS
#region-us
| Total train samples: 26381
Total test samples: 9868
Total tasks: 8
Task: reference\_number\_association\_without\_question\_boxes/2024-02-12, Train: 3026, Test: 633
Task: reference\_numbers/2024-02-12, Train: 3058, Test: 678
Task: reference\_number\_association\_with\_question\_boxes/2024-02-12, Train: 3032, Test: 633
Task: table\_cell\_incremental\_without\_question\_boxes/2024-02-12, Train: 3438, Test: 1481
Task: table\_cell\_incremental\_with\_question\_boxes/2024-02-12, Train: 4053, Test: 1422
Task: table\_header\_with\_question\_boxes/2024-02-12, Train: 3714, Test: 3731
Task: key\_value/2024-02-12, Train: 3020, Test: 682
Task: label\_and\_location/2024-02-12, Train: 3040, Test: 608
Task: Total artifact\_qids: 5764, Train: , Test:
| [] | [
"TAGS\n#region-us \n"
] | [
6
] | [
"passage: TAGS\n#region-us \n"
] |
faf87157dcc4af9a03b0feaa31da854354c3c5f7 |
# Dataset Card for Evaluation run of giraffe176/Open_Maid_Samantha_Hermes_Orca_dare_ties
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [giraffe176/Open_Maid_Samantha_Hermes_Orca_dare_ties](https://huggingface.co/giraffe176/Open_Maid_Samantha_Hermes_Orca_dare_ties) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_giraffe176__Open_Maid_Samantha_Hermes_Orca_dare_ties",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-17T05:56:05.382821](https://huggingface.co/datasets/open-llm-leaderboard/details_giraffe176__Open_Maid_Samantha_Hermes_Orca_dare_ties/blob/main/results_2024-02-17T05-56-05.382821.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6492929541395905,
"acc_stderr": 0.03204314290781419,
"acc_norm": 0.6502356687496237,
"acc_norm_stderr": 0.032693608758353816,
"mc1": 0.41003671970624234,
"mc1_stderr": 0.017217844717449325,
"mc2": 0.5797198662912402,
"mc2_stderr": 0.015180976093776475
},
"harness|arc:challenge|25": {
"acc": 0.6390784982935154,
"acc_stderr": 0.014034761386175456,
"acc_norm": 0.6774744027303754,
"acc_norm_stderr": 0.013659980894277366
},
"harness|hellaswag|10": {
"acc": 0.6803425612427804,
"acc_stderr": 0.004653907471785644,
"acc_norm": 0.8638717386974706,
"acc_norm_stderr": 0.003422238702226359
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.027943219989337128,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.027943219989337128
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107224,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107224
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146267,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146267
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.025424835086924003,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.025424835086924003
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356852,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356852
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.03192271569548301,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.03192271569548301
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586818,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586818
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033456,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033456
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.02394672474156398,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.02394672474156398
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.362962962962963,
"acc_stderr": 0.02931820364520686,
"acc_norm": 0.362962962962963,
"acc_norm_stderr": 0.02931820364520686
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7016806722689075,
"acc_stderr": 0.02971914287634286,
"acc_norm": 0.7016806722689075,
"acc_norm_stderr": 0.02971914287634286
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8366972477064221,
"acc_stderr": 0.01584825580650155,
"acc_norm": 0.8366972477064221,
"acc_norm_stderr": 0.01584825580650155
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8088235294117647,
"acc_stderr": 0.027599174300640766,
"acc_norm": 0.8088235294117647,
"acc_norm_stderr": 0.027599174300640766
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.02595502084162113,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.02595502084162113
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159465,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159465
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165612,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165612
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8339719029374202,
"acc_stderr": 0.013306478243066302,
"acc_norm": 0.8339719029374202,
"acc_norm_stderr": 0.013306478243066302
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069363,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069363
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.36201117318435755,
"acc_stderr": 0.016073067350153087,
"acc_norm": 0.36201117318435755,
"acc_norm_stderr": 0.016073067350153087
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7483660130718954,
"acc_stderr": 0.024848018263875195,
"acc_norm": 0.7483660130718954,
"acc_norm_stderr": 0.024848018263875195
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.025494259350694912,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.025494259350694912
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7438271604938271,
"acc_stderr": 0.024288533637726095,
"acc_norm": 0.7438271604938271,
"acc_norm_stderr": 0.024288533637726095
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.470013037809648,
"acc_stderr": 0.012747248967079064,
"acc_norm": 0.470013037809648,
"acc_norm_stderr": 0.012747248967079064
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.02767846864214472,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.02767846864214472
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.01897542792050721,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.01897542792050721
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.02826388994378459,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.02826388994378459
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8656716417910447,
"acc_stderr": 0.02411267824090081,
"acc_norm": 0.8656716417910447,
"acc_norm_stderr": 0.02411267824090081
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.41003671970624234,
"mc1_stderr": 0.017217844717449325,
"mc2": 0.5797198662912402,
"mc2_stderr": 0.015180976093776475
},
"harness|winogrande|5": {
"acc": 0.8113654301499605,
"acc_stderr": 0.010995172318019811
},
"harness|gsm8k|5": {
"acc": 0.6535253980288097,
"acc_stderr": 0.013107179054313401
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_giraffe176__Open_Maid_Samantha_Hermes_Orca_dare_ties | [
"region:us"
] | 2024-02-17T05:58:24+00:00 | {"pretty_name": "Evaluation run of giraffe176/Open_Maid_Samantha_Hermes_Orca_dare_ties", "dataset_summary": "Dataset automatically created during the evaluation run of model [giraffe176/Open_Maid_Samantha_Hermes_Orca_dare_ties](https://huggingface.co/giraffe176/Open_Maid_Samantha_Hermes_Orca_dare_ties) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_giraffe176__Open_Maid_Samantha_Hermes_Orca_dare_ties\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-17T05:56:05.382821](https://huggingface.co/datasets/open-llm-leaderboard/details_giraffe176__Open_Maid_Samantha_Hermes_Orca_dare_ties/blob/main/results_2024-02-17T05-56-05.382821.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6492929541395905,\n \"acc_stderr\": 0.03204314290781419,\n \"acc_norm\": 0.6502356687496237,\n \"acc_norm_stderr\": 0.032693608758353816,\n \"mc1\": 0.41003671970624234,\n \"mc1_stderr\": 0.017217844717449325,\n \"mc2\": 0.5797198662912402,\n \"mc2_stderr\": 0.015180976093776475\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6390784982935154,\n \"acc_stderr\": 0.014034761386175456,\n \"acc_norm\": 0.6774744027303754,\n \"acc_norm_stderr\": 0.013659980894277366\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6803425612427804,\n \"acc_stderr\": 0.004653907471785644,\n \"acc_norm\": 0.8638717386974706,\n \"acc_norm_stderr\": 0.003422238702226359\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.027943219989337128,\n \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.027943219989337128\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107224,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107224\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146267,\n \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146267\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42063492063492064,\n \"acc_stderr\": 0.025424835086924003,\n \"acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086924003\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n \"acc_stderr\": 0.02341529343356852,\n \"acc_norm\": 0.7838709677419354,\n \"acc_norm_stderr\": 0.02341529343356852\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.03192271569548301,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.03192271569548301\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586818,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586818\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033456,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033456\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.02394672474156398,\n \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.02394672474156398\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.362962962962963,\n \"acc_stderr\": 0.02931820364520686,\n \"acc_norm\": 0.362962962962963,\n \"acc_norm_stderr\": 0.02931820364520686\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7016806722689075,\n \"acc_stderr\": 0.02971914287634286,\n \"acc_norm\": 0.7016806722689075,\n \"acc_norm_stderr\": 0.02971914287634286\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8366972477064221,\n \"acc_stderr\": 0.01584825580650155,\n \"acc_norm\": 0.8366972477064221,\n \"acc_norm_stderr\": 0.01584825580650155\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8088235294117647,\n \"acc_stderr\": 0.027599174300640766,\n \"acc_norm\": 0.8088235294117647,\n \"acc_norm_stderr\": 0.027599174300640766\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8016877637130801,\n \"acc_stderr\": 0.02595502084162113,\n \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.02595502084162113\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n \"acc_stderr\": 0.022209309073165612,\n \"acc_norm\": 0.8675213675213675,\n \"acc_norm_stderr\": 0.022209309073165612\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8339719029374202,\n \"acc_stderr\": 0.013306478243066302,\n \"acc_norm\": 0.8339719029374202,\n \"acc_norm_stderr\": 0.013306478243066302\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069363,\n \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069363\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.36201117318435755,\n \"acc_stderr\": 0.016073067350153087,\n \"acc_norm\": 0.36201117318435755,\n \"acc_norm_stderr\": 0.016073067350153087\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7483660130718954,\n \"acc_stderr\": 0.024848018263875195,\n \"acc_norm\": 0.7483660130718954,\n \"acc_norm_stderr\": 0.024848018263875195\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n \"acc_stderr\": 0.025494259350694912,\n \"acc_norm\": 0.7202572347266881,\n \"acc_norm_stderr\": 0.025494259350694912\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.024288533637726095,\n \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.024288533637726095\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.470013037809648,\n \"acc_stderr\": 0.012747248967079064,\n \"acc_norm\": 0.470013037809648,\n \"acc_norm_stderr\": 0.012747248967079064\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.02767846864214472,\n \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.02767846864214472\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.673202614379085,\n \"acc_stderr\": 0.01897542792050721,\n \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.01897542792050721\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.02826388994378459,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.02826388994378459\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8656716417910447,\n \"acc_stderr\": 0.02411267824090081,\n \"acc_norm\": 0.8656716417910447,\n \"acc_norm_stderr\": 0.02411267824090081\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.41003671970624234,\n \"mc1_stderr\": 0.017217844717449325,\n \"mc2\": 0.5797198662912402,\n \"mc2_stderr\": 0.015180976093776475\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8113654301499605,\n \"acc_stderr\": 0.010995172318019811\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6535253980288097,\n \"acc_stderr\": 0.013107179054313401\n }\n}\n```", "repo_url": "https://huggingface.co/giraffe176/Open_Maid_Samantha_Hermes_Orca_dare_ties", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_17T05_56_05.382821", "path": ["**/details_harness|arc:challenge|25_2024-02-17T05-56-05.382821.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-17T05-56-05.382821.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_17T05_56_05.382821", "path": ["**/details_harness|gsm8k|5_2024-02-17T05-56-05.382821.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-17T05-56-05.382821.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_17T05_56_05.382821", "path": ["**/details_harness|hellaswag|10_2024-02-17T05-56-05.382821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-17T05-56-05.382821.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_17T05_56_05.382821", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-17T05-56-05.382821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-17T05-56-05.382821.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-17T05-56-05.382821.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_17T05_56_05.382821", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T05-56-05.382821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T05-56-05.382821.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_17T05_56_05.382821", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-17T05-56-05.382821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-17T05-56-05.382821.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_17T05_56_05.382821", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-17T05-56-05.382821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-17T05-56-05.382821.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_17T05_56_05.382821", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T05-56-05.382821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T05-56-05.382821.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_17T05_56_05.382821", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T05-56-05.382821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T05-56-05.382821.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_17T05_56_05.382821", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-17T05-56-05.382821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-17T05-56-05.382821.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_17T05_56_05.382821", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T05-56-05.382821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T05-56-05.382821.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_17T05_56_05.382821", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T05-56-05.382821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T05-56-05.382821.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_17T05_56_05.382821", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T05-56-05.382821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T05-56-05.382821.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_17T05_56_05.382821", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T05-56-05.382821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T05-56-05.382821.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_17T05_56_05.382821", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-17T05-56-05.382821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-17T05-56-05.382821.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_17T05_56_05.382821", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-17T05-56-05.382821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-17T05-56-05.382821.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_17T05_56_05.382821", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T05-56-05.382821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T05-56-05.382821.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_17T05_56_05.382821", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-17T05-56-05.382821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-17T05-56-05.382821.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_17T05_56_05.382821", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T05-56-05.382821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T05-56-05.382821.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_17T05_56_05.382821", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T05-56-05.382821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T05-56-05.382821.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_17T05_56_05.382821", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T05-56-05.382821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T05-56-05.382821.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_17T05_56_05.382821", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-17T05-56-05.382821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-17T05-56-05.382821.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_17T05_56_05.382821", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T05-56-05.382821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T05-56-05.382821.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_17T05_56_05.382821", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T05-56-05.382821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T05-56-05.382821.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_17T05_56_05.382821", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T05-56-05.382821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T05-56-05.382821.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_17T05_56_05.382821", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T05-56-05.382821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T05-56-05.382821.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_17T05_56_05.382821", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T05-56-05.382821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T05-56-05.382821.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_17T05_56_05.382821", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T05-56-05.382821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T05-56-05.382821.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_17T05_56_05.382821", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T05-56-05.382821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T05-56-05.382821.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_17T05_56_05.382821", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T05-56-05.382821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T05-56-05.382821.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_17T05_56_05.382821", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T05-56-05.382821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T05-56-05.382821.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_17T05_56_05.382821", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T05-56-05.382821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T05-56-05.382821.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_17T05_56_05.382821", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T05-56-05.382821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T05-56-05.382821.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_17T05_56_05.382821", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T05-56-05.382821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T05-56-05.382821.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_17T05_56_05.382821", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T05-56-05.382821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T05-56-05.382821.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_17T05_56_05.382821", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T05-56-05.382821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T05-56-05.382821.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_17T05_56_05.382821", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-17T05-56-05.382821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-17T05-56-05.382821.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_17T05_56_05.382821", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T05-56-05.382821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T05-56-05.382821.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_17T05_56_05.382821", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-17T05-56-05.382821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-17T05-56-05.382821.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_17T05_56_05.382821", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T05-56-05.382821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T05-56-05.382821.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_17T05_56_05.382821", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T05-56-05.382821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T05-56-05.382821.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_17T05_56_05.382821", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T05-56-05.382821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T05-56-05.382821.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_17T05_56_05.382821", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-17T05-56-05.382821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-17T05-56-05.382821.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_17T05_56_05.382821", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-17T05-56-05.382821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-17T05-56-05.382821.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_17T05_56_05.382821", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T05-56-05.382821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T05-56-05.382821.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_17T05_56_05.382821", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T05-56-05.382821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T05-56-05.382821.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_17T05_56_05.382821", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T05-56-05.382821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T05-56-05.382821.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_17T05_56_05.382821", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T05-56-05.382821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T05-56-05.382821.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_17T05_56_05.382821", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-17T05-56-05.382821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-17T05-56-05.382821.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_17T05_56_05.382821", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-17T05-56-05.382821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-17T05-56-05.382821.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_17T05_56_05.382821", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-17T05-56-05.382821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-17T05-56-05.382821.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_17T05_56_05.382821", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T05-56-05.382821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T05-56-05.382821.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_17T05_56_05.382821", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-17T05-56-05.382821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-17T05-56-05.382821.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_17T05_56_05.382821", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T05-56-05.382821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T05-56-05.382821.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_17T05_56_05.382821", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T05-56-05.382821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T05-56-05.382821.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_17T05_56_05.382821", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-17T05-56-05.382821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-17T05-56-05.382821.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_17T05_56_05.382821", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-17T05-56-05.382821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-17T05-56-05.382821.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_17T05_56_05.382821", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-17T05-56-05.382821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-17T05-56-05.382821.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_17T05_56_05.382821", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T05-56-05.382821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T05-56-05.382821.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_17T05_56_05.382821", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-17T05-56-05.382821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-17T05-56-05.382821.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_17T05_56_05.382821", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-17T05-56-05.382821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-17T05-56-05.382821.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_17T05_56_05.382821", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-17T05-56-05.382821.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-17T05-56-05.382821.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_17T05_56_05.382821", "path": ["**/details_harness|winogrande|5_2024-02-17T05-56-05.382821.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-17T05-56-05.382821.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_17T05_56_05.382821", "path": ["results_2024-02-17T05-56-05.382821.parquet"]}, {"split": "latest", "path": ["results_2024-02-17T05-56-05.382821.parquet"]}]}]} | 2024-02-17T05:58:44+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of giraffe176/Open_Maid_Samantha_Hermes_Orca_dare_ties
Dataset automatically created during the evaluation run of model giraffe176/Open_Maid_Samantha_Hermes_Orca_dare_ties on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-17T05:56:05.382821(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of giraffe176/Open_Maid_Samantha_Hermes_Orca_dare_ties\n\n\n\nDataset automatically created during the evaluation run of model giraffe176/Open_Maid_Samantha_Hermes_Orca_dare_ties on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-17T05:56:05.382821(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of giraffe176/Open_Maid_Samantha_Hermes_Orca_dare_ties\n\n\n\nDataset automatically created during the evaluation run of model giraffe176/Open_Maid_Samantha_Hermes_Orca_dare_ties on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-17T05:56:05.382821(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
203,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of giraffe176/Open_Maid_Samantha_Hermes_Orca_dare_ties\n\n\n\nDataset automatically created during the evaluation run of model giraffe176/Open_Maid_Samantha_Hermes_Orca_dare_ties on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-17T05:56:05.382821(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]"
] |
bc77c550cf1aef5d2f0645e048826f1c05bb68e5 |
# Dataset Card for Evaluation run of macadeliccc/SmaugDolphin-60B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [macadeliccc/SmaugDolphin-60B](https://huggingface.co/macadeliccc/SmaugDolphin-60B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_macadeliccc__SmaugDolphin-60B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-17T06:01:33.258417](https://huggingface.co/datasets/open-llm-leaderboard/details_macadeliccc__SmaugDolphin-60B/blob/main/results_2024-02-17T06-01-33.258417.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7651915724176592,
"acc_stderr": 0.02813245689172726,
"acc_norm": 0.7689266432773564,
"acc_norm_stderr": 0.02866672384399442,
"mc1": 0.5006119951040392,
"mc1_stderr": 0.01750348793889251,
"mc2": 0.6743783536659786,
"mc2_stderr": 0.014425170450824752
},
"harness|arc:challenge|25": {
"acc": 0.697098976109215,
"acc_stderr": 0.013428241573185349,
"acc_norm": 0.7337883959044369,
"acc_norm_stderr": 0.012915774781523217
},
"harness|hellaswag|10": {
"acc": 0.6674965146385182,
"acc_stderr": 0.004701474865207032,
"acc_norm": 0.8654650468034256,
"acc_norm_stderr": 0.003405288007233201
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.038201699145179055,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.038201699145179055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.875,
"acc_stderr": 0.026913523521537846,
"acc_norm": 0.875,
"acc_norm_stderr": 0.026913523521537846
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932262,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932262
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7924528301886793,
"acc_stderr": 0.02495991802891127,
"acc_norm": 0.7924528301886793,
"acc_norm_stderr": 0.02495991802891127
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.9027777777777778,
"acc_stderr": 0.024774516250440182,
"acc_norm": 0.9027777777777778,
"acc_norm_stderr": 0.024774516250440182
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7514450867052023,
"acc_stderr": 0.03295304696818317,
"acc_norm": 0.7514450867052023,
"acc_norm_stderr": 0.03295304696818317
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5980392156862745,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.5980392156862745,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7574468085106383,
"acc_stderr": 0.02802022627120022,
"acc_norm": 0.7574468085106383,
"acc_norm_stderr": 0.02802022627120022
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5964912280701754,
"acc_stderr": 0.04615186962583707,
"acc_norm": 0.5964912280701754,
"acc_norm_stderr": 0.04615186962583707
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7517241379310344,
"acc_stderr": 0.036001056927277696,
"acc_norm": 0.7517241379310344,
"acc_norm_stderr": 0.036001056927277696
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.7275132275132276,
"acc_stderr": 0.022930973071633363,
"acc_norm": 0.7275132275132276,
"acc_norm_stderr": 0.022930973071633363
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5793650793650794,
"acc_stderr": 0.04415438226743745,
"acc_norm": 0.5793650793650794,
"acc_norm_stderr": 0.04415438226743745
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.896774193548387,
"acc_stderr": 0.01730838128103452,
"acc_norm": 0.896774193548387,
"acc_norm_stderr": 0.01730838128103452
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6798029556650246,
"acc_stderr": 0.032826493853041504,
"acc_norm": 0.6798029556650246,
"acc_norm_stderr": 0.032826493853041504
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8727272727272727,
"acc_stderr": 0.026024657651656177,
"acc_norm": 0.8727272727272727,
"acc_norm_stderr": 0.026024657651656177
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9292929292929293,
"acc_stderr": 0.01826310542019949,
"acc_norm": 0.9292929292929293,
"acc_norm_stderr": 0.01826310542019949
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9844559585492227,
"acc_stderr": 0.008927492715084329,
"acc_norm": 0.9844559585492227,
"acc_norm_stderr": 0.008927492715084329
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8128205128205128,
"acc_stderr": 0.019776601086550032,
"acc_norm": 0.8128205128205128,
"acc_norm_stderr": 0.019776601086550032
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.44814814814814813,
"acc_stderr": 0.030321167196316286,
"acc_norm": 0.44814814814814813,
"acc_norm_stderr": 0.030321167196316286
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8697478991596639,
"acc_stderr": 0.021863258494852107,
"acc_norm": 0.8697478991596639,
"acc_norm_stderr": 0.021863258494852107
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4900662251655629,
"acc_stderr": 0.04081677107248436,
"acc_norm": 0.4900662251655629,
"acc_norm_stderr": 0.04081677107248436
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9137614678899083,
"acc_stderr": 0.012035597300116241,
"acc_norm": 0.9137614678899083,
"acc_norm_stderr": 0.012035597300116241
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.032757734861009996,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.032757734861009996
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9313725490196079,
"acc_stderr": 0.017744453647073322,
"acc_norm": 0.9313725490196079,
"acc_norm_stderr": 0.017744453647073322
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9071729957805907,
"acc_stderr": 0.01888975055095671,
"acc_norm": 0.9071729957805907,
"acc_norm_stderr": 0.01888975055095671
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8161434977578476,
"acc_stderr": 0.025998379092356517,
"acc_norm": 0.8161434977578476,
"acc_norm_stderr": 0.025998379092356517
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.9007633587786259,
"acc_stderr": 0.026222235171477374,
"acc_norm": 0.9007633587786259,
"acc_norm_stderr": 0.026222235171477374
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8760330578512396,
"acc_stderr": 0.030083098716035216,
"acc_norm": 0.8760330578512396,
"acc_norm_stderr": 0.030083098716035216
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8796296296296297,
"acc_stderr": 0.031457038543062504,
"acc_norm": 0.8796296296296297,
"acc_norm_stderr": 0.031457038543062504
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.852760736196319,
"acc_stderr": 0.027839915278339653,
"acc_norm": 0.852760736196319,
"acc_norm_stderr": 0.027839915278339653
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.04697113923010213,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.04697113923010213
},
"harness|hendrycksTest-management|5": {
"acc": 0.8640776699029126,
"acc_stderr": 0.0339329572976101,
"acc_norm": 0.8640776699029126,
"acc_norm_stderr": 0.0339329572976101
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9358974358974359,
"acc_stderr": 0.016046261631673137,
"acc_norm": 0.9358974358974359,
"acc_norm_stderr": 0.016046261631673137
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466125,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466125
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9106002554278416,
"acc_stderr": 0.0102030178476883,
"acc_norm": 0.9106002554278416,
"acc_norm_stderr": 0.0102030178476883
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8208092485549133,
"acc_stderr": 0.020647590029679332,
"acc_norm": 0.8208092485549133,
"acc_norm_stderr": 0.020647590029679332
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.7955307262569833,
"acc_stderr": 0.013488813404711917,
"acc_norm": 0.7955307262569833,
"acc_norm_stderr": 0.013488813404711917
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8496732026143791,
"acc_stderr": 0.020464175124332625,
"acc_norm": 0.8496732026143791,
"acc_norm_stderr": 0.020464175124332625
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7942122186495176,
"acc_stderr": 0.022961339906764244,
"acc_norm": 0.7942122186495176,
"acc_norm_stderr": 0.022961339906764244
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8827160493827161,
"acc_stderr": 0.017903112615281127,
"acc_norm": 0.8827160493827161,
"acc_norm_stderr": 0.017903112615281127
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6418439716312057,
"acc_stderr": 0.028602085862759422,
"acc_norm": 0.6418439716312057,
"acc_norm_stderr": 0.028602085862759422
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5853976531942634,
"acc_stderr": 0.012582597058908284,
"acc_norm": 0.5853976531942634,
"acc_norm_stderr": 0.012582597058908284
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8345588235294118,
"acc_stderr": 0.022571771025494746,
"acc_norm": 0.8345588235294118,
"acc_norm_stderr": 0.022571771025494746
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8169934640522876,
"acc_stderr": 0.015643069911273344,
"acc_norm": 0.8169934640522876,
"acc_norm_stderr": 0.015643069911273344
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8489795918367347,
"acc_stderr": 0.022923004094736854,
"acc_norm": 0.8489795918367347,
"acc_norm_stderr": 0.022923004094736854
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8955223880597015,
"acc_stderr": 0.021628920516700643,
"acc_norm": 0.8955223880597015,
"acc_norm_stderr": 0.021628920516700643
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466125,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466125
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5843373493975904,
"acc_stderr": 0.03836722176598053,
"acc_norm": 0.5843373493975904,
"acc_norm_stderr": 0.03836722176598053
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.024103384202072864,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.024103384202072864
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5006119951040392,
"mc1_stderr": 0.01750348793889251,
"mc2": 0.6743783536659786,
"mc2_stderr": 0.014425170450824752
},
"harness|winogrande|5": {
"acc": 0.835043409629045,
"acc_stderr": 0.010430917468237428
},
"harness|gsm8k|5": {
"acc": 0.709628506444276,
"acc_stderr": 0.012503592481818954
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_macadeliccc__SmaugDolphin-60B | [
"region:us"
] | 2024-02-17T06:03:46+00:00 | {"pretty_name": "Evaluation run of macadeliccc/SmaugDolphin-60B", "dataset_summary": "Dataset automatically created during the evaluation run of model [macadeliccc/SmaugDolphin-60B](https://huggingface.co/macadeliccc/SmaugDolphin-60B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_macadeliccc__SmaugDolphin-60B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-17T06:01:33.258417](https://huggingface.co/datasets/open-llm-leaderboard/details_macadeliccc__SmaugDolphin-60B/blob/main/results_2024-02-17T06-01-33.258417.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7651915724176592,\n \"acc_stderr\": 0.02813245689172726,\n \"acc_norm\": 0.7689266432773564,\n \"acc_norm_stderr\": 0.02866672384399442,\n \"mc1\": 0.5006119951040392,\n \"mc1_stderr\": 0.01750348793889251,\n \"mc2\": 0.6743783536659786,\n \"mc2_stderr\": 0.014425170450824752\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.697098976109215,\n \"acc_stderr\": 0.013428241573185349,\n \"acc_norm\": 0.7337883959044369,\n \"acc_norm_stderr\": 0.012915774781523217\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6674965146385182,\n \"acc_stderr\": 0.004701474865207032,\n \"acc_norm\": 0.8654650468034256,\n \"acc_norm_stderr\": 0.003405288007233201\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.038201699145179055,\n \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.038201699145179055\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.875,\n \"acc_stderr\": 0.026913523521537846,\n \"acc_norm\": 0.875,\n \"acc_norm_stderr\": 0.026913523521537846\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932262,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932262\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7924528301886793,\n \"acc_stderr\": 0.02495991802891127,\n \"acc_norm\": 0.7924528301886793,\n \"acc_norm_stderr\": 0.02495991802891127\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9027777777777778,\n \"acc_stderr\": 0.024774516250440182,\n \"acc_norm\": 0.9027777777777778,\n \"acc_norm_stderr\": 0.024774516250440182\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7514450867052023,\n \"acc_stderr\": 0.03295304696818317,\n \"acc_norm\": 0.7514450867052023,\n \"acc_norm_stderr\": 0.03295304696818317\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5980392156862745,\n \"acc_stderr\": 0.04878608714466996,\n \"acc_norm\": 0.5980392156862745,\n \"acc_norm_stderr\": 0.04878608714466996\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7574468085106383,\n \"acc_stderr\": 0.02802022627120022,\n \"acc_norm\": 0.7574468085106383,\n \"acc_norm_stderr\": 0.02802022627120022\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5964912280701754,\n \"acc_stderr\": 0.04615186962583707,\n \"acc_norm\": 0.5964912280701754,\n \"acc_norm_stderr\": 0.04615186962583707\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7517241379310344,\n \"acc_stderr\": 0.036001056927277696,\n \"acc_norm\": 0.7517241379310344,\n \"acc_norm_stderr\": 0.036001056927277696\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.7275132275132276,\n \"acc_stderr\": 0.022930973071633363,\n \"acc_norm\": 0.7275132275132276,\n \"acc_norm_stderr\": 0.022930973071633363\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5793650793650794,\n \"acc_stderr\": 0.04415438226743745,\n \"acc_norm\": 0.5793650793650794,\n \"acc_norm_stderr\": 0.04415438226743745\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.896774193548387,\n \"acc_stderr\": 0.01730838128103452,\n \"acc_norm\": 0.896774193548387,\n \"acc_norm_stderr\": 0.01730838128103452\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6798029556650246,\n \"acc_stderr\": 0.032826493853041504,\n \"acc_norm\": 0.6798029556650246,\n \"acc_norm_stderr\": 0.032826493853041504\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8727272727272727,\n \"acc_stderr\": 0.026024657651656177,\n \"acc_norm\": 0.8727272727272727,\n \"acc_norm_stderr\": 0.026024657651656177\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9292929292929293,\n \"acc_stderr\": 0.01826310542019949,\n \"acc_norm\": 0.9292929292929293,\n \"acc_norm_stderr\": 0.01826310542019949\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9844559585492227,\n \"acc_stderr\": 0.008927492715084329,\n \"acc_norm\": 0.9844559585492227,\n \"acc_norm_stderr\": 0.008927492715084329\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.8128205128205128,\n \"acc_stderr\": 0.019776601086550032,\n \"acc_norm\": 0.8128205128205128,\n \"acc_norm_stderr\": 0.019776601086550032\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.44814814814814813,\n \"acc_stderr\": 0.030321167196316286,\n \"acc_norm\": 0.44814814814814813,\n \"acc_norm_stderr\": 0.030321167196316286\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8697478991596639,\n \"acc_stderr\": 0.021863258494852107,\n \"acc_norm\": 0.8697478991596639,\n \"acc_norm_stderr\": 0.021863258494852107\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.4900662251655629,\n \"acc_stderr\": 0.04081677107248436,\n \"acc_norm\": 0.4900662251655629,\n \"acc_norm_stderr\": 0.04081677107248436\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9137614678899083,\n \"acc_stderr\": 0.012035597300116241,\n \"acc_norm\": 0.9137614678899083,\n \"acc_norm_stderr\": 0.012035597300116241\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6388888888888888,\n \"acc_stderr\": 0.032757734861009996,\n \"acc_norm\": 0.6388888888888888,\n \"acc_norm_stderr\": 0.032757734861009996\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9313725490196079,\n \"acc_stderr\": 0.017744453647073322,\n \"acc_norm\": 0.9313725490196079,\n \"acc_norm_stderr\": 0.017744453647073322\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9071729957805907,\n \"acc_stderr\": 0.01888975055095671,\n \"acc_norm\": 0.9071729957805907,\n \"acc_norm_stderr\": 0.01888975055095671\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8161434977578476,\n \"acc_stderr\": 0.025998379092356517,\n \"acc_norm\": 0.8161434977578476,\n \"acc_norm_stderr\": 0.025998379092356517\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.9007633587786259,\n \"acc_stderr\": 0.026222235171477374,\n \"acc_norm\": 0.9007633587786259,\n \"acc_norm_stderr\": 0.026222235171477374\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8760330578512396,\n \"acc_stderr\": 0.030083098716035216,\n \"acc_norm\": 0.8760330578512396,\n \"acc_norm_stderr\": 0.030083098716035216\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8796296296296297,\n \"acc_stderr\": 0.031457038543062504,\n \"acc_norm\": 0.8796296296296297,\n \"acc_norm_stderr\": 0.031457038543062504\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.852760736196319,\n \"acc_stderr\": 0.027839915278339653,\n \"acc_norm\": 0.852760736196319,\n \"acc_norm_stderr\": 0.027839915278339653\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5714285714285714,\n \"acc_stderr\": 0.04697113923010213,\n \"acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.04697113923010213\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8640776699029126,\n \"acc_stderr\": 0.0339329572976101,\n \"acc_norm\": 0.8640776699029126,\n \"acc_norm_stderr\": 0.0339329572976101\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9358974358974359,\n \"acc_stderr\": 0.016046261631673137,\n \"acc_norm\": 0.9358974358974359,\n \"acc_norm_stderr\": 0.016046261631673137\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466125,\n \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466125\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9106002554278416,\n \"acc_stderr\": 0.0102030178476883,\n \"acc_norm\": 0.9106002554278416,\n \"acc_norm_stderr\": 0.0102030178476883\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8208092485549133,\n \"acc_stderr\": 0.020647590029679332,\n \"acc_norm\": 0.8208092485549133,\n \"acc_norm_stderr\": 0.020647590029679332\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.7955307262569833,\n \"acc_stderr\": 0.013488813404711917,\n \"acc_norm\": 0.7955307262569833,\n \"acc_norm_stderr\": 0.013488813404711917\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8496732026143791,\n \"acc_stderr\": 0.020464175124332625,\n \"acc_norm\": 0.8496732026143791,\n \"acc_norm_stderr\": 0.020464175124332625\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7942122186495176,\n \"acc_stderr\": 0.022961339906764244,\n \"acc_norm\": 0.7942122186495176,\n \"acc_norm_stderr\": 0.022961339906764244\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8827160493827161,\n \"acc_stderr\": 0.017903112615281127,\n \"acc_norm\": 0.8827160493827161,\n \"acc_norm_stderr\": 0.017903112615281127\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6418439716312057,\n \"acc_stderr\": 0.028602085862759422,\n \"acc_norm\": 0.6418439716312057,\n \"acc_norm_stderr\": 0.028602085862759422\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5853976531942634,\n \"acc_stderr\": 0.012582597058908284,\n \"acc_norm\": 0.5853976531942634,\n \"acc_norm_stderr\": 0.012582597058908284\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8345588235294118,\n \"acc_stderr\": 0.022571771025494746,\n \"acc_norm\": 0.8345588235294118,\n \"acc_norm_stderr\": 0.022571771025494746\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.8169934640522876,\n \"acc_stderr\": 0.015643069911273344,\n \"acc_norm\": 0.8169934640522876,\n \"acc_norm_stderr\": 0.015643069911273344\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8489795918367347,\n \"acc_stderr\": 0.022923004094736854,\n \"acc_norm\": 0.8489795918367347,\n \"acc_norm_stderr\": 0.022923004094736854\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8955223880597015,\n \"acc_stderr\": 0.021628920516700643,\n \"acc_norm\": 0.8955223880597015,\n \"acc_norm_stderr\": 0.021628920516700643\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466125,\n \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466125\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n \"acc_stderr\": 0.03836722176598053,\n \"acc_norm\": 0.5843373493975904,\n \"acc_norm_stderr\": 0.03836722176598053\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.024103384202072864,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.024103384202072864\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5006119951040392,\n \"mc1_stderr\": 0.01750348793889251,\n \"mc2\": 0.6743783536659786,\n \"mc2_stderr\": 0.014425170450824752\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.835043409629045,\n \"acc_stderr\": 0.010430917468237428\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.709628506444276,\n \"acc_stderr\": 0.012503592481818954\n }\n}\n```", "repo_url": "https://huggingface.co/macadeliccc/SmaugDolphin-60B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_17T06_01_33.258417", "path": ["**/details_harness|arc:challenge|25_2024-02-17T06-01-33.258417.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-17T06-01-33.258417.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_17T06_01_33.258417", "path": ["**/details_harness|gsm8k|5_2024-02-17T06-01-33.258417.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-17T06-01-33.258417.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_17T06_01_33.258417", "path": ["**/details_harness|hellaswag|10_2024-02-17T06-01-33.258417.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-17T06-01-33.258417.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_17T06_01_33.258417", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-17T06-01-33.258417.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-17T06-01-33.258417.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-17T06-01-33.258417.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_17T06_01_33.258417", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T06-01-33.258417.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T06-01-33.258417.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_17T06_01_33.258417", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-17T06-01-33.258417.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-17T06-01-33.258417.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_17T06_01_33.258417", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-17T06-01-33.258417.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-17T06-01-33.258417.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_17T06_01_33.258417", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T06-01-33.258417.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T06-01-33.258417.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_17T06_01_33.258417", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T06-01-33.258417.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T06-01-33.258417.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_17T06_01_33.258417", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-17T06-01-33.258417.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-17T06-01-33.258417.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_17T06_01_33.258417", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T06-01-33.258417.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T06-01-33.258417.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_17T06_01_33.258417", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T06-01-33.258417.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T06-01-33.258417.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_17T06_01_33.258417", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T06-01-33.258417.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T06-01-33.258417.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_17T06_01_33.258417", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T06-01-33.258417.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T06-01-33.258417.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_17T06_01_33.258417", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-17T06-01-33.258417.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-17T06-01-33.258417.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_17T06_01_33.258417", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-17T06-01-33.258417.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-17T06-01-33.258417.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_17T06_01_33.258417", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T06-01-33.258417.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T06-01-33.258417.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_17T06_01_33.258417", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-17T06-01-33.258417.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-17T06-01-33.258417.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_17T06_01_33.258417", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T06-01-33.258417.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T06-01-33.258417.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_17T06_01_33.258417", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T06-01-33.258417.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T06-01-33.258417.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_17T06_01_33.258417", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T06-01-33.258417.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T06-01-33.258417.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_17T06_01_33.258417", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-17T06-01-33.258417.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-17T06-01-33.258417.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_17T06_01_33.258417", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T06-01-33.258417.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T06-01-33.258417.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_17T06_01_33.258417", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T06-01-33.258417.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T06-01-33.258417.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_17T06_01_33.258417", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T06-01-33.258417.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T06-01-33.258417.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_17T06_01_33.258417", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T06-01-33.258417.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T06-01-33.258417.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_17T06_01_33.258417", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T06-01-33.258417.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T06-01-33.258417.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_17T06_01_33.258417", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T06-01-33.258417.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T06-01-33.258417.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_17T06_01_33.258417", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T06-01-33.258417.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T06-01-33.258417.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_17T06_01_33.258417", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T06-01-33.258417.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T06-01-33.258417.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_17T06_01_33.258417", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T06-01-33.258417.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T06-01-33.258417.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_17T06_01_33.258417", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T06-01-33.258417.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T06-01-33.258417.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_17T06_01_33.258417", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T06-01-33.258417.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T06-01-33.258417.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_17T06_01_33.258417", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T06-01-33.258417.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T06-01-33.258417.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_17T06_01_33.258417", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T06-01-33.258417.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T06-01-33.258417.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_17T06_01_33.258417", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T06-01-33.258417.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T06-01-33.258417.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_17T06_01_33.258417", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-17T06-01-33.258417.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-17T06-01-33.258417.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_17T06_01_33.258417", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T06-01-33.258417.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T06-01-33.258417.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_17T06_01_33.258417", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-17T06-01-33.258417.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-17T06-01-33.258417.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_17T06_01_33.258417", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T06-01-33.258417.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T06-01-33.258417.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_17T06_01_33.258417", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T06-01-33.258417.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T06-01-33.258417.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_17T06_01_33.258417", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T06-01-33.258417.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T06-01-33.258417.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_17T06_01_33.258417", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-17T06-01-33.258417.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-17T06-01-33.258417.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_17T06_01_33.258417", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-17T06-01-33.258417.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-17T06-01-33.258417.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_17T06_01_33.258417", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T06-01-33.258417.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T06-01-33.258417.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_17T06_01_33.258417", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T06-01-33.258417.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T06-01-33.258417.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_17T06_01_33.258417", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T06-01-33.258417.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T06-01-33.258417.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_17T06_01_33.258417", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T06-01-33.258417.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T06-01-33.258417.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_17T06_01_33.258417", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-17T06-01-33.258417.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-17T06-01-33.258417.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_17T06_01_33.258417", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-17T06-01-33.258417.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-17T06-01-33.258417.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_17T06_01_33.258417", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-17T06-01-33.258417.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-17T06-01-33.258417.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_17T06_01_33.258417", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T06-01-33.258417.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T06-01-33.258417.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_17T06_01_33.258417", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-17T06-01-33.258417.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-17T06-01-33.258417.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_17T06_01_33.258417", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T06-01-33.258417.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T06-01-33.258417.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_17T06_01_33.258417", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T06-01-33.258417.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T06-01-33.258417.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_17T06_01_33.258417", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-17T06-01-33.258417.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-17T06-01-33.258417.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_17T06_01_33.258417", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-17T06-01-33.258417.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-17T06-01-33.258417.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_17T06_01_33.258417", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-17T06-01-33.258417.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-17T06-01-33.258417.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_17T06_01_33.258417", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T06-01-33.258417.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T06-01-33.258417.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_17T06_01_33.258417", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-17T06-01-33.258417.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-17T06-01-33.258417.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_17T06_01_33.258417", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-17T06-01-33.258417.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-17T06-01-33.258417.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_17T06_01_33.258417", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-17T06-01-33.258417.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-17T06-01-33.258417.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_17T06_01_33.258417", "path": ["**/details_harness|winogrande|5_2024-02-17T06-01-33.258417.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-17T06-01-33.258417.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_17T06_01_33.258417", "path": ["results_2024-02-17T06-01-33.258417.parquet"]}, {"split": "latest", "path": ["results_2024-02-17T06-01-33.258417.parquet"]}]}]} | 2024-02-17T06:04:08+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of macadeliccc/SmaugDolphin-60B
Dataset automatically created during the evaluation run of model macadeliccc/SmaugDolphin-60B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-17T06:01:33.258417(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of macadeliccc/SmaugDolphin-60B\n\n\n\nDataset automatically created during the evaluation run of model macadeliccc/SmaugDolphin-60B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-17T06:01:33.258417(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of macadeliccc/SmaugDolphin-60B\n\n\n\nDataset automatically created during the evaluation run of model macadeliccc/SmaugDolphin-60B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-17T06:01:33.258417(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
187,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of macadeliccc/SmaugDolphin-60B\n\n\n\nDataset automatically created during the evaluation run of model macadeliccc/SmaugDolphin-60B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-17T06:01:33.258417(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
d4020a57e34373d70476747119c1ed4861b90eb1 |
# Dataset Card for Evaluation run of fzzhang/toten_gsm8k_merged_s
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [fzzhang/toten_gsm8k_merged_s](https://huggingface.co/fzzhang/toten_gsm8k_merged_s) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_fzzhang__toten_gsm8k_merged_s",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-17T06:12:40.059000](https://huggingface.co/datasets/open-llm-leaderboard/details_fzzhang__toten_gsm8k_merged_s/blob/main/results_2024-02-17T06-12-40.059000.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6303593896054516,
"acc_stderr": 0.032532314502917783,
"acc_norm": 0.6323776185375855,
"acc_norm_stderr": 0.033183029878923596,
"mc1": 0.3818849449204406,
"mc1_stderr": 0.01700810193916349,
"mc2": 0.5491954242070916,
"mc2_stderr": 0.015160040858276722
},
"harness|arc:challenge|25": {
"acc": 0.6100682593856656,
"acc_stderr": 0.014252959848892896,
"acc_norm": 0.6527303754266212,
"acc_norm_stderr": 0.01391303452962045
},
"harness|hellaswag|10": {
"acc": 0.6602270464050985,
"acc_stderr": 0.00472664053256204,
"acc_norm": 0.847042421828321,
"acc_norm_stderr": 0.0035921097436286175
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5703703703703704,
"acc_stderr": 0.042763494943766,
"acc_norm": 0.5703703703703704,
"acc_norm_stderr": 0.042763494943766
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.02881561571343211,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.02881561571343211
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.0356760379963917,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.0356760379963917
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5263157894736842,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.5263157894736842,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.025331202438944444,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.025331202438944444
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377562,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377562
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7548387096774194,
"acc_stderr": 0.02447224384089552,
"acc_norm": 0.7548387096774194,
"acc_norm_stderr": 0.02447224384089552
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.47783251231527096,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.47783251231527096,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.03192271569548301,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.03192271569548301
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.02463978909770944,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.02463978909770944
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6256410256410256,
"acc_stderr": 0.0245375915728305,
"acc_norm": 0.6256410256410256,
"acc_norm_stderr": 0.0245375915728305
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616255,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6512605042016807,
"acc_stderr": 0.030956636328566548,
"acc_norm": 0.6512605042016807,
"acc_norm_stderr": 0.030956636328566548
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943343,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943343
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8238532110091743,
"acc_stderr": 0.016332882393431388,
"acc_norm": 0.8238532110091743,
"acc_norm_stderr": 0.016332882393431388
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588667,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588667
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.026558372502661916,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.026558372502661916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6502242152466368,
"acc_stderr": 0.03200736719484503,
"acc_norm": 0.6502242152466368,
"acc_norm_stderr": 0.03200736719484503
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7251908396946565,
"acc_stderr": 0.039153454088478354,
"acc_norm": 0.7251908396946565,
"acc_norm_stderr": 0.039153454088478354
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243838,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243838
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.03462419931615624,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.03462419931615624
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8418803418803419,
"acc_stderr": 0.023902325549560396,
"acc_norm": 0.8418803418803419,
"acc_norm_stderr": 0.023902325549560396
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8250319284802043,
"acc_stderr": 0.013586619219903341,
"acc_norm": 0.8250319284802043,
"acc_norm_stderr": 0.013586619219903341
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6907514450867052,
"acc_stderr": 0.02488314057007176,
"acc_norm": 0.6907514450867052,
"acc_norm_stderr": 0.02488314057007176
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3027932960893855,
"acc_stderr": 0.015366860386397108,
"acc_norm": 0.3027932960893855,
"acc_norm_stderr": 0.015366860386397108
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292452,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292452
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.025839898334877983,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.025839898334877983
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7253086419753086,
"acc_stderr": 0.024836057868294677,
"acc_norm": 0.7253086419753086,
"acc_norm_stderr": 0.024836057868294677
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873866,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873866
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4439374185136897,
"acc_stderr": 0.012689708167787687,
"acc_norm": 0.4439374185136897,
"acc_norm_stderr": 0.012689708167787687
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6397058823529411,
"acc_stderr": 0.029163128570670733,
"acc_norm": 0.6397058823529411,
"acc_norm_stderr": 0.029163128570670733
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6650326797385621,
"acc_stderr": 0.019094228167000325,
"acc_norm": 0.6650326797385621,
"acc_norm_stderr": 0.019094228167000325
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142783,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142783
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685515,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685515
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.030611116557432528,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.030611116557432528
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3818849449204406,
"mc1_stderr": 0.01700810193916349,
"mc2": 0.5491954242070916,
"mc2_stderr": 0.015160040858276722
},
"harness|winogrande|5": {
"acc": 0.7790055248618785,
"acc_stderr": 0.011661223637643416
},
"harness|gsm8k|5": {
"acc": 0.5921152388172858,
"acc_stderr": 0.013536742075643088
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_fzzhang__toten_gsm8k_merged_s | [
"region:us"
] | 2024-02-17T06:14:58+00:00 | {"pretty_name": "Evaluation run of fzzhang/toten_gsm8k_merged_s", "dataset_summary": "Dataset automatically created during the evaluation run of model [fzzhang/toten_gsm8k_merged_s](https://huggingface.co/fzzhang/toten_gsm8k_merged_s) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_fzzhang__toten_gsm8k_merged_s\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-17T06:12:40.059000](https://huggingface.co/datasets/open-llm-leaderboard/details_fzzhang__toten_gsm8k_merged_s/blob/main/results_2024-02-17T06-12-40.059000.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6303593896054516,\n \"acc_stderr\": 0.032532314502917783,\n \"acc_norm\": 0.6323776185375855,\n \"acc_norm_stderr\": 0.033183029878923596,\n \"mc1\": 0.3818849449204406,\n \"mc1_stderr\": 0.01700810193916349,\n \"mc2\": 0.5491954242070916,\n \"mc2_stderr\": 0.015160040858276722\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6100682593856656,\n \"acc_stderr\": 0.014252959848892896,\n \"acc_norm\": 0.6527303754266212,\n \"acc_norm_stderr\": 0.01391303452962045\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6602270464050985,\n \"acc_stderr\": 0.00472664053256204,\n \"acc_norm\": 0.847042421828321,\n \"acc_norm_stderr\": 0.0035921097436286175\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5703703703703704,\n \"acc_stderr\": 0.042763494943766,\n \"acc_norm\": 0.5703703703703704,\n \"acc_norm_stderr\": 0.042763494943766\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.02881561571343211,\n \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.02881561571343211\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720683,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720683\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n \"acc_stderr\": 0.0356760379963917,\n \"acc_norm\": 0.6763005780346821,\n \"acc_norm_stderr\": 0.0356760379963917\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.03246956919789958,\n \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.03246956919789958\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5263157894736842,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.5263157894736842,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482758,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482758\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41005291005291006,\n \"acc_stderr\": 0.025331202438944444,\n \"acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.025331202438944444\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7548387096774194,\n \"acc_stderr\": 0.02447224384089552,\n \"acc_norm\": 0.7548387096774194,\n \"acc_norm_stderr\": 0.02447224384089552\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.47783251231527096,\n \"acc_stderr\": 0.03514528562175008,\n \"acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.03514528562175008\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.03192271569548301,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.03192271569548301\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.02463978909770944,\n \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.02463978909770944\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6256410256410256,\n \"acc_stderr\": 0.0245375915728305,\n \"acc_norm\": 0.6256410256410256,\n \"acc_norm_stderr\": 0.0245375915728305\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616255,\n \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616255\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6512605042016807,\n \"acc_stderr\": 0.030956636328566548,\n \"acc_norm\": 0.6512605042016807,\n \"acc_norm_stderr\": 0.030956636328566548\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.304635761589404,\n \"acc_stderr\": 0.03757949922943343,\n \"acc_norm\": 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943343\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8238532110091743,\n \"acc_stderr\": 0.016332882393431388,\n \"acc_norm\": 0.8238532110091743,\n \"acc_norm_stderr\": 0.016332882393431388\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588667,\n \"acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588667\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7890295358649789,\n \"acc_stderr\": 0.026558372502661916,\n \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.026558372502661916\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6502242152466368,\n \"acc_stderr\": 0.03200736719484503,\n \"acc_norm\": 0.6502242152466368,\n \"acc_norm_stderr\": 0.03200736719484503\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7251908396946565,\n \"acc_stderr\": 0.039153454088478354,\n \"acc_norm\": 0.7251908396946565,\n \"acc_norm_stderr\": 0.039153454088478354\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615624,\n \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615624\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8418803418803419,\n \"acc_stderr\": 0.023902325549560396,\n \"acc_norm\": 0.8418803418803419,\n \"acc_norm_stderr\": 0.023902325549560396\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8250319284802043,\n \"acc_stderr\": 0.013586619219903341,\n \"acc_norm\": 0.8250319284802043,\n \"acc_norm_stderr\": 0.013586619219903341\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6907514450867052,\n \"acc_stderr\": 0.02488314057007176,\n \"acc_norm\": 0.6907514450867052,\n \"acc_norm_stderr\": 0.02488314057007176\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3027932960893855,\n \"acc_stderr\": 0.015366860386397108,\n \"acc_norm\": 0.3027932960893855,\n \"acc_norm_stderr\": 0.015366860386397108\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292452,\n \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292452\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n \"acc_stderr\": 0.025839898334877983,\n \"acc_norm\": 0.707395498392283,\n \"acc_norm_stderr\": 0.025839898334877983\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7253086419753086,\n \"acc_stderr\": 0.024836057868294677,\n \"acc_norm\": 0.7253086419753086,\n \"acc_norm_stderr\": 0.024836057868294677\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873866,\n \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873866\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4439374185136897,\n \"acc_stderr\": 0.012689708167787687,\n \"acc_norm\": 0.4439374185136897,\n \"acc_norm_stderr\": 0.012689708167787687\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6397058823529411,\n \"acc_stderr\": 0.029163128570670733,\n \"acc_norm\": 0.6397058823529411,\n \"acc_norm_stderr\": 0.029163128570670733\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6650326797385621,\n \"acc_stderr\": 0.019094228167000325,\n \"acc_norm\": 0.6650326797385621,\n \"acc_norm_stderr\": 0.019094228167000325\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142783,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142783\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n \"acc_stderr\": 0.03858158940685515,\n \"acc_norm\": 0.5662650602409639,\n \"acc_norm_stderr\": 0.03858158940685515\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3818849449204406,\n \"mc1_stderr\": 0.01700810193916349,\n \"mc2\": 0.5491954242070916,\n \"mc2_stderr\": 0.015160040858276722\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7790055248618785,\n \"acc_stderr\": 0.011661223637643416\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5921152388172858,\n \"acc_stderr\": 0.013536742075643088\n }\n}\n```", "repo_url": "https://huggingface.co/fzzhang/toten_gsm8k_merged_s", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_17T06_12_40.059000", "path": ["**/details_harness|arc:challenge|25_2024-02-17T06-12-40.059000.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-17T06-12-40.059000.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_17T06_12_40.059000", "path": ["**/details_harness|gsm8k|5_2024-02-17T06-12-40.059000.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-17T06-12-40.059000.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_17T06_12_40.059000", "path": ["**/details_harness|hellaswag|10_2024-02-17T06-12-40.059000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-17T06-12-40.059000.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_17T06_12_40.059000", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-17T06-12-40.059000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-17T06-12-40.059000.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-17T06-12-40.059000.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_17T06_12_40.059000", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T06-12-40.059000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T06-12-40.059000.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_17T06_12_40.059000", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-17T06-12-40.059000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-17T06-12-40.059000.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_17T06_12_40.059000", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-17T06-12-40.059000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-17T06-12-40.059000.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_17T06_12_40.059000", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T06-12-40.059000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T06-12-40.059000.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_17T06_12_40.059000", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T06-12-40.059000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T06-12-40.059000.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_17T06_12_40.059000", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-17T06-12-40.059000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-17T06-12-40.059000.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_17T06_12_40.059000", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T06-12-40.059000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T06-12-40.059000.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_17T06_12_40.059000", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T06-12-40.059000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T06-12-40.059000.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_17T06_12_40.059000", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T06-12-40.059000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T06-12-40.059000.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_17T06_12_40.059000", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T06-12-40.059000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T06-12-40.059000.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_17T06_12_40.059000", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-17T06-12-40.059000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-17T06-12-40.059000.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_17T06_12_40.059000", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-17T06-12-40.059000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-17T06-12-40.059000.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_17T06_12_40.059000", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T06-12-40.059000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T06-12-40.059000.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_17T06_12_40.059000", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-17T06-12-40.059000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-17T06-12-40.059000.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_17T06_12_40.059000", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T06-12-40.059000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T06-12-40.059000.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_17T06_12_40.059000", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T06-12-40.059000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T06-12-40.059000.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_17T06_12_40.059000", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T06-12-40.059000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T06-12-40.059000.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_17T06_12_40.059000", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-17T06-12-40.059000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-17T06-12-40.059000.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_17T06_12_40.059000", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T06-12-40.059000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T06-12-40.059000.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_17T06_12_40.059000", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T06-12-40.059000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T06-12-40.059000.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_17T06_12_40.059000", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T06-12-40.059000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T06-12-40.059000.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_17T06_12_40.059000", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T06-12-40.059000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T06-12-40.059000.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_17T06_12_40.059000", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T06-12-40.059000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T06-12-40.059000.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_17T06_12_40.059000", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T06-12-40.059000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T06-12-40.059000.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_17T06_12_40.059000", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T06-12-40.059000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T06-12-40.059000.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_17T06_12_40.059000", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T06-12-40.059000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T06-12-40.059000.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_17T06_12_40.059000", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T06-12-40.059000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T06-12-40.059000.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_17T06_12_40.059000", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T06-12-40.059000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T06-12-40.059000.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_17T06_12_40.059000", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T06-12-40.059000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T06-12-40.059000.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_17T06_12_40.059000", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T06-12-40.059000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T06-12-40.059000.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_17T06_12_40.059000", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T06-12-40.059000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T06-12-40.059000.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_17T06_12_40.059000", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T06-12-40.059000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T06-12-40.059000.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_17T06_12_40.059000", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-17T06-12-40.059000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-17T06-12-40.059000.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_17T06_12_40.059000", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T06-12-40.059000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T06-12-40.059000.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_17T06_12_40.059000", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-17T06-12-40.059000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-17T06-12-40.059000.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_17T06_12_40.059000", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T06-12-40.059000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T06-12-40.059000.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_17T06_12_40.059000", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T06-12-40.059000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T06-12-40.059000.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_17T06_12_40.059000", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T06-12-40.059000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T06-12-40.059000.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_17T06_12_40.059000", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-17T06-12-40.059000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-17T06-12-40.059000.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_17T06_12_40.059000", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-17T06-12-40.059000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-17T06-12-40.059000.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_17T06_12_40.059000", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T06-12-40.059000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T06-12-40.059000.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_17T06_12_40.059000", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T06-12-40.059000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T06-12-40.059000.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_17T06_12_40.059000", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T06-12-40.059000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T06-12-40.059000.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_17T06_12_40.059000", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T06-12-40.059000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T06-12-40.059000.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_17T06_12_40.059000", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-17T06-12-40.059000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-17T06-12-40.059000.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_17T06_12_40.059000", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-17T06-12-40.059000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-17T06-12-40.059000.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_17T06_12_40.059000", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-17T06-12-40.059000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-17T06-12-40.059000.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_17T06_12_40.059000", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T06-12-40.059000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T06-12-40.059000.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_17T06_12_40.059000", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-17T06-12-40.059000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-17T06-12-40.059000.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_17T06_12_40.059000", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T06-12-40.059000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T06-12-40.059000.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_17T06_12_40.059000", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T06-12-40.059000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T06-12-40.059000.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_17T06_12_40.059000", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-17T06-12-40.059000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-17T06-12-40.059000.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_17T06_12_40.059000", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-17T06-12-40.059000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-17T06-12-40.059000.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_17T06_12_40.059000", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-17T06-12-40.059000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-17T06-12-40.059000.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_17T06_12_40.059000", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T06-12-40.059000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T06-12-40.059000.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_17T06_12_40.059000", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-17T06-12-40.059000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-17T06-12-40.059000.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_17T06_12_40.059000", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-17T06-12-40.059000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-17T06-12-40.059000.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_17T06_12_40.059000", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-17T06-12-40.059000.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-17T06-12-40.059000.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_17T06_12_40.059000", "path": ["**/details_harness|winogrande|5_2024-02-17T06-12-40.059000.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-17T06-12-40.059000.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_17T06_12_40.059000", "path": ["results_2024-02-17T06-12-40.059000.parquet"]}, {"split": "latest", "path": ["results_2024-02-17T06-12-40.059000.parquet"]}]}]} | 2024-02-17T06:15:21+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of fzzhang/toten_gsm8k_merged_s
Dataset automatically created during the evaluation run of model fzzhang/toten_gsm8k_merged_s on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-17T06:12:40.059000(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of fzzhang/toten_gsm8k_merged_s\n\n\n\nDataset automatically created during the evaluation run of model fzzhang/toten_gsm8k_merged_s on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-17T06:12:40.059000(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of fzzhang/toten_gsm8k_merged_s\n\n\n\nDataset automatically created during the evaluation run of model fzzhang/toten_gsm8k_merged_s on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-17T06:12:40.059000(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
193,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of fzzhang/toten_gsm8k_merged_s\n\n\n\nDataset automatically created during the evaluation run of model fzzhang/toten_gsm8k_merged_s on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-17T06:12:40.059000(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
67fdf7b49f7c6d3e7a35286ee6c810668bc1bf94 |
# Dataset Card for Evaluation run of MaziyarPanahi/WizardLM-Math-70B-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [MaziyarPanahi/WizardLM-Math-70B-v0.1](https://huggingface.co/MaziyarPanahi/WizardLM-Math-70B-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_MaziyarPanahi__WizardLM-Math-70B-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-18T02:24:17.988962](https://huggingface.co/datasets/open-llm-leaderboard/details_MaziyarPanahi__WizardLM-Math-70B-v0.1/blob/main/results_2024-02-18T02-24-17.988962.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6914116069568377,
"acc_stderr": 0.03063431437342948,
"acc_norm": 0.6938613221179539,
"acc_norm_stderr": 0.031238741076549784,
"mc1": 0.40269277845777235,
"mc1_stderr": 0.01716883093518722,
"mc2": 0.5707095526544473,
"mc2_stderr": 0.01525040450448649
},
"harness|arc:challenge|25": {
"acc": 0.6322525597269625,
"acc_stderr": 0.014090995618168482,
"acc_norm": 0.6706484641638225,
"acc_norm_stderr": 0.013734057652635474
},
"harness|hellaswag|10": {
"acc": 0.6746664011153157,
"acc_stderr": 0.0046754187743142306,
"acc_norm": 0.8600876319458275,
"acc_norm_stderr": 0.0034618713240671846
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.03317672787533157,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.03317672787533157
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7283018867924528,
"acc_stderr": 0.027377706624670713,
"acc_norm": 0.7283018867924528,
"acc_norm_stderr": 0.027377706624670713
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8194444444444444,
"acc_stderr": 0.032166008088022675,
"acc_norm": 0.8194444444444444,
"acc_norm_stderr": 0.032166008088022675
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.035331333893236574,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.035331333893236574
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.047551296160629475,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.047551296160629475
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.676595744680851,
"acc_stderr": 0.030579442773610337,
"acc_norm": 0.676595744680851,
"acc_norm_stderr": 0.030579442773610337
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.40350877192982454,
"acc_stderr": 0.046151869625837026,
"acc_norm": 0.40350877192982454,
"acc_norm_stderr": 0.046151869625837026
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.04113914981189261,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.04113914981189261
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4497354497354497,
"acc_stderr": 0.02562085704293665,
"acc_norm": 0.4497354497354497,
"acc_norm_stderr": 0.02562085704293665
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8129032258064516,
"acc_stderr": 0.022185710092252252,
"acc_norm": 0.8129032258064516,
"acc_norm_stderr": 0.022185710092252252
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5369458128078818,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.5369458128078818,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8363636363636363,
"acc_stderr": 0.02888787239548795,
"acc_norm": 0.8363636363636363,
"acc_norm_stderr": 0.02888787239548795
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8686868686868687,
"acc_stderr": 0.024063156416822502,
"acc_norm": 0.8686868686868687,
"acc_norm_stderr": 0.024063156416822502
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.927461139896373,
"acc_stderr": 0.018718998520678178,
"acc_norm": 0.927461139896373,
"acc_norm_stderr": 0.018718998520678178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7025641025641025,
"acc_stderr": 0.023177408131465953,
"acc_norm": 0.7025641025641025,
"acc_norm_stderr": 0.023177408131465953
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.02904560029061626,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.02904560029061626
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.02626502460827588,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.02626502460827588
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4503311258278146,
"acc_stderr": 0.04062290018683776,
"acc_norm": 0.4503311258278146,
"acc_norm_stderr": 0.04062290018683776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8954128440366973,
"acc_stderr": 0.013120530245265593,
"acc_norm": 0.8954128440366973,
"acc_norm_stderr": 0.013120530245265593
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5787037037037037,
"acc_stderr": 0.03367462138896078,
"acc_norm": 0.5787037037037037,
"acc_norm_stderr": 0.03367462138896078
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9166666666666666,
"acc_stderr": 0.019398452135813905,
"acc_norm": 0.9166666666666666,
"acc_norm_stderr": 0.019398452135813905
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8860759493670886,
"acc_stderr": 0.020681745135884565,
"acc_norm": 0.8860759493670886,
"acc_norm_stderr": 0.020681745135884565
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.757847533632287,
"acc_stderr": 0.028751392398694755,
"acc_norm": 0.757847533632287,
"acc_norm_stderr": 0.028751392398694755
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8702290076335878,
"acc_stderr": 0.029473649496907065,
"acc_norm": 0.8702290076335878,
"acc_norm_stderr": 0.029473649496907065
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.03520893951097655,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.03520893951097655
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037181,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037181
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742179,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742179
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8446601941747572,
"acc_stderr": 0.03586594738573974,
"acc_norm": 0.8446601941747572,
"acc_norm_stderr": 0.03586594738573974
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.905982905982906,
"acc_stderr": 0.019119892798924974,
"acc_norm": 0.905982905982906,
"acc_norm_stderr": 0.019119892798924974
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526066,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526066
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8697318007662835,
"acc_stderr": 0.012036729568216054,
"acc_norm": 0.8697318007662835,
"acc_norm_stderr": 0.012036729568216054
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7774566473988439,
"acc_stderr": 0.02239421566194282,
"acc_norm": 0.7774566473988439,
"acc_norm_stderr": 0.02239421566194282
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5553072625698324,
"acc_stderr": 0.016619881988177012,
"acc_norm": 0.5553072625698324,
"acc_norm_stderr": 0.016619881988177012
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7516339869281046,
"acc_stderr": 0.024739981355113592,
"acc_norm": 0.7516339869281046,
"acc_norm_stderr": 0.024739981355113592
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.77491961414791,
"acc_stderr": 0.023720088516179027,
"acc_norm": 0.77491961414791,
"acc_norm_stderr": 0.023720088516179027
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.02240967454730417,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.02240967454730417
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5390070921985816,
"acc_stderr": 0.029736592526424445,
"acc_norm": 0.5390070921985816,
"acc_norm_stderr": 0.029736592526424445
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5586701434159062,
"acc_stderr": 0.012682016335646683,
"acc_norm": 0.5586701434159062,
"acc_norm_stderr": 0.012682016335646683
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7242647058823529,
"acc_stderr": 0.027146271936625162,
"acc_norm": 0.7242647058823529,
"acc_norm_stderr": 0.027146271936625162
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.761437908496732,
"acc_stderr": 0.017242385828779627,
"acc_norm": 0.761437908496732,
"acc_norm_stderr": 0.017242385828779627
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.041723430387053825,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.041723430387053825
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7877551020408163,
"acc_stderr": 0.026176967197866767,
"acc_norm": 0.7877551020408163,
"acc_norm_stderr": 0.026176967197866767
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8805970149253731,
"acc_stderr": 0.02292879327721974,
"acc_norm": 0.8805970149253731,
"acc_norm_stderr": 0.02292879327721974
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8596491228070176,
"acc_stderr": 0.0266405825391332,
"acc_norm": 0.8596491228070176,
"acc_norm_stderr": 0.0266405825391332
},
"harness|truthfulqa:mc|0": {
"mc1": 0.40269277845777235,
"mc1_stderr": 0.01716883093518722,
"mc2": 0.5707095526544473,
"mc2_stderr": 0.01525040450448649
},
"harness|winogrande|5": {
"acc": 0.8176795580110497,
"acc_stderr": 0.010851565594267207
},
"harness|gsm8k|5": {
"acc": 0.6444275966641395,
"acc_stderr": 0.013185402252713852
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_MaziyarPanahi__WizardLM-Math-70B-v0.1 | [
"region:us"
] | 2024-02-17T07:29:49+00:00 | {"pretty_name": "Evaluation run of MaziyarPanahi/WizardLM-Math-70B-v0.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [MaziyarPanahi/WizardLM-Math-70B-v0.1](https://huggingface.co/MaziyarPanahi/WizardLM-Math-70B-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_MaziyarPanahi__WizardLM-Math-70B-v0.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-18T02:24:17.988962](https://huggingface.co/datasets/open-llm-leaderboard/details_MaziyarPanahi__WizardLM-Math-70B-v0.1/blob/main/results_2024-02-18T02-24-17.988962.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6914116069568377,\n \"acc_stderr\": 0.03063431437342948,\n \"acc_norm\": 0.6938613221179539,\n \"acc_norm_stderr\": 0.031238741076549784,\n \"mc1\": 0.40269277845777235,\n \"mc1_stderr\": 0.01716883093518722,\n \"mc2\": 0.5707095526544473,\n \"mc2_stderr\": 0.01525040450448649\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6322525597269625,\n \"acc_stderr\": 0.014090995618168482,\n \"acc_norm\": 0.6706484641638225,\n \"acc_norm_stderr\": 0.013734057652635474\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6746664011153157,\n \"acc_stderr\": 0.0046754187743142306,\n \"acc_norm\": 0.8600876319458275,\n \"acc_norm_stderr\": 0.0034618713240671846\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.03317672787533157,\n \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.03317672787533157\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7283018867924528,\n \"acc_stderr\": 0.027377706624670713,\n \"acc_norm\": 0.7283018867924528,\n \"acc_norm_stderr\": 0.027377706624670713\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8194444444444444,\n \"acc_stderr\": 0.032166008088022675,\n \"acc_norm\": 0.8194444444444444,\n \"acc_norm_stderr\": 0.032166008088022675\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.035331333893236574,\n \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.035331333893236574\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.047551296160629475,\n \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.047551296160629475\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.676595744680851,\n \"acc_stderr\": 0.030579442773610337,\n \"acc_norm\": 0.676595744680851,\n \"acc_norm_stderr\": 0.030579442773610337\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.40350877192982454,\n \"acc_stderr\": 0.046151869625837026,\n \"acc_norm\": 0.40350877192982454,\n \"acc_norm_stderr\": 0.046151869625837026\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.04113914981189261,\n \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.04113914981189261\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4497354497354497,\n \"acc_stderr\": 0.02562085704293665,\n \"acc_norm\": 0.4497354497354497,\n \"acc_norm_stderr\": 0.02562085704293665\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8129032258064516,\n \"acc_stderr\": 0.022185710092252252,\n \"acc_norm\": 0.8129032258064516,\n \"acc_norm_stderr\": 0.022185710092252252\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5369458128078818,\n \"acc_stderr\": 0.035083705204426656,\n \"acc_norm\": 0.5369458128078818,\n \"acc_norm_stderr\": 0.035083705204426656\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8363636363636363,\n \"acc_stderr\": 0.02888787239548795,\n \"acc_norm\": 0.8363636363636363,\n \"acc_norm_stderr\": 0.02888787239548795\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8686868686868687,\n \"acc_stderr\": 0.024063156416822502,\n \"acc_norm\": 0.8686868686868687,\n \"acc_norm_stderr\": 0.024063156416822502\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.927461139896373,\n \"acc_stderr\": 0.018718998520678178,\n \"acc_norm\": 0.927461139896373,\n \"acc_norm_stderr\": 0.018718998520678178\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.7025641025641025,\n \"acc_stderr\": 0.023177408131465953,\n \"acc_norm\": 0.7025641025641025,\n \"acc_norm_stderr\": 0.023177408131465953\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34814814814814815,\n \"acc_stderr\": 0.02904560029061626,\n \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.02904560029061626\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7941176470588235,\n \"acc_stderr\": 0.02626502460827588,\n \"acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.02626502460827588\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.4503311258278146,\n \"acc_stderr\": 0.04062290018683776,\n \"acc_norm\": 0.4503311258278146,\n \"acc_norm_stderr\": 0.04062290018683776\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8954128440366973,\n \"acc_stderr\": 0.013120530245265593,\n \"acc_norm\": 0.8954128440366973,\n \"acc_norm_stderr\": 0.013120530245265593\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5787037037037037,\n \"acc_stderr\": 0.03367462138896078,\n \"acc_norm\": 0.5787037037037037,\n \"acc_norm_stderr\": 0.03367462138896078\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9166666666666666,\n \"acc_stderr\": 0.019398452135813905,\n \"acc_norm\": 0.9166666666666666,\n \"acc_norm_stderr\": 0.019398452135813905\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8860759493670886,\n \"acc_stderr\": 0.020681745135884565,\n \"acc_norm\": 0.8860759493670886,\n \"acc_norm_stderr\": 0.020681745135884565\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.757847533632287,\n \"acc_stderr\": 0.028751392398694755,\n \"acc_norm\": 0.757847533632287,\n \"acc_norm_stderr\": 0.028751392398694755\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8702290076335878,\n \"acc_stderr\": 0.029473649496907065,\n \"acc_norm\": 0.8702290076335878,\n \"acc_norm_stderr\": 0.029473649496907065\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.03520893951097655,\n \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03520893951097655\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.8148148148148148,\n \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742179,\n \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742179\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573974,\n \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573974\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.905982905982906,\n \"acc_stderr\": 0.019119892798924974,\n \"acc_norm\": 0.905982905982906,\n \"acc_norm_stderr\": 0.019119892798924974\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526066,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.047258156262526066\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8697318007662835,\n \"acc_stderr\": 0.012036729568216054,\n \"acc_norm\": 0.8697318007662835,\n \"acc_norm_stderr\": 0.012036729568216054\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7774566473988439,\n \"acc_stderr\": 0.02239421566194282,\n \"acc_norm\": 0.7774566473988439,\n \"acc_norm_stderr\": 0.02239421566194282\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5553072625698324,\n \"acc_stderr\": 0.016619881988177012,\n \"acc_norm\": 0.5553072625698324,\n \"acc_norm_stderr\": 0.016619881988177012\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7516339869281046,\n \"acc_stderr\": 0.024739981355113592,\n \"acc_norm\": 0.7516339869281046,\n \"acc_norm_stderr\": 0.024739981355113592\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.77491961414791,\n \"acc_stderr\": 0.023720088516179027,\n \"acc_norm\": 0.77491961414791,\n \"acc_norm_stderr\": 0.023720088516179027\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.02240967454730417,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.02240967454730417\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5390070921985816,\n \"acc_stderr\": 0.029736592526424445,\n \"acc_norm\": 0.5390070921985816,\n \"acc_norm_stderr\": 0.029736592526424445\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5586701434159062,\n \"acc_stderr\": 0.012682016335646683,\n \"acc_norm\": 0.5586701434159062,\n \"acc_norm_stderr\": 0.012682016335646683\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7242647058823529,\n \"acc_stderr\": 0.027146271936625162,\n \"acc_norm\": 0.7242647058823529,\n \"acc_norm_stderr\": 0.027146271936625162\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.761437908496732,\n \"acc_stderr\": 0.017242385828779627,\n \"acc_norm\": 0.761437908496732,\n \"acc_norm_stderr\": 0.017242385828779627\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.041723430387053825,\n \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.041723430387053825\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7877551020408163,\n \"acc_stderr\": 0.026176967197866767,\n \"acc_norm\": 0.7877551020408163,\n \"acc_norm_stderr\": 0.026176967197866767\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8805970149253731,\n \"acc_stderr\": 0.02292879327721974,\n \"acc_norm\": 0.8805970149253731,\n \"acc_norm_stderr\": 0.02292879327721974\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8596491228070176,\n \"acc_stderr\": 0.0266405825391332,\n \"acc_norm\": 0.8596491228070176,\n \"acc_norm_stderr\": 0.0266405825391332\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40269277845777235,\n \"mc1_stderr\": 0.01716883093518722,\n \"mc2\": 0.5707095526544473,\n \"mc2_stderr\": 0.01525040450448649\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8176795580110497,\n \"acc_stderr\": 0.010851565594267207\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6444275966641395,\n \"acc_stderr\": 0.013185402252713852\n }\n}\n```", "repo_url": "https://huggingface.co/MaziyarPanahi/WizardLM-Math-70B-v0.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|arc:challenge|25_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|arc:challenge|25_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|arc:challenge|25_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|gsm8k|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|gsm8k|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|gsm8k|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hellaswag|10_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hellaswag|10_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hellaswag|10_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|winogrande|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|winogrande|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|winogrande|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["results_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["results_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["results_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["results_2024-02-18T02-24-17.988962.parquet"]}]}]} | 2024-02-17T08:55:38+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of MaziyarPanahi/WizardLM-Math-70B-v0.1
Dataset automatically created during the evaluation run of model MaziyarPanahi/WizardLM-Math-70B-v0.1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-18T02:24:17.988962(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of MaziyarPanahi/WizardLM-Math-70B-v0.1\n\n\n\nDataset automatically created during the evaluation run of model MaziyarPanahi/WizardLM-Math-70B-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-18T02:24:17.988962(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of MaziyarPanahi/WizardLM-Math-70B-v0.1\n\n\n\nDataset automatically created during the evaluation run of model MaziyarPanahi/WizardLM-Math-70B-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-18T02:24:17.988962(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
193,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of MaziyarPanahi/WizardLM-Math-70B-v0.1\n\n\n\nDataset automatically created during the evaluation run of model MaziyarPanahi/WizardLM-Math-70B-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-18T02:24:17.988962(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
4548bcf07b77c84a36fbbeb136ffb4dd4978a2c7 |
# Dataset Card for Evaluation run of Kquant03/NeuralTrix-7B-dpo-relaser
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Kquant03/NeuralTrix-7B-dpo-relaser](https://huggingface.co/Kquant03/NeuralTrix-7B-dpo-relaser) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Kquant03__NeuralTrix-7B-dpo-relaser",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-17T09:02:39.812409](https://huggingface.co/datasets/open-llm-leaderboard/details_Kquant03__NeuralTrix-7B-dpo-relaser/blob/main/results_2024-02-17T09-02-39.812409.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6457696459415244,
"acc_stderr": 0.032218510960558784,
"acc_norm": 0.6454366964555287,
"acc_norm_stderr": 0.032889882572498676,
"mc1": 0.627906976744186,
"mc1_stderr": 0.01692109011881403,
"mc2": 0.7797569119907752,
"mc2_stderr": 0.01376117718949703
},
"harness|arc:challenge|25": {
"acc": 0.6885665529010239,
"acc_stderr": 0.013532472099850947,
"acc_norm": 0.7133105802047781,
"acc_norm_stderr": 0.013214986329274772
},
"harness|hellaswag|10": {
"acc": 0.6978689504082852,
"acc_stderr": 0.004582433109636472,
"acc_norm": 0.8840868352917746,
"acc_norm_stderr": 0.0031946652660786025
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322666,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322666
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107224,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107224
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5446808510638298,
"acc_stderr": 0.03255525359340355,
"acc_norm": 0.5446808510638298,
"acc_norm_stderr": 0.03255525359340355
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.02546714904546954,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.02546714904546954
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7612903225806451,
"acc_stderr": 0.02425107126220884,
"acc_norm": 0.7612903225806451,
"acc_norm_stderr": 0.02425107126220884
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267042,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267042
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.02247325333276877,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.02247325333276877
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402534,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.028406533090608452,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.028406533090608452
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6554621848739496,
"acc_stderr": 0.030868682604121626,
"acc_norm": 0.6554621848739496,
"acc_norm_stderr": 0.030868682604121626
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3973509933774834,
"acc_stderr": 0.039955240076816806,
"acc_norm": 0.3973509933774834,
"acc_norm_stderr": 0.039955240076816806
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.015630022970092427,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.015630022970092427
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.02584501798692692,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.02584501798692692
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944856,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944856
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406974,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406974
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.01354741565866226,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.01354741565866226
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.708092485549133,
"acc_stderr": 0.02447699407624734,
"acc_norm": 0.708092485549133,
"acc_norm_stderr": 0.02447699407624734
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.43798882681564244,
"acc_stderr": 0.016593394227564843,
"acc_norm": 0.43798882681564244,
"acc_norm_stderr": 0.016593394227564843
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.02582916327275748,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.02582916327275748
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6816720257234726,
"acc_stderr": 0.026457225067811025,
"acc_norm": 0.6816720257234726,
"acc_norm_stderr": 0.026457225067811025
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600713,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600713
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46936114732724904,
"acc_stderr": 0.012746237711716634,
"acc_norm": 0.46936114732724904,
"acc_norm_stderr": 0.012746237711716634
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.028501452860396556,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.028501452860396556
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.018926082916083383,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.018926082916083383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.028666857790274648,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.028666857790274648
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578337,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578337
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.627906976744186,
"mc1_stderr": 0.01692109011881403,
"mc2": 0.7797569119907752,
"mc2_stderr": 0.01376117718949703
},
"harness|winogrande|5": {
"acc": 0.840568271507498,
"acc_stderr": 0.010288617479454764
},
"harness|gsm8k|5": {
"acc": 0.6815769522365428,
"acc_stderr": 0.012832225723075408
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Kquant03__NeuralTrix-7B-dpo-relaser | [
"region:us"
] | 2024-02-17T09:05:00+00:00 | {"pretty_name": "Evaluation run of Kquant03/NeuralTrix-7B-dpo-relaser", "dataset_summary": "Dataset automatically created during the evaluation run of model [Kquant03/NeuralTrix-7B-dpo-relaser](https://huggingface.co/Kquant03/NeuralTrix-7B-dpo-relaser) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Kquant03__NeuralTrix-7B-dpo-relaser\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-17T09:02:39.812409](https://huggingface.co/datasets/open-llm-leaderboard/details_Kquant03__NeuralTrix-7B-dpo-relaser/blob/main/results_2024-02-17T09-02-39.812409.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6457696459415244,\n \"acc_stderr\": 0.032218510960558784,\n \"acc_norm\": 0.6454366964555287,\n \"acc_norm_stderr\": 0.032889882572498676,\n \"mc1\": 0.627906976744186,\n \"mc1_stderr\": 0.01692109011881403,\n \"mc2\": 0.7797569119907752,\n \"mc2_stderr\": 0.01376117718949703\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6885665529010239,\n \"acc_stderr\": 0.013532472099850947,\n \"acc_norm\": 0.7133105802047781,\n \"acc_norm_stderr\": 0.013214986329274772\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6978689504082852,\n \"acc_stderr\": 0.004582433109636472,\n \"acc_norm\": 0.8840868352917746,\n \"acc_norm_stderr\": 0.0031946652660786025\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322666,\n \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322666\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.6416184971098265,\n \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107224,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107224\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5446808510638298,\n \"acc_stderr\": 0.03255525359340355,\n \"acc_norm\": 0.5446808510638298,\n \"acc_norm_stderr\": 0.03255525359340355\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42592592592592593,\n \"acc_stderr\": 0.02546714904546954,\n \"acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.02546714904546954\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7612903225806451,\n \"acc_stderr\": 0.02425107126220884,\n \"acc_norm\": 0.7612903225806451,\n \"acc_norm_stderr\": 0.02425107126220884\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267042,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267042\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.02247325333276877,\n \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.02247325333276877\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.31851851851851853,\n \"acc_stderr\": 0.028406533090608452,\n \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.028406533090608452\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6554621848739496,\n \"acc_stderr\": 0.030868682604121626,\n \"acc_norm\": 0.6554621848739496,\n \"acc_norm_stderr\": 0.030868682604121626\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3973509933774834,\n \"acc_stderr\": 0.039955240076816806,\n \"acc_norm\": 0.3973509933774834,\n \"acc_norm_stderr\": 0.039955240076816806\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8422018348623853,\n \"acc_stderr\": 0.015630022970092427,\n \"acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.015630022970092427\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.02584501798692692,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.02584501798692692\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944856,\n \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944856\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406974,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406974\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n \"acc_stderr\": 0.01354741565866226,\n \"acc_norm\": 0.8263090676883781,\n \"acc_norm_stderr\": 0.01354741565866226\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.708092485549133,\n \"acc_stderr\": 0.02447699407624734,\n \"acc_norm\": 0.708092485549133,\n \"acc_norm_stderr\": 0.02447699407624734\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43798882681564244,\n \"acc_stderr\": 0.016593394227564843,\n \"acc_norm\": 0.43798882681564244,\n \"acc_norm_stderr\": 0.016593394227564843\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.02582916327275748,\n \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.02582916327275748\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6816720257234726,\n \"acc_stderr\": 0.026457225067811025,\n \"acc_norm\": 0.6816720257234726,\n \"acc_norm_stderr\": 0.026457225067811025\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600713,\n \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600713\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46936114732724904,\n \"acc_stderr\": 0.012746237711716634,\n \"acc_norm\": 0.46936114732724904,\n \"acc_norm_stderr\": 0.012746237711716634\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.028501452860396556,\n \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.028501452860396556\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.018926082916083383,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.018926082916083383\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.025538433368578337,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.025538433368578337\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.627906976744186,\n \"mc1_stderr\": 0.01692109011881403,\n \"mc2\": 0.7797569119907752,\n \"mc2_stderr\": 0.01376117718949703\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.840568271507498,\n \"acc_stderr\": 0.010288617479454764\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6815769522365428,\n \"acc_stderr\": 0.012832225723075408\n }\n}\n```", "repo_url": "https://huggingface.co/Kquant03/NeuralTrix-7B-dpo-relaser", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|arc:challenge|25_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|gsm8k|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hellaswag|10_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|winogrande|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["results_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["results_2024-02-17T09-02-39.812409.parquet"]}]}]} | 2024-02-17T09:05:22+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Kquant03/NeuralTrix-7B-dpo-relaser
Dataset automatically created during the evaluation run of model Kquant03/NeuralTrix-7B-dpo-relaser on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-17T09:02:39.812409(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Kquant03/NeuralTrix-7B-dpo-relaser\n\n\n\nDataset automatically created during the evaluation run of model Kquant03/NeuralTrix-7B-dpo-relaser on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-17T09:02:39.812409(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Kquant03/NeuralTrix-7B-dpo-relaser\n\n\n\nDataset automatically created during the evaluation run of model Kquant03/NeuralTrix-7B-dpo-relaser on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-17T09:02:39.812409(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
193,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Kquant03/NeuralTrix-7B-dpo-relaser\n\n\n\nDataset automatically created during the evaluation run of model Kquant03/NeuralTrix-7B-dpo-relaser on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-17T09:02:39.812409(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
2790f707bb56e84073bd13d330f001a06859ad8d | # Dataset Card for "RefGPT-Fact-v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | Mutonix/RefGPT-Fact-v2 | [
"region:us"
] | 2024-02-17T09:05:53+00:00 | {"dataset_info": {"features": [{"name": "dialogue", "dtype": "string"}, {"name": "reference", "dtype": "string"}, {"name": "language", "dtype": "string"}, {"name": "type", "dtype": "string"}], "splits": [{"name": "zh", "num_bytes": 605149267, "num_examples": 143154}, {"name": "en", "num_bytes": 4303277735, "num_examples": 522354}], "download_size": 599049240, "dataset_size": 4908427002}} | 2024-02-17T13:04:19+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "RefGPT-Fact-v2"
More Information needed | [
"# Dataset Card for \"RefGPT-Fact-v2\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"RefGPT-Fact-v2\"\n\nMore Information needed"
] | [
6,
19
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"RefGPT-Fact-v2\"\n\nMore Information needed"
] |
975a5a58fa42f9fa11249f888d823b156a34b74e | from diffusers import DiffusionPipeline
pipeline = DiffusionPipeline.from_pretrained("stabilityai/stable-video-diffusion-img2vid-xt") | dsplz/dasdasd | [
"region:us"
] | 2024-02-17T09:22:56+00:00 | {} | 2024-02-17T09:23:05+00:00 | [] | [] | TAGS
#region-us
| from diffusers import DiffusionPipeline
pipeline = DiffusionPipeline.from_pretrained("stabilityai/stable-video-diffusion-img2vid-xt") | [] | [
"TAGS\n#region-us \n"
] | [
6
] | [
"passage: TAGS\n#region-us \n"
] |
4646a7d3b5383437f10eb5f2ea590d15b66f72b9 | # Degarbayan-SC: A Colloquial Paraphrase Farsi using pre-trained mT5
This is the Dataset of [Degarbayan-SC paper](https://ieeexplore.ieee.org/abstract/document/9959983).
You can Finetune with this dataset on the transformers models using [Github](https://https://github.com/m0javad/Degarbayan-SC).
```python
from datasets import load_dataset
dataset = load_dataset("m0javad/Degarbayan-SC-dataset")
```
### Statistic
![Lenght of sentences](https://i.ibb.co/C1RJhTZ/lenght.jpg")
our sentence length distribution is between 3 and 19 words and sentences are an average of 8 words. This makes sense because in the movie subtitles, sentences are shown in a range of time and we matched them with timespans. Humans can say a certain number of words in a certain period of time. Our collected sentences have 128,699 unique words.
as you see in the table above, our dataset contains a large number of paraphrasing sentences in various forms such syntactic, semantic and conceptual paraphrases.
### contact
contact me for contribution and future possible works at: [email protected] | m0javad/Degarbayan-SC-dataset | [
"task_categories:text-generation",
"task_categories:conversational",
"task_categories:text2text-generation",
"size_categories:100M<n<1B",
"language:fa",
"region:us"
] | 2024-02-17T09:49:03+00:00 | {"language": ["fa"], "size_categories": ["100M<n<1B"], "task_categories": ["text-generation", "conversational", "text2text-generation"]} | 2024-02-17T10:00:27+00:00 | [] | [
"fa"
] | TAGS
#task_categories-text-generation #task_categories-conversational #task_categories-text2text-generation #size_categories-100M<n<1B #language-Persian #region-us
| # Degarbayan-SC: A Colloquial Paraphrase Farsi using pre-trained mT5
This is the Dataset of Degarbayan-SC paper.
You can Finetune with this dataset on the transformers models using Github.
### Statistic
!Lenght of sentences
our sentence length distribution is between 3 and 19 words and sentences are an average of 8 words. This makes sense because in the movie subtitles, sentences are shown in a range of time and we matched them with timespans. Humans can say a certain number of words in a certain period of time. Our collected sentences have 128,699 unique words.
as you see in the table above, our dataset contains a large number of paraphrasing sentences in various forms such syntactic, semantic and conceptual paraphrases.
### contact
contact me for contribution and future possible works at: URL@URL | [
"# Degarbayan-SC: A Colloquial Paraphrase Farsi using pre-trained mT5\n\nThis is the Dataset of Degarbayan-SC paper.\nYou can Finetune with this dataset on the transformers models using Github.",
"### Statistic\n!Lenght of sentences\n\nour sentence length distribution is between 3 and 19 words and sentences are an average of 8 words. This makes sense because in the movie subtitles, sentences are shown in a range of time and we matched them with timespans. Humans can say a certain number of words in a certain period of time. Our collected sentences have 128,699 unique words.\n\nas you see in the table above, our dataset contains a large number of paraphrasing sentences in various forms such syntactic, semantic and conceptual paraphrases.",
"### contact\n\ncontact me for contribution and future possible works at: URL@URL"
] | [
"TAGS\n#task_categories-text-generation #task_categories-conversational #task_categories-text2text-generation #size_categories-100M<n<1B #language-Persian #region-us \n",
"# Degarbayan-SC: A Colloquial Paraphrase Farsi using pre-trained mT5\n\nThis is the Dataset of Degarbayan-SC paper.\nYou can Finetune with this dataset on the transformers models using Github.",
"### Statistic\n!Lenght of sentences\n\nour sentence length distribution is between 3 and 19 words and sentences are an average of 8 words. This makes sense because in the movie subtitles, sentences are shown in a range of time and we matched them with timespans. Humans can say a certain number of words in a certain period of time. Our collected sentences have 128,699 unique words.\n\nas you see in the table above, our dataset contains a large number of paraphrasing sentences in various forms such syntactic, semantic and conceptual paraphrases.",
"### contact\n\ncontact me for contribution and future possible works at: URL@URL"
] | [
57,
55,
129,
16
] | [
"passage: TAGS\n#task_categories-text-generation #task_categories-conversational #task_categories-text2text-generation #size_categories-100M<n<1B #language-Persian #region-us \n# Degarbayan-SC: A Colloquial Paraphrase Farsi using pre-trained mT5\n\nThis is the Dataset of Degarbayan-SC paper.\nYou can Finetune with this dataset on the transformers models using Github.### Statistic\n!Lenght of sentences\n\nour sentence length distribution is between 3 and 19 words and sentences are an average of 8 words. This makes sense because in the movie subtitles, sentences are shown in a range of time and we matched them with timespans. Humans can say a certain number of words in a certain period of time. Our collected sentences have 128,699 unique words.\n\nas you see in the table above, our dataset contains a large number of paraphrasing sentences in various forms such syntactic, semantic and conceptual paraphrases.### contact\n\ncontact me for contribution and future possible works at: URL@URL"
] |
4cd2ff54497a83d2d1f24cce672e7d38ba20b850 | # Strix
![image/png](https://cdn-uploads.huggingface.co/production/uploads/65aa2d4b356bf23b4a4da247/P0fjr265dE_KFfv-dn35t.png)
134k question-answer pairs based on [AiresPucrs'](https://huggingface.co/datasets/AiresPucrs/stanford-encyclopedia-philosophy) [stanford-encyclopedia-philosophy](https://huggingface.co/datasets/AiresPucrs/stanford-encyclopedia-philosophy) dataset. | sayhan/strix-philosophy-qa | [
"task_categories:question-answering",
"language:en",
"license:apache-2.0",
"philosophy",
"region:us"
] | 2024-02-17T09:50:32+00:00 | {"language": ["en"], "license": "apache-2.0", "task_categories": ["question-answering"], "tags": ["philosophy"]} | 2024-02-17T10:34:28+00:00 | [] | [
"en"
] | TAGS
#task_categories-question-answering #language-English #license-apache-2.0 #philosophy #region-us
| # Strix
!image/png
134k question-answer pairs based on AiresPucrs' stanford-encyclopedia-philosophy dataset. | [
"# Strix\n!image/png\n\n134k question-answer pairs based on AiresPucrs' stanford-encyclopedia-philosophy dataset."
] | [
"TAGS\n#task_categories-question-answering #language-English #license-apache-2.0 #philosophy #region-us \n",
"# Strix\n!image/png\n\n134k question-answer pairs based on AiresPucrs' stanford-encyclopedia-philosophy dataset."
] | [
33,
33
] | [
"passage: TAGS\n#task_categories-question-answering #language-English #license-apache-2.0 #philosophy #region-us \n# Strix\n!image/png\n\n134k question-answer pairs based on AiresPucrs' stanford-encyclopedia-philosophy dataset."
] |
b7608b1f149519dcfb3d84706bcbf09f69878b8c |
This is a dataset that was created from [HuggingFaceH4/OpenHermes-2.5-1k-longest](https://huggingface.co/datasets/HuggingFaceH4/OpenHermes-2.5-1k-longest).
The purpose is to be able to use in [axolotl](https://github.com/OpenAccess-AI-Collective/axolotl) config by adding:
```yaml
datasets:
- path: Mihaiii/OpenHermes-2.5-1k-longest-curated
type: alpaca
```
I elimininated rows that:
1) Had sys prompt (only 3 rows eliminated)
2) Contained on output a character that is repeated 10 times in a row (478 rows eliminated)
So from a 1000 rows dataset, I ended up with a 519 rows dataset.
See the [OpenHermes-2.5-1k-longest-curated.ipynb](https://huggingface.co/datasets/Mihaiii/OpenHermes-2.5-1k-longest-curated/blob/main/OpenHermes-2.5-1k-longest-curated.ipynb) notebook for details on how the dataset was constructed.
**Later edit**: after a more in depth analysis on the dataset, I noticed that:
1) The imported subset is `test_sft`, but this is the 2nd chunk of top 1k records. The first one is in `train_sft` subset.
2) Valid code records that contained 10 repeated spaces for indentation were also eliminated. | Mihaiii/OpenHermes-2.5-1k-longest-curated | [
"region:us"
] | 2024-02-17T09:52:59+00:00 | {"dataset_info": {"features": [{"name": "instruction", "dtype": "string"}, {"name": "output", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4176433, "num_examples": 519}], "download_size": 1835764, "dataset_size": 4176433}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-17T12:36:56+00:00 | [] | [] | TAGS
#region-us
|
This is a dataset that was created from HuggingFaceH4/OpenHermes-2.5-1k-longest.
The purpose is to be able to use in axolotl config by adding:
I elimininated rows that:
1) Had sys prompt (only 3 rows eliminated)
2) Contained on output a character that is repeated 10 times in a row (478 rows eliminated)
So from a 1000 rows dataset, I ended up with a 519 rows dataset.
See the URL notebook for details on how the dataset was constructed.
Later edit: after a more in depth analysis on the dataset, I noticed that:
1) The imported subset is 'test_sft', but this is the 2nd chunk of top 1k records. The first one is in 'train_sft' subset.
2) Valid code records that contained 10 repeated spaces for indentation were also eliminated. | [] | [
"TAGS\n#region-us \n"
] | [
6
] | [
"passage: TAGS\n#region-us \n"
] |
dab2692ca42a3ec59762114f7719d046296edbb9 |
# Dataset Card for Evaluation run of freeCS-dot-org/OpenAGI-testing-truthyDPO-1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [freeCS-dot-org/OpenAGI-testing-truthyDPO-1](https://huggingface.co/freeCS-dot-org/OpenAGI-testing-truthyDPO-1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_freeCS-dot-org__OpenAGI-testing-truthyDPO-1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-17T10:15:19.081933](https://huggingface.co/datasets/open-llm-leaderboard/details_freeCS-dot-org__OpenAGI-testing-truthyDPO-1/blob/main/results_2024-02-17T10-15-19.081933.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6304857303424044,
"acc_stderr": 0.03249871750345685,
"acc_norm": 0.635809758335016,
"acc_norm_stderr": 0.033161164304669005,
"mc1": 0.5361077111383109,
"mc1_stderr": 0.017457800422268625,
"mc2": 0.7112471524136447,
"mc2_stderr": 0.014852535681165156
},
"harness|arc:challenge|25": {
"acc": 0.6322525597269625,
"acc_stderr": 0.014090995618168478,
"acc_norm": 0.6732081911262798,
"acc_norm_stderr": 0.01370666597558733
},
"harness|hellaswag|10": {
"acc": 0.6648078072097192,
"acc_stderr": 0.004710928569985755,
"acc_norm": 0.8598884684325832,
"acc_norm_stderr": 0.003463933286063884
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.720754716981132,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.720754716981132,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7013888888888888,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.7013888888888888,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6011560693641619,
"acc_stderr": 0.037336266553835096,
"acc_norm": 0.6011560693641619,
"acc_norm_stderr": 0.037336266553835096
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5404255319148936,
"acc_stderr": 0.03257901482099835,
"acc_norm": 0.5404255319148936,
"acc_norm_stderr": 0.03257901482099835
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3783068783068783,
"acc_stderr": 0.024976954053155254,
"acc_norm": 0.3783068783068783,
"acc_norm_stderr": 0.024976954053155254
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7516129032258064,
"acc_stderr": 0.024580028921481003,
"acc_norm": 0.7516129032258064,
"acc_norm_stderr": 0.024580028921481003
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5270935960591133,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.5270935960591133,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8601036269430051,
"acc_stderr": 0.02503387058301518,
"acc_norm": 0.8601036269430051,
"acc_norm_stderr": 0.02503387058301518
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6358974358974359,
"acc_stderr": 0.024396672985094767,
"acc_norm": 0.6358974358974359,
"acc_norm_stderr": 0.024396672985094767
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.028226446749683512,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.028226446749683512
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.029953823891887044,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.029953823891887044
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8330275229357799,
"acc_stderr": 0.015990154885073406,
"acc_norm": 0.8330275229357799,
"acc_norm_stderr": 0.015990154885073406
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.027865942286639325,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.027865942286639325
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944856,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944856
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.031381476375754995,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.031381476375754995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306085,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306085
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.03520893951097653,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.03520893951097653
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7300613496932515,
"acc_stderr": 0.034878251684978906,
"acc_norm": 0.7300613496932515,
"acc_norm_stderr": 0.034878251684978906
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8931623931623932,
"acc_stderr": 0.02023714900899093,
"acc_norm": 0.8931623931623932,
"acc_norm_stderr": 0.02023714900899093
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993464,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993464
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.024818350129436593,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.024818350129436593
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.376536312849162,
"acc_stderr": 0.016204672385106603,
"acc_norm": 0.376536312849162,
"acc_norm_stderr": 0.016204672385106603
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.025646863097137897,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.025646863097137897
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6945337620578779,
"acc_stderr": 0.02616058445014045,
"acc_norm": 0.6945337620578779,
"acc_norm_stderr": 0.02616058445014045
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7283950617283951,
"acc_stderr": 0.02474862449053737,
"acc_norm": 0.7283950617283951,
"acc_norm_stderr": 0.02474862449053737
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.029790719243829727,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.029790719243829727
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45045632333767927,
"acc_stderr": 0.012707390438502346,
"acc_norm": 0.45045632333767927,
"acc_norm_stderr": 0.012707390438502346
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.028739328513983576,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.028739328513983576
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6552287581699346,
"acc_stderr": 0.019228322018696647,
"acc_norm": 0.6552287581699346,
"acc_norm_stderr": 0.019228322018696647
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7061224489795919,
"acc_stderr": 0.029162738410249772,
"acc_norm": 0.7061224489795919,
"acc_norm_stderr": 0.029162738410249772
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7810945273631841,
"acc_stderr": 0.029239174636647,
"acc_norm": 0.7810945273631841,
"acc_norm_stderr": 0.029239174636647
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5361077111383109,
"mc1_stderr": 0.017457800422268625,
"mc2": 0.7112471524136447,
"mc2_stderr": 0.014852535681165156
},
"harness|winogrande|5": {
"acc": 0.8121546961325967,
"acc_stderr": 0.010977481103435091
},
"harness|gsm8k|5": {
"acc": 0.3707354056103108,
"acc_stderr": 0.01330426770545843
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_freeCS-dot-org__OpenAGI-testing-truthyDPO-1 | [
"region:us"
] | 2024-02-17T10:17:35+00:00 | {"pretty_name": "Evaluation run of freeCS-dot-org/OpenAGI-testing-truthyDPO-1", "dataset_summary": "Dataset automatically created during the evaluation run of model [freeCS-dot-org/OpenAGI-testing-truthyDPO-1](https://huggingface.co/freeCS-dot-org/OpenAGI-testing-truthyDPO-1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_freeCS-dot-org__OpenAGI-testing-truthyDPO-1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-17T10:15:19.081933](https://huggingface.co/datasets/open-llm-leaderboard/details_freeCS-dot-org__OpenAGI-testing-truthyDPO-1/blob/main/results_2024-02-17T10-15-19.081933.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6304857303424044,\n \"acc_stderr\": 0.03249871750345685,\n \"acc_norm\": 0.635809758335016,\n \"acc_norm_stderr\": 0.033161164304669005,\n \"mc1\": 0.5361077111383109,\n \"mc1_stderr\": 0.017457800422268625,\n \"mc2\": 0.7112471524136447,\n \"mc2_stderr\": 0.014852535681165156\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6322525597269625,\n \"acc_stderr\": 0.014090995618168478,\n \"acc_norm\": 0.6732081911262798,\n \"acc_norm_stderr\": 0.01370666597558733\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6648078072097192,\n \"acc_stderr\": 0.004710928569985755,\n \"acc_norm\": 0.8598884684325832,\n \"acc_norm_stderr\": 0.003463933286063884\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6011560693641619,\n \"acc_stderr\": 0.037336266553835096,\n \"acc_norm\": 0.6011560693641619,\n \"acc_norm_stderr\": 0.037336266553835096\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099835,\n \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099835\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3783068783068783,\n \"acc_stderr\": 0.024976954053155254,\n \"acc_norm\": 0.3783068783068783,\n \"acc_norm_stderr\": 0.024976954053155254\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7516129032258064,\n \"acc_stderr\": 0.024580028921481003,\n \"acc_norm\": 0.7516129032258064,\n \"acc_norm_stderr\": 0.024580028921481003\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5270935960591133,\n \"acc_stderr\": 0.03512819077876106,\n \"acc_norm\": 0.5270935960591133,\n \"acc_norm_stderr\": 0.03512819077876106\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.02503387058301518,\n \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.02503387058301518\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6358974358974359,\n \"acc_stderr\": 0.024396672985094767,\n \"acc_norm\": 0.6358974358974359,\n \"acc_norm_stderr\": 0.024396672985094767\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683512,\n \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683512\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.029953823891887044,\n \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.029953823891887044\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8330275229357799,\n \"acc_stderr\": 0.015990154885073406,\n \"acc_norm\": 0.8330275229357799,\n \"acc_norm_stderr\": 0.015990154885073406\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.803921568627451,\n \"acc_stderr\": 0.027865942286639325,\n \"acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639325\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944856,\n \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944856\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n \"acc_stderr\": 0.031381476375754995,\n \"acc_norm\": 0.6771300448430493,\n \"acc_norm_stderr\": 0.031381476375754995\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306085,\n \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306085\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.03520893951097653,\n \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03520893951097653\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.034878251684978906,\n \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.034878251684978906\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n \"acc_stderr\": 0.02023714900899093,\n \"acc_norm\": 0.8931623931623932,\n \"acc_norm_stderr\": 0.02023714900899093\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n \"acc_stderr\": 0.013625556907993464,\n \"acc_norm\": 0.8237547892720306,\n \"acc_norm_stderr\": 0.013625556907993464\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6936416184971098,\n \"acc_stderr\": 0.024818350129436593,\n \"acc_norm\": 0.6936416184971098,\n \"acc_norm_stderr\": 0.024818350129436593\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.376536312849162,\n \"acc_stderr\": 0.016204672385106603,\n \"acc_norm\": 0.376536312849162,\n \"acc_norm_stderr\": 0.016204672385106603\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137897,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137897\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6945337620578779,\n \"acc_stderr\": 0.02616058445014045,\n \"acc_norm\": 0.6945337620578779,\n \"acc_norm_stderr\": 0.02616058445014045\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7283950617283951,\n \"acc_stderr\": 0.02474862449053737,\n \"acc_norm\": 0.7283950617283951,\n \"acc_norm_stderr\": 0.02474862449053737\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.475177304964539,\n \"acc_stderr\": 0.029790719243829727,\n \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.029790719243829727\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45045632333767927,\n \"acc_stderr\": 0.012707390438502346,\n \"acc_norm\": 0.45045632333767927,\n \"acc_norm_stderr\": 0.012707390438502346\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.028739328513983576,\n \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.028739328513983576\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6552287581699346,\n \"acc_stderr\": 0.019228322018696647,\n \"acc_norm\": 0.6552287581699346,\n \"acc_norm_stderr\": 0.019228322018696647\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7061224489795919,\n \"acc_stderr\": 0.029162738410249772,\n \"acc_norm\": 0.7061224489795919,\n \"acc_norm_stderr\": 0.029162738410249772\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7810945273631841,\n \"acc_stderr\": 0.029239174636647,\n \"acc_norm\": 0.7810945273631841,\n \"acc_norm_stderr\": 0.029239174636647\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5361077111383109,\n \"mc1_stderr\": 0.017457800422268625,\n \"mc2\": 0.7112471524136447,\n \"mc2_stderr\": 0.014852535681165156\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8121546961325967,\n \"acc_stderr\": 0.010977481103435091\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3707354056103108,\n \"acc_stderr\": 0.01330426770545843\n }\n}\n```", "repo_url": "https://huggingface.co/freeCS-dot-org/OpenAGI-testing-truthyDPO-1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|arc:challenge|25_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|gsm8k|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hellaswag|10_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|winogrande|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["results_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["results_2024-02-17T10-15-19.081933.parquet"]}]}]} | 2024-02-17T10:17:55+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of freeCS-dot-org/OpenAGI-testing-truthyDPO-1
Dataset automatically created during the evaluation run of model freeCS-dot-org/OpenAGI-testing-truthyDPO-1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-17T10:15:19.081933(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of freeCS-dot-org/OpenAGI-testing-truthyDPO-1\n\n\n\nDataset automatically created during the evaluation run of model freeCS-dot-org/OpenAGI-testing-truthyDPO-1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-17T10:15:19.081933(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of freeCS-dot-org/OpenAGI-testing-truthyDPO-1\n\n\n\nDataset automatically created during the evaluation run of model freeCS-dot-org/OpenAGI-testing-truthyDPO-1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-17T10:15:19.081933(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
199,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of freeCS-dot-org/OpenAGI-testing-truthyDPO-1\n\n\n\nDataset automatically created during the evaluation run of model freeCS-dot-org/OpenAGI-testing-truthyDPO-1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-17T10:15:19.081933(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]"
] |
e620e4336aff109feb588cc3b3c2a41d8617d196 | # What Is This
これは [RVC-Models](https://huggingface.co/kuwacom/RVC-Models) のモデルの学習に利用したデータセットの生データです。
利用する際は学習環境にあったデータに変換して利用してください。
# About Dataset
## yukkuri
ゆっくりムービーメーカー4でChatGPTを利用して生成した日本語の文章を約100分作成した動画です。 | kuwacom/Character-Dataset | [
"license:mit",
"region:us"
] | 2024-02-17T10:25:03+00:00 | {"license": "mit", "pretty_name": "d"} | 2024-02-17T13:14:04+00:00 | [] | [] | TAGS
#license-mit #region-us
| # What Is This
これは RVC-Models のモデルの学習に利用したデータセットの生データです。
利用する際は学習環境にあったデータに変換して利用してください。
# About Dataset
## yukkuri
ゆっくりムービーメーカー4でChatGPTを利用して生成した日本語の文章を約100分作成した動画です。 | [
"# What Is This\nこれは RVC-Models のモデルの学習に利用したデータセットの生データです。\n\n利用する際は学習環境にあったデータに変換して利用してください。",
"# About Dataset",
"## yukkuri\nゆっくりムービーメーカー4でChatGPTを利用して生成した日本語の文章を約100分作成した動画です。"
] | [
"TAGS\n#license-mit #region-us \n",
"# What Is This\nこれは RVC-Models のモデルの学習に利用したデータセットの生データです。\n\n利用する際は学習環境にあったデータに変換して利用してください。",
"# About Dataset",
"## yukkuri\nゆっくりムービーメーカー4でChatGPTを利用して生成した日本語の文章を約100分作成した動画です。"
] | [
11,
40,
4,
29
] | [
"passage: TAGS\n#license-mit #region-us \n# What Is This\nこれは RVC-Models のモデルの学習に利用したデータセットの生データです。\n\n利用する際は学習環境にあったデータに変換して利用してください。# About Dataset## yukkuri\nゆっくりムービーメーカー4でChatGPTを利用して生成した日本語の文章を約100分作成した動画です。"
] |
977c295328baa4ac59b4409a3c43771d65b219e2 |
# Dataset Card for Evaluation run of freeCS-dot-org/OpenAGI-testing-intelDPO-2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [freeCS-dot-org/OpenAGI-testing-intelDPO-2](https://huggingface.co/freeCS-dot-org/OpenAGI-testing-intelDPO-2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_freeCS-dot-org__OpenAGI-testing-intelDPO-2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-17T10:25:04.402687](https://huggingface.co/datasets/open-llm-leaderboard/details_freeCS-dot-org__OpenAGI-testing-intelDPO-2/blob/main/results_2024-02-17T10-25-04.402687.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6269237223761134,
"acc_stderr": 0.03285871940715557,
"acc_norm": 0.6302370765791581,
"acc_norm_stderr": 0.03352050314315757,
"mc1": 0.40636474908200737,
"mc1_stderr": 0.0171938358120939,
"mc2": 0.5828319150839333,
"mc2_stderr": 0.015305820815113032
},
"harness|arc:challenge|25": {
"acc": 0.590443686006826,
"acc_stderr": 0.01437035863247244,
"acc_norm": 0.6279863481228669,
"acc_norm_stderr": 0.014124597881844463
},
"harness|hellaswag|10": {
"acc": 0.6442939653455487,
"acc_stderr": 0.004777483159634023,
"acc_norm": 0.8463453495319657,
"acc_norm_stderr": 0.00359880385546063
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6644736842105263,
"acc_stderr": 0.03842498559395268,
"acc_norm": 0.6644736842105263,
"acc_norm_stderr": 0.03842498559395268
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.028450154794118637,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.028450154794118637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.037143259063020656,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.037143259063020656
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.032469569197899575,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.032469569197899575
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.046854730419077895,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.046854730419077895
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.025279850397404904,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.025279850397404904
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7096774193548387,
"acc_stderr": 0.02582210611941589,
"acc_norm": 0.7096774193548387,
"acc_norm_stderr": 0.02582210611941589
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939098,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939098
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.03374402644139403,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.03374402644139403
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267042,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267042
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.844559585492228,
"acc_stderr": 0.026148483469153314,
"acc_norm": 0.844559585492228,
"acc_norm_stderr": 0.026148483469153314
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6051282051282051,
"acc_stderr": 0.024784316942156395,
"acc_norm": 0.6051282051282051,
"acc_norm_stderr": 0.024784316942156395
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.02897264888484427,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.02897264888484427
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886786,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886786
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8110091743119267,
"acc_stderr": 0.016785481159203624,
"acc_norm": 0.8110091743119267,
"acc_norm_stderr": 0.016785481159203624
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.03400603625538272,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.03400603625538272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7843137254901961,
"acc_stderr": 0.02886743144984932,
"acc_norm": 0.7843137254901961,
"acc_norm_stderr": 0.02886743144984932
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.02636165166838909,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.02636165166838909
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6547085201793722,
"acc_stderr": 0.03191100192835794,
"acc_norm": 0.6547085201793722,
"acc_norm_stderr": 0.03191100192835794
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.037683359597287434,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.037683359597287434
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.03520893951097652,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.03520893951097652
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.044143436668549335,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.044143436668549335
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092375,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092375
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8109833971902938,
"acc_stderr": 0.014000791294407,
"acc_norm": 0.8109833971902938,
"acc_norm_stderr": 0.014000791294407
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.684971098265896,
"acc_stderr": 0.025009313790069723,
"acc_norm": 0.684971098265896,
"acc_norm_stderr": 0.025009313790069723
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.40782122905027934,
"acc_stderr": 0.016435865260914742,
"acc_norm": 0.40782122905027934,
"acc_norm_stderr": 0.016435865260914742
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7124183006535948,
"acc_stderr": 0.02591780611714716,
"acc_norm": 0.7124183006535948,
"acc_norm_stderr": 0.02591780611714716
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6720257234726688,
"acc_stderr": 0.02666441088693762,
"acc_norm": 0.6720257234726688,
"acc_norm_stderr": 0.02666441088693762
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7067901234567902,
"acc_stderr": 0.025329888171900926,
"acc_norm": 0.7067901234567902,
"acc_norm_stderr": 0.025329888171900926
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4645390070921986,
"acc_stderr": 0.029752389657427047,
"acc_norm": 0.4645390070921986,
"acc_norm_stderr": 0.029752389657427047
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44328552803129073,
"acc_stderr": 0.012687818419599924,
"acc_norm": 0.44328552803129073,
"acc_norm_stderr": 0.012687818419599924
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6433823529411765,
"acc_stderr": 0.02909720956841195,
"acc_norm": 0.6433823529411765,
"acc_norm_stderr": 0.02909720956841195
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.01943177567703731,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.01943177567703731
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6979591836734694,
"acc_stderr": 0.0293936093198798,
"acc_norm": 0.6979591836734694,
"acc_norm_stderr": 0.0293936093198798
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7014925373134329,
"acc_stderr": 0.03235743789355042,
"acc_norm": 0.7014925373134329,
"acc_norm_stderr": 0.03235743789355042
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653693,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653693
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.40636474908200737,
"mc1_stderr": 0.0171938358120939,
"mc2": 0.5828319150839333,
"mc2_stderr": 0.015305820815113032
},
"harness|winogrande|5": {
"acc": 0.7884767166535123,
"acc_stderr": 0.011477747684223178
},
"harness|gsm8k|5": {
"acc": 0.5094768764215315,
"acc_stderr": 0.01377001065116882
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_freeCS-dot-org__OpenAGI-testing-intelDPO-2 | [
"region:us"
] | 2024-02-17T10:27:23+00:00 | {"pretty_name": "Evaluation run of freeCS-dot-org/OpenAGI-testing-intelDPO-2", "dataset_summary": "Dataset automatically created during the evaluation run of model [freeCS-dot-org/OpenAGI-testing-intelDPO-2](https://huggingface.co/freeCS-dot-org/OpenAGI-testing-intelDPO-2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_freeCS-dot-org__OpenAGI-testing-intelDPO-2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-17T10:25:04.402687](https://huggingface.co/datasets/open-llm-leaderboard/details_freeCS-dot-org__OpenAGI-testing-intelDPO-2/blob/main/results_2024-02-17T10-25-04.402687.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6269237223761134,\n \"acc_stderr\": 0.03285871940715557,\n \"acc_norm\": 0.6302370765791581,\n \"acc_norm_stderr\": 0.03352050314315757,\n \"mc1\": 0.40636474908200737,\n \"mc1_stderr\": 0.0171938358120939,\n \"mc2\": 0.5828319150839333,\n \"mc2_stderr\": 0.015305820815113032\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.590443686006826,\n \"acc_stderr\": 0.01437035863247244,\n \"acc_norm\": 0.6279863481228669,\n \"acc_norm_stderr\": 0.014124597881844463\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6442939653455487,\n \"acc_stderr\": 0.004777483159634023,\n \"acc_norm\": 0.8463453495319657,\n \"acc_norm_stderr\": 0.00359880385546063\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.03842498559395268,\n \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.03842498559395268\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6127167630057804,\n \"acc_stderr\": 0.037143259063020656,\n \"acc_norm\": 0.6127167630057804,\n \"acc_norm_stderr\": 0.037143259063020656\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.032469569197899575,\n \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.032469569197899575\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n \"acc_stderr\": 0.046854730419077895,\n \"acc_norm\": 0.45614035087719296,\n \"acc_norm_stderr\": 0.046854730419077895\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404904,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404904\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7096774193548387,\n \"acc_stderr\": 0.02582210611941589,\n \"acc_norm\": 0.7096774193548387,\n \"acc_norm_stderr\": 0.02582210611941589\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939098,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939098\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.03374402644139403,\n \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.03374402644139403\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267042,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267042\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.844559585492228,\n \"acc_stderr\": 0.026148483469153314,\n \"acc_norm\": 0.844559585492228,\n \"acc_norm_stderr\": 0.026148483469153314\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6051282051282051,\n \"acc_stderr\": 0.024784316942156395,\n \"acc_norm\": 0.6051282051282051,\n \"acc_norm_stderr\": 0.024784316942156395\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34444444444444444,\n \"acc_stderr\": 0.02897264888484427,\n \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.02897264888484427\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886786,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886786\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8110091743119267,\n \"acc_stderr\": 0.016785481159203624,\n \"acc_norm\": 0.8110091743119267,\n \"acc_norm_stderr\": 0.016785481159203624\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5370370370370371,\n \"acc_stderr\": 0.03400603625538272,\n \"acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.03400603625538272\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7843137254901961,\n \"acc_stderr\": 0.02886743144984932,\n \"acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.02886743144984932\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7932489451476793,\n \"acc_stderr\": 0.02636165166838909,\n \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.02636165166838909\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.6547085201793722,\n \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.037683359597287434,\n \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.037683359597287434\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.03520893951097652,\n \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03520893951097652\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.044143436668549335,\n \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.044143436668549335\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.020588491316092375,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.020588491316092375\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8109833971902938,\n \"acc_stderr\": 0.014000791294407,\n \"acc_norm\": 0.8109833971902938,\n \"acc_norm_stderr\": 0.014000791294407\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.684971098265896,\n \"acc_stderr\": 0.025009313790069723,\n \"acc_norm\": 0.684971098265896,\n \"acc_norm_stderr\": 0.025009313790069723\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.40782122905027934,\n \"acc_stderr\": 0.016435865260914742,\n \"acc_norm\": 0.40782122905027934,\n \"acc_norm_stderr\": 0.016435865260914742\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.02591780611714716,\n \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.02591780611714716\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6720257234726688,\n \"acc_stderr\": 0.02666441088693762,\n \"acc_norm\": 0.6720257234726688,\n \"acc_norm_stderr\": 0.02666441088693762\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7067901234567902,\n \"acc_stderr\": 0.025329888171900926,\n \"acc_norm\": 0.7067901234567902,\n \"acc_norm_stderr\": 0.025329888171900926\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4645390070921986,\n \"acc_stderr\": 0.029752389657427047,\n \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.029752389657427047\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44328552803129073,\n \"acc_stderr\": 0.012687818419599924,\n \"acc_norm\": 0.44328552803129073,\n \"acc_norm_stderr\": 0.012687818419599924\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6433823529411765,\n \"acc_stderr\": 0.02909720956841195,\n \"acc_norm\": 0.6433823529411765,\n \"acc_norm_stderr\": 0.02909720956841195\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6388888888888888,\n \"acc_stderr\": 0.01943177567703731,\n \"acc_norm\": 0.6388888888888888,\n \"acc_norm_stderr\": 0.01943177567703731\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6979591836734694,\n \"acc_stderr\": 0.0293936093198798,\n \"acc_norm\": 0.6979591836734694,\n \"acc_norm_stderr\": 0.0293936093198798\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7014925373134329,\n \"acc_stderr\": 0.03235743789355042,\n \"acc_norm\": 0.7014925373134329,\n \"acc_norm_stderr\": 0.03235743789355042\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653693,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653693\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40636474908200737,\n \"mc1_stderr\": 0.0171938358120939,\n \"mc2\": 0.5828319150839333,\n \"mc2_stderr\": 0.015305820815113032\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7884767166535123,\n \"acc_stderr\": 0.011477747684223178\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5094768764215315,\n \"acc_stderr\": 0.01377001065116882\n }\n}\n```", "repo_url": "https://huggingface.co/freeCS-dot-org/OpenAGI-testing-intelDPO-2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|arc:challenge|25_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|gsm8k|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hellaswag|10_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|winogrande|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["results_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["results_2024-02-17T10-25-04.402687.parquet"]}]}]} | 2024-02-17T10:27:44+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of freeCS-dot-org/OpenAGI-testing-intelDPO-2
Dataset automatically created during the evaluation run of model freeCS-dot-org/OpenAGI-testing-intelDPO-2 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-17T10:25:04.402687(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of freeCS-dot-org/OpenAGI-testing-intelDPO-2\n\n\n\nDataset automatically created during the evaluation run of model freeCS-dot-org/OpenAGI-testing-intelDPO-2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-17T10:25:04.402687(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of freeCS-dot-org/OpenAGI-testing-intelDPO-2\n\n\n\nDataset automatically created during the evaluation run of model freeCS-dot-org/OpenAGI-testing-intelDPO-2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-17T10:25:04.402687(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
199,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of freeCS-dot-org/OpenAGI-testing-intelDPO-2\n\n\n\nDataset automatically created during the evaluation run of model freeCS-dot-org/OpenAGI-testing-intelDPO-2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-17T10:25:04.402687(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]"
] |
f517da1233a70b6127468ede8f1d000adbd58206 |
# phrase-ticker Dataset
## Description
The Phrase Ticker Dataset enables the extraction of stock ticker symbols from natural language queries. The dataset pairs NL utterances commonly associated with S&P 500 companies with their corresponding ticker symbols, providing a simple resource for understanding how companies are referred to in various contexts.
## Structure
The dataset comprises two columns:
- `phrase`: This column contains natural language phrases that reference or describe companies in ways that are commonly used in financial news, reports, and discussions. These include not only formal company names and products but also informal and colloquial references.
- `ticker`: Each phrase is associated with a unique stock ticker symbol, identifying the company mentioned or described in the phrase.
## Primary Use Case
**Ticker Extraction from Natural Language Queries**: The main application of this dataset is to train models that can accurately identify and extract stock ticker symbols from text. This capability is crucial for automating the analysis of financial news, social media mentions, analyst reports, and any textual content where companies are discussed without directly mentioning their ticker symbols.
## Getting Started
To begin working with the phrase-ticker Dataset in your projects, you can load it using the Hugging Face `datasets` library:
```python
from datasets import load_dataset
dataset = load_dataset("rohanmahen/phrase-ticker")
```
## Contributions
Contributions to the phrase-ticker Dataset are welcomed, including the addition of new phrases, refinement of existing data, and suggestions for improvement. Please checkout the repository on [github](https://github.com/rohanmahen/phrase-ticker) for more info.
| rohanmahen/phrase-ticker | [
"license:mit",
"region:us"
] | 2024-02-17T12:04:15+00:00 | {"license": "mit"} | 2024-02-17T12:55:53+00:00 | [] | [] | TAGS
#license-mit #region-us
|
# phrase-ticker Dataset
## Description
The Phrase Ticker Dataset enables the extraction of stock ticker symbols from natural language queries. The dataset pairs NL utterances commonly associated with S&P 500 companies with their corresponding ticker symbols, providing a simple resource for understanding how companies are referred to in various contexts.
## Structure
The dataset comprises two columns:
- 'phrase': This column contains natural language phrases that reference or describe companies in ways that are commonly used in financial news, reports, and discussions. These include not only formal company names and products but also informal and colloquial references.
- 'ticker': Each phrase is associated with a unique stock ticker symbol, identifying the company mentioned or described in the phrase.
## Primary Use Case
Ticker Extraction from Natural Language Queries: The main application of this dataset is to train models that can accurately identify and extract stock ticker symbols from text. This capability is crucial for automating the analysis of financial news, social media mentions, analyst reports, and any textual content where companies are discussed without directly mentioning their ticker symbols.
## Getting Started
To begin working with the phrase-ticker Dataset in your projects, you can load it using the Hugging Face 'datasets' library:
## Contributions
Contributions to the phrase-ticker Dataset are welcomed, including the addition of new phrases, refinement of existing data, and suggestions for improvement. Please checkout the repository on github for more info.
| [
"# phrase-ticker Dataset",
"## Description\n\nThe Phrase Ticker Dataset enables the extraction of stock ticker symbols from natural language queries. The dataset pairs NL utterances commonly associated with S&P 500 companies with their corresponding ticker symbols, providing a simple resource for understanding how companies are referred to in various contexts.",
"## Structure\n\nThe dataset comprises two columns:\n\n- 'phrase': This column contains natural language phrases that reference or describe companies in ways that are commonly used in financial news, reports, and discussions. These include not only formal company names and products but also informal and colloquial references.\n- 'ticker': Each phrase is associated with a unique stock ticker symbol, identifying the company mentioned or described in the phrase.",
"## Primary Use Case\n\nTicker Extraction from Natural Language Queries: The main application of this dataset is to train models that can accurately identify and extract stock ticker symbols from text. This capability is crucial for automating the analysis of financial news, social media mentions, analyst reports, and any textual content where companies are discussed without directly mentioning their ticker symbols.",
"## Getting Started\n\nTo begin working with the phrase-ticker Dataset in your projects, you can load it using the Hugging Face 'datasets' library:",
"## Contributions\n\nContributions to the phrase-ticker Dataset are welcomed, including the addition of new phrases, refinement of existing data, and suggestions for improvement. Please checkout the repository on github for more info."
] | [
"TAGS\n#license-mit #region-us \n",
"# phrase-ticker Dataset",
"## Description\n\nThe Phrase Ticker Dataset enables the extraction of stock ticker symbols from natural language queries. The dataset pairs NL utterances commonly associated with S&P 500 companies with their corresponding ticker symbols, providing a simple resource for understanding how companies are referred to in various contexts.",
"## Structure\n\nThe dataset comprises two columns:\n\n- 'phrase': This column contains natural language phrases that reference or describe companies in ways that are commonly used in financial news, reports, and discussions. These include not only formal company names and products but also informal and colloquial references.\n- 'ticker': Each phrase is associated with a unique stock ticker symbol, identifying the company mentioned or described in the phrase.",
"## Primary Use Case\n\nTicker Extraction from Natural Language Queries: The main application of this dataset is to train models that can accurately identify and extract stock ticker symbols from text. This capability is crucial for automating the analysis of financial news, social media mentions, analyst reports, and any textual content where companies are discussed without directly mentioning their ticker symbols.",
"## Getting Started\n\nTo begin working with the phrase-ticker Dataset in your projects, you can load it using the Hugging Face 'datasets' library:",
"## Contributions\n\nContributions to the phrase-ticker Dataset are welcomed, including the addition of new phrases, refinement of existing data, and suggestions for improvement. Please checkout the repository on github for more info."
] | [
11,
7,
71,
99,
84,
36,
53
] | [
"passage: TAGS\n#license-mit #region-us \n# phrase-ticker Dataset## Description\n\nThe Phrase Ticker Dataset enables the extraction of stock ticker symbols from natural language queries. The dataset pairs NL utterances commonly associated with S&P 500 companies with their corresponding ticker symbols, providing a simple resource for understanding how companies are referred to in various contexts.## Structure\n\nThe dataset comprises two columns:\n\n- 'phrase': This column contains natural language phrases that reference or describe companies in ways that are commonly used in financial news, reports, and discussions. These include not only formal company names and products but also informal and colloquial references.\n- 'ticker': Each phrase is associated with a unique stock ticker symbol, identifying the company mentioned or described in the phrase.## Primary Use Case\n\nTicker Extraction from Natural Language Queries: The main application of this dataset is to train models that can accurately identify and extract stock ticker symbols from text. This capability is crucial for automating the analysis of financial news, social media mentions, analyst reports, and any textual content where companies are discussed without directly mentioning their ticker symbols.## Getting Started\n\nTo begin working with the phrase-ticker Dataset in your projects, you can load it using the Hugging Face 'datasets' library:## Contributions\n\nContributions to the phrase-ticker Dataset are welcomed, including the addition of new phrases, refinement of existing data, and suggestions for improvement. Please checkout the repository on github for more info."
] |
2a3b3406e9ef2fe616c0cd1ccf1d47193f0fbeb4 |
![image/png](https://cdn-uploads.huggingface.co/production/uploads/64e7616c7df33432812e3c57/z-Rf2b0rfvg02unmYAbfk.png) | PotatoOff/Milo | [
"task_categories:text-generation",
"task_categories:question-answering",
"language:en",
"region:us"
] | 2024-02-17T12:15:22+00:00 | {"language": ["en"], "task_categories": ["text-generation", "question-answering"]} | 2024-02-17T12:15:56+00:00 | [] | [
"en"
] | TAGS
#task_categories-text-generation #task_categories-question-answering #language-English #region-us
|
!image/png | [] | [
"TAGS\n#task_categories-text-generation #task_categories-question-answering #language-English #region-us \n"
] | [
33
] | [
"passage: TAGS\n#task_categories-text-generation #task_categories-question-answering #language-English #region-us \n"
] |
09e0ccbbdd7909573be28f8131fd037e9a3f11e4 |
# Dataset Card for Evaluation run of cloudyu/Mixtral_13B_Chat
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [cloudyu/Mixtral_13B_Chat](https://huggingface.co/cloudyu/Mixtral_13B_Chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_cloudyu__Mixtral_13B_Chat",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-17T13:01:58.551979](https://huggingface.co/datasets/open-llm-leaderboard/details_cloudyu__Mixtral_13B_Chat/blob/main/results_2024-02-17T13-01-58.551979.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6471647618048562,
"acc_stderr": 0.03218980683733778,
"acc_norm": 0.6495327471727932,
"acc_norm_stderr": 0.032835191770398446,
"mc1": 0.42962056303549573,
"mc1_stderr": 0.0173292345804091,
"mc2": 0.5897994402086952,
"mc2_stderr": 0.015625316517181305
},
"harness|arc:challenge|25": {
"acc": 0.64419795221843,
"acc_stderr": 0.01399057113791876,
"acc_norm": 0.674061433447099,
"acc_norm_stderr": 0.013697432466693247
},
"harness|hellaswag|10": {
"acc": 0.6725751842262497,
"acc_stderr": 0.004683146373232271,
"acc_norm": 0.8586934873531169,
"acc_norm_stderr": 0.0034762555096445303
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7302631578947368,
"acc_stderr": 0.03611780560284898,
"acc_norm": 0.7302631578947368,
"acc_norm_stderr": 0.03611780560284898
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.720754716981132,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.720754716981132,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.03716177437566017,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.03716177437566017
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.02540255550326091,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.02540255550326091
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356853,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356853
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.03517603540361008,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.03517603540361008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267042,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267042
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6487179487179487,
"acc_stderr": 0.024203665177902803,
"acc_norm": 0.6487179487179487,
"acc_norm_stderr": 0.024203665177902803
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6974789915966386,
"acc_stderr": 0.02983796238829193,
"acc_norm": 0.6974789915966386,
"acc_norm_stderr": 0.02983796238829193
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374303,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374303
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5324074074074074,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.5324074074074074,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.02646056956124064,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.02646056956124064
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.026750826994676173,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.026750826994676173
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165616,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165616
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.822477650063857,
"acc_stderr": 0.01366423099583483,
"acc_norm": 0.822477650063857,
"acc_norm_stderr": 0.01366423099583483
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.02425790170532338,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.02425790170532338
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.441340782122905,
"acc_stderr": 0.016607021781050873,
"acc_norm": 0.441340782122905,
"acc_norm_stderr": 0.016607021781050873
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.025646863097137894,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.025646863097137894
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.02575586592263295,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.02575586592263295
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135114,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4645390070921986,
"acc_stderr": 0.029752389657427047,
"acc_norm": 0.4645390070921986,
"acc_norm_stderr": 0.029752389657427047
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46479791395045633,
"acc_stderr": 0.012738547371303956,
"acc_norm": 0.46479791395045633,
"acc_norm_stderr": 0.012738547371303956
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.02873932851398357,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.02873932851398357
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6633986928104575,
"acc_stderr": 0.019117213911495155,
"acc_norm": 0.6633986928104575,
"acc_norm_stderr": 0.019117213911495155
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.028795185574291293,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.028795185574291293
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.02519692987482706,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.02519692987482706
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.034873508801977704,
"acc_norm": 0.86,
"acc_norm_stderr": 0.034873508801977704
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.02796678585916089,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.02796678585916089
},
"harness|truthfulqa:mc|0": {
"mc1": 0.42962056303549573,
"mc1_stderr": 0.0173292345804091,
"mc2": 0.5897994402086952,
"mc2_stderr": 0.015625316517181305
},
"harness|winogrande|5": {
"acc": 0.8042620363062352,
"acc_stderr": 0.011151145042218324
},
"harness|gsm8k|5": {
"acc": 0.5663381349507203,
"acc_stderr": 0.013650728047064685
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_cloudyu__Mixtral_13B_Chat | [
"region:us"
] | 2024-02-17T13:04:13+00:00 | {"pretty_name": "Evaluation run of cloudyu/Mixtral_13B_Chat", "dataset_summary": "Dataset automatically created during the evaluation run of model [cloudyu/Mixtral_13B_Chat](https://huggingface.co/cloudyu/Mixtral_13B_Chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_cloudyu__Mixtral_13B_Chat\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-17T13:01:58.551979](https://huggingface.co/datasets/open-llm-leaderboard/details_cloudyu__Mixtral_13B_Chat/blob/main/results_2024-02-17T13-01-58.551979.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6471647618048562,\n \"acc_stderr\": 0.03218980683733778,\n \"acc_norm\": 0.6495327471727932,\n \"acc_norm_stderr\": 0.032835191770398446,\n \"mc1\": 0.42962056303549573,\n \"mc1_stderr\": 0.0173292345804091,\n \"mc2\": 0.5897994402086952,\n \"mc2_stderr\": 0.015625316517181305\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.64419795221843,\n \"acc_stderr\": 0.01399057113791876,\n \"acc_norm\": 0.674061433447099,\n \"acc_norm_stderr\": 0.013697432466693247\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6725751842262497,\n \"acc_stderr\": 0.004683146373232271,\n \"acc_norm\": 0.8586934873531169,\n \"acc_norm_stderr\": 0.0034762555096445303\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7302631578947368,\n \"acc_stderr\": 0.03611780560284898,\n \"acc_norm\": 0.7302631578947368,\n \"acc_norm_stderr\": 0.03611780560284898\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n \"acc_stderr\": 0.03716177437566017,\n \"acc_norm\": 0.7291666666666666,\n \"acc_norm_stderr\": 0.03716177437566017\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41798941798941797,\n \"acc_stderr\": 0.02540255550326091,\n \"acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.02540255550326091\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n \"acc_stderr\": 0.02341529343356853,\n \"acc_norm\": 0.7838709677419354,\n \"acc_norm_stderr\": 0.02341529343356853\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.03517603540361008,\n \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.03517603540361008\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267042,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267042\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6487179487179487,\n \"acc_stderr\": 0.024203665177902803,\n \"acc_norm\": 0.6487179487179487,\n \"acc_norm_stderr\": 0.024203665177902803\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6974789915966386,\n \"acc_stderr\": 0.02983796238829193,\n \"acc_norm\": 0.6974789915966386,\n \"acc_norm_stderr\": 0.02983796238829193\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374303,\n \"acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374303\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5324074074074074,\n \"acc_stderr\": 0.03402801581358966,\n \"acc_norm\": 0.5324074074074074,\n \"acc_norm_stderr\": 0.03402801581358966\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8284313725490197,\n \"acc_stderr\": 0.02646056956124064,\n \"acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.02646056956124064\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7848101265822784,\n \"acc_stderr\": 0.026750826994676173,\n \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.026750826994676173\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n \"acc_stderr\": 0.022209309073165616,\n \"acc_norm\": 0.8675213675213675,\n \"acc_norm_stderr\": 0.022209309073165616\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.822477650063857,\n \"acc_stderr\": 0.01366423099583483,\n \"acc_norm\": 0.822477650063857,\n \"acc_norm_stderr\": 0.01366423099583483\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.02425790170532338,\n \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.02425790170532338\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.441340782122905,\n \"acc_stderr\": 0.016607021781050873,\n \"acc_norm\": 0.441340782122905,\n \"acc_norm_stderr\": 0.016607021781050873\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137894,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137894\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n \"acc_stderr\": 0.02575586592263295,\n \"acc_norm\": 0.7106109324758842,\n \"acc_norm_stderr\": 0.02575586592263295\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135114,\n \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135114\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4645390070921986,\n \"acc_stderr\": 0.029752389657427047,\n \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.029752389657427047\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46479791395045633,\n \"acc_stderr\": 0.012738547371303956,\n \"acc_norm\": 0.46479791395045633,\n \"acc_norm_stderr\": 0.012738547371303956\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.02873932851398357,\n \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.02873932851398357\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6633986928104575,\n \"acc_stderr\": 0.019117213911495155,\n \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.019117213911495155\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.028795185574291293,\n \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.028795185574291293\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n \"acc_stderr\": 0.02519692987482706,\n \"acc_norm\": 0.8507462686567164,\n \"acc_norm_stderr\": 0.02519692987482706\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.034873508801977704,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.034873508801977704\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.02796678585916089,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.02796678585916089\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.42962056303549573,\n \"mc1_stderr\": 0.0173292345804091,\n \"mc2\": 0.5897994402086952,\n \"mc2_stderr\": 0.015625316517181305\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8042620363062352,\n \"acc_stderr\": 0.011151145042218324\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5663381349507203,\n \"acc_stderr\": 0.013650728047064685\n }\n}\n```", "repo_url": "https://huggingface.co/cloudyu/Mixtral_13B_Chat", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|arc:challenge|25_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|gsm8k|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hellaswag|10_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|winogrande|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["results_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["results_2024-02-17T13-01-58.551979.parquet"]}]}]} | 2024-02-17T13:04:34+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of cloudyu/Mixtral_13B_Chat
Dataset automatically created during the evaluation run of model cloudyu/Mixtral_13B_Chat on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-17T13:01:58.551979(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of cloudyu/Mixtral_13B_Chat\n\n\n\nDataset automatically created during the evaluation run of model cloudyu/Mixtral_13B_Chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-17T13:01:58.551979(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of cloudyu/Mixtral_13B_Chat\n\n\n\nDataset automatically created during the evaluation run of model cloudyu/Mixtral_13B_Chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-17T13:01:58.551979(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
183,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of cloudyu/Mixtral_13B_Chat\n\n\n\nDataset automatically created during the evaluation run of model cloudyu/Mixtral_13B_Chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-17T13:01:58.551979(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
974623f6718e1fdbd75f7c83b2f858cf85b65d5e | # Dataset Card for "RefGPT-Reason"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | Mutonix/RefGPT-Reason | [
"region:us"
] | 2024-02-17T13:31:17+00:00 | {"dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "answers", "sequence": "string"}, {"name": "best_answer", "dtype": "string"}, {"name": "language", "dtype": "string"}, {"name": "type", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 351127497, "num_examples": 288285}], "download_size": 210897840, "dataset_size": 351127497}} | 2024-02-17T13:32:56+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "RefGPT-Reason"
More Information needed | [
"# Dataset Card for \"RefGPT-Reason\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"RefGPT-Reason\"\n\nMore Information needed"
] | [
6,
17
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"RefGPT-Reason\"\n\nMore Information needed"
] |
7c73af7e30499b0ff3eefbf8b88eca92fd1a2960 | # Dataset Card for "vikwiki-qa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | boapps/vikwiki-qa | [
"region:us"
] | 2024-02-17T13:44:14+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "answers", "sequence": "string"}, {"name": "correct_answer", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 1267180.5238698774, "num_examples": 5325}, {"name": "test", "num_bytes": 422631.4761301225, "num_examples": 1776}], "download_size": 983857, "dataset_size": 1689812.0}} | 2024-02-17T13:44:20+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "vikwiki-qa"
More Information needed | [
"# Dataset Card for \"vikwiki-qa\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"vikwiki-qa\"\n\nMore Information needed"
] | [
6,
14
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"vikwiki-qa\"\n\nMore Information needed"
] |
bfaf34347c4f74efe93807194962edc9ed4c5156 |
# Dataset of conte_di_cavour/コンテ·ディ·カブール/加富尔伯爵 (Azur Lane)
This is the dataset of conte_di_cavour/コンテ·ディ·カブール/加富尔伯爵 (Azur Lane), containing 9 images and their tags.
The core tags of this character are `grey_hair, yellow_eyes, short_hair, bangs, beret, hat, black_headwear`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:---------|:--------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 9 | 6.41 MiB | [Download](https://huggingface.co/datasets/CyberHarem/conte_di_cavour_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 9 | 5.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/conte_di_cavour_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 13 | 7.53 MiB | [Download](https://huggingface.co/datasets/CyberHarem/conte_di_cavour_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 9 | 6.06 MiB | [Download](https://huggingface.co/datasets/CyberHarem/conte_di_cavour_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 13 | 8.91 MiB | [Download](https://huggingface.co/datasets/CyberHarem/conte_di_cavour_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/conte_di_cavour_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, weapon, black_thighhighs, closed_mouth, holding, looking_at_viewer, white_gloves, cape, full_body, simple_background, water, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | weapon | black_thighhighs | closed_mouth | holding | looking_at_viewer | white_gloves | cape | full_body | simple_background | water | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:---------|:-------------------|:---------------|:----------|:--------------------|:---------------|:-------|:------------|:--------------------|:--------|:-------------------|
| 0 | 9 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/conte_di_cavour_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-02-17T13:50:29+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-02-17T14:04:47+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of conte\_di\_cavour/コンテ·ディ·カブール/加富尔伯爵 (Azur Lane)
==========================================================
This is the dataset of conte\_di\_cavour/コンテ·ディ·カブール/加富尔伯爵 (Azur Lane), containing 9 images and their tags.
The core tags of this character are 'grey\_hair, yellow\_eyes, short\_hair, bangs, beret, hat, black\_headwear', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
44,
61,
5,
4
] | [
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version"
] |
4db4eedcf9ae6fc375c9199c93fe38a73b9481d2 | # Dataset Card for "ytrends5"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | firstgradeai/ytrends5 | [
"region:us"
] | 2024-02-17T14:02:11+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "instruction", "dtype": "string"}, {"name": "output", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 13395866.378974993, "num_examples": 9069}, {"name": "test", "num_bytes": 5741507.621025008, "num_examples": 3887}], "download_size": 9624751, "dataset_size": 19137374.0}} | 2024-02-17T14:05:09+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "ytrends5"
More Information needed | [
"# Dataset Card for \"ytrends5\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"ytrends5\"\n\nMore Information needed"
] | [
6,
14
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"ytrends5\"\n\nMore Information needed"
] |
ce6e3ed267388f8725bff3b07a7bb18ec475276f |
# Dataset Card for Evaluation run of DatPySci/pythia-1b-sft-50k
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [DatPySci/pythia-1b-sft-50k](https://huggingface.co/DatPySci/pythia-1b-sft-50k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_DatPySci__pythia-1b-sft-50k",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-17T14:41:59.887810](https://huggingface.co/datasets/open-llm-leaderboard/details_DatPySci__pythia-1b-sft-50k/blob/main/results_2024-02-17T14-41-59.887810.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.24467360878008643,
"acc_stderr": 0.03024534477539282,
"acc_norm": 0.24555697815658845,
"acc_norm_stderr": 0.030978837434188194,
"mc1": 0.22031823745410037,
"mc1_stderr": 0.01450904517148729,
"mc2": 0.37005968856579075,
"mc2_stderr": 0.014337009699291485
},
"harness|arc:challenge|25": {
"acc": 0.27986348122866894,
"acc_stderr": 0.01311904089772592,
"acc_norm": 0.3003412969283277,
"acc_norm_stderr": 0.013395909309957009
},
"harness|hellaswag|10": {
"acc": 0.3906592312288389,
"acc_stderr": 0.00486901015228075,
"acc_norm": 0.4910376419040032,
"acc_norm_stderr": 0.004988979750014438
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322716,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322716
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.03785714465066653,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.03785714465066653
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123394,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123394
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.27547169811320754,
"acc_stderr": 0.02749566368372407,
"acc_norm": 0.27547169811320754,
"acc_norm_stderr": 0.02749566368372407
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.19,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816508,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816508
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2543352601156069,
"acc_stderr": 0.0332055644308557,
"acc_norm": 0.2543352601156069,
"acc_norm_stderr": 0.0332055644308557
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.04158307533083286,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.04158307533083286
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.22,
"acc_stderr": 0.0416333199893227,
"acc_norm": 0.22,
"acc_norm_stderr": 0.0416333199893227
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.02767845257821238,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.02767845257821238
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.21929824561403508,
"acc_stderr": 0.038924311065187504,
"acc_norm": 0.21929824561403508,
"acc_norm_stderr": 0.038924311065187504
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2,
"acc_stderr": 0.0333333333333333,
"acc_norm": 0.2,
"acc_norm_stderr": 0.0333333333333333
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.039325376803928704,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.039325376803928704
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.25161290322580643,
"acc_stderr": 0.024685979286239963,
"acc_norm": 0.25161290322580643,
"acc_norm_stderr": 0.024685979286239963
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.21674876847290642,
"acc_stderr": 0.028990331252516235,
"acc_norm": 0.21674876847290642,
"acc_norm_stderr": 0.028990331252516235
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24242424242424243,
"acc_stderr": 0.03346409881055952,
"acc_norm": 0.24242424242424243,
"acc_norm_stderr": 0.03346409881055952
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.1717171717171717,
"acc_stderr": 0.026869716187429917,
"acc_norm": 0.1717171717171717,
"acc_norm_stderr": 0.026869716187429917
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.2694300518134715,
"acc_stderr": 0.03201867122877795,
"acc_norm": 0.2694300518134715,
"acc_norm_stderr": 0.03201867122877795
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2230769230769231,
"acc_stderr": 0.021107730127244,
"acc_norm": 0.2230769230769231,
"acc_norm_stderr": 0.021107730127244
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.29259259259259257,
"acc_stderr": 0.027738969632176088,
"acc_norm": 0.29259259259259257,
"acc_norm_stderr": 0.027738969632176088
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.22268907563025211,
"acc_stderr": 0.027025433498882385,
"acc_norm": 0.22268907563025211,
"acc_norm_stderr": 0.027025433498882385
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2119205298013245,
"acc_stderr": 0.03336767086567978,
"acc_norm": 0.2119205298013245,
"acc_norm_stderr": 0.03336767086567978
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.30275229357798167,
"acc_stderr": 0.019698711434756343,
"acc_norm": 0.30275229357798167,
"acc_norm_stderr": 0.019698711434756343
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.24537037037037038,
"acc_stderr": 0.02934666509437295,
"acc_norm": 0.24537037037037038,
"acc_norm_stderr": 0.02934666509437295
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.20098039215686275,
"acc_stderr": 0.028125972265654355,
"acc_norm": 0.20098039215686275,
"acc_norm_stderr": 0.028125972265654355
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.25316455696202533,
"acc_stderr": 0.028304657943035296,
"acc_norm": 0.25316455696202533,
"acc_norm_stderr": 0.028304657943035296
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.336322869955157,
"acc_stderr": 0.031708824268455,
"acc_norm": 0.336322869955157,
"acc_norm_stderr": 0.031708824268455
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.22900763358778625,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.22900763358778625,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.23140495867768596,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.23140495867768596,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2085889570552147,
"acc_stderr": 0.03192193448934724,
"acc_norm": 0.2085889570552147,
"acc_norm_stderr": 0.03192193448934724
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.20388349514563106,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.20388349514563106,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2863247863247863,
"acc_stderr": 0.029614323690456648,
"acc_norm": 0.2863247863247863,
"acc_norm_stderr": 0.029614323690456648
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.26947637292464877,
"acc_stderr": 0.015866243073215037,
"acc_norm": 0.26947637292464877,
"acc_norm_stderr": 0.015866243073215037
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2543352601156069,
"acc_stderr": 0.02344582627654555,
"acc_norm": 0.2543352601156069,
"acc_norm_stderr": 0.02344582627654555
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2435754189944134,
"acc_stderr": 0.014355911964767857,
"acc_norm": 0.2435754189944134,
"acc_norm_stderr": 0.014355911964767857
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.023805186524888156,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.023805186524888156
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2572347266881029,
"acc_stderr": 0.024826171289250888,
"acc_norm": 0.2572347266881029,
"acc_norm_stderr": 0.024826171289250888
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2654320987654321,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.2654320987654321,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432403,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432403
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2392438070404172,
"acc_stderr": 0.010896123652676651,
"acc_norm": 0.2392438070404172,
"acc_norm_stderr": 0.010896123652676651
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.34558823529411764,
"acc_stderr": 0.028888193103988644,
"acc_norm": 0.34558823529411764,
"acc_norm_stderr": 0.028888193103988644
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.28104575163398693,
"acc_stderr": 0.018185218954318086,
"acc_norm": 0.28104575163398693,
"acc_norm_stderr": 0.018185218954318086
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.19090909090909092,
"acc_stderr": 0.03764425585984926,
"acc_norm": 0.19090909090909092,
"acc_norm_stderr": 0.03764425585984926
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.17959183673469387,
"acc_stderr": 0.024573293589585637,
"acc_norm": 0.17959183673469387,
"acc_norm_stderr": 0.024573293589585637
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.19,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370519,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370519
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.19883040935672514,
"acc_stderr": 0.03061111655743253,
"acc_norm": 0.19883040935672514,
"acc_norm_stderr": 0.03061111655743253
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22031823745410037,
"mc1_stderr": 0.01450904517148729,
"mc2": 0.37005968856579075,
"mc2_stderr": 0.014337009699291485
},
"harness|winogrande|5": {
"acc": 0.5406471981057617,
"acc_stderr": 0.014005973823825136
},
"harness|gsm8k|5": {
"acc": 0.017437452615617893,
"acc_stderr": 0.003605486867998255
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_DatPySci__pythia-1b-sft-50k | [
"region:us"
] | 2024-02-17T14:29:57+00:00 | {"pretty_name": "Evaluation run of DatPySci/pythia-1b-sft-50k", "dataset_summary": "Dataset automatically created during the evaluation run of model [DatPySci/pythia-1b-sft-50k](https://huggingface.co/DatPySci/pythia-1b-sft-50k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_DatPySci__pythia-1b-sft-50k\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-17T14:41:59.887810](https://huggingface.co/datasets/open-llm-leaderboard/details_DatPySci__pythia-1b-sft-50k/blob/main/results_2024-02-17T14-41-59.887810.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.24467360878008643,\n \"acc_stderr\": 0.03024534477539282,\n \"acc_norm\": 0.24555697815658845,\n \"acc_norm_stderr\": 0.030978837434188194,\n \"mc1\": 0.22031823745410037,\n \"mc1_stderr\": 0.01450904517148729,\n \"mc2\": 0.37005968856579075,\n \"mc2_stderr\": 0.014337009699291485\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.27986348122866894,\n \"acc_stderr\": 0.01311904089772592,\n \"acc_norm\": 0.3003412969283277,\n \"acc_norm_stderr\": 0.013395909309957009\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.3906592312288389,\n \"acc_stderr\": 0.00486901015228075,\n \"acc_norm\": 0.4910376419040032,\n \"acc_norm_stderr\": 0.004988979750014438\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322716,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322716\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.03785714465066653,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.03785714465066653\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123394,\n \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123394\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.27547169811320754,\n \"acc_stderr\": 0.02749566368372407,\n \"acc_norm\": 0.27547169811320754,\n \"acc_norm_stderr\": 0.02749566368372407\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816508,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816508\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2543352601156069,\n \"acc_stderr\": 0.0332055644308557,\n \"acc_norm\": 0.2543352601156069,\n \"acc_norm_stderr\": 0.0332055644308557\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.04158307533083286,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.04158307533083286\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.0416333199893227,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.0416333199893227\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.23404255319148937,\n \"acc_stderr\": 0.02767845257821238,\n \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.02767845257821238\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21929824561403508,\n \"acc_stderr\": 0.038924311065187504,\n \"acc_norm\": 0.21929824561403508,\n \"acc_norm_stderr\": 0.038924311065187504\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.0333333333333333,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.0333333333333333\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2619047619047619,\n \"acc_stderr\": 0.039325376803928704,\n \"acc_norm\": 0.2619047619047619,\n \"acc_norm_stderr\": 0.039325376803928704\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.25161290322580643,\n \"acc_stderr\": 0.024685979286239963,\n \"acc_norm\": 0.25161290322580643,\n \"acc_norm_stderr\": 0.024685979286239963\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.21674876847290642,\n \"acc_stderr\": 0.028990331252516235,\n \"acc_norm\": 0.21674876847290642,\n \"acc_norm_stderr\": 0.028990331252516235\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.24242424242424243,\n \"acc_stderr\": 0.03346409881055952,\n \"acc_norm\": 0.24242424242424243,\n \"acc_norm_stderr\": 0.03346409881055952\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.1717171717171717,\n \"acc_stderr\": 0.026869716187429917,\n \"acc_norm\": 0.1717171717171717,\n \"acc_norm_stderr\": 0.026869716187429917\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.2694300518134715,\n \"acc_stderr\": 0.03201867122877795,\n \"acc_norm\": 0.2694300518134715,\n \"acc_norm_stderr\": 0.03201867122877795\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.2230769230769231,\n \"acc_stderr\": 0.021107730127244,\n \"acc_norm\": 0.2230769230769231,\n \"acc_norm_stderr\": 0.021107730127244\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.29259259259259257,\n \"acc_stderr\": 0.027738969632176088,\n \"acc_norm\": 0.29259259259259257,\n \"acc_norm_stderr\": 0.027738969632176088\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.22268907563025211,\n \"acc_stderr\": 0.027025433498882385,\n \"acc_norm\": 0.22268907563025211,\n \"acc_norm_stderr\": 0.027025433498882385\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2119205298013245,\n \"acc_stderr\": 0.03336767086567978,\n \"acc_norm\": 0.2119205298013245,\n \"acc_norm_stderr\": 0.03336767086567978\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.30275229357798167,\n \"acc_stderr\": 0.019698711434756343,\n \"acc_norm\": 0.30275229357798167,\n \"acc_norm_stderr\": 0.019698711434756343\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.24537037037037038,\n \"acc_stderr\": 0.02934666509437295,\n \"acc_norm\": 0.24537037037037038,\n \"acc_norm_stderr\": 0.02934666509437295\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.20098039215686275,\n \"acc_stderr\": 0.028125972265654355,\n \"acc_norm\": 0.20098039215686275,\n \"acc_norm_stderr\": 0.028125972265654355\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.25316455696202533,\n \"acc_stderr\": 0.028304657943035296,\n \"acc_norm\": 0.25316455696202533,\n \"acc_norm_stderr\": 0.028304657943035296\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.336322869955157,\n \"acc_stderr\": 0.031708824268455,\n \"acc_norm\": 0.336322869955157,\n \"acc_norm_stderr\": 0.031708824268455\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.22900763358778625,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.22900763358778625,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.23140495867768596,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\": 0.23140495867768596,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.2085889570552147,\n \"acc_stderr\": 0.03192193448934724,\n \"acc_norm\": 0.2085889570552147,\n \"acc_norm_stderr\": 0.03192193448934724\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.20388349514563106,\n \"acc_stderr\": 0.039891398595317706,\n \"acc_norm\": 0.20388349514563106,\n \"acc_norm_stderr\": 0.039891398595317706\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2863247863247863,\n \"acc_stderr\": 0.029614323690456648,\n \"acc_norm\": 0.2863247863247863,\n \"acc_norm_stderr\": 0.029614323690456648\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.26947637292464877,\n \"acc_stderr\": 0.015866243073215037,\n \"acc_norm\": 0.26947637292464877,\n \"acc_norm_stderr\": 0.015866243073215037\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.2543352601156069,\n \"acc_stderr\": 0.02344582627654555,\n \"acc_norm\": 0.2543352601156069,\n \"acc_norm_stderr\": 0.02344582627654555\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2435754189944134,\n \"acc_stderr\": 0.014355911964767857,\n \"acc_norm\": 0.2435754189944134,\n \"acc_norm_stderr\": 0.014355911964767857\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.023805186524888156,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.023805186524888156\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2572347266881029,\n \"acc_stderr\": 0.024826171289250888,\n \"acc_norm\": 0.2572347266881029,\n \"acc_norm_stderr\": 0.024826171289250888\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2654320987654321,\n \"acc_stderr\": 0.024569223600460845,\n \"acc_norm\": 0.2654320987654321,\n \"acc_norm_stderr\": 0.024569223600460845\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432403,\n \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432403\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2392438070404172,\n \"acc_stderr\": 0.010896123652676651,\n \"acc_norm\": 0.2392438070404172,\n \"acc_norm_stderr\": 0.010896123652676651\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.34558823529411764,\n \"acc_stderr\": 0.028888193103988644,\n \"acc_norm\": 0.34558823529411764,\n \"acc_norm_stderr\": 0.028888193103988644\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.28104575163398693,\n \"acc_stderr\": 0.018185218954318086,\n \"acc_norm\": 0.28104575163398693,\n \"acc_norm_stderr\": 0.018185218954318086\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.19090909090909092,\n \"acc_stderr\": 0.03764425585984926,\n \"acc_norm\": 0.19090909090909092,\n \"acc_norm_stderr\": 0.03764425585984926\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.17959183673469387,\n \"acc_stderr\": 0.024573293589585637,\n \"acc_norm\": 0.17959183673469387,\n \"acc_norm_stderr\": 0.024573293589585637\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370519,\n \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370519\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.19883040935672514,\n \"acc_stderr\": 0.03061111655743253,\n \"acc_norm\": 0.19883040935672514,\n \"acc_norm_stderr\": 0.03061111655743253\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22031823745410037,\n \"mc1_stderr\": 0.01450904517148729,\n \"mc2\": 0.37005968856579075,\n \"mc2_stderr\": 0.014337009699291485\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5406471981057617,\n \"acc_stderr\": 0.014005973823825136\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.017437452615617893,\n \"acc_stderr\": 0.003605486867998255\n }\n}\n```", "repo_url": "https://huggingface.co/DatPySci/pythia-1b-sft-50k", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|arc:challenge|25_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|arc:challenge|25_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|arc:challenge|25_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|gsm8k|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|gsm8k|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|gsm8k|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hellaswag|10_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hellaswag|10_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hellaswag|10_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|winogrande|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|winogrande|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|winogrande|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["results_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["results_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["results_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["results_2024-02-17T14-41-59.887810.parquet"]}]}]} | 2024-02-17T14:43:48+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of DatPySci/pythia-1b-sft-50k
Dataset automatically created during the evaluation run of model DatPySci/pythia-1b-sft-50k on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-17T14:41:59.887810(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of DatPySci/pythia-1b-sft-50k\n\n\n\nDataset automatically created during the evaluation run of model DatPySci/pythia-1b-sft-50k on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-17T14:41:59.887810(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of DatPySci/pythia-1b-sft-50k\n\n\n\nDataset automatically created during the evaluation run of model DatPySci/pythia-1b-sft-50k on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-17T14:41:59.887810(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
191,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of DatPySci/pythia-1b-sft-50k\n\n\n\nDataset automatically created during the evaluation run of model DatPySci/pythia-1b-sft-50k on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-17T14:41:59.887810(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
6711d2f2a9c7d585e61ecee5138c3252d3367fb1 |
## Sources of the corpus used
- [Opus](https://opus.nlpl.eu/results/en&eu/corpus-result-table)
- [Orai](https://www.orai.eus/en/resources)
| itzune/basque-parallel-corpus | [
"task_categories:translation",
"language:eu",
"language:en",
"language:es",
"language:fr",
"region:us"
] | 2024-02-17T14:36:43+00:00 | {"language": ["eu", "en", "es", "fr"], "task_categories": ["translation"], "pretty_name": "Parallel Basque Corpus"} | 2024-02-17T14:52:38+00:00 | [] | [
"eu",
"en",
"es",
"fr"
] | TAGS
#task_categories-translation #language-Basque #language-English #language-Spanish #language-French #region-us
|
## Sources of the corpus used
- Opus
- Orai
| [
"## Sources of the corpus used\n\n- Opus\n- Orai"
] | [
"TAGS\n#task_categories-translation #language-Basque #language-English #language-Spanish #language-French #region-us \n",
"## Sources of the corpus used\n\n- Opus\n- Orai"
] | [
35,
13
] | [
"passage: TAGS\n#task_categories-translation #language-Basque #language-English #language-Spanish #language-French #region-us \n## Sources of the corpus used\n\n- Opus\n- Orai"
] |
b4020ab3a4b832f42e8b226e9be64efb7f3b8f43 |
# Dataset Card for Evaluation run of athirdpath/Orca-2-13b-Alpaca-Uncensored
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [athirdpath/Orca-2-13b-Alpaca-Uncensored](https://huggingface.co/athirdpath/Orca-2-13b-Alpaca-Uncensored) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_athirdpath__Orca-2-13b-Alpaca-Uncensored",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-17T14:58:45.053086](https://huggingface.co/datasets/open-llm-leaderboard/details_athirdpath__Orca-2-13b-Alpaca-Uncensored/blob/main/results_2024-02-17T14-58-45.053086.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6002703838763287,
"acc_stderr": 0.032941926494755115,
"acc_norm": 0.6047089287996544,
"acc_norm_stderr": 0.03361522848210774,
"mc1": 0.37821297429620565,
"mc1_stderr": 0.01697633590754687,
"mc2": 0.5358987502340753,
"mc2_stderr": 0.01565060464007792
},
"harness|arc:challenge|25": {
"acc": 0.5750853242320819,
"acc_stderr": 0.014445698968520769,
"acc_norm": 0.6109215017064846,
"acc_norm_stderr": 0.014247309976045607
},
"harness|hellaswag|10": {
"acc": 0.6100378410675165,
"acc_stderr": 0.004867445945277156,
"acc_norm": 0.792670782712607,
"acc_norm_stderr": 0.004045648954769832
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7302631578947368,
"acc_stderr": 0.036117805602848975,
"acc_norm": 0.7302631578947368,
"acc_norm_stderr": 0.036117805602848975
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6150943396226415,
"acc_stderr": 0.02994649856769995,
"acc_norm": 0.6150943396226415,
"acc_norm_stderr": 0.02994649856769995
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03942082639927213,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03942082639927213
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5664739884393064,
"acc_stderr": 0.03778621079092056,
"acc_norm": 0.5664739884393064,
"acc_norm_stderr": 0.03778621079092056
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.04617034827006718,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.04617034827006718
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5361702127659574,
"acc_stderr": 0.03260038511835771,
"acc_norm": 0.5361702127659574,
"acc_norm_stderr": 0.03260038511835771
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.37566137566137564,
"acc_stderr": 0.024942368931159795,
"acc_norm": 0.37566137566137564,
"acc_norm_stderr": 0.024942368931159795
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.04163453031302859,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.04163453031302859
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7387096774193549,
"acc_stderr": 0.024993053397764805,
"acc_norm": 0.7387096774193549,
"acc_norm_stderr": 0.024993053397764805
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.46798029556650245,
"acc_stderr": 0.03510766597959217,
"acc_norm": 0.46798029556650245,
"acc_norm_stderr": 0.03510766597959217
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7151515151515152,
"acc_stderr": 0.03524390844511781,
"acc_norm": 0.7151515151515152,
"acc_norm_stderr": 0.03524390844511781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7474747474747475,
"acc_stderr": 0.030954055470365897,
"acc_norm": 0.7474747474747475,
"acc_norm_stderr": 0.030954055470365897
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8549222797927462,
"acc_stderr": 0.02541634309630642,
"acc_norm": 0.8549222797927462,
"acc_norm_stderr": 0.02541634309630642
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5794871794871795,
"acc_stderr": 0.025028610276710855,
"acc_norm": 0.5794871794871795,
"acc_norm_stderr": 0.025028610276710855
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.028661201116524586,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.028661201116524586
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6260504201680672,
"acc_stderr": 0.031429466378837076,
"acc_norm": 0.6260504201680672,
"acc_norm_stderr": 0.031429466378837076
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8073394495412844,
"acc_stderr": 0.016909276884936073,
"acc_norm": 0.8073394495412844,
"acc_norm_stderr": 0.016909276884936073
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.028626547912437406,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.028626547912437406
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.025955020841621112,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.025955020841621112
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.031602951437766785,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.031602951437766785
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7175572519083969,
"acc_stderr": 0.03948406125768361,
"acc_norm": 0.7175572519083969,
"acc_norm_stderr": 0.03948406125768361
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7055214723926381,
"acc_stderr": 0.03581165790474082,
"acc_norm": 0.7055214723926381,
"acc_norm_stderr": 0.03581165790474082
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04547960999764377,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04547960999764377
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384493,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384493
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077802,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077802
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.776500638569604,
"acc_stderr": 0.01489723522945071,
"acc_norm": 0.776500638569604,
"acc_norm_stderr": 0.01489723522945071
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.025070713719153186,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.025070713719153186
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3318435754189944,
"acc_stderr": 0.015748421208187306,
"acc_norm": 0.3318435754189944,
"acc_norm_stderr": 0.015748421208187306
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6601307189542484,
"acc_stderr": 0.027121956071388852,
"acc_norm": 0.6601307189542484,
"acc_norm_stderr": 0.027121956071388852
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6816720257234726,
"acc_stderr": 0.026457225067811025,
"acc_norm": 0.6816720257234726,
"acc_norm_stderr": 0.026457225067811025
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7067901234567902,
"acc_stderr": 0.025329888171900933,
"acc_norm": 0.7067901234567902,
"acc_norm_stderr": 0.025329888171900933
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.029779450957303062,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.029779450957303062
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4380704041720991,
"acc_stderr": 0.012671902782567657,
"acc_norm": 0.4380704041720991,
"acc_norm_stderr": 0.012671902782567657
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5772058823529411,
"acc_stderr": 0.030008562845003476,
"acc_norm": 0.5772058823529411,
"acc_norm_stderr": 0.030008562845003476
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6143790849673203,
"acc_stderr": 0.01969145905235402,
"acc_norm": 0.6143790849673203,
"acc_norm_stderr": 0.01969145905235402
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.0289205832206756,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.0289205832206756
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7263681592039801,
"acc_stderr": 0.03152439186555402,
"acc_norm": 0.7263681592039801,
"acc_norm_stderr": 0.03152439186555402
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.03094445977853321,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.03094445977853321
},
"harness|truthfulqa:mc|0": {
"mc1": 0.37821297429620565,
"mc1_stderr": 0.01697633590754687,
"mc2": 0.5358987502340753,
"mc2_stderr": 0.01565060464007792
},
"harness|winogrande|5": {
"acc": 0.7742699289660616,
"acc_stderr": 0.011749626260902545
},
"harness|gsm8k|5": {
"acc": 0.38286580742987114,
"acc_stderr": 0.013389223491820465
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_athirdpath__Orca-2-13b-Alpaca-Uncensored | [
"region:us"
] | 2024-02-17T15:01:02+00:00 | {"pretty_name": "Evaluation run of athirdpath/Orca-2-13b-Alpaca-Uncensored", "dataset_summary": "Dataset automatically created during the evaluation run of model [athirdpath/Orca-2-13b-Alpaca-Uncensored](https://huggingface.co/athirdpath/Orca-2-13b-Alpaca-Uncensored) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_athirdpath__Orca-2-13b-Alpaca-Uncensored\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-17T14:58:45.053086](https://huggingface.co/datasets/open-llm-leaderboard/details_athirdpath__Orca-2-13b-Alpaca-Uncensored/blob/main/results_2024-02-17T14-58-45.053086.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6002703838763287,\n \"acc_stderr\": 0.032941926494755115,\n \"acc_norm\": 0.6047089287996544,\n \"acc_norm_stderr\": 0.03361522848210774,\n \"mc1\": 0.37821297429620565,\n \"mc1_stderr\": 0.01697633590754687,\n \"mc2\": 0.5358987502340753,\n \"mc2_stderr\": 0.01565060464007792\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5750853242320819,\n \"acc_stderr\": 0.014445698968520769,\n \"acc_norm\": 0.6109215017064846,\n \"acc_norm_stderr\": 0.014247309976045607\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6100378410675165,\n \"acc_stderr\": 0.004867445945277156,\n \"acc_norm\": 0.792670782712607,\n \"acc_norm_stderr\": 0.004045648954769832\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7302631578947368,\n \"acc_stderr\": 0.036117805602848975,\n \"acc_norm\": 0.7302631578947368,\n \"acc_norm_stderr\": 0.036117805602848975\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6150943396226415,\n \"acc_stderr\": 0.02994649856769995,\n \"acc_norm\": 0.6150943396226415,\n \"acc_norm_stderr\": 0.02994649856769995\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.03942082639927213,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03942082639927213\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5664739884393064,\n \"acc_stderr\": 0.03778621079092056,\n \"acc_norm\": 0.5664739884393064,\n \"acc_norm_stderr\": 0.03778621079092056\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006718,\n \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006718\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5361702127659574,\n \"acc_stderr\": 0.03260038511835771,\n \"acc_norm\": 0.5361702127659574,\n \"acc_norm_stderr\": 0.03260038511835771\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.37566137566137564,\n \"acc_stderr\": 0.024942368931159795,\n \"acc_norm\": 0.37566137566137564,\n \"acc_norm_stderr\": 0.024942368931159795\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.31746031746031744,\n \"acc_stderr\": 0.04163453031302859,\n \"acc_norm\": 0.31746031746031744,\n \"acc_norm_stderr\": 0.04163453031302859\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7387096774193549,\n \"acc_stderr\": 0.024993053397764805,\n \"acc_norm\": 0.7387096774193549,\n \"acc_norm_stderr\": 0.024993053397764805\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.46798029556650245,\n \"acc_stderr\": 0.03510766597959217,\n \"acc_norm\": 0.46798029556650245,\n \"acc_norm_stderr\": 0.03510766597959217\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7151515151515152,\n \"acc_stderr\": 0.03524390844511781,\n \"acc_norm\": 0.7151515151515152,\n \"acc_norm_stderr\": 0.03524390844511781\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7474747474747475,\n \"acc_stderr\": 0.030954055470365897,\n \"acc_norm\": 0.7474747474747475,\n \"acc_norm_stderr\": 0.030954055470365897\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.02541634309630642,\n \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.02541634309630642\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5794871794871795,\n \"acc_stderr\": 0.025028610276710855,\n \"acc_norm\": 0.5794871794871795,\n \"acc_norm_stderr\": 0.025028610276710855\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524586,\n \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524586\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6260504201680672,\n \"acc_stderr\": 0.031429466378837076,\n \"acc_norm\": 0.6260504201680672,\n \"acc_norm_stderr\": 0.031429466378837076\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8073394495412844,\n \"acc_stderr\": 0.016909276884936073,\n \"acc_norm\": 0.8073394495412844,\n \"acc_norm_stderr\": 0.016909276884936073\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7892156862745098,\n \"acc_stderr\": 0.028626547912437406,\n \"acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.028626547912437406\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621112,\n \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621112\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n \"acc_stderr\": 0.031602951437766785,\n \"acc_norm\": 0.6681614349775785,\n \"acc_norm_stderr\": 0.031602951437766785\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7175572519083969,\n \"acc_stderr\": 0.03948406125768361,\n \"acc_norm\": 0.7175572519083969,\n \"acc_norm_stderr\": 0.03948406125768361\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7055214723926381,\n \"acc_stderr\": 0.03581165790474082,\n \"acc_norm\": 0.7055214723926381,\n \"acc_norm_stderr\": 0.03581165790474082\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n \"acc_stderr\": 0.04547960999764377,\n \"acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.04547960999764377\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384493,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384493\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n \"acc_stderr\": 0.022509033937077802,\n \"acc_norm\": 0.8632478632478633,\n \"acc_norm_stderr\": 0.022509033937077802\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.776500638569604,\n \"acc_stderr\": 0.01489723522945071,\n \"acc_norm\": 0.776500638569604,\n \"acc_norm_stderr\": 0.01489723522945071\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6820809248554913,\n \"acc_stderr\": 0.025070713719153186,\n \"acc_norm\": 0.6820809248554913,\n \"acc_norm_stderr\": 0.025070713719153186\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3318435754189944,\n \"acc_stderr\": 0.015748421208187306,\n \"acc_norm\": 0.3318435754189944,\n \"acc_norm_stderr\": 0.015748421208187306\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6601307189542484,\n \"acc_stderr\": 0.027121956071388852,\n \"acc_norm\": 0.6601307189542484,\n \"acc_norm_stderr\": 0.027121956071388852\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6816720257234726,\n \"acc_stderr\": 0.026457225067811025,\n \"acc_norm\": 0.6816720257234726,\n \"acc_norm_stderr\": 0.026457225067811025\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7067901234567902,\n \"acc_stderr\": 0.025329888171900933,\n \"acc_norm\": 0.7067901234567902,\n \"acc_norm_stderr\": 0.025329888171900933\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303062,\n \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303062\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4380704041720991,\n \"acc_stderr\": 0.012671902782567657,\n \"acc_norm\": 0.4380704041720991,\n \"acc_norm_stderr\": 0.012671902782567657\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5772058823529411,\n \"acc_stderr\": 0.030008562845003476,\n \"acc_norm\": 0.5772058823529411,\n \"acc_norm_stderr\": 0.030008562845003476\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6143790849673203,\n \"acc_stderr\": 0.01969145905235402,\n \"acc_norm\": 0.6143790849673203,\n \"acc_norm_stderr\": 0.01969145905235402\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.0289205832206756,\n \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.0289205832206756\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7263681592039801,\n \"acc_stderr\": 0.03152439186555402,\n \"acc_norm\": 0.7263681592039801,\n \"acc_norm_stderr\": 0.03152439186555402\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653694,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653694\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.03094445977853321,\n \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.03094445977853321\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.37821297429620565,\n \"mc1_stderr\": 0.01697633590754687,\n \"mc2\": 0.5358987502340753,\n \"mc2_stderr\": 0.01565060464007792\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7742699289660616,\n \"acc_stderr\": 0.011749626260902545\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.38286580742987114,\n \"acc_stderr\": 0.013389223491820465\n }\n}\n```", "repo_url": "https://huggingface.co/athirdpath/Orca-2-13b-Alpaca-Uncensored", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|arc:challenge|25_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|gsm8k|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hellaswag|10_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|winogrande|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["results_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["results_2024-02-17T14-58-45.053086.parquet"]}]}]} | 2024-02-17T15:01:23+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of athirdpath/Orca-2-13b-Alpaca-Uncensored
Dataset automatically created during the evaluation run of model athirdpath/Orca-2-13b-Alpaca-Uncensored on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-17T14:58:45.053086(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of athirdpath/Orca-2-13b-Alpaca-Uncensored\n\n\n\nDataset automatically created during the evaluation run of model athirdpath/Orca-2-13b-Alpaca-Uncensored on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-17T14:58:45.053086(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of athirdpath/Orca-2-13b-Alpaca-Uncensored\n\n\n\nDataset automatically created during the evaluation run of model athirdpath/Orca-2-13b-Alpaca-Uncensored on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-17T14:58:45.053086(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
197,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of athirdpath/Orca-2-13b-Alpaca-Uncensored\n\n\n\nDataset automatically created during the evaluation run of model athirdpath/Orca-2-13b-Alpaca-Uncensored on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-17T14:58:45.053086(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]"
] |
33a5335cb3c30d270cb984f980894c95bbe1fc73 | 1. edjudgement.zip consist of originally scrapped pdf files
2. pdf_to_text_raw.csv is a processed data (stage 1 - non refined) from the scraped pdf | izardy/malaysia-ejudgement | [
"region:us"
] | 2024-02-17T15:28:45+00:00 | {} | 2024-02-17T16:01:56+00:00 | [] | [] | TAGS
#region-us
| 1. URL consist of originally scrapped pdf files
2. pdf_to_text_raw.csv is a processed data (stage 1 - non refined) from the scraped pdf | [] | [
"TAGS\n#region-us \n"
] | [
6
] | [
"passage: TAGS\n#region-us \n"
] |
4570667d64bcd1741501c700899b11f321198570 | # Durham Urban Canopy Analysis and Enhancement Initiative (DUCAEI)
## Project Overview
The Durham Urban Canopy Analysis and Enhancement Initiative (DUCAEI) is committed to utilizing the Trees & Planting Sites dataset for a comprehensive geospatial analysis of Durham's urban tree canopy. Through Python within Google Colab, our aim is to identify key locations for canopy expansion, evaluate the impact of urban development on green spaces, and deliver informed recommendations for the sustainable growth of urban tree coverage.
## Background and Rationale
Durham's urban tree canopy is a crucial component that contributes to environmental quality, public health, and overall city aesthetics. This canopy is under threat due to ongoing urban development and natural wear. A systematic, data-driven approach is critical for strategic planning and conservation of the urban forest to ensure its vitality for generations to come.
## Data Sources and Methodology
### Data Sources
We will leverage the following files from the Durham Trees & Planting Sites Dataset, as found on the Durham Open Data portal:
- `GS_TreeInventory.shp`
- `Trees_&_Planting_Sites.csv`
- `Trees_%26_Planting_Sites.geojson`
# Dataset Card for Urban Tree Inventory
## Dataset Description
This dataset provides comprehensive information about urban trees within a specified area, including their physical characteristics, environmental benefits, and the economic value they add in terms of ecosystem services.
### Spatial Data (GeoJSON)
**Format:** GeoJSON
**Content:**
- **Type:** `FeatureCollection` - A collection of feature objects.
- **Features:** Each feature object represents a tree and contains:
- **Type:** `Feature`
- **Geometry:** `Point` (includes longitude and latitude of the tree location).
- **Properties:** Detailed information about the tree (some fields may overlap with the CSV structure below).
### Tabular Data (CSV)
**Format:** CSV
**Columns:**
- **X, Y:** Coordinates of the tree location.
- **OBJECTID:** Unique identifier for the tree.
- **streetaddress:** Street address nearest to the tree.
- **city:** City where the tree is located.
- **zipcode:** Zip code for the location of the tree.
- **facilityid:** Identifier for the facility associated with the tree, if any.
- **present:** Indication of whether the tree is currently present.
- **genus, species, commonname:** Botanical and common names of the tree.
- **plantingdate:** Date when the tree was planted.
- **diameterin:** Diameter of the tree trunk in inches.
- **heightft:** Height of the tree in feet.
- **condition:** Health condition of the tree.
- **contractwork:** Indicates if the tree has had any contract work done.
- **neighborhood:** Neighborhood where the tree is located.
- **program:** The program under which the tree was planted.
- **plantingw:** Width of the planting site.
- **plantingcond:** Condition of the planting site.
- **underpwerlins:** Whether the tree is under power lines.
- **matureheight:** The mature height of the tree.
- **GlobalID:** A global unique identifier for the tree.
- **created_user:** The user who created the record.
- **created_date:** The date the record was created.
- **last_edited_user:** The user who last edited the record.
- **last_edited_date:** The date the record was last edited.
#### Environmental and Economic Data:
- **isoprene, monoterpene, vocs:** Emissions and absorption data for various compounds.
- **coremoved_ozperyr, o3removed_ozperyr, etc.:** Annual pollutant removal metrics.
- **o2production_lbperyr:** Annual oxygen production.
- **carbonstorage_lb, carbonstorage_dol:** Carbon storage metrics.
- **grosscarseq_lbperyr, grosscarseq_dolperyr:** Gross carbon sequestration.
- **avoidrunoff_ft2peryr, avoidrunoff_dol2peryr:** Metrics related to stormwater runoff avoidance.
- **totannbenefits_dolperyr:** Total annual dollar benefits from the tree.
- **leafarea_sqft, potevapotran_cuftperyr, etc.:** Metrics related to the water cycle.
- **heating_mbtuperyr, cooling_kwhperyr, etc.:** Energy savings related to the tree's impact on building energy use.
### Example Record
**GeoJSON Feature:**
```json
{
"type": "Feature",
"geometry": {
"type": "Point",
"coordinates": [-78.90863, 36.00441]
},
"properties": {
"OBJECTID": 2840940,
"commonname": "Willow Oak",
// Additional properties...
}
}
```
The `GS_TreeInventory.shp` file encompasses a range of attributes for each record:
- **OBJECTID:** Unique identifier for each record.
- **streetaddr:** Street address where the tree or planting site is located.
- **city:** The city name, which is Durham.
- **zipcode:** Postal code for the location.
- **facilityid:** Identifier possibly linked to a facility or area associated with the tree.
- **present:** Type of feature present, such as a tree or a planting site.
- **genus:** Genus of the tree.
- **species:** Species of the tree.
- **commonname:** Common name of the tree.
- **plantingda:** Date or year range when the tree was planted or the planting site was established.
- ...
### Objectives
1. Combine Shapefile and CSV data into a comprehensive geospatial dataset using Python.
2. Apply Python libraries to uncover relationships between tree canopy data and urban development.
3. Provide practical insights and strategies for the expansion of Durham's urban tree canopy.
4. Produce analyses and visualizations with the GeoJSON file.
### Methodology
Our analytical process within Google Colab will encompass:
- **Data Preparation and Integration:** Using tools like Geopandas, Pandas, and PyShp to organize and combine spatial and tabular data.
- **Geospatial Analysis:** Applying Shapely and Rtree for spatial analysis, and using SciPy or Statsmodels for statistical correlations.
- **Visualization and Optimization:** Generating maps and graphs with Matplotlib, Seaborn, or Plotly, and utilizing optimization algorithms to suggest optimal planting locations.
## Deliverables
1. A collection of Google Colab Python notebooks that outline our analytical processes.
2. Interactive maps and visualizations that connect tree canopy coverage with urban development metrics.
3. An exhaustive report that contains our findings and recommendations for enhancing the urban canopy.
## Limitations
- **Computational Resources:** The limited computational offerings of Google Colab may pose a challenge to the size of the datasets or the complexity of models we can employ.
- **Data Quality:** The accuracy and currency of the data ultimately affect the precision of our recommendations.
- **Sociopolitical Considerations:** Implementation of our data-driven suggestions must be reviewed within the context of local policy and community input.
## Conclusion
DUCAEI aims to create a more verdant and livable urban landscape in Durham through this Python-based analytical project. By laying a strong foundation for data-informed decision-making, we hope to cultivate a thriving, green, and sustainable urban environment. | Ziyuan111/Urban_Tree_Canopy_in_Durham2 | [
"size_categories:10K<n<100K",
"language:en",
"license:apache-2.0",
"region:us"
] | 2024-02-17T15:32:47+00:00 | {"language": ["en"], "license": "apache-2.0", "size_categories": ["10K<n<100K"]} | 2024-02-17T16:36:59+00:00 | [] | [
"en"
] | TAGS
#size_categories-10K<n<100K #language-English #license-apache-2.0 #region-us
| # Durham Urban Canopy Analysis and Enhancement Initiative (DUCAEI)
## Project Overview
The Durham Urban Canopy Analysis and Enhancement Initiative (DUCAEI) is committed to utilizing the Trees & Planting Sites dataset for a comprehensive geospatial analysis of Durham's urban tree canopy. Through Python within Google Colab, our aim is to identify key locations for canopy expansion, evaluate the impact of urban development on green spaces, and deliver informed recommendations for the sustainable growth of urban tree coverage.
## Background and Rationale
Durham's urban tree canopy is a crucial component that contributes to environmental quality, public health, and overall city aesthetics. This canopy is under threat due to ongoing urban development and natural wear. A systematic, data-driven approach is critical for strategic planning and conservation of the urban forest to ensure its vitality for generations to come.
## Data Sources and Methodology
### Data Sources
We will leverage the following files from the Durham Trees & Planting Sites Dataset, as found on the Durham Open Data portal:
- 'GS_TreeInventory.shp'
- 'Trees_&_Planting_Sites.csv'
- 'Trees_%26_Planting_Sites.geojson'
# Dataset Card for Urban Tree Inventory
## Dataset Description
This dataset provides comprehensive information about urban trees within a specified area, including their physical characteristics, environmental benefits, and the economic value they add in terms of ecosystem services.
### Spatial Data (GeoJSON)
Format: GeoJSON
Content:
- Type: 'FeatureCollection' - A collection of feature objects.
- Features: Each feature object represents a tree and contains:
- Type: 'Feature'
- Geometry: 'Point' (includes longitude and latitude of the tree location).
- Properties: Detailed information about the tree (some fields may overlap with the CSV structure below).
### Tabular Data (CSV)
Format: CSV
Columns:
- X, Y: Coordinates of the tree location.
- OBJECTID: Unique identifier for the tree.
- streetaddress: Street address nearest to the tree.
- city: City where the tree is located.
- zipcode: Zip code for the location of the tree.
- facilityid: Identifier for the facility associated with the tree, if any.
- present: Indication of whether the tree is currently present.
- genus, species, commonname: Botanical and common names of the tree.
- plantingdate: Date when the tree was planted.
- diameterin: Diameter of the tree trunk in inches.
- heightft: Height of the tree in feet.
- condition: Health condition of the tree.
- contractwork: Indicates if the tree has had any contract work done.
- neighborhood: Neighborhood where the tree is located.
- program: The program under which the tree was planted.
- plantingw: Width of the planting site.
- plantingcond: Condition of the planting site.
- underpwerlins: Whether the tree is under power lines.
- matureheight: The mature height of the tree.
- GlobalID: A global unique identifier for the tree.
- created_user: The user who created the record.
- created_date: The date the record was created.
- last_edited_user: The user who last edited the record.
- last_edited_date: The date the record was last edited.
#### Environmental and Economic Data:
- isoprene, monoterpene, vocs: Emissions and absorption data for various compounds.
- coremoved_ozperyr, o3removed_ozperyr, etc.: Annual pollutant removal metrics.
- o2production_lbperyr: Annual oxygen production.
- carbonstorage_lb, carbonstorage_dol: Carbon storage metrics.
- grosscarseq_lbperyr, grosscarseq_dolperyr: Gross carbon sequestration.
- avoidrunoff_ft2peryr, avoidrunoff_dol2peryr: Metrics related to stormwater runoff avoidance.
- totannbenefits_dolperyr: Total annual dollar benefits from the tree.
- leafarea_sqft, potevapotran_cuftperyr, etc.: Metrics related to the water cycle.
- heating_mbtuperyr, cooling_kwhperyr, etc.: Energy savings related to the tree's impact on building energy use.
### Example Record
GeoJSON Feature:
The 'GS_TreeInventory.shp' file encompasses a range of attributes for each record:
- OBJECTID: Unique identifier for each record.
- streetaddr: Street address where the tree or planting site is located.
- city: The city name, which is Durham.
- zipcode: Postal code for the location.
- facilityid: Identifier possibly linked to a facility or area associated with the tree.
- present: Type of feature present, such as a tree or a planting site.
- genus: Genus of the tree.
- species: Species of the tree.
- commonname: Common name of the tree.
- plantingda: Date or year range when the tree was planted or the planting site was established.
- ...
### Objectives
1. Combine Shapefile and CSV data into a comprehensive geospatial dataset using Python.
2. Apply Python libraries to uncover relationships between tree canopy data and urban development.
3. Provide practical insights and strategies for the expansion of Durham's urban tree canopy.
4. Produce analyses and visualizations with the GeoJSON file.
### Methodology
Our analytical process within Google Colab will encompass:
- Data Preparation and Integration: Using tools like Geopandas, Pandas, and PyShp to organize and combine spatial and tabular data.
- Geospatial Analysis: Applying Shapely and Rtree for spatial analysis, and using SciPy or Statsmodels for statistical correlations.
- Visualization and Optimization: Generating maps and graphs with Matplotlib, Seaborn, or Plotly, and utilizing optimization algorithms to suggest optimal planting locations.
## Deliverables
1. A collection of Google Colab Python notebooks that outline our analytical processes.
2. Interactive maps and visualizations that connect tree canopy coverage with urban development metrics.
3. An exhaustive report that contains our findings and recommendations for enhancing the urban canopy.
## Limitations
- Computational Resources: The limited computational offerings of Google Colab may pose a challenge to the size of the datasets or the complexity of models we can employ.
- Data Quality: The accuracy and currency of the data ultimately affect the precision of our recommendations.
- Sociopolitical Considerations: Implementation of our data-driven suggestions must be reviewed within the context of local policy and community input.
## Conclusion
DUCAEI aims to create a more verdant and livable urban landscape in Durham through this Python-based analytical project. By laying a strong foundation for data-informed decision-making, we hope to cultivate a thriving, green, and sustainable urban environment. | [
"# Durham Urban Canopy Analysis and Enhancement Initiative (DUCAEI)",
"## Project Overview\n\nThe Durham Urban Canopy Analysis and Enhancement Initiative (DUCAEI) is committed to utilizing the Trees & Planting Sites dataset for a comprehensive geospatial analysis of Durham's urban tree canopy. Through Python within Google Colab, our aim is to identify key locations for canopy expansion, evaluate the impact of urban development on green spaces, and deliver informed recommendations for the sustainable growth of urban tree coverage.",
"## Background and Rationale\n\nDurham's urban tree canopy is a crucial component that contributes to environmental quality, public health, and overall city aesthetics. This canopy is under threat due to ongoing urban development and natural wear. A systematic, data-driven approach is critical for strategic planning and conservation of the urban forest to ensure its vitality for generations to come.",
"## Data Sources and Methodology",
"### Data Sources\n\nWe will leverage the following files from the Durham Trees & Planting Sites Dataset, as found on the Durham Open Data portal:\n\n- 'GS_TreeInventory.shp'\n- 'Trees_&_Planting_Sites.csv'\n- 'Trees_%26_Planting_Sites.geojson'",
"# Dataset Card for Urban Tree Inventory",
"## Dataset Description\n\nThis dataset provides comprehensive information about urban trees within a specified area, including their physical characteristics, environmental benefits, and the economic value they add in terms of ecosystem services.",
"### Spatial Data (GeoJSON)\n\nFormat: GeoJSON\n\nContent:\n\n- Type: 'FeatureCollection' - A collection of feature objects.\n- Features: Each feature object represents a tree and contains:\n - Type: 'Feature'\n - Geometry: 'Point' (includes longitude and latitude of the tree location).\n - Properties: Detailed information about the tree (some fields may overlap with the CSV structure below).",
"### Tabular Data (CSV)\n\nFormat: CSV\n\nColumns:\n\n- X, Y: Coordinates of the tree location.\n- OBJECTID: Unique identifier for the tree.\n- streetaddress: Street address nearest to the tree.\n- city: City where the tree is located.\n- zipcode: Zip code for the location of the tree.\n- facilityid: Identifier for the facility associated with the tree, if any.\n- present: Indication of whether the tree is currently present.\n- genus, species, commonname: Botanical and common names of the tree.\n- plantingdate: Date when the tree was planted.\n- diameterin: Diameter of the tree trunk in inches.\n- heightft: Height of the tree in feet.\n- condition: Health condition of the tree.\n- contractwork: Indicates if the tree has had any contract work done.\n- neighborhood: Neighborhood where the tree is located.\n- program: The program under which the tree was planted.\n- plantingw: Width of the planting site.\n- plantingcond: Condition of the planting site.\n- underpwerlins: Whether the tree is under power lines.\n- matureheight: The mature height of the tree.\n- GlobalID: A global unique identifier for the tree.\n- created_user: The user who created the record.\n- created_date: The date the record was created.\n- last_edited_user: The user who last edited the record.\n- last_edited_date: The date the record was last edited.",
"#### Environmental and Economic Data:\n\n- isoprene, monoterpene, vocs: Emissions and absorption data for various compounds.\n- coremoved_ozperyr, o3removed_ozperyr, etc.: Annual pollutant removal metrics.\n- o2production_lbperyr: Annual oxygen production.\n- carbonstorage_lb, carbonstorage_dol: Carbon storage metrics.\n- grosscarseq_lbperyr, grosscarseq_dolperyr: Gross carbon sequestration.\n- avoidrunoff_ft2peryr, avoidrunoff_dol2peryr: Metrics related to stormwater runoff avoidance.\n- totannbenefits_dolperyr: Total annual dollar benefits from the tree.\n- leafarea_sqft, potevapotran_cuftperyr, etc.: Metrics related to the water cycle.\n- heating_mbtuperyr, cooling_kwhperyr, etc.: Energy savings related to the tree's impact on building energy use.",
"### Example Record\n\nGeoJSON Feature:\n\nThe 'GS_TreeInventory.shp' file encompasses a range of attributes for each record:\n\n- OBJECTID: Unique identifier for each record.\n- streetaddr: Street address where the tree or planting site is located.\n- city: The city name, which is Durham.\n- zipcode: Postal code for the location.\n- facilityid: Identifier possibly linked to a facility or area associated with the tree.\n- present: Type of feature present, such as a tree or a planting site.\n- genus: Genus of the tree.\n- species: Species of the tree.\n- commonname: Common name of the tree.\n- plantingda: Date or year range when the tree was planted or the planting site was established.\n- ...",
"### Objectives\n\n1. Combine Shapefile and CSV data into a comprehensive geospatial dataset using Python.\n2. Apply Python libraries to uncover relationships between tree canopy data and urban development.\n3. Provide practical insights and strategies for the expansion of Durham's urban tree canopy.\n4. Produce analyses and visualizations with the GeoJSON file.",
"### Methodology\n\nOur analytical process within Google Colab will encompass:\n\n- Data Preparation and Integration: Using tools like Geopandas, Pandas, and PyShp to organize and combine spatial and tabular data.\n- Geospatial Analysis: Applying Shapely and Rtree for spatial analysis, and using SciPy or Statsmodels for statistical correlations.\n- Visualization and Optimization: Generating maps and graphs with Matplotlib, Seaborn, or Plotly, and utilizing optimization algorithms to suggest optimal planting locations.",
"## Deliverables\n\n1. A collection of Google Colab Python notebooks that outline our analytical processes.\n2. Interactive maps and visualizations that connect tree canopy coverage with urban development metrics.\n3. An exhaustive report that contains our findings and recommendations for enhancing the urban canopy.",
"## Limitations\n\n- Computational Resources: The limited computational offerings of Google Colab may pose a challenge to the size of the datasets or the complexity of models we can employ.\n- Data Quality: The accuracy and currency of the data ultimately affect the precision of our recommendations.\n- Sociopolitical Considerations: Implementation of our data-driven suggestions must be reviewed within the context of local policy and community input.",
"## Conclusion\n\nDUCAEI aims to create a more verdant and livable urban landscape in Durham through this Python-based analytical project. By laying a strong foundation for data-informed decision-making, we hope to cultivate a thriving, green, and sustainable urban environment."
] | [
"TAGS\n#size_categories-10K<n<100K #language-English #license-apache-2.0 #region-us \n",
"# Durham Urban Canopy Analysis and Enhancement Initiative (DUCAEI)",
"## Project Overview\n\nThe Durham Urban Canopy Analysis and Enhancement Initiative (DUCAEI) is committed to utilizing the Trees & Planting Sites dataset for a comprehensive geospatial analysis of Durham's urban tree canopy. Through Python within Google Colab, our aim is to identify key locations for canopy expansion, evaluate the impact of urban development on green spaces, and deliver informed recommendations for the sustainable growth of urban tree coverage.",
"## Background and Rationale\n\nDurham's urban tree canopy is a crucial component that contributes to environmental quality, public health, and overall city aesthetics. This canopy is under threat due to ongoing urban development and natural wear. A systematic, data-driven approach is critical for strategic planning and conservation of the urban forest to ensure its vitality for generations to come.",
"## Data Sources and Methodology",
"### Data Sources\n\nWe will leverage the following files from the Durham Trees & Planting Sites Dataset, as found on the Durham Open Data portal:\n\n- 'GS_TreeInventory.shp'\n- 'Trees_&_Planting_Sites.csv'\n- 'Trees_%26_Planting_Sites.geojson'",
"# Dataset Card for Urban Tree Inventory",
"## Dataset Description\n\nThis dataset provides comprehensive information about urban trees within a specified area, including their physical characteristics, environmental benefits, and the economic value they add in terms of ecosystem services.",
"### Spatial Data (GeoJSON)\n\nFormat: GeoJSON\n\nContent:\n\n- Type: 'FeatureCollection' - A collection of feature objects.\n- Features: Each feature object represents a tree and contains:\n - Type: 'Feature'\n - Geometry: 'Point' (includes longitude and latitude of the tree location).\n - Properties: Detailed information about the tree (some fields may overlap with the CSV structure below).",
"### Tabular Data (CSV)\n\nFormat: CSV\n\nColumns:\n\n- X, Y: Coordinates of the tree location.\n- OBJECTID: Unique identifier for the tree.\n- streetaddress: Street address nearest to the tree.\n- city: City where the tree is located.\n- zipcode: Zip code for the location of the tree.\n- facilityid: Identifier for the facility associated with the tree, if any.\n- present: Indication of whether the tree is currently present.\n- genus, species, commonname: Botanical and common names of the tree.\n- plantingdate: Date when the tree was planted.\n- diameterin: Diameter of the tree trunk in inches.\n- heightft: Height of the tree in feet.\n- condition: Health condition of the tree.\n- contractwork: Indicates if the tree has had any contract work done.\n- neighborhood: Neighborhood where the tree is located.\n- program: The program under which the tree was planted.\n- plantingw: Width of the planting site.\n- plantingcond: Condition of the planting site.\n- underpwerlins: Whether the tree is under power lines.\n- matureheight: The mature height of the tree.\n- GlobalID: A global unique identifier for the tree.\n- created_user: The user who created the record.\n- created_date: The date the record was created.\n- last_edited_user: The user who last edited the record.\n- last_edited_date: The date the record was last edited.",
"#### Environmental and Economic Data:\n\n- isoprene, monoterpene, vocs: Emissions and absorption data for various compounds.\n- coremoved_ozperyr, o3removed_ozperyr, etc.: Annual pollutant removal metrics.\n- o2production_lbperyr: Annual oxygen production.\n- carbonstorage_lb, carbonstorage_dol: Carbon storage metrics.\n- grosscarseq_lbperyr, grosscarseq_dolperyr: Gross carbon sequestration.\n- avoidrunoff_ft2peryr, avoidrunoff_dol2peryr: Metrics related to stormwater runoff avoidance.\n- totannbenefits_dolperyr: Total annual dollar benefits from the tree.\n- leafarea_sqft, potevapotran_cuftperyr, etc.: Metrics related to the water cycle.\n- heating_mbtuperyr, cooling_kwhperyr, etc.: Energy savings related to the tree's impact on building energy use.",
"### Example Record\n\nGeoJSON Feature:\n\nThe 'GS_TreeInventory.shp' file encompasses a range of attributes for each record:\n\n- OBJECTID: Unique identifier for each record.\n- streetaddr: Street address where the tree or planting site is located.\n- city: The city name, which is Durham.\n- zipcode: Postal code for the location.\n- facilityid: Identifier possibly linked to a facility or area associated with the tree.\n- present: Type of feature present, such as a tree or a planting site.\n- genus: Genus of the tree.\n- species: Species of the tree.\n- commonname: Common name of the tree.\n- plantingda: Date or year range when the tree was planted or the planting site was established.\n- ...",
"### Objectives\n\n1. Combine Shapefile and CSV data into a comprehensive geospatial dataset using Python.\n2. Apply Python libraries to uncover relationships between tree canopy data and urban development.\n3. Provide practical insights and strategies for the expansion of Durham's urban tree canopy.\n4. Produce analyses and visualizations with the GeoJSON file.",
"### Methodology\n\nOur analytical process within Google Colab will encompass:\n\n- Data Preparation and Integration: Using tools like Geopandas, Pandas, and PyShp to organize and combine spatial and tabular data.\n- Geospatial Analysis: Applying Shapely and Rtree for spatial analysis, and using SciPy or Statsmodels for statistical correlations.\n- Visualization and Optimization: Generating maps and graphs with Matplotlib, Seaborn, or Plotly, and utilizing optimization algorithms to suggest optimal planting locations.",
"## Deliverables\n\n1. A collection of Google Colab Python notebooks that outline our analytical processes.\n2. Interactive maps and visualizations that connect tree canopy coverage with urban development metrics.\n3. An exhaustive report that contains our findings and recommendations for enhancing the urban canopy.",
"## Limitations\n\n- Computational Resources: The limited computational offerings of Google Colab may pose a challenge to the size of the datasets or the complexity of models we can employ.\n- Data Quality: The accuracy and currency of the data ultimately affect the precision of our recommendations.\n- Sociopolitical Considerations: Implementation of our data-driven suggestions must be reviewed within the context of local policy and community input.",
"## Conclusion\n\nDUCAEI aims to create a more verdant and livable urban landscape in Durham through this Python-based analytical project. By laying a strong foundation for data-informed decision-making, we hope to cultivate a thriving, green, and sustainable urban environment."
] | [
30,
20,
108,
86,
7,
83,
10,
43,
105,
343,
235,
182,
86,
135,
69,
96,
63
] | [
"passage: TAGS\n#size_categories-10K<n<100K #language-English #license-apache-2.0 #region-us \n# Durham Urban Canopy Analysis and Enhancement Initiative (DUCAEI)## Project Overview\n\nThe Durham Urban Canopy Analysis and Enhancement Initiative (DUCAEI) is committed to utilizing the Trees & Planting Sites dataset for a comprehensive geospatial analysis of Durham's urban tree canopy. Through Python within Google Colab, our aim is to identify key locations for canopy expansion, evaluate the impact of urban development on green spaces, and deliver informed recommendations for the sustainable growth of urban tree coverage.## Background and Rationale\n\nDurham's urban tree canopy is a crucial component that contributes to environmental quality, public health, and overall city aesthetics. This canopy is under threat due to ongoing urban development and natural wear. A systematic, data-driven approach is critical for strategic planning and conservation of the urban forest to ensure its vitality for generations to come.## Data Sources and Methodology### Data Sources\n\nWe will leverage the following files from the Durham Trees & Planting Sites Dataset, as found on the Durham Open Data portal:\n\n- 'GS_TreeInventory.shp'\n- 'Trees_&_Planting_Sites.csv'\n- 'Trees_%26_Planting_Sites.geojson'# Dataset Card for Urban Tree Inventory## Dataset Description\n\nThis dataset provides comprehensive information about urban trees within a specified area, including their physical characteristics, environmental benefits, and the economic value they add in terms of ecosystem services.### Spatial Data (GeoJSON)\n\nFormat: GeoJSON\n\nContent:\n\n- Type: 'FeatureCollection' - A collection of feature objects.\n- Features: Each feature object represents a tree and contains:\n - Type: 'Feature'\n - Geometry: 'Point' (includes longitude and latitude of the tree location).\n - Properties: Detailed information about the tree (some fields may overlap with the CSV structure below).",
"passage: ### Tabular Data (CSV)\n\nFormat: CSV\n\nColumns:\n\n- X, Y: Coordinates of the tree location.\n- OBJECTID: Unique identifier for the tree.\n- streetaddress: Street address nearest to the tree.\n- city: City where the tree is located.\n- zipcode: Zip code for the location of the tree.\n- facilityid: Identifier for the facility associated with the tree, if any.\n- present: Indication of whether the tree is currently present.\n- genus, species, commonname: Botanical and common names of the tree.\n- plantingdate: Date when the tree was planted.\n- diameterin: Diameter of the tree trunk in inches.\n- heightft: Height of the tree in feet.\n- condition: Health condition of the tree.\n- contractwork: Indicates if the tree has had any contract work done.\n- neighborhood: Neighborhood where the tree is located.\n- program: The program under which the tree was planted.\n- plantingw: Width of the planting site.\n- plantingcond: Condition of the planting site.\n- underpwerlins: Whether the tree is under power lines.\n- matureheight: The mature height of the tree.\n- GlobalID: A global unique identifier for the tree.\n- created_user: The user who created the record.\n- created_date: The date the record was created.\n- last_edited_user: The user who last edited the record.\n- last_edited_date: The date the record was last edited.#### Environmental and Economic Data:\n\n- isoprene, monoterpene, vocs: Emissions and absorption data for various compounds.\n- coremoved_ozperyr, o3removed_ozperyr, etc.: Annual pollutant removal metrics.\n- o2production_lbperyr: Annual oxygen production.\n- carbonstorage_lb, carbonstorage_dol: Carbon storage metrics.\n- grosscarseq_lbperyr, grosscarseq_dolperyr: Gross carbon sequestration.\n- avoidrunoff_ft2peryr, avoidrunoff_dol2peryr: Metrics related to stormwater runoff avoidance.\n- totannbenefits_dolperyr: Total annual dollar benefits from the tree.\n- leafarea_sqft, potevapotran_cuftperyr, etc.: Metrics related to the water cycle.\n- heating_mbtuperyr, cooling_kwhperyr, etc.: Energy savings related to the tree's impact on building energy use.### Example Record\n\nGeoJSON Feature:\n\nThe 'GS_TreeInventory.shp' file encompasses a range of attributes for each record:\n\n- OBJECTID: Unique identifier for each record.\n- streetaddr: Street address where the tree or planting site is located.\n- city: The city name, which is Durham.\n- zipcode: Postal code for the location.\n- facilityid: Identifier possibly linked to a facility or area associated with the tree.\n- present: Type of feature present, such as a tree or a planting site.\n- genus: Genus of the tree.\n- species: Species of the tree.\n- commonname: Common name of the tree.\n- plantingda: Date or year range when the tree was planted or the planting site was established.\n- ...### Objectives\n\n1. Combine Shapefile and CSV data into a comprehensive geospatial dataset using Python.\n2. Apply Python libraries to uncover relationships between tree canopy data and urban development.\n3. Provide practical insights and strategies for the expansion of Durham's urban tree canopy.\n4. Produce analyses and visualizations with the GeoJSON file."
] |
03605da446ad99e0d52506c72536900e270a768f |
<div align="center">
<h1> HALvest </h1>
<h3> Multilingual Research Papers Harvested from HAL </h3>
</div>
## Dataset Description
- **Repository:** [GitHub](https://github.com/Madjakul/HALvesting/tree/main)
- **Papers:** TBD
## Dataset Summary
### overview:
HALvest is dataset comprise of fulltext from open papers from [Hyper Articles en Ligne (HAL)](https://hal.science/). Our dump gather papers written in TBD languages across TBD domains from TBD/date to TBD/date.
You can download the dataset using Hugging Face datasets:
*You may need to follow these instructions to setup authentication before downloading the dataset: [https://huggingface.co/docs/huggingface_hub/quick-start#login](https://huggingface.co/docs/huggingface_hub/quick-start#login)*
```py
from datasets import load_dataset
ds = load_dataset(
"Madjakul/halvest",
"en",
trust_remote_code=True
)
```
### Details:
- We first request [HAL's API](https://api.archives-ouvertes.fr/docs) in order to gather open research papers and parse it -- effectively sorting papers by language, year of production and domain.
- We then download the PDF of the fetched papers.
- Using [GROBID](https://github.com/kermitt2/grobid), we convert each PDF to `xml-tei` format in order to have structured data.
- Using [TBD](https://github.com), we convert each `xml-tei` to LLM's compatible `txt` format and concatenate it with the papaper's metadata fetched earlier.
### Languages
The supported languages and statistics for our dataset can be found below:
TBD
### Domains
The supported domains and statistics for our dataset can be found below:
TBD
### Dataset Structure
```json
{
"halid": ...,
"lang": ...,
"domain": ...,
"timestamp": ...,
"year": ...,
"url": ...,
"text": ...
}
```
## Considerations for Using the Data
HALvest is a direct conversion of reasearch papers found on HAL. Hence, fullnames of individuals, as well as their research e-mail adress can be found within the dataset. This must be considered prior to using this dataset for any purpose, such as training deep learning models, etc.
## Dataset Copyright
The licence terms for HALvest strictly follows those of HAL. Please refer to the below license when using this dataset.
- [HAL license](https://doc.archives-ouvertes.fr/en/legal-aspects/)
## Citation
```bibtex
@proceedings{TBD
}
```
## Acknowledegment
This dataset is built upon the following works:
```
GROBID: A machine learning software for extracting information from scholarly documents
https://github.com/kermitt2/grobid
harvesting: Collection of data parser for harvested data in [...]
[...]
```
| Madjakul/halvest | [
"task_categories:text-generation",
"task_categories:fill-mask",
"academia",
"research",
"region:us"
] | 2024-02-17T15:42:06+00:00 | {"annotations_creators": ["no-annotation"], "multilinguality": ["multilingual"], "source_datasets": ["original"], "task_categories": ["text-generation", "fill-mask"], "task_ids": ["language-modeling", "masked-language-modeling"], "pretty_name": "HALvest", "tags": ["academia", "research"]} | 2024-02-17T15:53:20+00:00 | [] | [] | TAGS
#task_categories-text-generation #task_categories-fill-mask #academia #research #region-us
|
<div align="center">
<h1> HALvest </h1>
<h3> Multilingual Research Papers Harvested from HAL </h3>
</div>
## Dataset Description
- Repository: GitHub
- Papers: TBD
## Dataset Summary
### overview:
HALvest is dataset comprise of fulltext from open papers from Hyper Articles en Ligne (HAL). Our dump gather papers written in TBD languages across TBD domains from TBD/date to TBD/date.
You can download the dataset using Hugging Face datasets:
*You may need to follow these instructions to setup authentication before downloading the dataset: URL
### Details:
- We first request HAL's API in order to gather open research papers and parse it -- effectively sorting papers by language, year of production and domain.
- We then download the PDF of the fetched papers.
- Using GROBID, we convert each PDF to 'xml-tei' format in order to have structured data.
- Using TBD, we convert each 'xml-tei' to LLM's compatible 'txt' format and concatenate it with the papaper's metadata fetched earlier.
### Languages
The supported languages and statistics for our dataset can be found below:
TBD
### Domains
The supported domains and statistics for our dataset can be found below:
TBD
### Dataset Structure
## Considerations for Using the Data
HALvest is a direct conversion of reasearch papers found on HAL. Hence, fullnames of individuals, as well as their research e-mail adress can be found within the dataset. This must be considered prior to using this dataset for any purpose, such as training deep learning models, etc.
## Dataset Copyright
The licence terms for HALvest strictly follows those of HAL. Please refer to the below license when using this dataset.
- HAL license
## Acknowledegment
This dataset is built upon the following works:
| [
"## Dataset Description\n\n- Repository: GitHub\n- Papers: TBD",
"## Dataset Summary",
"### overview:\n\nHALvest is dataset comprise of fulltext from open papers from Hyper Articles en Ligne (HAL). Our dump gather papers written in TBD languages across TBD domains from TBD/date to TBD/date.\n\nYou can download the dataset using Hugging Face datasets:\n\n*You may need to follow these instructions to setup authentication before downloading the dataset: URL",
"### Details:\n\n- We first request HAL's API in order to gather open research papers and parse it -- effectively sorting papers by language, year of production and domain.\n- We then download the PDF of the fetched papers.\n- Using GROBID, we convert each PDF to 'xml-tei' format in order to have structured data.\n- Using TBD, we convert each 'xml-tei' to LLM's compatible 'txt' format and concatenate it with the papaper's metadata fetched earlier.",
"### Languages\n\nThe supported languages and statistics for our dataset can be found below:\n\nTBD",
"### Domains\n\nThe supported domains and statistics for our dataset can be found below:\n\nTBD",
"### Dataset Structure",
"## Considerations for Using the Data\n\nHALvest is a direct conversion of reasearch papers found on HAL. Hence, fullnames of individuals, as well as their research e-mail adress can be found within the dataset. This must be considered prior to using this dataset for any purpose, such as training deep learning models, etc.",
"## Dataset Copyright\n\nThe licence terms for HALvest strictly follows those of HAL. Please refer to the below license when using this dataset.\n- HAL license",
"## Acknowledegment\n\nThis dataset is built upon the following works:"
] | [
"TAGS\n#task_categories-text-generation #task_categories-fill-mask #academia #research #region-us \n",
"## Dataset Description\n\n- Repository: GitHub\n- Papers: TBD",
"## Dataset Summary",
"### overview:\n\nHALvest is dataset comprise of fulltext from open papers from Hyper Articles en Ligne (HAL). Our dump gather papers written in TBD languages across TBD domains from TBD/date to TBD/date.\n\nYou can download the dataset using Hugging Face datasets:\n\n*You may need to follow these instructions to setup authentication before downloading the dataset: URL",
"### Details:\n\n- We first request HAL's API in order to gather open research papers and parse it -- effectively sorting papers by language, year of production and domain.\n- We then download the PDF of the fetched papers.\n- Using GROBID, we convert each PDF to 'xml-tei' format in order to have structured data.\n- Using TBD, we convert each 'xml-tei' to LLM's compatible 'txt' format and concatenate it with the papaper's metadata fetched earlier.",
"### Languages\n\nThe supported languages and statistics for our dataset can be found below:\n\nTBD",
"### Domains\n\nThe supported domains and statistics for our dataset can be found below:\n\nTBD",
"### Dataset Structure",
"## Considerations for Using the Data\n\nHALvest is a direct conversion of reasearch papers found on HAL. Hence, fullnames of individuals, as well as their research e-mail adress can be found within the dataset. This must be considered prior to using this dataset for any purpose, such as training deep learning models, etc.",
"## Dataset Copyright\n\nThe licence terms for HALvest strictly follows those of HAL. Please refer to the below license when using this dataset.\n- HAL license",
"## Acknowledegment\n\nThis dataset is built upon the following works:"
] | [
35,
18,
5,
92,
121,
23,
23,
7,
73,
33,
16
] | [
"passage: TAGS\n#task_categories-text-generation #task_categories-fill-mask #academia #research #region-us \n## Dataset Description\n\n- Repository: GitHub\n- Papers: TBD## Dataset Summary### overview:\n\nHALvest is dataset comprise of fulltext from open papers from Hyper Articles en Ligne (HAL). Our dump gather papers written in TBD languages across TBD domains from TBD/date to TBD/date.\n\nYou can download the dataset using Hugging Face datasets:\n\n*You may need to follow these instructions to setup authentication before downloading the dataset: URL### Details:\n\n- We first request HAL's API in order to gather open research papers and parse it -- effectively sorting papers by language, year of production and domain.\n- We then download the PDF of the fetched papers.\n- Using GROBID, we convert each PDF to 'xml-tei' format in order to have structured data.\n- Using TBD, we convert each 'xml-tei' to LLM's compatible 'txt' format and concatenate it with the papaper's metadata fetched earlier.### Languages\n\nThe supported languages and statistics for our dataset can be found below:\n\nTBD### Domains\n\nThe supported domains and statistics for our dataset can be found below:\n\nTBD### Dataset Structure## Considerations for Using the Data\n\nHALvest is a direct conversion of reasearch papers found on HAL. Hence, fullnames of individuals, as well as their research e-mail adress can be found within the dataset. This must be considered prior to using this dataset for any purpose, such as training deep learning models, etc.## Dataset Copyright\n\nThe licence terms for HALvest strictly follows those of HAL. Please refer to the below license when using this dataset.\n- HAL license## Acknowledegment\n\nThis dataset is built upon the following works:"
] |
b53c97aeaeb9e1b4459299f38b282bd76a3c5586 |
# Dataset Card for Evaluation run of bardsai/jaskier-7b-dpo-v5.6
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [bardsai/jaskier-7b-dpo-v5.6](https://huggingface.co/bardsai/jaskier-7b-dpo-v5.6) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_bardsai__jaskier-7b-dpo-v5.6",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-17T15:44:22.548008](https://huggingface.co/datasets/open-llm-leaderboard/details_bardsai__jaskier-7b-dpo-v5.6/blob/main/results_2024-02-17T15-44-22.548008.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6501937924638518,
"acc_stderr": 0.032025249091365754,
"acc_norm": 0.6494469200952948,
"acc_norm_stderr": 0.03269529478578274,
"mc1": 0.6303549571603427,
"mc1_stderr": 0.01689818070697388,
"mc2": 0.7781424860062839,
"mc2_stderr": 0.013751565023330138
},
"harness|arc:challenge|25": {
"acc": 0.7090443686006825,
"acc_stderr": 0.01327307786590759,
"acc_norm": 0.7303754266211604,
"acc_norm_stderr": 0.012968040686869147
},
"harness|hellaswag|10": {
"acc": 0.7137024497112129,
"acc_stderr": 0.004511063351278702,
"acc_norm": 0.8899621589324835,
"acc_norm_stderr": 0.00312297363203947
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322663,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322663
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542126,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542126
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.048786087144669955,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.048786087144669955
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778394,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778394
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542126,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542126
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723295,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.02860620428922987,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.02860620428922987
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.02403548967633508,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.02403548967633508
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.028317533496066485,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.028317533496066485
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.01563002297009244,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.01563002297009244
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.03407632093854051,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.03407632093854051
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931055,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931055
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290916,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.031921934489347235,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.031921934489347235
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993469,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993469
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7312138728323699,
"acc_stderr": 0.023868003262500104,
"acc_norm": 0.7312138728323699,
"acc_norm_stderr": 0.023868003262500104
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4402234636871508,
"acc_stderr": 0.016602564615049942,
"acc_norm": 0.4402234636871508,
"acc_norm_stderr": 0.016602564615049942
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.025738854797818733,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.025738854797818733
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.02575586592263295,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.02575586592263295
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.024659685185967284,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.024659685185967284
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4758800521512386,
"acc_stderr": 0.012755368722863937,
"acc_norm": 0.4758800521512386,
"acc_norm_stderr": 0.012755368722863937
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.018926082916083383,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.018926082916083383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.02797982353874455,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.02797982353874455
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6303549571603427,
"mc1_stderr": 0.01689818070697388,
"mc2": 0.7781424860062839,
"mc2_stderr": 0.013751565023330138
},
"harness|winogrande|5": {
"acc": 0.8453038674033149,
"acc_stderr": 0.010163172650433535
},
"harness|gsm8k|5": {
"acc": 0.6967399545109931,
"acc_stderr": 0.012661502663418697
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_bardsai__jaskier-7b-dpo-v5.6 | [
"region:us"
] | 2024-02-17T15:46:42+00:00 | {"pretty_name": "Evaluation run of bardsai/jaskier-7b-dpo-v5.6", "dataset_summary": "Dataset automatically created during the evaluation run of model [bardsai/jaskier-7b-dpo-v5.6](https://huggingface.co/bardsai/jaskier-7b-dpo-v5.6) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_bardsai__jaskier-7b-dpo-v5.6\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-17T15:44:22.548008](https://huggingface.co/datasets/open-llm-leaderboard/details_bardsai__jaskier-7b-dpo-v5.6/blob/main/results_2024-02-17T15-44-22.548008.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6501937924638518,\n \"acc_stderr\": 0.032025249091365754,\n \"acc_norm\": 0.6494469200952948,\n \"acc_norm_stderr\": 0.03269529478578274,\n \"mc1\": 0.6303549571603427,\n \"mc1_stderr\": 0.01689818070697388,\n \"mc2\": 0.7781424860062839,\n \"mc2_stderr\": 0.013751565023330138\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7090443686006825,\n \"acc_stderr\": 0.01327307786590759,\n \"acc_norm\": 0.7303754266211604,\n \"acc_norm_stderr\": 0.012968040686869147\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7137024497112129,\n \"acc_stderr\": 0.004511063351278702,\n \"acc_norm\": 0.8899621589324835,\n \"acc_norm_stderr\": 0.00312297363203947\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322663,\n \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322663\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542126,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542126\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.048786087144669955,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.048786087144669955\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778394,\n \"acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778394\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542126,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542126\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n \"acc_stderr\": 0.023540799358723295,\n \"acc_norm\": 0.7806451612903226,\n \"acc_norm_stderr\": 0.023540799358723295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.02403548967633508,\n \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.02403548967633508\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.028317533496066485,\n \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.028317533496066485\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8422018348623853,\n \"acc_stderr\": 0.01563002297009244,\n \"acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.01563002297009244\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931055,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931055\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290916,\n \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290916\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n \"acc_stderr\": 0.013625556907993469,\n \"acc_norm\": 0.8237547892720306,\n \"acc_norm_stderr\": 0.013625556907993469\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.023868003262500104,\n \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.023868003262500104\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4402234636871508,\n \"acc_stderr\": 0.016602564615049942,\n \"acc_norm\": 0.4402234636871508,\n \"acc_norm_stderr\": 0.016602564615049942\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818733,\n \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818733\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n \"acc_stderr\": 0.02575586592263295,\n \"acc_norm\": 0.7106109324758842,\n \"acc_norm_stderr\": 0.02575586592263295\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.024659685185967284,\n \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.024659685185967284\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4758800521512386,\n \"acc_stderr\": 0.012755368722863937,\n \"acc_norm\": 0.4758800521512386,\n \"acc_norm_stderr\": 0.012755368722863937\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.018926082916083383,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.018926082916083383\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6303549571603427,\n \"mc1_stderr\": 0.01689818070697388,\n \"mc2\": 0.7781424860062839,\n \"mc2_stderr\": 0.013751565023330138\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8453038674033149,\n \"acc_stderr\": 0.010163172650433535\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6967399545109931,\n \"acc_stderr\": 0.012661502663418697\n }\n}\n```", "repo_url": "https://huggingface.co/bardsai/jaskier-7b-dpo-v5.6", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|arc:challenge|25_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|gsm8k|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hellaswag|10_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|winogrande|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["results_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["results_2024-02-17T15-44-22.548008.parquet"]}]}]} | 2024-02-17T15:47:04+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of bardsai/jaskier-7b-dpo-v5.6
Dataset automatically created during the evaluation run of model bardsai/jaskier-7b-dpo-v5.6 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-17T15:44:22.548008(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of bardsai/jaskier-7b-dpo-v5.6\n\n\n\nDataset automatically created during the evaluation run of model bardsai/jaskier-7b-dpo-v5.6 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-17T15:44:22.548008(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of bardsai/jaskier-7b-dpo-v5.6\n\n\n\nDataset automatically created during the evaluation run of model bardsai/jaskier-7b-dpo-v5.6 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-17T15:44:22.548008(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
191,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of bardsai/jaskier-7b-dpo-v5.6\n\n\n\nDataset automatically created during the evaluation run of model bardsai/jaskier-7b-dpo-v5.6 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-17T15:44:22.548008(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
f425906b3e9e0c339730ce5cfd630c706b601b70 | # Dataset Card for "samantar_with_idx_merged_with_train_val"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | mlsquare/samantar_with_idx_merged_with_train_val | [
"region:us"
] | 2024-02-17T15:52:01+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "valid", "path": "data/valid-*"}]}], "dataset_info": {"features": [{"name": "idx", "dtype": "int64"}, {"name": "tgt", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 11263548923.940233, "num_examples": 79638793}, {"name": "valid", "num_bytes": 2815887337.0597663, "num_examples": 19909699}], "download_size": 9506498914, "dataset_size": 14079436261.0}} | 2024-02-17T16:17:45+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "samantar_with_idx_merged_with_train_val"
More Information needed | [
"# Dataset Card for \"samantar_with_idx_merged_with_train_val\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"samantar_with_idx_merged_with_train_val\"\n\nMore Information needed"
] | [
6,
27
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"samantar_with_idx_merged_with_train_val\"\n\nMore Information needed"
] |
6fef3d2cfc13f70cb2a4dcfda0485c202db55e39 |
Not a serious or useable dataset, just used this https://github.com/Pleias/marginalia library and edited this [notebook](https://colab.research.google.com/drive/1dbliAbWRoidQgMqFPq4Rs8qooJ4S0Ye1?usp=sharing) to check the quality of the selections in the jondurbin/truthy-dpo-v0.1 dataset | eren23/truthy-dpo-v0.1-pruning-test-result | [
"region:us"
] | 2024-02-17T15:57:03+00:00 | {"dataset_info": {"features": [{"name": "original_source", "dtype": "string"}, {"name": "reference_evaluate", "dtype": "string"}, {"name": "reference_result", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1241977, "num_examples": 909}], "download_size": 622347, "dataset_size": 1241977}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-17T16:00:48+00:00 | [] | [] | TAGS
#region-us
|
Not a serious or useable dataset, just used this URL library and edited this notebook to check the quality of the selections in the jondurbin/truthy-dpo-v0.1 dataset | [] | [
"TAGS\n#region-us \n"
] | [
6
] | [
"passage: TAGS\n#region-us \n"
] |
156ca922a6c621448d5e05d40aa521357f32be99 |
# Dataset of antonina/アントニーナ/安冬妮娜 (Neural Cloud)
This is the dataset of antonina/アントニーナ/安冬妮娜 (Neural Cloud), containing 34 images and their tags.
The core tags of this character are `long_hair, yellow_eyes, hat, headphones, bangs, hair_between_eyes, white_headwear, aqua_hair, green_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 34 | 55.54 MiB | [Download](https://huggingface.co/datasets/CyberHarem/antonina_neuralcloud/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 34 | 27.67 MiB | [Download](https://huggingface.co/datasets/CyberHarem/antonina_neuralcloud/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 72 | 54.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/antonina_neuralcloud/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 34 | 45.72 MiB | [Download](https://huggingface.co/datasets/CyberHarem/antonina_neuralcloud/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 72 | 83.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/antonina_neuralcloud/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/antonina_neuralcloud',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 34 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, looking_at_viewer, solo, black_gloves, holding, long_sleeves, jacket, blush, closed_mouth, open_clothes, black_thighhighs, sitting, black_shirt |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | black_gloves | holding | long_sleeves | jacket | blush | closed_mouth | open_clothes | black_thighhighs | sitting | black_shirt |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:---------------|:----------|:---------------|:---------|:--------|:---------------|:---------------|:-------------------|:----------|:--------------|
| 0 | 34 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/antonina_neuralcloud | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-02-17T15:59:32+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-02-17T16:08:19+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of antonina/アントニーナ/安冬妮娜 (Neural Cloud)
==============================================
This is the dataset of antonina/アントニーナ/安冬妮娜 (Neural Cloud), containing 34 images and their tags.
The core tags of this character are 'long\_hair, yellow\_eyes, hat, headphones, bangs, hair\_between\_eyes, white\_headwear, aqua\_hair, green\_hair', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
44,
61,
5,
4
] | [
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version"
] |
54e96e381906fa487acf1a4d77217e581efc148d | 100k Sentence Pairs for Nepali Spelling Correction | dura-garage/nep-spell-100k | [
"license:mit",
"region:us"
] | 2024-02-17T15:59:51+00:00 | {"license": "mit"} | 2024-02-17T16:01:55+00:00 | [] | [] | TAGS
#license-mit #region-us
| 100k Sentence Pairs for Nepali Spelling Correction | [] | [
"TAGS\n#license-mit #region-us \n"
] | [
11
] | [
"passage: TAGS\n#license-mit #region-us \n"
] |
37fcd93d6e30fec318c4cfccf1b7044e764b0394 |
# Dataset Card for Evaluation run of vilm/Quyen-Mini-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [vilm/Quyen-Mini-v0.1](https://huggingface.co/vilm/Quyen-Mini-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_vilm__Quyen-Mini-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-17T16:04:03.826177](https://huggingface.co/datasets/open-llm-leaderboard/details_vilm__Quyen-Mini-v0.1/blob/main/results_2024-02-17T16-04-03.826177.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.43840376506422085,
"acc_stderr": 0.03434968269403666,
"acc_norm": 0.4413503435966419,
"acc_norm_stderr": 0.03507262298104947,
"mc1": 0.2974296205630355,
"mc1_stderr": 0.016002651487360988,
"mc2": 0.4643634101883703,
"mc2_stderr": 0.015588466941925255
},
"harness|arc:challenge|25": {
"acc": 0.3703071672354949,
"acc_stderr": 0.014111298751674948,
"acc_norm": 0.39334470989761094,
"acc_norm_stderr": 0.014275101465693024
},
"harness|hellaswag|10": {
"acc": 0.4660426209918343,
"acc_stderr": 0.004978260641742204,
"acc_norm": 0.6056562437761402,
"acc_norm_stderr": 0.004877104939356237
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.48026315789473684,
"acc_stderr": 0.04065771002562605,
"acc_norm": 0.48026315789473684,
"acc_norm_stderr": 0.04065771002562605
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.44150943396226416,
"acc_stderr": 0.03056159042673184,
"acc_norm": 0.44150943396226416,
"acc_norm_stderr": 0.03056159042673184
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4513888888888889,
"acc_stderr": 0.04161402398403279,
"acc_norm": 0.4513888888888889,
"acc_norm_stderr": 0.04161402398403279
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.36416184971098264,
"acc_stderr": 0.03669072477416907,
"acc_norm": 0.36416184971098264,
"acc_norm_stderr": 0.03669072477416907
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.17647058823529413,
"acc_stderr": 0.0379328118530781,
"acc_norm": 0.17647058823529413,
"acc_norm_stderr": 0.0379328118530781
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4297872340425532,
"acc_stderr": 0.03236214467715563,
"acc_norm": 0.4297872340425532,
"acc_norm_stderr": 0.03236214467715563
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.040493392977481425,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.040493392977481425
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.47586206896551725,
"acc_stderr": 0.041618085035015295,
"acc_norm": 0.47586206896551725,
"acc_norm_stderr": 0.041618085035015295
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.328042328042328,
"acc_stderr": 0.02418049716437689,
"acc_norm": 0.328042328042328,
"acc_norm_stderr": 0.02418049716437689
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.03670066451047181,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.03670066451047181
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4870967741935484,
"acc_stderr": 0.028434533152681855,
"acc_norm": 0.4870967741935484,
"acc_norm_stderr": 0.028434533152681855
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.35467980295566504,
"acc_stderr": 0.0336612448905145,
"acc_norm": 0.35467980295566504,
"acc_norm_stderr": 0.0336612448905145
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5575757575757576,
"acc_stderr": 0.03878372113711274,
"acc_norm": 0.5575757575757576,
"acc_norm_stderr": 0.03878372113711274
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5707070707070707,
"acc_stderr": 0.03526552724601199,
"acc_norm": 0.5707070707070707,
"acc_norm_stderr": 0.03526552724601199
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.538860103626943,
"acc_stderr": 0.03597524411734577,
"acc_norm": 0.538860103626943,
"acc_norm_stderr": 0.03597524411734577
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.41794871794871796,
"acc_stderr": 0.02500732988246122,
"acc_norm": 0.41794871794871796,
"acc_norm_stderr": 0.02500732988246122
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25555555555555554,
"acc_stderr": 0.02659393910184409,
"acc_norm": 0.25555555555555554,
"acc_norm_stderr": 0.02659393910184409
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4957983193277311,
"acc_stderr": 0.0324773433444811,
"acc_norm": 0.4957983193277311,
"acc_norm_stderr": 0.0324773433444811
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526733,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526733
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5596330275229358,
"acc_stderr": 0.02128431062376155,
"acc_norm": 0.5596330275229358,
"acc_norm_stderr": 0.02128431062376155
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3101851851851852,
"acc_stderr": 0.03154696285656628,
"acc_norm": 0.3101851851851852,
"acc_norm_stderr": 0.03154696285656628
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.4803921568627451,
"acc_stderr": 0.03506612560524866,
"acc_norm": 0.4803921568627451,
"acc_norm_stderr": 0.03506612560524866
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5611814345991561,
"acc_stderr": 0.032302649315470375,
"acc_norm": 0.5611814345991561,
"acc_norm_stderr": 0.032302649315470375
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4349775784753363,
"acc_stderr": 0.033272833702713445,
"acc_norm": 0.4349775784753363,
"acc_norm_stderr": 0.033272833702713445
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.48091603053435117,
"acc_stderr": 0.04382094705550988,
"acc_norm": 0.48091603053435117,
"acc_norm_stderr": 0.04382094705550988
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6776859504132231,
"acc_stderr": 0.042664163633521685,
"acc_norm": 0.6776859504132231,
"acc_norm_stderr": 0.042664163633521685
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5648148148148148,
"acc_stderr": 0.04792898170907062,
"acc_norm": 0.5648148148148148,
"acc_norm_stderr": 0.04792898170907062
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4601226993865031,
"acc_stderr": 0.03915857291436972,
"acc_norm": 0.4601226993865031,
"acc_norm_stderr": 0.03915857291436972
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4017857142857143,
"acc_stderr": 0.04653333146973647,
"acc_norm": 0.4017857142857143,
"acc_norm_stderr": 0.04653333146973647
},
"harness|hendrycksTest-management|5": {
"acc": 0.6310679611650486,
"acc_stderr": 0.0477761518115674,
"acc_norm": 0.6310679611650486,
"acc_norm_stderr": 0.0477761518115674
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7606837606837606,
"acc_stderr": 0.027951826808924333,
"acc_norm": 0.7606837606837606,
"acc_norm_stderr": 0.027951826808924333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.44,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.44,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5363984674329502,
"acc_stderr": 0.01783252407959326,
"acc_norm": 0.5363984674329502,
"acc_norm_stderr": 0.01783252407959326
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5289017341040463,
"acc_stderr": 0.026874085883518348,
"acc_norm": 0.5289017341040463,
"acc_norm_stderr": 0.026874085883518348
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23575418994413408,
"acc_stderr": 0.014196375686290804,
"acc_norm": 0.23575418994413408,
"acc_norm_stderr": 0.014196375686290804
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5424836601307189,
"acc_stderr": 0.02852638345214263,
"acc_norm": 0.5424836601307189,
"acc_norm_stderr": 0.02852638345214263
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.44694533762057875,
"acc_stderr": 0.02823776942208533,
"acc_norm": 0.44694533762057875,
"acc_norm_stderr": 0.02823776942208533
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4845679012345679,
"acc_stderr": 0.0278074900442762,
"acc_norm": 0.4845679012345679,
"acc_norm_stderr": 0.0278074900442762
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3049645390070922,
"acc_stderr": 0.027464708442022128,
"acc_norm": 0.3049645390070922,
"acc_norm_stderr": 0.027464708442022128
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3520208604954368,
"acc_stderr": 0.012198140605353599,
"acc_norm": 0.3520208604954368,
"acc_norm_stderr": 0.012198140605353599
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.35661764705882354,
"acc_stderr": 0.02909720956841194,
"acc_norm": 0.35661764705882354,
"acc_norm_stderr": 0.02909720956841194
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.40032679738562094,
"acc_stderr": 0.01982184368827178,
"acc_norm": 0.40032679738562094,
"acc_norm_stderr": 0.01982184368827178
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.509090909090909,
"acc_stderr": 0.0478833976870286,
"acc_norm": 0.509090909090909,
"acc_norm_stderr": 0.0478833976870286
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5551020408163265,
"acc_stderr": 0.031814251181977865,
"acc_norm": 0.5551020408163265,
"acc_norm_stderr": 0.031814251181977865
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5771144278606966,
"acc_stderr": 0.034932317774212816,
"acc_norm": 0.5771144278606966,
"acc_norm_stderr": 0.034932317774212816
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42168674698795183,
"acc_stderr": 0.03844453181770917,
"acc_norm": 0.42168674698795183,
"acc_norm_stderr": 0.03844453181770917
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5380116959064327,
"acc_stderr": 0.03823727092882307,
"acc_norm": 0.5380116959064327,
"acc_norm_stderr": 0.03823727092882307
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2974296205630355,
"mc1_stderr": 0.016002651487360988,
"mc2": 0.4643634101883703,
"mc2_stderr": 0.015588466941925255
},
"harness|winogrande|5": {
"acc": 0.5911602209944752,
"acc_stderr": 0.013816954295135686
},
"harness|gsm8k|5": {
"acc": 0.2744503411675512,
"acc_stderr": 0.0122915811708149
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_vilm__Quyen-Mini-v0.1 | [
"region:us"
] | 2024-02-17T16:06:12+00:00 | {"pretty_name": "Evaluation run of vilm/Quyen-Mini-v0.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [vilm/Quyen-Mini-v0.1](https://huggingface.co/vilm/Quyen-Mini-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_vilm__Quyen-Mini-v0.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-17T16:04:03.826177](https://huggingface.co/datasets/open-llm-leaderboard/details_vilm__Quyen-Mini-v0.1/blob/main/results_2024-02-17T16-04-03.826177.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.43840376506422085,\n \"acc_stderr\": 0.03434968269403666,\n \"acc_norm\": 0.4413503435966419,\n \"acc_norm_stderr\": 0.03507262298104947,\n \"mc1\": 0.2974296205630355,\n \"mc1_stderr\": 0.016002651487360988,\n \"mc2\": 0.4643634101883703,\n \"mc2_stderr\": 0.015588466941925255\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.3703071672354949,\n \"acc_stderr\": 0.014111298751674948,\n \"acc_norm\": 0.39334470989761094,\n \"acc_norm_stderr\": 0.014275101465693024\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4660426209918343,\n \"acc_stderr\": 0.004978260641742204,\n \"acc_norm\": 0.6056562437761402,\n \"acc_norm_stderr\": 0.004877104939356237\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.37037037037037035,\n \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.48026315789473684,\n \"acc_stderr\": 0.04065771002562605,\n \"acc_norm\": 0.48026315789473684,\n \"acc_norm_stderr\": 0.04065771002562605\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.44150943396226416,\n \"acc_stderr\": 0.03056159042673184,\n \"acc_norm\": 0.44150943396226416,\n \"acc_norm_stderr\": 0.03056159042673184\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4513888888888889,\n \"acc_stderr\": 0.04161402398403279,\n \"acc_norm\": 0.4513888888888889,\n \"acc_norm_stderr\": 0.04161402398403279\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.36416184971098264,\n \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.36416184971098264,\n \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.17647058823529413,\n \"acc_stderr\": 0.0379328118530781,\n \"acc_norm\": 0.17647058823529413,\n \"acc_norm_stderr\": 0.0379328118530781\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4297872340425532,\n \"acc_stderr\": 0.03236214467715563,\n \"acc_norm\": 0.4297872340425532,\n \"acc_norm_stderr\": 0.03236214467715563\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n \"acc_stderr\": 0.040493392977481425,\n \"acc_norm\": 0.24561403508771928,\n \"acc_norm_stderr\": 0.040493392977481425\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.47586206896551725,\n \"acc_stderr\": 0.041618085035015295,\n \"acc_norm\": 0.47586206896551725,\n \"acc_norm_stderr\": 0.041618085035015295\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.328042328042328,\n \"acc_stderr\": 0.02418049716437689,\n \"acc_norm\": 0.328042328042328,\n \"acc_norm_stderr\": 0.02418049716437689\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.21428571428571427,\n \"acc_stderr\": 0.03670066451047181,\n \"acc_norm\": 0.21428571428571427,\n \"acc_norm_stderr\": 0.03670066451047181\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.4870967741935484,\n \"acc_stderr\": 0.028434533152681855,\n \"acc_norm\": 0.4870967741935484,\n \"acc_norm_stderr\": 0.028434533152681855\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.35467980295566504,\n \"acc_stderr\": 0.0336612448905145,\n \"acc_norm\": 0.35467980295566504,\n \"acc_norm_stderr\": 0.0336612448905145\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.5575757575757576,\n \"acc_stderr\": 0.03878372113711274,\n \"acc_norm\": 0.5575757575757576,\n \"acc_norm_stderr\": 0.03878372113711274\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.5707070707070707,\n \"acc_stderr\": 0.03526552724601199,\n \"acc_norm\": 0.5707070707070707,\n \"acc_norm_stderr\": 0.03526552724601199\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.538860103626943,\n \"acc_stderr\": 0.03597524411734577,\n \"acc_norm\": 0.538860103626943,\n \"acc_norm_stderr\": 0.03597524411734577\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.41794871794871796,\n \"acc_stderr\": 0.02500732988246122,\n \"acc_norm\": 0.41794871794871796,\n \"acc_norm_stderr\": 0.02500732988246122\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.25555555555555554,\n \"acc_stderr\": 0.02659393910184409,\n \"acc_norm\": 0.25555555555555554,\n \"acc_norm_stderr\": 0.02659393910184409\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.4957983193277311,\n \"acc_stderr\": 0.0324773433444811,\n \"acc_norm\": 0.4957983193277311,\n \"acc_norm_stderr\": 0.0324773433444811\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526733,\n \"acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526733\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.5596330275229358,\n \"acc_stderr\": 0.02128431062376155,\n \"acc_norm\": 0.5596330275229358,\n \"acc_norm_stderr\": 0.02128431062376155\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.3101851851851852,\n \"acc_stderr\": 0.03154696285656628,\n \"acc_norm\": 0.3101851851851852,\n \"acc_norm_stderr\": 0.03154696285656628\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.4803921568627451,\n \"acc_stderr\": 0.03506612560524866,\n \"acc_norm\": 0.4803921568627451,\n \"acc_norm_stderr\": 0.03506612560524866\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.5611814345991561,\n \"acc_stderr\": 0.032302649315470375,\n \"acc_norm\": 0.5611814345991561,\n \"acc_norm_stderr\": 0.032302649315470375\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4349775784753363,\n \"acc_stderr\": 0.033272833702713445,\n \"acc_norm\": 0.4349775784753363,\n \"acc_norm_stderr\": 0.033272833702713445\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.48091603053435117,\n \"acc_stderr\": 0.04382094705550988,\n \"acc_norm\": 0.48091603053435117,\n \"acc_norm_stderr\": 0.04382094705550988\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6776859504132231,\n \"acc_stderr\": 0.042664163633521685,\n \"acc_norm\": 0.6776859504132231,\n \"acc_norm_stderr\": 0.042664163633521685\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5648148148148148,\n \"acc_stderr\": 0.04792898170907062,\n \"acc_norm\": 0.5648148148148148,\n \"acc_norm_stderr\": 0.04792898170907062\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.4601226993865031,\n \"acc_stderr\": 0.03915857291436972,\n \"acc_norm\": 0.4601226993865031,\n \"acc_norm_stderr\": 0.03915857291436972\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n \"acc_stderr\": 0.04653333146973647,\n \"acc_norm\": 0.4017857142857143,\n \"acc_norm_stderr\": 0.04653333146973647\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6310679611650486,\n \"acc_stderr\": 0.0477761518115674,\n \"acc_norm\": 0.6310679611650486,\n \"acc_norm_stderr\": 0.0477761518115674\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7606837606837606,\n \"acc_stderr\": 0.027951826808924333,\n \"acc_norm\": 0.7606837606837606,\n \"acc_norm_stderr\": 0.027951826808924333\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5363984674329502,\n \"acc_stderr\": 0.01783252407959326,\n \"acc_norm\": 0.5363984674329502,\n \"acc_norm_stderr\": 0.01783252407959326\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5289017341040463,\n \"acc_stderr\": 0.026874085883518348,\n \"acc_norm\": 0.5289017341040463,\n \"acc_norm_stderr\": 0.026874085883518348\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23575418994413408,\n \"acc_stderr\": 0.014196375686290804,\n \"acc_norm\": 0.23575418994413408,\n \"acc_norm_stderr\": 0.014196375686290804\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5424836601307189,\n \"acc_stderr\": 0.02852638345214263,\n \"acc_norm\": 0.5424836601307189,\n \"acc_norm_stderr\": 0.02852638345214263\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.44694533762057875,\n \"acc_stderr\": 0.02823776942208533,\n \"acc_norm\": 0.44694533762057875,\n \"acc_norm_stderr\": 0.02823776942208533\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.4845679012345679,\n \"acc_stderr\": 0.0278074900442762,\n \"acc_norm\": 0.4845679012345679,\n \"acc_norm_stderr\": 0.0278074900442762\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3049645390070922,\n \"acc_stderr\": 0.027464708442022128,\n \"acc_norm\": 0.3049645390070922,\n \"acc_norm_stderr\": 0.027464708442022128\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3520208604954368,\n \"acc_stderr\": 0.012198140605353599,\n \"acc_norm\": 0.3520208604954368,\n \"acc_norm_stderr\": 0.012198140605353599\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.35661764705882354,\n \"acc_stderr\": 0.02909720956841194,\n \"acc_norm\": 0.35661764705882354,\n \"acc_norm_stderr\": 0.02909720956841194\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.40032679738562094,\n \"acc_stderr\": 0.01982184368827178,\n \"acc_norm\": 0.40032679738562094,\n \"acc_norm_stderr\": 0.01982184368827178\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.509090909090909,\n \"acc_stderr\": 0.0478833976870286,\n \"acc_norm\": 0.509090909090909,\n \"acc_norm_stderr\": 0.0478833976870286\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.5551020408163265,\n \"acc_stderr\": 0.031814251181977865,\n \"acc_norm\": 0.5551020408163265,\n \"acc_norm_stderr\": 0.031814251181977865\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5771144278606966,\n \"acc_stderr\": 0.034932317774212816,\n \"acc_norm\": 0.5771144278606966,\n \"acc_norm_stderr\": 0.034932317774212816\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42168674698795183,\n \"acc_stderr\": 0.03844453181770917,\n \"acc_norm\": 0.42168674698795183,\n \"acc_norm_stderr\": 0.03844453181770917\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.5380116959064327,\n \"acc_stderr\": 0.03823727092882307,\n \"acc_norm\": 0.5380116959064327,\n \"acc_norm_stderr\": 0.03823727092882307\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2974296205630355,\n \"mc1_stderr\": 0.016002651487360988,\n \"mc2\": 0.4643634101883703,\n \"mc2_stderr\": 0.015588466941925255\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5911602209944752,\n \"acc_stderr\": 0.013816954295135686\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2744503411675512,\n \"acc_stderr\": 0.0122915811708149\n }\n}\n```", "repo_url": "https://huggingface.co/vilm/Quyen-Mini-v0.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|arc:challenge|25_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|gsm8k|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hellaswag|10_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|winogrande|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["results_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["results_2024-02-17T16-04-03.826177.parquet"]}]}]} | 2024-02-17T16:06:39+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of vilm/Quyen-Mini-v0.1
Dataset automatically created during the evaluation run of model vilm/Quyen-Mini-v0.1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-17T16:04:03.826177(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of vilm/Quyen-Mini-v0.1\n\n\n\nDataset automatically created during the evaluation run of model vilm/Quyen-Mini-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-17T16:04:03.826177(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of vilm/Quyen-Mini-v0.1\n\n\n\nDataset automatically created during the evaluation run of model vilm/Quyen-Mini-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-17T16:04:03.826177(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
181,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of vilm/Quyen-Mini-v0.1\n\n\n\nDataset automatically created during the evaluation run of model vilm/Quyen-Mini-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-17T16:04:03.826177(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
5a6ee4a27482addf31eb78ee73750872816fda0d |
# Dataset of sol/ソル/苏尔 (Neural Cloud)
This is the dataset of sol/ソル/苏尔 (Neural Cloud), containing 21 images and their tags.
The core tags of this character are `long_hair, blonde_hair, hair_between_eyes, yellow_eyes, ponytail, very_long_hair, bangs, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 21 | 26.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sol_neuralcloud/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 21 | 17.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sol_neuralcloud/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 38 | 28.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sol_neuralcloud/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 21 | 24.02 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sol_neuralcloud/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 38 | 36.53 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sol_neuralcloud/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/sol_neuralcloud',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 21 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, smile, fingerless_gloves, black_gloves, looking_at_viewer, white_shirt, belt, crop_top, midriff, navel, necklace, orange_jacket, open_jacket, black_pants, long_sleeves, standing, fur-trimmed_jacket, holding, outdoors, black_choker, boots, sky |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | smile | fingerless_gloves | black_gloves | looking_at_viewer | white_shirt | belt | crop_top | midriff | navel | necklace | orange_jacket | open_jacket | black_pants | long_sleeves | standing | fur-trimmed_jacket | holding | outdoors | black_choker | boots | sky |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:--------------------|:---------------|:--------------------|:--------------|:-------|:-----------|:----------|:--------|:-----------|:----------------|:--------------|:--------------|:---------------|:-----------|:---------------------|:----------|:-----------|:---------------|:--------|:------|
| 0 | 21 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/sol_neuralcloud | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-02-17T16:20:02+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-02-17T16:24:39+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of sol/ソル/苏尔 (Neural Cloud)
===================================
This is the dataset of sol/ソル/苏尔 (Neural Cloud), containing 21 images and their tags.
The core tags of this character are 'long\_hair, blonde\_hair, hair\_between\_eyes, yellow\_eyes, ponytail, very\_long\_hair, bangs, breasts', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
44,
61,
5,
4
] | [
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version"
] |
24a786ac7e4153e368cef1536280e3645ff4919c |
# Dataset of evelyn/イヴリン/伊芙琳 (Neural Cloud)
This is the dataset of evelyn/イヴリン/伊芙琳 (Neural Cloud), containing 19 images and their tags.
The core tags of this character are `long_hair, red_eyes, eyepatch, bangs, breasts, white_hair, large_breasts, grey_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 19 | 38.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/evelyn_neuralcloud/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 19 | 19.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/evelyn_neuralcloud/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 49 | 39.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/evelyn_neuralcloud/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 19 | 33.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/evelyn_neuralcloud/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 49 | 58.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/evelyn_neuralcloud/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/evelyn_neuralcloud',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, looking_at_viewer, nipples, solo, cum_on_breasts, nude, blush, closed_mouth, navel, pussy |
| 1 | 7 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, solo, closed_mouth, holding_gun, pantyhose, red_gloves, tactical_clothes, black_footwear, boots, cape, bulletproof_vest, handgun, long_sleeves, looking_at_viewer, outdoors, pouch, rifle, shotgun_shell, standing |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | nipples | solo | cum_on_breasts | nude | blush | closed_mouth | navel | pussy | holding_gun | pantyhose | red_gloves | tactical_clothes | black_footwear | boots | cape | bulletproof_vest | handgun | long_sleeves | outdoors | pouch | rifle | shotgun_shell | standing |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:----------|:-------|:-----------------|:-------|:--------|:---------------|:--------|:--------|:--------------|:------------|:-------------|:-------------------|:-----------------|:--------|:-------|:-------------------|:----------|:---------------|:-----------|:--------|:--------|:----------------|:-----------|
| 0 | 6 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 1 | 7 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | | X | | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/evelyn_neuralcloud | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-02-17T16:35:07+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-02-17T16:41:38+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of evelyn/イヴリン/伊芙琳 (Neural Cloud)
=========================================
This is the dataset of evelyn/イヴリン/伊芙琳 (Neural Cloud), containing 19 images and their tags.
The core tags of this character are 'long\_hair, red\_eyes, eyepatch, bangs, breasts, white\_hair, large\_breasts, grey\_hair', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
44,
61,
5,
4
] | [
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version"
] |
9129570552b03519306f3084d489b144f5f0d034 | # Dataset Card for "Gold-alpaca-med-small"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | hemanth955/Gold-alpaca-med-small | [
"region:us"
] | 2024-02-17T16:39:32+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "Input", "dtype": "string"}, {"name": "Output", "dtype": "int64"}, {"name": "Instruction", "dtype": "string"}, {"name": "Text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 47684340, "num_examples": 27360}], "download_size": 0, "dataset_size": 47684340}} | 2024-02-17T16:53:09+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "Gold-alpaca-med-small"
More Information needed | [
"# Dataset Card for \"Gold-alpaca-med-small\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"Gold-alpaca-med-small\"\n\nMore Information needed"
] | [
6,
19
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"Gold-alpaca-med-small\"\n\nMore Information needed"
] |
954d50162ffedba0f3bfc4d7a8b317f0e8e6d944 |
# BEE-spoke-data/consumer-finance-complaints
`consumer-finance-complaints` but in a format that actually works.
Pulled Feb 2024
| BEE-spoke-data/consumer-finance-complaints | [
"task_categories:text-classification",
"task_categories:text-generation",
"source_datasets:consumer-finance-complaints",
"license:cc0-1.0",
"region:us"
] | 2024-02-17T16:42:13+00:00 | {"language": ["en"], "license": "cc0-1.0", "size_categories": ["1M<n<10M"], "source_datasets": "consumer-finance-complaints", "task_categories": ["text-classification", "text-generation"], "dataset_info": [{"config_name": "default", "features": [{"name": "Date received", "dtype": "string"}, {"name": "Product", "dtype": "string"}, {"name": "Sub-product", "dtype": "string"}, {"name": "Issue", "dtype": "string"}, {"name": "Sub-issue", "dtype": "string"}, {"name": "Consumer complaint narrative", "dtype": "string"}, {"name": "Company public response", "dtype": "string"}, {"name": "Company", "dtype": "string"}, {"name": "State", "dtype": "string"}, {"name": "ZIP code", "dtype": "string"}, {"name": "Tags", "dtype": "string"}, {"name": "Consumer consent provided?", "dtype": "string"}, {"name": "Submitted via", "dtype": "string"}, {"name": "Date sent to company", "dtype": "string"}, {"name": "Company response to consumer", "dtype": "string"}, {"name": "Timely response?", "dtype": "string"}, {"name": "Consumer disputed?", "dtype": "string"}, {"name": "Complaint ID", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 3427420677, "num_examples": 4707579}], "download_size": 1061488683, "dataset_size": 3427420677}, {"config_name": "has-text", "features": [{"name": "Date received", "dtype": "string"}, {"name": "Product", "dtype": "string"}, {"name": "Sub-product", "dtype": "string"}, {"name": "Issue", "dtype": "string"}, {"name": "Sub-issue", "dtype": "string"}, {"name": "Consumer complaint narrative", "dtype": "string"}, {"name": "Company public response", "dtype": "string"}, {"name": "Company", "dtype": "string"}, {"name": "State", "dtype": "string"}, {"name": "ZIP code", "dtype": "string"}, {"name": "Tags", "dtype": "string"}, {"name": "Consumer consent provided?", "dtype": "string"}, {"name": "Submitted via", "dtype": "string"}, {"name": "Date sent to company", "dtype": "string"}, {"name": "Company response to consumer", "dtype": "string"}, {"name": "Timely response?", "dtype": "string"}, {"name": "Consumer disputed?", "dtype": "string"}, {"name": "Complaint ID", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 1229876941.3934054, "num_examples": 1689573}], "download_size": 925128908, "dataset_size": 1229876941.3934054}], "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}, {"config_name": "has-text", "data_files": [{"split": "train", "path": "has-text/train-*"}]}], "tags": ["finance", "government data", "2024"]} | 2024-02-17T16:46:00+00:00 | [] | [
"en"
] | TAGS
#task_categories-text-classification #task_categories-text-generation #source_datasets-consumer-finance-complaints #license-cc0-1.0 #region-us
|
# BEE-spoke-data/consumer-finance-complaints
'consumer-finance-complaints' but in a format that actually works.
Pulled Feb 2024
| [
"# BEE-spoke-data/consumer-finance-complaints\n\n\n'consumer-finance-complaints' but in a format that actually works. \n\nPulled Feb 2024"
] | [
"TAGS\n#task_categories-text-classification #task_categories-text-generation #source_datasets-consumer-finance-complaints #license-cc0-1.0 #region-us \n",
"# BEE-spoke-data/consumer-finance-complaints\n\n\n'consumer-finance-complaints' but in a format that actually works. \n\nPulled Feb 2024"
] | [
53,
44
] | [
"passage: TAGS\n#task_categories-text-classification #task_categories-text-generation #source_datasets-consumer-finance-complaints #license-cc0-1.0 #region-us \n# BEE-spoke-data/consumer-finance-complaints\n\n\n'consumer-finance-complaints' but in a format that actually works. \n\nPulled Feb 2024"
] |
53a1781ba62188ec9b507f07a2ae9fb2f8b11ddf |
# Dataset of turing/チューリング/图灵 (Neural Cloud)
This is the dataset of turing/チューリング/图灵 (Neural Cloud), containing 22 images and their tags.
The core tags of this character are `animal_ears, brown_hair, long_hair, hair_ornament, breasts, bangs, animal_ear_fluff, large_breasts, braid, hair_between_eyes, tail`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 22 | 34.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/turing_neuralcloud/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 22 | 17.91 MiB | [Download](https://huggingface.co/datasets/CyberHarem/turing_neuralcloud/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 53 | 39.65 MiB | [Download](https://huggingface.co/datasets/CyberHarem/turing_neuralcloud/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 22 | 29.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/turing_neuralcloud/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 53 | 57.05 MiB | [Download](https://huggingface.co/datasets/CyberHarem/turing_neuralcloud/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/turing_neuralcloud',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 22 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, looking_at_viewer, solo, smile, white_shirt, simple_background, yellow_gloves, cleavage_cutout, closed_mouth, black_pants, open_clothes, white_background, id_card, long_sleeves, white_jacket |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | smile | white_shirt | simple_background | yellow_gloves | cleavage_cutout | closed_mouth | black_pants | open_clothes | white_background | id_card | long_sleeves | white_jacket |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:--------|:--------------|:--------------------|:----------------|:------------------|:---------------|:--------------|:---------------|:-------------------|:----------|:---------------|:---------------|
| 0 | 22 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/turing_neuralcloud | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-02-17T16:44:37+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-02-17T16:50:22+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of turing/チューリング/图灵 (Neural Cloud)
==========================================
This is the dataset of turing/チューリング/图灵 (Neural Cloud), containing 22 images and their tags.
The core tags of this character are 'animal\_ears, brown\_hair, long\_hair, hair\_ornament, breasts, bangs, animal\_ear\_fluff, large\_breasts, braid, hair\_between\_eyes, tail', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
44,
61,
5,
4
] | [
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version"
] |
dd382831addcffdecab68b96b411e2222bfcf2d8 | 0: ENOUGH_INFO
1: NOT_ENOUGH_INFO | iestynmullinor/evidence_reranker_fever_balanced | [
"region:us"
] | 2024-02-17T16:46:56+00:00 | {} | 2024-02-17T17:27:53+00:00 | [] | [] | TAGS
#region-us
| 0: ENOUGH_INFO
1: NOT_ENOUGH_INFO | [] | [
"TAGS\n#region-us \n"
] | [
6
] | [
"passage: TAGS\n#region-us \n"
] |
8f344b292b1d35da6eaf09dba8acce269dada4f5 |
# Dataset Card for Evaluation run of DreadPoor/JustToSuffer-7B-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [DreadPoor/JustToSuffer-7B-slerp](https://huggingface.co/DreadPoor/JustToSuffer-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_DreadPoor__JustToSuffer-7B-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-17T16:45:59.965925](https://huggingface.co/datasets/open-llm-leaderboard/details_DreadPoor__JustToSuffer-7B-slerp/blob/main/results_2024-02-17T16-45-59.965925.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6495053299013857,
"acc_stderr": 0.03213728048297641,
"acc_norm": 0.6511170368490796,
"acc_norm_stderr": 0.03278191957728323,
"mc1": 0.4541003671970624,
"mc1_stderr": 0.017429593091323522,
"mc2": 0.6269022526171036,
"mc2_stderr": 0.015516860373603584
},
"harness|arc:challenge|25": {
"acc": 0.6604095563139932,
"acc_stderr": 0.013839039762820167,
"acc_norm": 0.689419795221843,
"acc_norm_stderr": 0.013522292098053067
},
"harness|hellaswag|10": {
"acc": 0.7030472017526389,
"acc_stderr": 0.00455981758918207,
"acc_norm": 0.8678550089623581,
"acc_norm_stderr": 0.003379562298387565
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.027834912527544067,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.027834912527544067
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.035868792800803406,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.035868792800803406
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.0355068398916558,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.0355068398916558
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.049135952012744975,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.049135952012744975
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.025279850397404904,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.025279850397404904
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.0442626668137991,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.0442626668137991
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083525,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267042,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267042
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768776,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768776
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6692307692307692,
"acc_stderr": 0.023854795680971125,
"acc_norm": 0.6692307692307692,
"acc_norm_stderr": 0.023854795680971125
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.02882088466625326,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.02882088466625326
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.02995382389188704,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.02995382389188704
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8550458715596331,
"acc_stderr": 0.015094215699700476,
"acc_norm": 0.8550458715596331,
"acc_norm_stderr": 0.015094215699700476
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5324074074074074,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.5324074074074074,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.025845017986926917,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.025845017986926917
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.030636591348699813,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.030636591348699813
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.03768335959728744,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.03768335959728744
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.822477650063857,
"acc_stderr": 0.013664230995834838,
"acc_norm": 0.822477650063857,
"acc_norm_stderr": 0.013664230995834838
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069363,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069363
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.40893854748603353,
"acc_stderr": 0.016442830654715544,
"acc_norm": 0.40893854748603353,
"acc_norm_stderr": 0.016442830654715544
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7516339869281046,
"acc_stderr": 0.02473998135511359,
"acc_norm": 0.7516339869281046,
"acc_norm_stderr": 0.02473998135511359
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.025583062489984813,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.025583062489984813
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600712992,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600712992
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.02979071924382972,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.02979071924382972
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46088657105606257,
"acc_stderr": 0.012731102790504515,
"acc_norm": 0.46088657105606257,
"acc_norm_stderr": 0.012731102790504515
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6985294117647058,
"acc_stderr": 0.027875982114273168,
"acc_norm": 0.6985294117647058,
"acc_norm_stderr": 0.027875982114273168
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6584967320261438,
"acc_stderr": 0.019184639328092487,
"acc_norm": 0.6584967320261438,
"acc_norm_stderr": 0.019184639328092487
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.02879518557429129,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.02879518557429129
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578337,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578337
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4541003671970624,
"mc1_stderr": 0.017429593091323522,
"mc2": 0.6269022526171036,
"mc2_stderr": 0.015516860373603584
},
"harness|winogrande|5": {
"acc": 0.8003157063930545,
"acc_stderr": 0.011235328382625856
},
"harness|gsm8k|5": {
"acc": 0.5974222896133434,
"acc_stderr": 0.013508523063663423
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_DreadPoor__JustToSuffer-7B-slerp | [
"region:us"
] | 2024-02-17T16:48:17+00:00 | {"pretty_name": "Evaluation run of DreadPoor/JustToSuffer-7B-slerp", "dataset_summary": "Dataset automatically created during the evaluation run of model [DreadPoor/JustToSuffer-7B-slerp](https://huggingface.co/DreadPoor/JustToSuffer-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_DreadPoor__JustToSuffer-7B-slerp\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-17T16:45:59.965925](https://huggingface.co/datasets/open-llm-leaderboard/details_DreadPoor__JustToSuffer-7B-slerp/blob/main/results_2024-02-17T16-45-59.965925.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6495053299013857,\n \"acc_stderr\": 0.03213728048297641,\n \"acc_norm\": 0.6511170368490796,\n \"acc_norm_stderr\": 0.03278191957728323,\n \"mc1\": 0.4541003671970624,\n \"mc1_stderr\": 0.017429593091323522,\n \"mc2\": 0.6269022526171036,\n \"mc2_stderr\": 0.015516860373603584\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6604095563139932,\n \"acc_stderr\": 0.013839039762820167,\n \"acc_norm\": 0.689419795221843,\n \"acc_norm_stderr\": 0.013522292098053067\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7030472017526389,\n \"acc_stderr\": 0.00455981758918207,\n \"acc_norm\": 0.8678550089623581,\n \"acc_norm_stderr\": 0.003379562298387565\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544067,\n \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544067\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n \"acc_stderr\": 0.035868792800803406,\n \"acc_norm\": 0.7569444444444444,\n \"acc_norm_stderr\": 0.035868792800803406\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.6820809248554913,\n \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.049135952012744975,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.049135952012744975\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404904,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404904\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.0442626668137991,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.0442626668137991\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n \"acc_stderr\": 0.023157879349083525,\n \"acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.023157879349083525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267042,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267042\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768776,\n \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768776\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.023854795680971125,\n \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.023854795680971125\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.337037037037037,\n \"acc_stderr\": 0.02882088466625326,\n \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.02882088466625326\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.02995382389188704,\n \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.02995382389188704\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8550458715596331,\n \"acc_stderr\": 0.015094215699700476,\n \"acc_norm\": 0.8550458715596331,\n \"acc_norm_stderr\": 0.015094215699700476\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5324074074074074,\n \"acc_stderr\": 0.03402801581358966,\n \"acc_norm\": 0.5324074074074074,\n \"acc_norm_stderr\": 0.03402801581358966\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926917,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926917\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n \"acc_stderr\": 0.030636591348699813,\n \"acc_norm\": 0.7040358744394619,\n \"acc_norm_stderr\": 0.030636591348699813\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.03768335959728744,\n \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.03768335959728744\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.822477650063857,\n \"acc_stderr\": 0.013664230995834838,\n \"acc_norm\": 0.822477650063857,\n \"acc_norm_stderr\": 0.013664230995834838\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069363,\n \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069363\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.40893854748603353,\n \"acc_stderr\": 0.016442830654715544,\n \"acc_norm\": 0.40893854748603353,\n \"acc_norm_stderr\": 0.016442830654715544\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7516339869281046,\n \"acc_stderr\": 0.02473998135511359,\n \"acc_norm\": 0.7516339869281046,\n \"acc_norm_stderr\": 0.02473998135511359\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n \"acc_stderr\": 0.025583062489984813,\n \"acc_norm\": 0.7170418006430869,\n \"acc_norm_stderr\": 0.025583062489984813\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712992,\n \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712992\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.475177304964539,\n \"acc_stderr\": 0.02979071924382972,\n \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.02979071924382972\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46088657105606257,\n \"acc_stderr\": 0.012731102790504515,\n \"acc_norm\": 0.46088657105606257,\n \"acc_norm_stderr\": 0.012731102790504515\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6985294117647058,\n \"acc_stderr\": 0.027875982114273168,\n \"acc_norm\": 0.6985294117647058,\n \"acc_norm_stderr\": 0.027875982114273168\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6584967320261438,\n \"acc_stderr\": 0.019184639328092487,\n \"acc_norm\": 0.6584967320261438,\n \"acc_norm_stderr\": 0.019184639328092487\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.02879518557429129,\n \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.02879518557429129\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.025538433368578337,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.025538433368578337\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4541003671970624,\n \"mc1_stderr\": 0.017429593091323522,\n \"mc2\": 0.6269022526171036,\n \"mc2_stderr\": 0.015516860373603584\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8003157063930545,\n \"acc_stderr\": 0.011235328382625856\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5974222896133434,\n \"acc_stderr\": 0.013508523063663423\n }\n}\n```", "repo_url": "https://huggingface.co/DreadPoor/JustToSuffer-7B-slerp", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|arc:challenge|25_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|gsm8k|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hellaswag|10_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|winogrande|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["results_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["results_2024-02-17T16-45-59.965925.parquet"]}]}]} | 2024-02-17T16:48:38+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of DreadPoor/JustToSuffer-7B-slerp
Dataset automatically created during the evaluation run of model DreadPoor/JustToSuffer-7B-slerp on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-17T16:45:59.965925(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of DreadPoor/JustToSuffer-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model DreadPoor/JustToSuffer-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-17T16:45:59.965925(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of DreadPoor/JustToSuffer-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model DreadPoor/JustToSuffer-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-17T16:45:59.965925(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
189,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of DreadPoor/JustToSuffer-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model DreadPoor/JustToSuffer-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-17T16:45:59.965925(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
55b8e6131e6bb7d576c9b5a3bd1af46c0d6616ff |
# Dataset of willow/ウィロウ/薇洛儿 (Neural Cloud)
This is the dataset of willow/ウィロウ/薇洛儿 (Neural Cloud), containing 22 images and their tags.
The core tags of this character are `pink_hair, breasts, red_eyes, short_hair, large_breasts, hat, eyewear_on_head, sunglasses, bangs, bow, hair_ornament, pink_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 22 | 44.59 MiB | [Download](https://huggingface.co/datasets/CyberHarem/willow_neuralcloud/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 22 | 19.86 MiB | [Download](https://huggingface.co/datasets/CyberHarem/willow_neuralcloud/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 52 | 43.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/willow_neuralcloud/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 22 | 36.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/willow_neuralcloud/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 52 | 71.15 MiB | [Download](https://huggingface.co/datasets/CyberHarem/willow_neuralcloud/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/willow_neuralcloud',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------|
| 0 | 22 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, looking_at_viewer, gloves, open_mouth, black_thighhighs, jacket, smile, holding, bowtie, skirt, white_shirt |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | gloves | open_mouth | black_thighhighs | jacket | smile | holding | bowtie | skirt | white_shirt |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:---------|:-------------|:-------------------|:---------|:--------|:----------|:---------|:--------|:--------------|
| 0 | 22 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/willow_neuralcloud | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-02-17T16:55:50+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-02-17T17:05:02+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of willow/ウィロウ/薇洛儿 (Neural Cloud)
=========================================
This is the dataset of willow/ウィロウ/薇洛儿 (Neural Cloud), containing 22 images and their tags.
The core tags of this character are 'pink\_hair, breasts, red\_eyes, short\_hair, large\_breasts, hat, eyewear\_on\_head, sunglasses, bangs, bow, hair\_ornament, pink\_eyes', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
44,
61,
5,
4
] | [
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version"
] |
143d5b2a9777a7162d3c167fec17cdbaa3be6982 |
# Dataset Card for Spider-Releastic
This dataset variant contains only the Spider Realistic dataset used in "Structure-Grounded Pretraining for Text-to-SQL". The dataset is created based on the dev split of the Spider dataset (2020-06-07 version from https://yale-lily.github.io/spider). The authors of the dataset modified the original questions to remove the explicit mention of column names while keeping the SQL queries unchanged to better evaluate the model's capability in aligning the NL utterance and the DB schema. For more details, please refer to the authors paper https://arxiv.org/abs/2010.12773. The SQL queries and databases from the original Spider dataset are kept unchanged.
For the official database files, please refer to the Spider release site: https://yale-lily.github.io/spider.
This dataset was copied from Zenodo: https://zenodo.org/records/5205322.
This dataset is distributed under the CC BY-SA 4.0 license.
## Paper Abstract
> Learning to capture text-table alignment is essential for tasks like text-to-SQL. A model needs to correctly recognize natural language references to columns and values and to ground them in the given database schema. In this paper, we present a novel weakly supervised Structure-Grounded pretraining framework (StruG) for text-to-SQL that can effectively learn to capture text-table alignment based on a parallel text-table corpus. We identify a set of novel prediction tasks: column grounding, value grounding and column-value mapping, and leverage them to pretrain a text-table encoder. Additionally, to evaluate different methods under more realistic text-table alignment settings, we create a new evaluation set Spider-Realistic based on Spider dev set with explicit mentions of column names removed, and adopt eight existing text-to-SQL datasets for cross-database evaluation. STRUG brings significant improvement over BERT-LARGE in all settings. Compared with existing pretraining methods such as GRAPPA, STRUG achieves similar performance on Spider, and outperforms all baselines on more realistic sets.
## Citation Information
If you use the dataset, please cite the following papers including the original Spider datasets, Finegan-Dollak et al., 2018 and the original datasets for Restaurants, GeoQuery, Scholar, Academic, IMDB, and Yelp.
```
@article{deng2020structure,
title={Structure-Grounded Pretraining for Text-to-SQL},
author={Deng, Xiang and Awadallah, Ahmed Hassan and Meek, Christopher and Polozov, Oleksandr and Sun, Huan and Richardson, Matthew},
journal={arXiv preprint arXiv:2010.12773},
year={2020}
}
@inproceedings{Yu&al.18c,
year = 2018,
title = {Spider: A Large-Scale Human-Labeled Dataset for Complex and Cross-Domain Semantic Parsing and Text-to-SQL Task},
booktitle = {EMNLP},
author = {Tao Yu and Rui Zhang and Kai Yang and Michihiro Yasunaga and Dongxu Wang and Zifan Li and James Ma and Irene Li and Qingning Yao and Shanelle Roman and Zilin Zhang and Dragomir Radev }
}
@InProceedings{P18-1033,
author = "Finegan-Dollak, Catherine
and Kummerfeld, Jonathan K.
and Zhang, Li
and Ramanathan, Karthik
and Sadasivam, Sesh
and Zhang, Rui
and Radev, Dragomir",
title = "Improving Text-to-SQL Evaluation Methodology",
booktitle = "Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)",
year = "2018",
publisher = "Association for Computational Linguistics",
pages = "351--360",
location = "Melbourne, Australia",
url = "http://aclweb.org/anthology/P18-1033"
}
@InProceedings{data-sql-imdb-yelp,
dataset = {IMDB and Yelp},
author = {Navid Yaghmazadeh, Yuepeng Wang, Isil Dillig, and Thomas Dillig},
title = {SQLizer: Query Synthesis from Natural Language},
booktitle = {International Conference on Object-Oriented Programming, Systems, Languages, and Applications, ACM},
month = {October},
year = {2017},
pages = {63:1--63:26},
url = {http://doi.org/10.1145/3133887},
}
@article{data-academic,
dataset = {Academic},
author = {Fei Li and H. V. Jagadish},
title = {Constructing an Interactive Natural Language Interface for Relational Databases},
journal = {Proceedings of the VLDB Endowment},
volume = {8},
number = {1},
month = {September},
year = {2014},
pages = {73--84},
url = {http://dx.doi.org/10.14778/2735461.2735468},
}
@InProceedings{data-atis-geography-scholar,
dataset = {Scholar, and Updated ATIS and Geography},
author = {Srinivasan Iyer, Ioannis Konstas, Alvin Cheung, Jayant Krishnamurthy, and Luke Zettlemoyer},
title = {Learning a Neural Semantic Parser from User Feedback},
booktitle = {Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)},
year = {2017},
pages = {963--973},
location = {Vancouver, Canada},
url = {http://www.aclweb.org/anthology/P17-1089},
}
@inproceedings{data-geography-original
dataset = {Geography, original},
author = {John M. Zelle and Raymond J. Mooney},
title = {Learning to Parse Database Queries Using Inductive Logic Programming},
booktitle = {Proceedings of the Thirteenth National Conference on Artificial Intelligence - Volume 2},
year = {1996},
pages = {1050--1055},
location = {Portland, Oregon},
url = {http://dl.acm.org/citation.cfm?id=1864519.1864543},
}
@inproceedings{data-restaurants-logic,
author = {Lappoon R. Tang and Raymond J. Mooney},
title = {Automated Construction of Database Interfaces: Intergrating Statistical and Relational Learning for Semantic Parsing},
booktitle = {2000 Joint SIGDAT Conference on Empirical Methods in Natural Language Processing and Very Large Corpora},
year = {2000},
pages = {133--141},
location = {Hong Kong, China},
url = {http://www.aclweb.org/anthology/W00-1317},
}
@inproceedings{data-restaurants-original,
author = {Ana-Maria Popescu, Oren Etzioni, and Henry Kautz},
title = {Towards a Theory of Natural Language Interfaces to Databases},
booktitle = {Proceedings of the 8th International Conference on Intelligent User Interfaces},
year = {2003},
location = {Miami, Florida, USA},
pages = {149--157},
url = {http://doi.acm.org/10.1145/604045.604070},
}
@inproceedings{data-restaurants,
author = {Alessandra Giordani and Alessandro Moschitti},
title = {Automatic Generation and Reranking of SQL-derived Answers to NL Questions},
booktitle = {Proceedings of the Second International Conference on Trustworthy Eternal Systems via Evolving Software, Data and Knowledge},
year = {2012},
location = {Montpellier, France},
pages = {59--76},
url = {https://doi.org/10.1007/978-3-642-45260-4_5},
}
``` | aherntech/spider-realistic | [
"task_categories:text2text-generation",
"size_categories:n<1K",
"language:en",
"license:cc-by-4.0",
"text-to-sql",
"arxiv:2010.12773",
"region:us"
] | 2024-02-17T17:00:56+00:00 | {"language": ["en"], "license": "cc-by-4.0", "size_categories": ["n<1K"], "task_categories": ["text2text-generation"], "pretty_name": "Spider-Rea;ostoc", "tags": ["text-to-sql"]} | 2024-02-17T17:19:05+00:00 | [
"2010.12773"
] | [
"en"
] | TAGS
#task_categories-text2text-generation #size_categories-n<1K #language-English #license-cc-by-4.0 #text-to-sql #arxiv-2010.12773 #region-us
|
# Dataset Card for Spider-Releastic
This dataset variant contains only the Spider Realistic dataset used in "Structure-Grounded Pretraining for Text-to-SQL". The dataset is created based on the dev split of the Spider dataset (2020-06-07 version from URL The authors of the dataset modified the original questions to remove the explicit mention of column names while keeping the SQL queries unchanged to better evaluate the model's capability in aligning the NL utterance and the DB schema. For more details, please refer to the authors paper URL The SQL queries and databases from the original Spider dataset are kept unchanged.
For the official database files, please refer to the Spider release site: URL
This dataset was copied from Zenodo: URL
This dataset is distributed under the CC BY-SA 4.0 license.
## Paper Abstract
> Learning to capture text-table alignment is essential for tasks like text-to-SQL. A model needs to correctly recognize natural language references to columns and values and to ground them in the given database schema. In this paper, we present a novel weakly supervised Structure-Grounded pretraining framework (StruG) for text-to-SQL that can effectively learn to capture text-table alignment based on a parallel text-table corpus. We identify a set of novel prediction tasks: column grounding, value grounding and column-value mapping, and leverage them to pretrain a text-table encoder. Additionally, to evaluate different methods under more realistic text-table alignment settings, we create a new evaluation set Spider-Realistic based on Spider dev set with explicit mentions of column names removed, and adopt eight existing text-to-SQL datasets for cross-database evaluation. STRUG brings significant improvement over BERT-LARGE in all settings. Compared with existing pretraining methods such as GRAPPA, STRUG achieves similar performance on Spider, and outperforms all baselines on more realistic sets.
If you use the dataset, please cite the following papers including the original Spider datasets, Finegan-Dollak et al., 2018 and the original datasets for Restaurants, GeoQuery, Scholar, Academic, IMDB, and Yelp.
| [
"# Dataset Card for Spider-Releastic\n\nThis dataset variant contains only the Spider Realistic dataset used in \"Structure-Grounded Pretraining for Text-to-SQL\". The dataset is created based on the dev split of the Spider dataset (2020-06-07 version from URL The authors of the dataset modified the original questions to remove the explicit mention of column names while keeping the SQL queries unchanged to better evaluate the model's capability in aligning the NL utterance and the DB schema. For more details, please refer to the authors paper URL The SQL queries and databases from the original Spider dataset are kept unchanged.\n\nFor the official database files, please refer to the Spider release site: URL\n\nThis dataset was copied from Zenodo: URL\n\nThis dataset is distributed under the CC BY-SA 4.0 license.",
"## Paper Abstract\n\n> Learning to capture text-table alignment is essential for tasks like text-to-SQL. A model needs to correctly recognize natural language references to columns and values and to ground them in the given database schema. In this paper, we present a novel weakly supervised Structure-Grounded pretraining framework (StruG) for text-to-SQL that can effectively learn to capture text-table alignment based on a parallel text-table corpus. We identify a set of novel prediction tasks: column grounding, value grounding and column-value mapping, and leverage them to pretrain a text-table encoder. Additionally, to evaluate different methods under more realistic text-table alignment settings, we create a new evaluation set Spider-Realistic based on Spider dev set with explicit mentions of column names removed, and adopt eight existing text-to-SQL datasets for cross-database evaluation. STRUG brings significant improvement over BERT-LARGE in all settings. Compared with existing pretraining methods such as GRAPPA, STRUG achieves similar performance on Spider, and outperforms all baselines on more realistic sets. \n\n\n\nIf you use the dataset, please cite the following papers including the original Spider datasets, Finegan-Dollak et al., 2018 and the original datasets for Restaurants, GeoQuery, Scholar, Academic, IMDB, and Yelp."
] | [
"TAGS\n#task_categories-text2text-generation #size_categories-n<1K #language-English #license-cc-by-4.0 #text-to-sql #arxiv-2010.12773 #region-us \n",
"# Dataset Card for Spider-Releastic\n\nThis dataset variant contains only the Spider Realistic dataset used in \"Structure-Grounded Pretraining for Text-to-SQL\". The dataset is created based on the dev split of the Spider dataset (2020-06-07 version from URL The authors of the dataset modified the original questions to remove the explicit mention of column names while keeping the SQL queries unchanged to better evaluate the model's capability in aligning the NL utterance and the DB schema. For more details, please refer to the authors paper URL The SQL queries and databases from the original Spider dataset are kept unchanged.\n\nFor the official database files, please refer to the Spider release site: URL\n\nThis dataset was copied from Zenodo: URL\n\nThis dataset is distributed under the CC BY-SA 4.0 license.",
"## Paper Abstract\n\n> Learning to capture text-table alignment is essential for tasks like text-to-SQL. A model needs to correctly recognize natural language references to columns and values and to ground them in the given database schema. In this paper, we present a novel weakly supervised Structure-Grounded pretraining framework (StruG) for text-to-SQL that can effectively learn to capture text-table alignment based on a parallel text-table corpus. We identify a set of novel prediction tasks: column grounding, value grounding and column-value mapping, and leverage them to pretrain a text-table encoder. Additionally, to evaluate different methods under more realistic text-table alignment settings, we create a new evaluation set Spider-Realistic based on Spider dev set with explicit mentions of column names removed, and adopt eight existing text-to-SQL datasets for cross-database evaluation. STRUG brings significant improvement over BERT-LARGE in all settings. Compared with existing pretraining methods such as GRAPPA, STRUG achieves similar performance on Spider, and outperforms all baselines on more realistic sets. \n\n\n\nIf you use the dataset, please cite the following papers including the original Spider datasets, Finegan-Dollak et al., 2018 and the original datasets for Restaurants, GeoQuery, Scholar, Academic, IMDB, and Yelp."
] | [
57,
191,
331
] | [
"passage: TAGS\n#task_categories-text2text-generation #size_categories-n<1K #language-English #license-cc-by-4.0 #text-to-sql #arxiv-2010.12773 #region-us \n# Dataset Card for Spider-Releastic\n\nThis dataset variant contains only the Spider Realistic dataset used in \"Structure-Grounded Pretraining for Text-to-SQL\". The dataset is created based on the dev split of the Spider dataset (2020-06-07 version from URL The authors of the dataset modified the original questions to remove the explicit mention of column names while keeping the SQL queries unchanged to better evaluate the model's capability in aligning the NL utterance and the DB schema. For more details, please refer to the authors paper URL The SQL queries and databases from the original Spider dataset are kept unchanged.\n\nFor the official database files, please refer to the Spider release site: URL\n\nThis dataset was copied from Zenodo: URL\n\nThis dataset is distributed under the CC BY-SA 4.0 license."
] |
d3ef3e2957d4568807ba310c2acc921f7502768e | # Dataset Card for "Gold-alpaca-med-new-small"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | hemanth955/Gold-alpaca-med-new-small | [
"region:us"
] | 2024-02-17T17:06:37+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "Input", "dtype": "string"}, {"name": "Output", "dtype": "int64"}, {"name": "Instruction", "dtype": "string"}, {"name": "Text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 5229084, "num_examples": 3000}], "download_size": 1204919, "dataset_size": 5229084}} | 2024-02-17T17:06:41+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "Gold-alpaca-med-new-small"
More Information needed | [
"# Dataset Card for \"Gold-alpaca-med-new-small\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"Gold-alpaca-med-new-small\"\n\nMore Information needed"
] | [
6,
21
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"Gold-alpaca-med-new-small\"\n\nMore Information needed"
] |
4d8a6247a05c59706a9e6a17cbf19ed0e0360fd0 |
# Annotation
Each date in the Products-Real and Products-Synth datasets is annotated with class, bounding box coordinates, date transcription, image width, and height. There are four classes defined: date, due, prod, and code in the training sets. Expiration dates in the test set of Product-Real are specifically labeled as "exp" class for easy evaluation, unlike the training set of Product-Real. Each component in the Date-Real and Date-Synth datasets is annotated with class, bounding box, and transcription. The day, month, and year are used as the classes for each component of the dates. Moreover, Components-Real and Components-Synth datasets consist of the components of the day, month, and year and their transcriptions.
# Citation
Dataset published originally in `A Generalized Framework for Recognition of Expiration Date on Product Packages Using Fully Convolutional Networks`
@article{seker2022generalized,
title={A generalized framework for recognition of expiration dates on product packages using fully convolutional networks},
author={Seker, Ahmet Cagatay and Ahn, Sang Chul},
journal={Expert Systems with Applications},
pages={117310},
year={2022},
publisher={Elsevier}
} | dimun/ExpirationDate | [
"task_categories:object-detection",
"language:en",
"license:afl-3.0",
"region:us"
] | 2024-02-17T17:16:08+00:00 | {"language": ["en"], "license": "afl-3.0", "task_categories": ["object-detection"]} | 2024-02-17T17:27:54+00:00 | [] | [
"en"
] | TAGS
#task_categories-object-detection #language-English #license-afl-3.0 #region-us
|
# Annotation
Each date in the Products-Real and Products-Synth datasets is annotated with class, bounding box coordinates, date transcription, image width, and height. There are four classes defined: date, due, prod, and code in the training sets. Expiration dates in the test set of Product-Real are specifically labeled as "exp" class for easy evaluation, unlike the training set of Product-Real. Each component in the Date-Real and Date-Synth datasets is annotated with class, bounding box, and transcription. The day, month, and year are used as the classes for each component of the dates. Moreover, Components-Real and Components-Synth datasets consist of the components of the day, month, and year and their transcriptions.
Dataset published originally in 'A Generalized Framework for Recognition of Expiration Date on Product Packages Using Fully Convolutional Networks'
@article{seker2022generalized,
title={A generalized framework for recognition of expiration dates on product packages using fully convolutional networks},
author={Seker, Ahmet Cagatay and Ahn, Sang Chul},
journal={Expert Systems with Applications},
pages={117310},
year={2022},
publisher={Elsevier}
} | [
"# Annotation\nEach date in the Products-Real and Products-Synth datasets is annotated with class, bounding box coordinates, date transcription, image width, and height. There are four classes defined: date, due, prod, and code in the training sets. Expiration dates in the test set of Product-Real are specifically labeled as \"exp\" class for easy evaluation, unlike the training set of Product-Real. Each component in the Date-Real and Date-Synth datasets is annotated with class, bounding box, and transcription. The day, month, and year are used as the classes for each component of the dates. Moreover, Components-Real and Components-Synth datasets consist of the components of the day, month, and year and their transcriptions.\n\nDataset published originally in 'A Generalized Framework for Recognition of Expiration Date on Product Packages Using Fully Convolutional Networks'\n@article{seker2022generalized,\n title={A generalized framework for recognition of expiration dates on product packages using fully convolutional networks},\n author={Seker, Ahmet Cagatay and Ahn, Sang Chul},\n journal={Expert Systems with Applications},\n pages={117310},\n year={2022},\n publisher={Elsevier}\n }"
] | [
"TAGS\n#task_categories-object-detection #language-English #license-afl-3.0 #region-us \n",
"# Annotation\nEach date in the Products-Real and Products-Synth datasets is annotated with class, bounding box coordinates, date transcription, image width, and height. There are four classes defined: date, due, prod, and code in the training sets. Expiration dates in the test set of Product-Real are specifically labeled as \"exp\" class for easy evaluation, unlike the training set of Product-Real. Each component in the Date-Real and Date-Synth datasets is annotated with class, bounding box, and transcription. The day, month, and year are used as the classes for each component of the dates. Moreover, Components-Real and Components-Synth datasets consist of the components of the day, month, and year and their transcriptions.\n\nDataset published originally in 'A Generalized Framework for Recognition of Expiration Date on Product Packages Using Fully Convolutional Networks'\n@article{seker2022generalized,\n title={A generalized framework for recognition of expiration dates on product packages using fully convolutional networks},\n author={Seker, Ahmet Cagatay and Ahn, Sang Chul},\n journal={Expert Systems with Applications},\n pages={117310},\n year={2022},\n publisher={Elsevier}\n }"
] | [
29,
308
] | [
"passage: TAGS\n#task_categories-object-detection #language-English #license-afl-3.0 #region-us \n# Annotation\nEach date in the Products-Real and Products-Synth datasets is annotated with class, bounding box coordinates, date transcription, image width, and height. There are four classes defined: date, due, prod, and code in the training sets. Expiration dates in the test set of Product-Real are specifically labeled as \"exp\" class for easy evaluation, unlike the training set of Product-Real. Each component in the Date-Real and Date-Synth datasets is annotated with class, bounding box, and transcription. The day, month, and year are used as the classes for each component of the dates. Moreover, Components-Real and Components-Synth datasets consist of the components of the day, month, and year and their transcriptions.\n\nDataset published originally in 'A Generalized Framework for Recognition of Expiration Date on Product Packages Using Fully Convolutional Networks'\n@article{seker2022generalized,\n title={A generalized framework for recognition of expiration dates on product packages using fully convolutional networks},\n author={Seker, Ahmet Cagatay and Ahn, Sang Chul},\n journal={Expert Systems with Applications},\n pages={117310},\n year={2022},\n publisher={Elsevier}\n }"
] |
2305e7834fca367897ad93968493a07007577b27 | SST with openai whisper large V3 for collect better part of the voices | sarpba/common_voice_16.1_hu_texts | [
"license:apache-2.0",
"region:us"
] | 2024-02-17T17:16:36+00:00 | {"license": "apache-2.0"} | 2024-02-17T17:19:07+00:00 | [] | [] | TAGS
#license-apache-2.0 #region-us
| SST with openai whisper large V3 for collect better part of the voices | [] | [
"TAGS\n#license-apache-2.0 #region-us \n"
] | [
14
] | [
"passage: TAGS\n#license-apache-2.0 #region-us \n"
] |