{"success":true,"database":"eegdash","data":{"_id":"6965f3e1ac44fa1028dc6316","dataset_id":"ds007172","associated_paper_doi":null,"authors":["Petunia Reinke","Lisa Deneke","Sebastian Ocklenburg"],"bids_version":"1.7.0","contact_info":["Petunia Reinke"],"contributing_labs":null,"data_processed":false,"dataset_doi":"doi:10.18112/openneuro.ds007172.v1.0.0","datatypes":["eeg"],"demographics":{"subjects_count":100,"ages":[],"age_min":null,"age_max":null,"age_mean":null,"species":null,"sex_distribution":{"m":46,"f":53,"o":1},"handedness_distribution":{"r":52,"l":48}},"experimental_modalities":null,"external_links":{"source_url":"https://openneuro.org/datasets/ds007172","osf_url":null,"github_url":null,"paper_url":null},"funding":[],"ingestion_fingerprint":"2f5c59df4fd6f867bd9cb2a4d35d23316899430cf5b94cb9b80942c6550b96fd","license":"CC0","n_contributing_labs":null,"name":"EEG-Asymmetries Dataset","readme":"﻿References BIDS\n---------------\nAppelhoff, S., Sanderson, M., Brooks, T., Vliet, M., Quentin, R., Holdgraf, C., Chaumon, M., Mikulan, E., Tavabi, K., Höchenberger, R., Welke, D., Brunner, C., Rockhill, A., Larson, E., Gramfort, A. and Jas, M. (2019). MNE-BIDS: Organizing electrophysiological data into the BIDS format and facilitating their analysis. Journal of Open Source Software 4: (1896).https://doi.org/10.21105/joss.01896\nPernet, C. R., Appelhoff, S., Gorgolewski, K. J., Flandin, G., Phillips, C., Delorme, A., Oostenveld, R. (2019). EEG-BIDS, an extension to the brain imaging data structure for electroencephalography. Scientific Data, 6, 103.https://doi.org/10.1038/s41597-019-0104-8\nReferences Dataset\n------------------\nReinke, P., Deneke, L., & Ocklenburg, S. (2025). Hemispheric asymmetries in the EEG: Is there an association between N1 lateralization and alpha asymmetry?. Laterality, 1–50. Advance online publication. https://doi.org/10.1080/1357650X.2025.2591660\nDataset description\n-----------------------------\nThe dataset comprises 100 participants (53 females, 46 males, 1 diverse individual). 27 of the females were right-handed, the rest was non-right-hand dominant. Of the males 24 were right-handed, while the rest was non-right-hand dominant. The mean age of the participants was 25.6 [4.91SD] years. All participants reported normal or 2 corrected-to-normal vision, had no unilateral sensory or motor deficits, no history of mental illnesses 3 or neurologic disorders, and were currently not taking any medication.\nAll participants started with a resting state (RS) of approximately eight minutes, where periods of open and closed eyes were included (each period was 63seconds, leading to 4.2 minutes of open eyes and 4.2 minutes of closed eyes resting state). After the RS each participant completed four tasks in a randomized order. Each task was constructed in the same way: The participants were instructed verbally as well as in written form directly before each trial 4 began. They were told to only react to the target stimuli (animal names, female faces, and houses with pitched roofs) via a press on the space bar. Each trial consisted of three blocks: one short practice block, one block where answers should be given with the right hand, and one block where answers should be given with the left hand. The starting hand was randomized across participants. During the trials, words (words task) or pictures (faces, emotions, and houses task) were shown in the center of the screen for one second, followed by a fixation cross for 500-700ms. After 80 stimuli, the response hand was changed, leading to a total of 160 stimuli presentations for each task. For more specific information look here: Reinke, P., Deneke, L., & Ocklenburg, S. (2025). Hemispheric asymmetries in the EEG: Is there an association between N1 lateralization and alpha asymmetry?. Laterality, 1–50. Advance online publication. https://doi.org/10.1080/1357650X.2025.2591660\nTrigger description\n-------------------\nResting State (“rest”):\nRest/Open: 1\nRest/Closed: 2\nWords Task (“words”):\nRight Hand & animal name: 13\nRight Hand & non-animal word: 14\nLeft Hand & animal name: 23\nLeft Hand & non-animal word: 24\nFaces Task (“faces”):\nRight Hand – Male – Black: 117\nRight Hand – Male – White: 118\nRight Hand – Female – Black: 127\nRight Hand – Female – White: 128\nLeft Hand – Male – Black: 217\nLeft Hand – Male – White: 218\nLeft Hand – Female – Black: 227\nLeft Hand – Female – White: 228\nEmotions Task (“emotions”):\nRight Hand – Male – Angry: 111\nRight Hand – Male – Fearful: 112\nRight Hand – Male – Happy (mouth open): 113\nRight Hand – Male – Happy (mouth closed): 114\nRight Hand – Female – Angry: 121\nRight Hand – Female – Fearful: 122\nRight Hand – Female – Happy (mouth open): 123\nRight Hand – Female – Happy (mouth closed): 124´\nLeft Hand – Male – Angry: 211\nLeft Hand – Male – Fearful: 212\nLeft Hand – Male – Happy (mouth open): 213\nLeft Hand – Male – Happy (mouth closed): 214\nLeft Hand – Female – Angry: 221\nLeft Hand – Female – Fearful: 222\nLeft Hand – Female – Happy (mouth open): 223\nLeft Hand – Female – Happy (mouth closed): 224\nHouses task (“houses”):\nRight Hand & Pitched Roof: 11\nRight Hand & Flat Roof: 12\nLeft Hand & Pitched Roof: 21\nLeft Hand & Flat Roof: 22","recording_modality":["eeg"],"senior_author":"Sebastian Ocklenburg","sessions":[],"size_bytes":11852106397,"source":"openneuro","storage":{"backend":"s3","base":"s3://openneuro.org/ds007172","raw_key":"dataset_description.json","dep_keys":["CHANGES","README","participants.json","participants.tsv"]},"study_design":null,"study_domain":null,"tasks":["emotions","emotios","faces","houses","rest","words"],"timestamps":{"digested_at":"2026-04-22T12:30:05.612894+00:00","dataset_created_at":"2026-01-05T08:14:16.940Z","dataset_modified_at":"2026-01-06T13:23:50.000Z"},"total_files":501,"tagger_meta":{"config_hash":"4a051be509a0e3d0","metadata_hash":"5f66c67e89595dec","model":"openai/gpt-5.2","tagged_at":"2026-01-20T19:30:53.582611+00:00"},"tags":{"pathology":["Healthy"],"modality":["Visual"],"type":["Attention"],"confidence":{"pathology":0.8,"modality":0.8,"type":0.72},"reasoning":{"few_shot_analysis":"Closest convention match is the few-shot example \"EEG: DPX Cog Ctl Task in Acute Mild TBI\" labeled as Type=Attention with a visual cue/probe paradigm and instructed responses. Although this dataset is not clinical, it similarly uses visual stimuli with target vs non-target processing and button-press responses, which (by the few-shot convention) maps better to Attention than Perception when the instruction is to selectively respond to targets. Few-shot examples also reinforce that Modality should follow the stimulus channel (e.g., visual screen stimuli in multiple examples are labeled Modality=Visual), not the response modality.","metadata_analysis":"Population/health status: \"All participants reported ... had no ... history of mental illnesses ... or neurologic disorders\" and \"were currently not taking any medication.\" Also described as \"The dataset comprises 100 participants\" with no clinical recruitment described.\nTask/stimuli: After resting state, \"words (words task) or pictures (faces, emotions, and houses task) were shown in the center of the screen\" and participants were instructed \"to only react to the target stimuli ... via a press on the space bar.\" Resting state is also included: \"All participants started with a resting state (RS) ... where periods of open and closed eyes were included.\"","paper_abstract_analysis":"No useful paper information. (Only a citation is provided; no abstract text in the metadata.)","evidence_alignment_check":"Pathology: Metadata says participants had \"no history of mental illnesses ... or neurologic disorders\" and no medication, implying a non-clinical cohort. Few-shot pattern suggests using Healthy when no disorder-based recruitment is present. ALIGN.\nModality: Metadata says stimuli are \"words ... or pictures ... shown in the center of the screen\" (visual), and resting state eyes open/closed is also present. Few-shot pattern suggests labeling modality by dominant stimulus channel; for screen-based tasks this is Visual. ALIGN (Visual is dominant across multiple tasks; resting is secondary).\nType: Metadata says participants \"only react to the target stimuli\" (target detection/selection) and the study concerns N1/alpha asymmetries during tasks and rest. Few-shot convention for target/non-target control-demanding paradigms (e.g., DPX) maps to Attention rather than Perception when selective responding is central. Mostly ALIGN; mild ambiguity with Perception because N1 is sensory-evoked, but task instructions emphasize selective target responding.","decision_summary":"Pathology top-2: (1) Healthy — supported by \"no history of mental illnesses ... or neurologic disorders\" and \"not taking any medication\" and absence of any clinical group; (2) Unknown — would apply if health status were not stated. Winner: Healthy (alignment clear). \nModality top-2: (1) Visual — supported by \"words ... or pictures ... shown in the center of the screen\" and fixation cross timing; (2) Resting State — because \"resting state ... open and closed eyes\" is included. Winner: Visual because multiple active visual tasks dominate the dataset content beyond the initial RS block. \nType top-2: (1) Attention — supported by \"only react to the target stimuli\" (selective responding/target detection) and block structure with go/no-go-like targets; (2) Perception — plausible given focus on N1 lateralization (early sensory ERP). Winner: Attention, following few-shot convention that target vs non-target response selection paradigms map to Attention; confidence is moderate due to the plausible Perception interpretation."}},"computed_title":"EEG-Asymmetries Dataset","nchans_counts":[{"val":32,"count":496},{"val":29,"count":5}],"sfreq_counts":[{"val":500.0,"count":496},{"val":1000.0,"count":5}],"stats_computed_at":"2026-04-22T23:16:00.312278+00:00","total_duration_s":183542.403,"canonical_name":null,"name_confidence":0.62,"name_meta":{"suggested_at":"2026-04-14T10:18:35.343Z","model":"openai/gpt-5.2 + openai/gpt-5.4-mini + deterministic_fallback"},"name_source":"canonical","author_year":"Reinke2026"}}