{"success":true,"database":"eegdash","data":{"_id":"69d16e04897a7725c66f4c70","dataset_id":"nm000135","associated_paper_doi":null,"authors":["R. Leeb","C. Brunner","G. R. Müller-Putz","A. Schlögl","G. Pfurtscheller","F. Lee","C. Keinrath","R. Scherer","H. Bischof"],"bids_version":"1.9.0","contact_info":null,"contributing_labs":null,"data_processed":false,"dataset_doi":"10.82901/nemar.nm000135","datatypes":["eeg"],"demographics":{"subjects_count":1,"ages":[21],"age_min":21,"age_max":21,"age_mean":21.0,"species":null,"sex_distribution":{"f":1},"handedness_distribution":{"r":1}},"experimental_modalities":null,"external_links":{"source_url":"https://nemar.org/dataexplorer/detail/nm000135","osf_url":null,"github_url":null,"paper_url":null},"funding":[],"ingestion_fingerprint":"e6207f5e412f05dab46470d957f87a8e9f741e7f33629ddeff0c82977be317f0","license":"CC-BY-ND-4.0","n_contributing_labs":null,"name":"BNCI 2014-004 Motor Imagery dataset","readme":"[![DOI](https://img.shields.io/badge/DOI-10.82901%2Fnemar.nm000135-blue)](https://doi.org/10.82901/nemar.nm000135)\n# BNCI 2014-004 Motor Imagery dataset\nBNCI 2014-004 Motor Imagery dataset.\n## Dataset Overview\n- **Code**: BNCI2014-004\n- **Paradigm**: imagery\n- **DOI**: 10.1109/TNSRE.2007.906956\n- **Subjects**: 9\n- **Sessions per subject**: 5\n- **Events**: left_hand=1, right_hand=2\n- **Trial interval**: [3, 7.5] s\n- **Session IDs**: 01T, 02T, 03T, 04E, 05E\n- **File format**: GDF\n## Acquisition\n- **Sampling rate**: 250.0 Hz\n- **Number of channels**: 3\n- **Channel types**: eeg=3, eog=3\n- **Channel names**: C3, C4, Cz, EOG1, EOG2, EOG3\n- **Montage**: standard_1020\n- **Hardware**: g.tec\n- **Software**: rtsBCI (MATLAB/Simulink)\n- **Reference**: left mastoid\n- **Ground**: Fz\n- **Sensor type**: EEG\n- **Line frequency**: 50.0 Hz\n- **Online filters**: 0.5-100 Hz bandpass, 50 Hz notch\n- **Cap manufacturer**: Easycap\n- **Electrode material**: Ag/AgCl\n- **Auxiliary channels**: EOG (3 ch, horizontal, vertical, radial)\n## Participants\n- **Number of subjects**: 9\n- **Health status**: healthy\n- **Age**: mean=24.7, std=3.3\n- **Handedness**: right\n- **BCI experience**: naive\n- **Species**: human\n## Experimental Protocol\n- **Paradigm**: imagery\n- **Task type**: motor_imagery\n- **Number of classes**: 2\n- **Class labels**: left_hand, right_hand\n- **Trial duration**: 7.5 s\n- **Tasks**: left_hand_imagery, right_hand_imagery\n- **Study design**: Two-class motor imagery: left hand and right hand. Screening sessions (01T, 02T) without feedback, feedback sessions (03T, 04E, 05E) with smiley feedback.\n- **Study domain**: brain-computer interface\n- **Feedback type**: visual\n- **Stimulus type**: arrow_cue\n- **Stimulus modalities**: visual, auditory\n- **Primary modality**: visual\n- **Synchronicity**: cue_based\n- **Mode**: both\n- **Training/test split**: True\n- **Instructions**: Subjects selected their best motor imagery strategy (e.g., squeezing a ball or pulling a brake) and performed kinesthetic motor imagery of left or right hand movements.\n## HED Event Annotations\nSchema: HED 8.4.0 | Browse: https://www.hedtags.org/hed-schema-browser\n```\n  left_hand\n    ├─ Sensory-event\n    │  ├─ Experimental-stimulus\n    │  ├─ Visual-presentation\n    │  └─ Leftward, Arrow\n    └─ Agent-action\n       └─ Imagine\n          ├─ Move\n          └─ Left, Hand\n  right_hand\n    ├─ Sensory-event\n    │  ├─ Experimental-stimulus\n    │  ├─ Visual-presentation\n    │  └─ Rightward, Arrow\n    └─ Agent-action\n       └─ Imagine\n          ├─ Move\n          └─ Right, Hand\n```\n## Paradigm-Specific Parameters\n- **Detected paradigm**: motor_imagery\n- **Imagery tasks**: left_hand, right_hand\n- **Cue duration**: 1.25 s\n- **Imagery duration**: 4.0 s\n## Data Structure\n- **Trials**: {'screening': 120, 'feedback': 160}\n- **Trials context**: per session\n## Preprocessing\n- **Data state**: raw with online filtering\n- **Preprocessing applied**: True\n- **Steps**: bandpass filtering, notch filtering\n- **Highpass filter**: 0.5 Hz\n- **Lowpass filter**: 100.0 Hz\n- **Bandpass filter**: {'low_cutoff_hz': 0.5, 'high_cutoff_hz': 100.0}\n- **Notch filter**: [50.0] Hz\n- **Filter type**: analog\n- **Notes**: Online bandpass (0.5-100 Hz) and notch (50 Hz) filters applied during recording. Artifact trials marked with event type 1023. EOG channels provided for user-applied artifact correction.\n## Signal Processing\n- **Classifiers**: LDA\n- **Feature extraction**: Bandpower, BP\n## Cross-Validation\n- **Method**: 10x10 cross-validation\n- **Folds**: 10\n- **Evaluation type**: within_subject\n## BCI Application\n- **Applications**: motor_control\n- **Environment**: laboratory\n- **Online feedback**: True\n## Tags\n- **Pathology**: Healthy\n- **Modality**: Motor\n- **Type**: Motor Imagery\n## Documentation\n- **Description**: BCI Competition 2008 - Graz data set B: Two-class motor imagery dataset (left/right hand) with screening sessions (no feedback) and smiley feedback sessions. 9 subjects, 3 bipolar EEG channels (C3, Cz, C4) + 3 EOG channels, 250 Hz.\n- **DOI**: 10.1109/TNSRE.2007.906956\n- **License**: CC-BY-ND-4.0\n- **Investigators**: R. Leeb, C. Brunner, G. R. Müller-Putz, A. Schlögl, G. Pfurtscheller, F. Lee, C. Keinrath, R. Scherer, H. Bischof\n- **Senior author**: G. Pfurtscheller\n- **Institution**: Graz University of Technology\n- **Department**: Institute for Knowledge Discovery\n- **Country**: AT\n- **Repository**: BNCI Horizon\n- **Data URL**: http://biosig.sourceforge.net/\n- **Publication year**: 2007\n- **Keywords**: brain-computer interface, BCI, electroencephalogram, EEG, motor imagery, BCI competition, smiley feedback\n## External Links\n- **Source**: http://biosig.sourceforge.net/\n## Abstract\nBCI Competition 2008 Graz data set B. EEG data from 9 subjects performing two-class motor imagery (left hand vs right hand). Two screening sessions without feedback (120 trials each) and three feedback sessions with smiley feedback (160 trials each). Three bipolar EEG channels (C3, Cz, C4) and three EOG channels recorded at 250 Hz.\n## Methodology\nSubjects performed kinesthetic motor imagery of left or right hand movements. Two screening sessions (01T, 02T) without feedback: 6 runs x 20 trials = 120 trials per session. Three feedback sessions (03T, 04E, 05E) with smiley feedback: 4 runs x 40 trials (20 per class) = 160 trials per session. Screening trials: fixation cross + beep at t=0, arrow cue at ~t=2 for 1.25s, imagery for 4s, break. Feedback trials: smiley at t=0, beep at t=2, cue from t=3 to t=7.5 with continuous smiley feedback. Three bipolar EEG channels (C3, Cz, C4) plus three monopolar EOG channels recorded at 250 Hz with 0.5-100 Hz bandpass and 50 Hz notch filter. EEG ground at Fz, EOG reference at left mastoid. Amplifier: g.tec. Software: rtsBCI (MATLAB/Simulink).\n## References\nTangermann, M., Muller, K.R., Aertsen, A., Birbaumer, N., Braun, C., Brunner, C., Leeb, R., Mehring, C., Miller, K.J., Mueller-Putz, G. and Nolte, G., 2012. Review of the BCI competition IV. Frontiers in neuroscience, 6, p.55.\nNotes\n.. note::\n``BNCI2014_004`` was previously named ``BNCI2014004``. ``BNCI2014004`` will be removed in version 1.1.\n.. versionadded:: 0.4.0\nThis dataset is commonly referred to as \"BCI Competition IV Dataset 2b\". It is widely used for binary motor imagery classification tasks.\nSee Also\nBNCI2014_001 : 4-class motor imagery (Dataset 2a) BNCI2014_002 : 2-class motor imagery with Laplacian derivations\nAppelhoff, S., Sanderson, M., Brooks, T., Vliet, M., Quentin, R., Holdgraf, C., Chaumon, M., Mikulan, E., Tavabi, K., Hochenberger, R., Welke, D., Brunner, C., Rockhill, A., Larson, E., Gramfort, A. and Jas, M. (2019). MNE-BIDS: Organizing electrophysiological data into the BIDS format and facilitating their analysis. Journal of Open Source Software 4: (1896). https://doi.org/10.21105/joss.01896\nPernet, C. R., Appelhoff, S., Gorgolewski, K. J., Flandin, G., Phillips, C., Delorme, A., Oostenveld, R. (2019). EEG-BIDS, an extension to the brain imaging data structure for electroencephalography. Scientific Data, 6, 103. https://doi.org/10.1038/s41597-019-0104-8\n---\nGenerated by MOABB 1.4.3 (Mother of All BCI Benchmarks)\nhttps://github.com/NeuroTechX/moabb","recording_modality":["eeg"],"senior_author":null,"sessions":["0train","1train","2train","3test","4test"],"size_bytes":23698887,"source":"nemar","storage":{"backend":"nemar","base":"s3://nemar/nm000135","raw_key":"dataset_description.json","dep_keys":["README.md","participants.json","participants.tsv"]},"study_design":null,"study_domain":null,"tasks":["imagery"],"timestamps":{"digested_at":"2026-04-30T14:08:38.585744+00:00","dataset_created_at":null,"dataset_modified_at":"2026-04-17T13:44:25Z"},"total_files":5,"computed_title":"BNCI 2014-004 Motor Imagery dataset","nchans_counts":[{"val":3,"count":5}],"sfreq_counts":[{"val":250.0,"count":5}],"stats_computed_at":"2026-05-01T13:49:34.644974+00:00","total_duration_s":10267.752,"tagger_meta":{"config_hash":"3557b68bca409f28","metadata_hash":"fc3974991e555444","model":"openai/gpt-5.2","tagged_at":"2026-04-07T09:32:40.872789+00:00"},"tags":{"pathology":["Healthy"],"modality":["Visual"],"type":["Motor"],"confidence":{"pathology":0.85,"modality":0.9,"type":0.9},"reasoning":{"few_shot_analysis":"Most similar few-shot example: \"EEG Motor Movement/Imagery Dataset\" (Schalk et al.). It is a motor execution/imagery paradigm with on-screen targets/cues, labeled as Pathology=Healthy, Modality=Visual, Type=Motor. This guides the convention that motor-imagery BCI datasets are typed as Motor, while Modality is assigned based on the stimulus/cue channel (often visual) rather than the participant’s imagined movement.\nA second supporting convention appears in other task-based datasets (e.g., gambling/RL) where Modality follows presented stimuli (visual/auditory), not button presses; this matches the instruction to infer modality from stimulus type, not response.","metadata_analysis":"Key metadata facts:\n1) Population: \"Health status: healthy\" and also \"Pathology: Healthy\" (in Tags).\n2) Task/purpose: \"Paradigm: imagery\" and \"Task type: motor_imagery\" with \"Tasks: left_hand_imagery, right_hand_imagery\"; also \"Study domain: brain-computer interface\".\n3) Stimulus channel: \"Stimulus type: arrow_cue\" and \"Feedback type: visual\" plus \"Primary modality: visual\"; additionally \"Stimulus modalities: visual, auditory\" (beep mentioned in methodology: \"fixation cross + beep ... arrow cue\").","paper_abstract_analysis":"No useful paper information. (Only a short dataset abstract restating motor imagery and feedback; no added disambiguation beyond the metadata.)","evidence_alignment_check":"Pathology:\n- Metadata says: \"Health status: healthy\" / \"Pathology: Healthy\".\n- Few-shot suggests: Motor imagery datasets like EEGmmidb are commonly \"Healthy\".\n- Alignment: ALIGN.\n\nModality:\n- Metadata says: \"Stimulus type: arrow_cue\", \"Primary modality: visual\", and \"Stimulus modalities: visual, auditory\".\n- Few-shot suggests: Similar motor imagery cueing datasets are labeled \"Visual\" modality (EEGmmidb example).\n- Alignment: ALIGN (choose dominant/primary stimulus channel = visual).\n\nType:\n- Metadata says: \"Task type: motor_imagery\" and \"Subjects ... performed kinesthetic motor imagery of left or right hand movements\".\n- Few-shot suggests: Motor imagery/execution datasets are labeled Type=\"Motor\" (EEGmmidb example).\n- Alignment: ALIGN.","decision_summary":"Top-2 candidates (with head-to-head comparison):\n\nPathology:\n1) Healthy (selected) — explicit: \"Health status: healthy\"; also tag: \"Pathology: Healthy\".\n2) Unknown — would apply if no population info; rejected due to explicit health-status statements.\nAlignment: aligned with few-shot motor-imagery example.\n\nModality:\n1) Visual (selected) — explicit: \"Primary modality: visual\"; supported by \"Stimulus type: arrow_cue\" and \"Feedback type: visual\".\n2) Multisensory — plausible because \"Stimulus modalities: visual, auditory\" and \"beep\" is present; rejected because metadata explicitly states \"Primary modality: visual\" and the task is cue-based with visual arrow cues.\nAlignment: aligned with few-shot convention for motor-imagery datasets using visual cues.\n\nType:\n1) Motor (selected) — explicit: \"Task type: motor_imagery\" and left/right hand imagery tasks.\n2) Perception — could be considered if focusing on cue processing; rejected because the research aim is BCI motor imagery classification/control (\"Study domain: brain-computer interface\", \"Applications: motor_control\").\nAlignment: aligned with few-shot motor/imagery dataset labeling.\n\nConfidence justification quotes/features:\n- Pathology: \"Health status: healthy\"; \"Pathology: Healthy\".\n- Modality: \"Primary modality: visual\"; \"Stimulus type: arrow_cue\"; \"Feedback type: visual\".\n- Type: \"Task type: motor_imagery\"; \"performed kinesthetic motor imagery of left or right hand movements\"; \"Study domain: brain-computer interface\"."}},"canonical_name":null,"name_confidence":0.86,"name_meta":{"suggested_at":"2026-04-14T10:18:35.343Z","model":"openai/gpt-5.2 + openai/gpt-5.4-mini + deterministic_fallback"},"name_source":"canonical","author_year":"Leeb2014"}}