{"success":true,"database":"eegdash","data":{"_id":"69d16e04897a7725c66f4c77","dataset_id":"nm000142","associated_paper_doi":null,"authors":["Xiaoli Wu","Wenhui Zhang","Zhibo Fu","Roy T.H. Cheung","Rosa H.M. Chan"],"bids_version":"1.9.0","contact_info":null,"contributing_labs":null,"data_processed":true,"dataset_doi":"10.82901/nemar.nm000142","datatypes":["eeg"],"demographics":{"subjects_count":6,"ages":[25,25,25,25,25,25],"age_min":25,"age_max":25,"age_mean":25.0,"species":null,"sex_distribution":null,"handedness_distribution":{"r":6}},"experimental_modalities":null,"external_links":{"source_url":"https://nemar.org/dataexplorer/detail/nm000142","osf_url":null,"github_url":null,"paper_url":null},"funding":[],"ingestion_fingerprint":"1e2a90b5908ee1f34bfcea97f314f8829d6b774d6e02332c97fe15b43294bcfe","license":"CC-BY-4.0","n_contributing_labs":null,"name":"Ear-EEG motor execution dataset from Wu et al 2020","readme":"[![DOI](https://img.shields.io/badge/DOI-10.82901%2Fnemar.nm000142-blue)](https://doi.org/10.82901/nemar.nm000142)\n# Ear-EEG motor execution dataset from Wu et al 2020\nEar-EEG motor execution dataset from Wu et al 2020.\n## Dataset Overview\n- **Code**: Wu2020\n- **Paradigm**: imagery\n- **DOI**: 10.1088/1741-2552/abc1b6\n- **Subjects**: 6\n- **Sessions per subject**: 1\n- **Events**: left_hand=1, right_hand=2\n- **Trial interval**: [0, 4] s\n- **File format**: Curry\n## Acquisition\n- **Sampling rate**: 1000.0 Hz\n- **Number of channels**: 122\n- **Channel types**: eeg=122, misc=10\n- **Montage**: standard_1005\n- **Hardware**: Neuroscan SynAmps2\n- **Reference**: scalp REF\n- **Ground**: scalp GRD\n- **Sensor type**: Ag/AgCl\n- **Line frequency**: 50.0 Hz\n- **Online filters**: {'bandpass': [0.5, 100]}\n## Participants\n- **Number of subjects**: 6\n- **Health status**: healthy\n- **Age**: mean=25.0, min=22.0, max=28.0\n- **Gender distribution**: female=4, male=2\n- **Handedness**: right-handed\n- **Species**: human\n## Experimental Protocol\n- **Paradigm**: imagery\n- **Number of classes**: 2\n- **Class labels**: left_hand, right_hand\n- **Trial duration**: 4.0 s\n- **Study design**: Motor execution (fist clenching) with simultaneous scalp and ear-EEG recording\n- **Feedback type**: none\n- **Stimulus type**: arrow cues\n- **Stimulus modalities**: visual, auditory\n- **Primary modality**: visual\n- **Synchronicity**: synchronous\n- **Mode**: offline\n## HED Event Annotations\nSchema: HED 8.4.0 | Browse: https://www.hedtags.org/hed-schema-browser\n```\n  left_hand\n    ├─ Sensory-event, Experimental-stimulus, Visual-presentation\n    └─ Agent-action\n       └─ Imagine\n          ├─ Move\n          └─ Left, Hand\n  right_hand\n    ├─ Sensory-event, Experimental-stimulus, Visual-presentation\n    └─ Agent-action\n       └─ Imagine\n          ├─ Move\n          └─ Right, Hand\n```\n## Paradigm-Specific Parameters\n- **Detected paradigm**: motor_imagery\n- **Imagery tasks**: left_hand, right_hand\n## Data Structure\n- **Trials**: 1114\n- **Trials context**: S1: 240, S2: 160, S3: 160, S4: 80, S5: 234, S6: 240 = 1114\n## Signal Processing\n- **Classifiers**: EEGNet\n## Cross-Validation\n- **Evaluation type**: within_subject\n## BCI Application\n- **Applications**: motor_control\n- **Environment**: laboratory\n- **Online feedback**: False\n## Tags\n- **Pathology**: Healthy\n- **Modality**: Motor\n- **Type**: Research\n## Documentation\n- **DOI**: 10.1088/1741-2552/abc1b6\n- **License**: CC-BY-4.0\n- **Investigators**: Xiaoli Wu, Wenhui Zhang, Zhibo Fu, Roy T.H. Cheung, Rosa H.M. Chan\n- **Institution**: City University of Hong Kong\n- **Country**: HK\n- **Repository**: Zenodo\n- **Data URL**: https://zenodo.org/records/18961128\n- **Publication year**: 2020\n## References\nWu, X., Zhang, W., Fu, Z., Cheung, R. T. H., & Chan, R. H. M. (2020). An investigation of in-ear sensing for motor task classification. Journal of Neural Engineering, 17(6), 066029. https://doi.org/10.1088/1741-2552/abc1b6\nAppelhoff, S., Sanderson, M., Brooks, T., Vliet, M., Quentin, R., Holdgraf, C., Chaumon, M., Mikulan, E., Tavabi, K., Hochenberger, R., Welke, D., Brunner, C., Rockhill, A., Larson, E., Gramfort, A. and Jas, M. (2019). MNE-BIDS: Organizing electrophysiological data into the BIDS format and facilitating their analysis. Journal of Open Source Software 4: (1896). https://doi.org/10.21105/joss.01896\nPernet, C. R., Appelhoff, S., Gorgolewski, K. J., Flandin, G., Phillips, C., Delorme, A., Oostenveld, R. (2019). EEG-BIDS, an extension to the brain imaging data structure for electroencephalography. Scientific Data, 6, 103. https://doi.org/10.1038/s41597-019-0104-8\n---\nGenerated by MOABB 1.4.3 (Mother of All BCI Benchmarks)\nhttps://github.com/NeuroTechX/moabb","recording_modality":["eeg"],"senior_author":null,"sessions":["0"],"size_bytes":5290421629,"source":"nemar","storage":{"backend":"nemar","base":"s3://nemar/nm000142","raw_key":"dataset_description.json","dep_keys":["README.md","participants.json","participants.tsv"]},"study_design":null,"study_domain":null,"tasks":["imagery"],"timestamps":{"digested_at":"2026-04-30T14:08:40.084428+00:00","dataset_created_at":null,"dataset_modified_at":"2026-04-17T14:14:27Z"},"total_files":13,"computed_title":"Ear-EEG motor execution dataset from Wu et al 2020","nchans_counts":[{"val":122,"count":13}],"sfreq_counts":[{"val":1000.0,"count":13}],"stats_computed_at":"2026-05-01T13:49:34.645083+00:00","total_duration_s":14420.187,"tagger_meta":{"config_hash":"3557b68bca409f28","metadata_hash":"c1c807210a958846","model":"openai/gpt-5.2","tagged_at":"2026-04-07T09:32:40.872789+00:00"},"tags":{"pathology":["Healthy"],"modality":["Visual"],"type":["Motor"],"confidence":{"pathology":0.85,"modality":0.9,"type":0.9},"reasoning":{"few_shot_analysis":"Most similar few-shot example: 'EEG Motor Movement/Imagery Dataset' (Schalk et al.). It involves healthy participants performing motor movement/imagery cued by on-screen targets, and is labeled Pathology=Healthy, Modality=Visual, Type=Motor. The current dataset is likewise a motor imagery/execution paradigm with arrow (visual) cues in healthy subjects, so the same label mapping convention applies (visual cue modality; motor-focused construct/type).","metadata_analysis":"Key metadata facts:\n- Population: explicitly healthy: \"Health status: healthy\" and also \"Tags\\n- **Pathology**: Healthy\".\n- Task/purpose is motor imagery/execution: \"Paradigm: imagery\", \"Detected paradigm: motor_imagery\", and \"Study design: Motor execution (fist clenching) with simultaneous scalp and ear-EEG recording\".\n- Stimulus modality is primarily visual: \"Stimulus type: arrow cues\", \"Stimulus modalities: visual, auditory\", and \"Primary modality: visual\".\n- Classes/events confirm left vs right hand motor condition: \"Events: left_hand=1, right_hand=2\" and \"Class labels: left_hand, right_hand\".","paper_abstract_analysis":"No useful paper information (no abstract provided in the metadata; only the DOI/citation is listed).","evidence_alignment_check":"Pathology:\n- Metadata says: \"Health status: healthy\" / \"Tags\\n- **Pathology**: Healthy\".\n- Few-shot pattern suggests: motor imagery datasets are typically Healthy unless a clinical diagnosis is stated (e.g., Parkinson's, TBI).\n- ALIGN.\n\nModality:\n- Metadata says: \"Stimulus type: arrow cues\" and \"Primary modality: visual\" (also notes \"Stimulus modalities: visual, auditory\").\n- Few-shot pattern suggests: when cues are on-screen targets/arrows, label Modality=Visual even if responses are motor.\n- ALIGN.\n\nType:\n- Metadata says: \"Study design: Motor execution (fist clenching)...\" and \"Detected paradigm: motor_imagery\".\n- Few-shot pattern suggests: motor execution/imagery as the research focus maps to Type=Motor.\n- ALIGN.","decision_summary":"Top-2 comparative selections:\n\nPathology candidates:\n1) Healthy (WINNER) — supported by \"Health status: healthy\", \"Tags\\n- **Pathology**: Healthy\", and \"Subjects: 6\" with no clinical condition mentioned.\n2) Unknown (runner-up) — would apply if health status were not stated.\nAlignment: Aligns with few-shot convention (motor imagery benchmark datasets typically healthy unless stated otherwise).\nConfidence basis: 2+ explicit health-status/pathology quotes.\n\nModality candidates:\n1) Visual (WINNER) — \"Stimulus type: arrow cues\" + \"Primary modality: visual\".\n2) Multisensory (runner-up) — because \"Stimulus modalities: visual, auditory\" mentions both.\nAlignment: Aligns with few-shot convention to pick the dominant/primary stimulus channel.\nConfidence basis: 3 explicit stimulus/modality quotes; clear primary modality.\n\nType candidates:\n1) Motor (WINNER) — \"Motor execution (fist clenching)\", \"Detected paradigm: motor_imagery\", and \"Paradigm: imagery\" indicate motor/MI is the construct of interest.\n2) Perception (runner-up) — only weakly plausible because cues are sensory, but the study aim is motor classification.\nAlignment: Aligns with few-shot motor imagery example labeled Type=Motor.\nConfidence basis: 3 explicit motor-paradigm/design quotes."}},"canonical_name":null,"name_confidence":0.74,"name_meta":{"suggested_at":"2026-04-14T10:18:35.343Z","model":"openai/gpt-5.2 + openai/gpt-5.4-mini + deterministic_fallback"},"name_source":"author_year","author_year":"Wu2020"}}