{"success":true,"database":"eegdash","data":{"_id":"6953f4249276ef1ee07a3477","dataset_id":"ds006940","associated_paper_doi":null,"authors":["Shantanu Sarkar","Kevin Nathan","Jose L. Contreras-Vidal"],"bids_version":"1.8.0","contact_info":["Shantanu Sarkar"],"contributing_labs":null,"data_processed":true,"dataset_doi":"doi:10.18112/openneuro.ds006940.v1.0.0","datatypes":["eeg"],"demographics":{"subjects_count":7,"ages":[],"age_min":null,"age_max":null,"age_mean":null,"species":null,"sex_distribution":null,"handedness_distribution":null},"experimental_modalities":null,"external_links":{"source_url":"https://openneuro.org/datasets/ds006940","osf_url":null,"github_url":null,"paper_url":null},"funding":["UH–Methodist Graduate Fellowship for Translational Research","NSF IUCRC BRAIN Center Award #2137255"],"ingestion_fingerprint":"e8cfda16f85a6c5931e74c1136cb381a78a8aab5d1c5a89e24af465a912b31ac","license":"CC0","n_contributing_labs":null,"name":"Dataset: EEG-Controlled Exoskeleton for Walking and Standing - A Longitudinal Study of Healthy Individuals","readme":"EEG-Controlled Exoskeleton for Walking and Standing\nA Longitudinal Motor Imagery Study in Healthy Adults\nDataset Overview\nThis dataset contains multimodal recordings from a brain–machine interface (BMI) training study involving seven healthy adult participants (ages 20–30, Mean = 24.3, SD = 3.8). The study focused on open-loop and closed-loop control of a lower-limb exoskeleton (Rex Bionics) using EEG and inertial sensor data. Each participant completed nine sessions over several weeks, structured into training and trial phases.\nExperimental Design\n* Participants: 7 healthy adults (4 male, 3 female)\n* Sessions: 9 per participant\n* Training Phase: Motor imagery calibration\n* Trial Phase: Closed-loop BMI control (walk/stop)\n* Conditions: Walk / Stop (motor imagery)\nTask Structure and Naming Convention\nEach session includes multiple motor imagery tasks organized as follows:\nTraining: The training phase is used to calibrate the BMI decoder. Participants perform motor imagery tasks without feedback.\nTrialXX:\nThe trial phase consists of 12 closed-loop BMI trials per session, labeled trial01 to trial12. During these trials, participants use motor imagery to control the exoskeleton in real time.\nBlock 1: Trials 1–4\nBlock 2: Trials 5–8\nBlock 3: Trials 9–12\nwalk6min / stop6min:\nAfter completing the 12 trials, participants perform two extended motor imagery tasks:\nwalk6min – Imagining continuous walking for 6 minutes\nstop6min – Imagining standing still for 6 minutes\nData Modalities\n* EEG: 60 scalp channels + 4 EOG channels\n* IMU: 3-axis accelerometer, gyroscope, magnetometer, and quaternion\n* Sensor Placement: IMUs mounted on participant forehead and exosuit back brace\n* Decoder Signals/Feedback: Logged control signals and BMI predictions\nAdditional Materials\n* MIQ-RS: Motor Imagery Questionnaire – Revised Second Version (PDFs in derivatives/MIQ-RS/)\n* Validation Tables: Data availability, synchronization, and electrode placement (derivatives/validation/)\n* Raw Data: Provided without filtering or artifact removal\nBIDS Structure\n* dataset_description.json: Metadata and provenance\n* sub-XX/ses-YY/: EEG and IMU recordings per session\n* derivatives/: MIQ-RS responses and validation spreadsheets","recording_modality":["eeg"],"senior_author":"Jose L. Contreras-Vidal","sessions":["01","02","03","04","05","06","07","08","09"],"size_bytes":3883627381,"source":"openneuro","study_design":null,"study_domain":null,"tasks":["stop6min","training","trial01","trial02","trial03","trial04","trial05","trial06","trial07","trial08","trial09","trial10","trial11","trial12","walk6min"],"timestamps":{"digested_at":"2026-04-22T12:29:51.555926+00:00","dataset_created_at":"2025-11-15T17:02:22.664Z","dataset_modified_at":"2025-11-15T23:11:01.000Z"},"total_files":935,"storage":{"backend":"s3","base":"s3://openneuro.org/ds006940","raw_key":"dataset_description.json","dep_keys":["CHANGES","README.md"]},"tagger_meta":{"config_hash":"4a051be509a0e3d0","metadata_hash":"09047ebb176c11aa","model":"openai/gpt-5.2","tagged_at":"2026-01-20T19:15:18.323557+00:00"},"tags":{"pathology":["Healthy"],"modality":["Motor"],"type":["Motor"],"confidence":{"pathology":0.8,"modality":0.7,"type":0.85},"reasoning":{"few_shot_analysis":"Most similar few-shot reference is the \"EEG Motor Movement/Imagery Dataset\" example (Healthy, Motor-focused motor/imagery paradigm). That example shows the convention that motor imagery paradigms are labeled with Type=Motor. Its Modality was set to Visual because the task explicitly used on-screen targets; in the current dataset, cues/stimuli are not described as visual, so we should not copy the Visual modality convention and instead map the task’s dominant channel to Motor given the explicit motor imagery/exoskeleton control focus.","metadata_analysis":"Key metadata facts:\n1) Population: \"seven healthy adult participants (ages 20–30\" and \"Participants: 7 healthy adults\".\n2) Paradigm: \"A Longitudinal Motor Imagery Study in Healthy Adults\" and \"Training Phase: Motor imagery calibration\".\n3) Closed-loop BMI/movement context: \"closed-loop control of a lower-limb exoskeleton (Rex Bionics) using EEG\" and \"participants use motor imagery to control the exoskeleton in real time.\"","paper_abstract_analysis":"No useful paper information.","evidence_alignment_check":"Pathology:\n- Metadata says: \"seven healthy adult participants\" / \"Participants: 7 healthy adults\".\n- Few-shot suggests: motor imagery datasets often use Healthy if volunteers.\n- Alignment: ALIGN.\n\nModality:\n- Metadata says: \"Motor imagery\" for \"Walk / Stop (motor imagery)\" and exoskeleton walking/standing control; no explicit auditory/visual/tactile stimulus description.\n- Few-shot suggests: motor imagery tasks can be Motor modality, but may be Visual when explicit screen targets are present (as in the motor imagery example).\n- Alignment: PARTIAL; few-shot indicates Visual only when visual stimuli are explicit, which is not the case here. Metadata-driven inference favors Motor.\n\nType:\n- Metadata says: \"Motor Imagery Study\" and \"Motor imagery calibration\" with BMI control of walking/standing.\n- Few-shot suggests: motor imagery paradigms map to Type=Motor.\n- Alignment: ALIGN.","decision_summary":"Top-2 candidates and selection:\n\nPathology:\n1) Healthy — Supported by: \"seven healthy adult participants\"; \"Participants: 7 healthy adults\".\n2) Unknown — would apply if no recruitment info were provided.\nDecision: Healthy (clear explicit recruitment description). Confidence supported by 2 explicit quotes.\n\nModality:\n1) Motor — Supported by: \"Walk / Stop (motor imagery)\"; \"Motor imagery calibration\"; \"control of a lower-limb exoskeleton... using EEG\".\n2) Multisensory — plausible because closed-loop exoskeleton control likely involves visual/proprioceptive feedback, but not explicitly described as stimuli.\nDecision: Motor (motor imagery is the dominant task/stimulus channel described; no explicit external sensory stimuli detailed). Confidence is moderate due to some inference about absence of explicit cues.\n\nType:\n1) Motor — Supported by: \"Longitudinal Motor Imagery Study\"; \"Motor imagery calibration\"; \"use motor imagery to control the exoskeleton\".\n2) Clinical/Intervention — could apply if it were a rehabilitation/clinical cohort intervention, but participants are healthy and focus is BMI training.\nDecision: Motor (primary construct is motor imagery/BMI motor control). High confidence supported by multiple explicit task-description quotes."}},"computed_title":"Dataset: EEG-Controlled Exoskeleton for Walking and Standing - A Longitudinal Study of Healthy Individuals","nchans_counts":[{"val":64,"count":935}],"sfreq_counts":[{"val":100.0,"count":935}],"stats_computed_at":"2026-04-22T23:16:00.312191+00:00","total_duration_s":122479.48,"author_year":"Sarkar2025_StudyOF","canonical_name":null}}