{"success":true,"database":"eegdash","data":{"_id":"6953f4249276ef1ee07a341c","dataset_id":"ds005841","associated_paper_doi":null,"authors":["Elena Karakashevska","Alexis Makin","Michael Batterley"],"bids_version":"1.6.0","contact_info":["Elena Karakashevska"],"contributing_labs":null,"data_processed":false,"dataset_doi":"doi:10.18112/openneuro.ds005841.v1.0.0","datatypes":["eeg"],"demographics":{"subjects_count":48,"ages":[18,18,18,18,18,19,19,18,19,18,19,18,18,26,18,18,55,18,21,19,21,26,18,18,19,26,38,18,18,19,18,19,18,19,18,18,18,18,18,18,26,19,19,18,19,20,27,19],"age_min":18,"age_max":55,"age_mean":20.479166666666668,"species":null,"sex_distribution":{"f":39,"m":9},"handedness_distribution":null},"experimental_modalities":null,"external_links":{"source_url":"https://openneuro.org/datasets/ds005841","osf_url":null,"github_url":null,"paper_url":null},"funding":[],"ingestion_fingerprint":"155f3fc5afa074b8ce76690c5883854b9ef76b9eb0c7e75bd882633891197461","license":"CC0","n_contributing_labs":null,"name":"EEG Experiment measuring ERPs in VR","readme":"# EEG Experiment Measuring ERPs in VR\nThis dataset contains EEG recordings from a study investigating event-related potentials (ERPs) during different visual tasks in virtual reality.\n## Study Design\n- **Participants**: 48 participants\n- **Tasks**:\n  - Lumfront\n  - Lumperp\n  - Regfront\n  - Regperp\n  - Signalscreen\n  - Signalvr\n- **Modality**: EEG (512 Hz sampling rate)\n## Dataset Organization\nThe dataset follows the BIDS specification (version 1.6.0). Each subject folder contains EEG recordings and associated metadata.\n## Funding and Acknowledgements\nThis study was supported by a doctoral studentship awarded to EK. We thank the participants for their time.","recording_modality":["eeg"],"senior_author":"Michael Batterley","sessions":[],"size_bytes":7889278693,"source":"openneuro","study_design":null,"study_domain":null,"tasks":["lumfront","lumperp","regfront","regperp","signalscreen","signalvr"],"timestamps":{"digested_at":"2026-04-22T12:28:55.420504+00:00","dataset_created_at":"2025-01-14T13:48:40.783Z","dataset_modified_at":"2025-01-14T18:15:10.000Z"},"total_files":288,"storage":{"backend":"s3","base":"s3://openneuro.org/ds005841","raw_key":"dataset_description.json","dep_keys":["CHANGES","README","events.json","participants.json","participants.tsv"]},"tagger_meta":{"config_hash":"4a051be509a0e3d0","metadata_hash":"7c1426487d24c789","model":"openai/gpt-5.2","tagged_at":"2026-01-20T18:40:04.534495+00:00"},"tags":{"pathology":["Healthy"],"modality":["Visual"],"type":["Perception"],"confidence":{"pathology":0.8,"modality":0.8,"type":0.7},"reasoning":{"few_shot_analysis":"Closest few-shot conventions are the visually driven ERP/perceptual paradigms labeled as Visual+Perception (e.g., the schizophrenia dataset that uses a \"visual discrimination task\" and is labeled with modality=Visual and type=Perception). This guides mapping a stimulus-driven visual ERP paradigm (even if in VR) to Modality=Visual and Type=Perception rather than Motor/Resting-state. For pathology, multiple few-shots show that when no disorder recruitment is stated, the convention is Pathology=Healthy.","metadata_analysis":"Key metadata facts from the README: (1) Purpose/construct: \"investigating event-related potentials (ERPs) during different visual tasks in virtual reality.\" (2) Population description lacks any diagnosis: \"Participants: 48 participants\" with no mention of patients/clinical groups. (3) Stimulus channel: \"different visual tasks in virtual reality\" and task names include \"Signalscreen\" and \"Signalvr\" (both suggesting visually presented signals).","paper_abstract_analysis":"No useful paper information.","evidence_alignment_check":"Pathology: Metadata says only \"Participants: 48 participants\" with no clinical recruitment stated; few-shot pattern suggests Healthy when no disorder is named; ALIGN.\nModality: Metadata says \"different visual tasks in virtual reality\"; few-shot pattern maps stimulus-driven visual paradigms to Visual; ALIGN.\nType: Metadata says \"investigating event-related potentials (ERPs) during different visual tasks\" (ERP study of visual processing) but does not specify a higher-level construct like memory/learning/decision; few-shot pattern suggests sensory-stimulus ERP tasks are typically labeled Perception; ALIGN (with some residual ambiguity vs Attention).","decision_summary":"Top-2 candidates per category:\n- Pathology: (1) Healthy vs (2) Unknown. Evidence for Healthy: no diagnosis/clinical recruitment mentioned (\"Participants: 48 participants\"); study framed as generic ERP/VR task study. Winner: Healthy. Alignment: aligns with few-shot convention. Confidence: moderate-high because absence of any pathology is clear.\n- Modality: (1) Visual vs (2) Multisensory. Evidence for Visual: \"different visual tasks in virtual reality\"; tasks named \"Signalscreen\"/\"Signalvr\"; no mention of auditory/tactile stimuli. Winner: Visual. Alignment: aligns with few-shot convention for visual ERP tasks. Confidence: moderate-high.\n- Type: (1) Perception vs (2) Attention. Evidence for Perception: explicit ERP study during \"visual tasks\" (sensory/visual processing focus); no explicit attentional manipulation described. Evidence for Attention: task labels like front/perp (front vs peripheral) could imply spatial attention, but not stated. Winner: Perception. Alignment: aligns with few-shot convention mapping visual ERP paradigms to Perception when no other cognitive construct is specified. Confidence: moderate due to limited task detail."}},"computed_title":"EEG Experiment measuring ERPs in VR","nchans_counts":[],"sfreq_counts":[{"val":512.0,"count":288}],"stats_computed_at":"2026-04-22T23:16:00.310967+00:00","total_duration_s":null,"author_year":"Karakashevska2025","size_human":"7.3 GB","canonical_name":null}}