{"success":true,"database":"eegdash","data":{"_id":"6953f4249276ef1ee07a342b","dataset_id":"ds005953","associated_paper_doi":null,"authors":["Jonathan Winawer","Dora Hermes"],"bids_version":"1.0.2","contact_info":["Dora Hermes"],"contributing_labs":null,"data_processed":true,"dataset_doi":"doi:10.18112/openneuro.ds005953.v1.0.0","datatypes":["ieeg"],"demographics":{"subjects_count":2,"ages":[],"age_min":null,"age_max":null,"age_mean":null,"species":null,"sex_distribution":null,"handedness_distribution":null},"experimental_modalities":null,"external_links":{"source_url":"https://openneuro.org/datasets/ds005953","osf_url":null,"github_url":null,"paper_url":null},"funding":["This research has received funding from the National Institute Of Mental Health of the National Institutes of Health under Award Number R01MH111417"],"ingestion_fingerprint":"053a80c804431a0f1916f8108e3bc57b7328b81bfef61de66f6ad222546b63fc","license":"CC0","n_contributing_labs":null,"name":"iEEG_visual","readme":"Information\n===========\nThis folder contains the ECoG data from 2 subjects performing a visual task used in the publications of Hermes et al., 2015, Cerebral Cortex \"Stimulus Dependence of Gamma Oscillations in Human Visual Cortex\" and Hermes et al., 2017, PLOS Biology “Neuronal synchrony and the relation between the blood-oxygen-level dependent response and the local field potential”.\nContact: Dora Hermes (dorahermes@gmail.com)\nCiting this dataset\n--------\nIf you use this data as a part of any publications, please use the following citation:\n[1] Hermes D, Miller KJ, Wandell BA, Winawer J (2015). Stimulus dependence of gamma oscillations in human visual cortex. Cerebral Cortex 25(9):2951-9. https://doi.org/10.1093/cercor/bhu091\n[2] Hermes D, Nguyen M, Winawer J. (2017). Neuronal synchrony and the relation between the BOLD response and the local field potential. PLOS Biology 15(7). https://doi.org/10.1371/journal.pbio.2001461\nThis dataset was made available with the support of the Netherlands Organization for Scientific Research www.nwo.nl under award number 016.VENI.178.048 to Dora Hermes and the National Institute Of Mental Health of the National Institutes of Health under Award Number R01MH111417 to Natalia Petridou and Jonathan Winawer. The ECoG data collection was facilitated by the Stanford Human Intracranial Cognitive Electrophysiology Program (SHICEP).\nLicense\n-------\nThis dataset is made available under the Public Domain Dedication and License \\\\nv1.0,\nwhose full text can be found at \\\\nhttp://www.opendatacommons.org/licenses/pddl/1.0/.\nTask Description\n----------------\nSubjects were presented with images presented on a computer screen. The images spanned about 25x25 degrees of visual angle. Subjects fixated on a dot in the center of the screen that alternated between red and green, changing colors at random times. Subject 1 pressed a button when the fixation dot changed color. Subject 2 fixated on the dot but did not make manual responses because these responses were found to interfere with visual fixation.\nDataset and Stimuli\n===================\nThis data is organized according to the Brain Imaging Data Structure specification. A community- driven specification for organizing neurophysiology data along with its metadata. For more information on this data specification, see https://bids-specification.readthedocs.io/en/stable/\nEach subject has their own folder (e.g., `sub-01`) which contains the raw EcoG data for that subject, as well as the metadata needed to understand the raw data and event timing. In addition, the `stimuli/` folder contains the .png files of the presented images.\nStimuli\n-------\nStimuli including high contrast vertical gratings (0.16, 0.33, 0.65, or 1.3 duty cycles per degree square wave) and noise patterns (spectral power distributions of k/f^4; k/f^2; and k/f^0).\nRaw data\n--------\nRaw data is stored with the Brainvision data format. This can be read in to memory with the following tools:\n* Python: The `pybv` package (https://github.com/bids-standard/pybv)\n* Matlab: BrainVision analyzer (https://www.mathworks.com/products/connections/product_detail/brainvision-analyzer.html)","recording_modality":["ieeg"],"senior_author":"Dora Hermes","sessions":["01"],"size_bytes":605346653,"source":"openneuro","study_design":null,"study_domain":null,"tasks":["visual"],"timestamps":{"digested_at":"2026-04-22T12:29:00.879535+00:00","dataset_created_at":"2025-03-03T15:16:54.906Z","dataset_modified_at":"2025-03-03T17:00:05.000Z"},"total_files":3,"storage":{"backend":"s3","base":"s3://openneuro.org/ds005953","raw_key":"dataset_description.json","dep_keys":["CHANGES","README","participants.tsv"]},"tagger_meta":{"config_hash":"4a051be509a0e3d0","metadata_hash":"3b10774664ad3a99","model":"openai/gpt-5.2","tagged_at":"2026-01-20T18:45:33.035412+00:00"},"tags":{"pathology":["Surgery"],"modality":["Visual"],"type":["Perception"],"confidence":{"pathology":0.6,"modality":0.9,"type":0.8},"reasoning":{"few_shot_analysis":"Most similar few-shot convention is the visual discrimination dataset (Meta-rdk) labeled as Modality=Visual and Type=Perception: it involves visually presented stimuli and the scientific goal is perceptual processing rather than motor responses. Here, the current dataset likewise presents visual stimuli (gratings/noise) while subjects fixate, aligning with the same Visual→Perception mapping convention. The motor action (button press on fixation color change) is incidental, similar to how choice responses do not force a Motor modality/type in the few-shot conventions.","metadata_analysis":"Key quoted facts from README: (1) \"ECoG data from 2 subjects performing a visual task\". (2) \"Subjects were presented with images presented on a computer screen.\" (3) \"Subjects fixated on a dot in the center of the screen\" and \"Subject 1 pressed a button when the fixation dot changed color.\" (4) Stimulus description: \"high contrast vertical gratings ... and noise patterns\" and stimuli stored as \"the `stimuli/` folder contains the .png files of the presented images.\" Also context: \"Stanford Human Intracranial Cognitive Electrophysiology Program (SHICEP)\" indicates intracranial recording but does not explicitly name the clinical diagnosis.","paper_abstract_analysis":"No useful paper information.","evidence_alignment_check":"Pathology: Metadata SAYS (quoted) \"ECoG data\" and references \"Intracranial Cognitive Electrophysiology\" but gives no explicit diagnosis (no mention of epilepsy, tumor, etc.). Few-shot pattern SUGGESTS that invasive intracranial recordings are typically collected in clinical/surgical patients, making 'Surgery' plausible. ALIGNMENT: partial (clinical context implied, diagnosis not stated). Decision uses inference due to lack of explicit pathology facts.\nModality: Metadata SAYS \"presented with images\" and lists \"vertical gratings\" and \"noise patterns\". Few-shot pattern SUGGESTS Visual for screen-based image stimuli. ALIGNMENT: yes.\nType: Metadata SAYS stimuli are gratings/noise and publications focus on \"Stimulus Dependence of Gamma Oscillations in Human Visual Cortex\"; task is fixation with image viewing. Few-shot pattern SUGGESTS Perception for sensory stimulus processing tasks (even with button press). ALIGNMENT: yes.","decision_summary":"Top-2 candidates per category:\nPathology: (1) Surgery — evidence: intracranial context \"ECoG\" and \"Intracranial ... Program (SHICEP)\" implies recordings obtained in a neurosurgical/clinical setting; (2) Unknown — evidence: no explicit diagnosis stated anywhere in provided metadata. Head-to-head: Surgery slightly stronger due to clear invasive clinical context, but uncertainty remains. (Alignment: partial; inference-only).\nModality: (1) Visual — evidence quotes: \"presented with images\", \"computer screen\", \"vertical gratings ... and noise patterns\"; (2) Resting State — counterpoint: fixation-only could resemble passive viewing, but stimuli are explicit. Winner: Visual. (Alignment: yes).\nType: (1) Perception — evidence quotes: \"Stimulus dependence of gamma oscillations in human visual cortex\" and explicit visual stimuli (gratings/noise) indicate sensory/perceptual processing; (2) Attention — counterpoint: fixation dot color-change detection/button press suggests attentional monitoring, but appears secondary to visual stimulus-response characterization. Winner: Perception. (Alignment: yes).\nConfidence justification: Pathology confidence limited because evidence is contextual inference without explicit diagnosis; Modality/Type higher because multiple explicit stimulus/task quotes support them."}},"computed_title":"iEEG_visual","nchans_counts":[{"val":96,"count":2},{"val":118,"count":1}],"sfreq_counts":[{"val":1525.9,"count":2},{"val":3051.76,"count":1}],"stats_computed_at":"2026-04-22T23:16:00.311161+00:00","total_duration_s":700.9159661701202,"author_year":"Winawer2025","canonical_name":null}}