{"success":true,"database":"eegdash","data":{"_id":"69a33a3b897a7725c66f3ee3","dataset_id":"ds007056","associated_paper_doi":null,"authors":["Couperus, J.W.","Bukach, C.M.","Reed, C.L."],"bids_version":"1.8.0","contact_info":["Jane Couperus"],"contributing_labs":null,"data_processed":false,"dataset_doi":"doi:10.18112/openneuro.ds007056.v1.1.1","datatypes":["eeg"],"demographics":{"subjects_count":286,"ages":[18,18,20,18,19,19,19,19,19,18,18,21,18,18,19,19,20,19,20,18,18,19,18,21,19,22,19,20,18,20,18,20,20,19,19,19,18,21,20,19,19,18,19,19,18,20,21,22,18,18,23,19,19,18,18,21,18,20,18,18,18,18,18,18,20,18,19,18,18,18,19,18,18,18,19,19,19,20,19,22,18,21,18,19,18,19,19,30,21,20,19,18,22,20,18,18,18,18,19,18,18,18,19,22,21,19,20,19,21,18,21,19,19,24,21,18,20,20,20,19,20,20,20,21,22,19,22,19,22,18,18,19,20,19,21,20,19,20,19,19,21,19,26,19,22,33,20,20,21,19,19,18,19,20,19,18,18,18,21,19,21,21,18,22,19,24,21,21,19,18,18,19,20,20,19,22,19,22,21,21,21,21,20,19,23,20,21,20,23,21,19,19,20,21,21,19,19,21,19,18,20,21,19,18,19,19,19,18,20,20,19,18,19,22,19,20,19,21,21,19,20,18,18,19,19,18,21,21,21,18,18,21,18,18,19,21,21,18,20,19,20,19,19,20,21,18,20,19,20,20,19,20,20,18,21,21,20,19,19,21,19,21,21,18,21,20,23,19,23,21,21,19,19,19,18,18,22,20],"age_min":18,"age_max":33,"age_mean":19.6294964028777,"species":null,"sex_distribution":{"o":276},"handedness_distribution":null},"experimental_modalities":null,"external_links":{"source_url":"https://openneuro.org/datasets/ds007056","osf_url":null,"github_url":null,"paper_url":null},"funding":[],"ingestion_fingerprint":"f5b2c97c92526199fc137a7b71ea6202c23c3df547f02fa6522a27712ca6ab5a","license":"CC0","n_contributing_labs":null,"name":"PURSUE P300 Visual Oddball","readme":"Visual Oddball Experiment from the PURSUE project (pursureerp.com). Data collected from participants at 3 different primarily undergraduate academic institutions (Southern California, Massachusetts, and Virginia) in 2017 and 2018. The task design can be found in the publication by Kappenman et al.(2021). ERP CORE: An open resource for human event-related potential research. NeuroImage, 225, 117465. Details of task are found in the supplementary materials.\nRace Key: \"Levels\": { \"x1\": \"White\", \"x2\": \"Black/African American\", \"x3\": \"Native American\", \"x4\": \"Asian\", \"x5\": \"Pacific Islander\", \"x6\": \"Hispanic/Latino\", \"x7\": \"Other\", \"x8\": \"Prefer not to respond\", \"x9\": \"Chose more than one response\", \"\" : \"empty\" }","recording_modality":["eeg"],"senior_author":"Reed, C.L.","sessions":[],"size_bytes":8413923956,"source":"openneuro","storage":{"backend":"s3","base":"s3://openneuro.org/ds007056","raw_key":"dataset_description.json","dep_keys":["CHANGES","README","datacite.yml","participants.json","participants.tsv","task-VisualOddball_events.json"]},"study_design":null,"study_domain":null,"tasks":["VisualOddball"],"timestamps":{"digested_at":"2026-04-22T12:29:58.258810+00:00","dataset_created_at":"2025-12-10T14:00:33.648Z","dataset_modified_at":"2025-12-11T14:40:26.000Z"},"total_files":286,"computed_title":"PURSUE P300 Visual Oddball","nchans_counts":[{"val":32,"count":286}],"sfreq_counts":[{"val":500.0,"count":286}],"stats_computed_at":"2026-04-22T23:16:00.312502+00:00","total_duration_s":125524.55,"tagger_meta":{"config_hash":"3557b68bca409f28","metadata_hash":"f5e42e2b8d3e46c0","model":"openai/gpt-5.2","tagged_at":"2026-04-07T09:32:40.872789+00:00"},"tags":{"pathology":["Healthy"],"modality":["Visual"],"type":["Attention"],"confidence":{"pathology":0.7,"modality":0.9,"type":0.8},"reasoning":{"few_shot_analysis":"Most similar few-shot by paradigm is the \"Cross-modal Oddball Task\" example (oddball design; target/standard/oddball structure), which illustrates the convention that oddball paradigms are primarily categorized by the attentional/cognitive-control construct rather than by the motor response mechanics. Another related convention is seen in the \"EEG: Three-Stim Auditory Oddball...\" example, showing oddball datasets are strongly tied to P300/oddball-style attention constructs (even when auditory). These examples guide labeling this dataset’s Type toward Attention rather than Perception, and Modality toward the stimulus channel (here explicitly visual). For Pathology, several few-shots demonstrate that when no clinical recruitment is stated, datasets are labeled Healthy rather than inferring a disorder.","metadata_analysis":"Key quoted metadata facts:\n1) Task/paradigm: \"PURSUE P300 Visual Oddball\" and \"Visual Oddball Experiment from the PURSUE project\".\n2) Context/population source: \"Data collected from participants at 3 different primarily undergraduate academic institutions ... in 2017 and 2018\".\n3) Task name: tasks includes \"VisualOddball\".\nNo explicit clinical diagnosis/recruitment is mentioned anywhere in the provided metadata; the participants are described only by basic demographics (\"Age range: 18-33\").","paper_abstract_analysis":"No useful paper information. (Only a citation to Kappenman et al. (2021) / ERP CORE is provided, but no abstract content is included in the metadata here.)","evidence_alignment_check":"Pathology:\n- Metadata says: participants are from \"primarily undergraduate academic institutions\" with \"Age range: 18-33\" and no diagnosis mentioned.\n- Few-shot pattern suggests: in non-clinical student/volunteer ERP datasets without a disorder, label as Healthy.\n- Alignment: ALIGN (no clinical-population facts to override).\n\nModality:\n- Metadata says: \"P300 Visual Oddball\" and \"Visual Oddball Experiment\" and task \"VisualOddball\".\n- Few-shot pattern suggests: oddball modality follows stimulus channel; visual oddball -> Visual.\n- Alignment: ALIGN.\n\nType:\n- Metadata says: \"Visual Oddball Experiment\" with \"P300\" focus.\n- Few-shot pattern suggests: oddball/P300 paradigms are typically categorized as Attention (target detection/oddball processing) rather than Perception unless the stated aim is sensory discrimination.\n- Alignment: ALIGN (no contrary statement indicating perceptual-threshold or discrimination as the primary construct).","decision_summary":"Top-2 candidates (with head-to-head selection):\n\nPathology:\n1) Healthy (selected): Supported by \"participants at ... primarily undergraduate academic institutions\" and no mention of any diagnosis; typical normative ERP participant pool.\n2) Unknown (runner-up): Because metadata does not explicitly state \"healthy controls\".\nDecision: Healthy wins because the dataset is clearly a general undergraduate sample with no clinical recruitment indicated.\nConfidence basis: 1 strong contextual quote + absence of any clinical-population facts.\n\nModality:\n1) Visual (selected): Explicit in title/description: \"P300 Visual Oddball\" and \"Visual Oddball Experiment\"; task \"VisualOddball\".\n2) Unknown (runner-up): Only as a fallback if modality were not stated (but it is).\nDecision: Visual wins by explicit labeling.\nConfidence basis: 2+ explicit mentions of visual/visual oddball.\n\nType:\n1) Attention (selected): Oddball/P300 paradigms primarily probe attention/target detection; matches few-shot oddball convention.\n2) Perception (runner-up): Possible if interpreted as simple visual stimulus processing, but oddball/P300 is more classically attentional.\nDecision: Attention wins because P300 oddball is conventionally an attentional/target-detection ERP paradigm.\nConfidence basis: explicit oddball/P300 paradigm + strong few-shot convention match, but no abstract/extra task-detail text here to further narrow aims."}},"canonical_name":null,"name_confidence":0.61,"name_meta":{"suggested_at":"2026-04-14T10:18:35.343Z","model":"openai/gpt-5.2 + openai/gpt-5.4-mini + deterministic_fallback"},"name_source":"author_year","author_year":"Couperus2025_P300"}}