{"success":true,"database":"eegdash","data":{"_id":"69a33a3b897a7725c66f3ee6","dataset_id":"ds007137","associated_paper_doi":null,"authors":["Couperus, J.W.","Bukach, C.M.","Reed, C.L."],"bids_version":"1.8.0","contact_info":["Jane Couperus"],"contributing_labs":null,"data_processed":false,"dataset_doi":"doi:10.18112/openneuro.ds007137.v1.0.0","datatypes":["eeg"],"demographics":{"subjects_count":294,"ages":[22,19,19,20,18,19,18,20,18,19,19,19,19,19,18,18,18,19,21,18,18,19,19,19,20,18,18,19,18,21,19,22,19,20,18,20,18,20,20,19,19,19,18,21,20,19,19,18,19,19,18,20,21,22,18,18,23,19,19,18,18,21,18,20,18,18,18,18,18,18,20,18,19,18,18,18,19,18,18,18,19,19,19,20,19,22,18,21,18,18,19,18,19,19,30,18,21,20,21,18,19,18,20,22,20,18,18,18,18,19,18,18,18,19,22,21,19,20,19,21,18,21,19,19,24,21,18,20,20,20,19,20,20,20,21,22,19,22,22,19,22,18,18,19,20,19,21,20,19,20,19,19,19,21,19,26,19,22,33,20,20,21,19,19,18,19,20,19,22,18,18,18,21,19,21,21,18,22,19,24,21,21,19,18,18,19,20,20,19,22,19,22,21,21,21,21,20,19,23,20,21,20,23,21,19,19,20,21,21,19,19,21,19,18,20,21,19,18,19,19,19,18,20,20,19,18,19,22,19,19,20,19,21,21,19,20,18,18,19,19,18,21,21,21,18,18,21,18,18,19,21,21,18,20,19,20,19,19,20,21,18,20,19,20,20,19,20,20,18,21,21,20,19,19,21,19,21,21,18,21,20,23,19,23,21,21,19,19,19,18,18,22,20],"age_min":18,"age_max":33,"age_mean":19.62457337883959,"species":null,"sex_distribution":{"o":293},"handedness_distribution":null},"experimental_modalities":null,"external_links":{"source_url":"https://openneuro.org/datasets/ds007137","osf_url":null,"github_url":null,"paper_url":null},"funding":[],"ingestion_fingerprint":"8ddf5deedf6444c52c0a0012ace38b91b4fc126b35900b29e4d1caa3126b5b16","license":"CC0","n_contributing_labs":null,"name":"PURSUE N2pc Visual Search","readme":"Visual Search Experiment from the PURSUE project (pursureerp.com). Data collected from participants at 3 different primarily undergraduate academic institutions (Southern California, Massachusetts, and Virginia) in 2017 and 2018. The task design can be found in the publication by Kappenman et al.(2021). ERP CORE: An open resource for human event-related potential research. NeuroImage, 225, 117465. Details of task are found in the supplementary materials.\nRace Key: \"Levels\": { \"x1\": \"White\", \"x2\": \"Black/African American\", \"x3\": \"Native American\", \"x4\": \"Asian\", \"x5\": \"Pacific Islander\", \"x6\": \"Hispanic/Latino\", \"x7\": \"Other\", \"x8\": \"Prefer not to respond\", \"x9\": \"Chose more than one response\", \"\" : \"empty\" }                                                                                                                                                                                                                          \u0000\u0000","recording_modality":["eeg"],"senior_author":"Reed, C.L.","sessions":[],"size_bytes":13143663368,"source":"openneuro","storage":{"backend":"s3","base":"s3://openneuro.org/ds007137","raw_key":"dataset_description.json","dep_keys":["CHANGES","README","participants.json","participants.tsv","task-VisualSearch_events.json"]},"study_design":null,"study_domain":null,"tasks":["VisualSearch"],"timestamps":{"digested_at":"2026-04-22T12:30:03.127807+00:00","dataset_created_at":"2025-12-27T16:32:55.898Z","dataset_modified_at":"2025-12-28T21:03:16.000Z"},"total_files":294,"computed_title":"PURSUE N2pc Visual Search","nchans_counts":[{"val":32,"count":294}],"sfreq_counts":[{"val":500.0,"count":294}],"stats_computed_at":"2026-04-22T23:16:00.312540+00:00","total_duration_s":195987.17,"tagger_meta":{"config_hash":"3557b68bca409f28","metadata_hash":"f072d624fcb64152","model":"openai/gpt-5.2","tagged_at":"2026-04-07T09:32:40.872789+00:00"},"tags":{"pathology":["Healthy"],"modality":["Visual"],"type":["Attention"],"confidence":{"pathology":0.7,"modality":0.8,"type":0.8},"reasoning":{"few_shot_analysis":"Most similar few-shot example by paradigm/construct is the TBI DPX cognitive control dataset (few-shot: \"EEG: DPX Cog Ctl Task in Acute Mild TBI\"), which is labeled Type=Attention for a task designed to elicit attention/cognitive control ERPs. Although PURSUE is not clinical, it is also an ERP task centered on visual selection (N2pc), which conventionally maps to Attention rather than Perception. The schizophrenia visual motion discrimination example (Type=Perception) guides the contrast: discrimination of sensory evidence (left/right motion) is Perception, whereas visual search/target selection is more squarely Attention.","metadata_analysis":"Key facts from provided metadata:\n- Task/paradigm: \"PURSUE N2pc Visual Search\" and \"Visual Search Experiment from the PURSUE project\" (readme). Also task list: \"VisualSearch\".\n- Population: participants are typical undergraduates/young adults with no stated diagnosis: \"participants at 3 different primarily undergraduate academic institutions\" and \"Age range: 18-33\" with \"Subjects: 294\" (participants_overview).\n- Modality/stimuli: explicitly \"Visual Search\" and dataset title includes \"Visual Search\" and \"N2pc\" (a visual-spatial attention ERP component), supporting Visual modality.","paper_abstract_analysis":"No useful paper information.","evidence_alignment_check":"Pathology:\n1) Metadata says: \"participants at 3 different primarily undergraduate academic institutions\" and \"Age range: 18-33\" (no diagnosis mentioned).\n2) Few-shot pattern suggests: undergraduate/typical participants without clinical descriptors map to Healthy (seen across multiple healthy examples).\n3) Alignment: ALIGN.\n\nModality:\n1) Metadata says: \"Visual Search Experiment\" and task name \"VisualSearch\".\n2) Few-shot pattern suggests: tasks explicitly described as visual (e.g., visual discrimination; visual search) map to Visual modality.\n3) Alignment: ALIGN.\n\nType:\n1) Metadata says: \"N2pc Visual Search\" / \"Visual Search Experiment\" (N2pc is classically tied to selective visual attention during search).\n2) Few-shot pattern suggests: tasks probing selection/control (e.g., DPX cognitive control) are labeled Attention; simple sensory discrimination is Perception.\n3) Alignment: ALIGN (visual search/N2pc fits Attention better than Perception).","decision_summary":"Top-2 candidates and decision:\n\nPathology:\n- Healthy (winner): No recruitment based on disorder; \"primarily undergraduate academic institutions\" and \"Age range: 18-33\"; large normative sample \"Subjects: 294\".\n- Unknown (runner-up): Could be considered if recruitment criteria were missing, but metadata context strongly implies non-clinical.\nFinal: Healthy. Confidence=0.7 (explicit non-clinical context but no direct \"healthy\" statement).\n\nModality:\n- Visual (winner): \"Visual Search Experiment\"; task \"VisualSearch\"; title includes \"Visual Search\"/\"N2pc\".\n- Unknown (runner-up): Only if stimulus channel were not stated, but it is explicit.\nFinal: Visual. Confidence=0.8 (multiple explicit metadata cues).\n\nType:\n- Attention (winner): Visual search and N2pc are selective-attention constructs (target selection among distractors); aligns with few-shot convention where control/selection paradigms map to Attention.\n- Perception (runner-up): Could apply if the focus were purely sensory discrimination, but \"Visual Search\"/N2pc more directly indexes attentional selection.\nFinal: Attention. Confidence=0.8 (explicit paradigm + strong ERP/construct linkage and few-shot support)."}},"canonical_name":null,"name_confidence":0.86,"name_meta":{"suggested_at":"2026-04-14T10:18:35.343Z","model":"openai/gpt-5.2 + openai/gpt-5.4-mini + deterministic_fallback"},"name_source":"author_year","author_year":"Couperus2025_N2PC"}}