{"success":true,"database":"eegdash","data":{"_id":"6953f4249276ef1ee07a3362","dataset_id":"ds004563","associated_paper_doi":null,"authors":["Sophie Smit","Denise Moerel","Regine Zopf","Anina N Rich"],"bids_version":"v1.8.0","contact_info":["Sophie Smit"],"contributing_labs":null,"data_processed":true,"dataset_doi":"doi:10.18112/openneuro.ds004563.v1.0.1","datatypes":["eeg"],"demographics":{"subjects_count":40,"ages":[18,18,21,20,18,40,28,18,28,28,25,18,18,18,25,19,19,27,32,19,17,18,22,20,25,23,19,22,21,23,19,34,18,19,18,20,18,19,20,19],"age_min":17,"age_max":40,"age_mean":21.775,"species":null,"sex_distribution":{"f":27,"m":13},"handedness_distribution":null},"experimental_modalities":null,"external_links":{"source_url":"https://openneuro.org/datasets/ds004563","osf_url":null,"github_url":null,"paper_url":null},"funding":[],"ingestion_fingerprint":"6d054ee02574ddffe0335e30996611412503188aeba9692e8399e4c9a1ca62c5","license":"CC0","n_contributing_labs":null,"name":"Vicarious touch: overlapping neural patterns between seeing and feeling touch","readme":"Data collection took place at Macquarie University in Sydney Australia. The study was approved by the Macquarie University Ethics Committee.\nWe used time-resolved multivariate pattern analysis on whole-brain EEG data from people with and without vicarious touch experiences to test whether seen touch evokes overlapping neural representations with the first-hand experience of touch. Participants felt touch to the fingers (tactile trials) or watched carefully matched videos of touch to another person’s fingers (visual trials).\nThere were 12 runs in total, divided into four blocks of 36 trials (with alternating sets of nine tactile and nine visual trials) resulting in a total of 1728 trials (864 tactile and 864 visual). There were an additional 240 target trials (20 per run), which were excluded from analysis.\nBetween trials there was an inter-trial-interval of 800ms. Each run lasted approximately 7-8 minutes with short breaks between blocks and runs.\nWhole brain 64-channel EEG data were recorded using an Active Two Biosemi system (Biosemi, Inc.) at 2048Hz and 10-20 standard caps. Stimuli were presented using MATLAB (MathWorks) and Psychtoolbox (Brainard and Vision). The experiment presentation script, all analysis code, and stimuli are made available (see code and stimuli folder). The data is made available both in raw form (see each participant's file) and after processing (see derivatives).","recording_modality":["eeg"],"senior_author":"Anina N Rich","sessions":["01","02","03"],"size_bytes":108333359388,"source":"openneuro","study_design":null,"study_domain":null,"tasks":["touchdecoding"],"timestamps":{"digested_at":"2026-04-22T12:26:42.445954+00:00","dataset_created_at":"2023-05-10T10:26:34.294Z","dataset_modified_at":"2023-07-06T22:12:40.000Z"},"total_files":119,"storage":{"backend":"s3","base":"s3://openneuro.org/ds004563","raw_key":"dataset_description.json","dep_keys":["CHANGES","README","participants.json","participants.tsv","task-touchdecoding_events.json"]},"tagger_meta":{"config_hash":"4a051be509a0e3d0","metadata_hash":"ea10bb5a25c5e6de","model":"openai/gpt-5.2","tagged_at":"2026-01-20T10:43:02.368334+00:00"},"tags":{"pathology":["Other"],"modality":["Multisensory"],"type":["Perception"],"confidence":{"pathology":0.7,"modality":0.85,"type":0.7},"reasoning":{"few_shot_analysis":"Most similar few-shot conventions:\n- The \"Braille letters - EEG\" example shows that when the primary stimulus is delivered to the fingers, the Modality is labeled \"Tactile\" (it uses \"Braille cells under the left and right index fingers\"). This guides identifying the tactile component here (\"Participants felt touch to the fingers\").\n- The \"Cross-modal Oddball Task\" example demonstrates that when trials include multiple sensory channels in the same overall paradigm, Modality can be labeled \"Multisensory\" (it includes both a \"visual pre-cue\" and an \"auditory pre-cue\"). This guides labeling this dataset as Multisensory because it explicitly contains both tactile and visual trial types.\nAlso, the schizophrenia visual discrimination example supports mapping discrimination/representation of sensory stimuli to Type=\"Perception\", which matches this dataset’s goal of comparing neural representations of touch across first-hand and seen touch.","metadata_analysis":"Key factual lines from the dataset metadata:\n- Population/recruitment: \"whole-brain EEG data from people with and without vicarious touch experiences\".\n- Stimulus modalities: \"Participants felt touch to the fingers (tactile trials) or watched carefully matched videos of touch to another person’s fingers (visual trials).\"\n- Study aim/type: \"to test whether seen touch evokes overlapping neural representations with the first-hand experience of touch.\"","paper_abstract_analysis":"No useful paper information.","evidence_alignment_check":"Pathology:\n1) Metadata says: \"people with and without vicarious touch experiences\" (a specific atypical perceptual/sensory trait used to define groups).\n2) Few-shot pattern suggests: when recruitment targets a specific non-standard population not in the main clinical labels, use \"Other\" (e.g., visually deprived population labeled \"Other\" in the Braille example).\n3) Alignment: ALIGN.\n\nModality:\n1) Metadata says: \"felt touch to the fingers (tactile trials)\" AND \"watched ... videos of touch ... (visual trials)\".\n2) Few-shot pattern suggests: multiple stimulus channels in one dataset/task can be labeled \"Multisensory\" (Cross-modal Oddball example).\n3) Alignment: ALIGN.\n\nType:\n1) Metadata says: goal is to test overlap of neural representations for \"seen touch\" vs \"first-hand experience of touch\".\n2) Few-shot pattern suggests: sensory representation/discrimination tasks map to \"Perception\" (e.g., visual discrimination task labeled Perception; tactile braille sensory-to-perceptual representations also aligns with Perception/Learning conventions).\n3) Alignment: ALIGN (primary aim is sensory/perceptual representation rather than intervention or decision policy).","decision_summary":"Top-2 comparative selections:\n\nPathology candidates:\n- Other: Supported by explicit recruitment around a specific condition/trait not listed in allowed pathology labels: \"people with and without vicarious touch experiences\". Few-shot convention supports using \"Other\" for non-listed special populations.\n- Healthy: Possible because the metadata does not name a standard medical diagnosis, and includes a non-affected comparison group.\nHead-to-head: \"Other\" is stronger because the dataset explicitly targets a special population/phenotype (vicarious touch experiencers) as a grouping variable.\nConfidence evidence: 1 explicit quote supporting a non-standard recruited trait.\n\nModality candidates:\n- Multisensory: Supported by explicit inclusion of tactile and visual stimulus trials: \"felt touch to the fingers (tactile trials)\" and \"watched ... videos ... (visual trials)\"; matches few-shot multisensory convention.\n- Tactile: Possible because half the trials are direct tactile stimulation and the research question concerns touch.\nHead-to-head: \"Multisensory\" is stronger because the design explicitly contains two stimulus modalities central to the research question (comparison of first-hand vs seen touch).\nConfidence evidence: 2 explicit quotes naming tactile and visual trials + strong few-shot analog.\n\nType candidates:\n- Perception: Supported by the aim to test whether \"seen touch evokes overlapping neural representations\" with actual touch, i.e., sensory/perceptual representation.\n- Other: Possible if interpreted primarily as social/vicarious processing rather than sensory perception.\nHead-to-head: \"Perception\" is stronger because the core manipulation is sensory (tactile vs visual touch observation) and the stated goal concerns representations of touch.\nConfidence evidence: 1 explicit quote about representational overlap for touch, plus consistent few-shot mapping for sensory tasks."}},"nemar_citation_count":1,"computed_title":"Vicarious touch: overlapping neural patterns between seeing and feeling touch","nchans_counts":[{"val":64,"count":119}],"sfreq_counts":[{"val":2048.0,"count":119}],"stats_computed_at":"2026-04-22T23:16:00.307896+00:00","total_duration_s":null,"author_year":"Smit2023","canonical_name":null}}