{"success":true,"database":"eegdash","data":{"_id":"69d16e04897a7725c66f4c4b","dataset_id":"ds007471","associated_paper_doi":null,"authors":["Zijun Zhou","Anna Zamm","Justin Christensen","Vinesh Rao","Janeen Loehr"],"bids_version":"1.8.0","contact_info":["Zijun Zhou"],"contributing_labs":null,"data_processed":true,"dataset_doi":"doi:10.18112/openneuro.ds007471.v1.0.0","datatypes":["eeg"],"demographics":{"subjects_count":31,"ages":[],"age_min":null,"age_max":null,"age_mean":null,"species":null,"sex_distribution":null,"handedness_distribution":null},"experimental_modalities":null,"external_links":{"source_url":"https://openneuro.org/datasets/ds007471","osf_url":null,"github_url":null,"paper_url":null},"funding":[],"ingestion_fingerprint":"e08957b60ad6b84915b5db8f3cd99ade1bcf7b9f7f027ba413d1f409827ec04d","license":"CC0","n_contributing_labs":null,"name":"Joint agency EEG dataset","readme":"# Behavioural and EEG data from an EEG hyperscanning study examining cognitive and neural signals underlying the sense of joint agency during a musical joint action task\n---\n# Dataset Structure\nThe primary folder includes a separate folder for each pair:sub-##\nEach pair folder contains:\n## Behavioural Data\nLocated in:sub-##/beh/\nFile:sub-##_task-jointaction_beh.tsv\n## EEG Data\nLocated in:sub-##/eeg/\nFiles (BrainVision format):\nsub-##_task-jointaction_eeg.eeg\nsub-##_task-jointaction_eeg.vhdr\nsub-##_task-jointaction_eeg.vmrk\n---\n# Derivatives Folder\nThe `derivatives/` folder contains:\n- `behavioural_all.tsv`\n  Compiled behavioural data across all pairs.\n- `32chanElectrodePositions.elp`\n  Electrode positions used for EEG data acquisition and analysis.\n---\n# Behavioural Data Description\nThe following column descriptions apply to both:\n- `behavioural_all.tsv`\n- `sub-##_task-jointaction_beh.tsv`\n---\n## Pair Number\nValues: 1–32\n---\n## Participant Number\n- The first one or two digits represent the pair number.\n- The last digit represents seating position:\n  - `1` = left participant\n  - `2` = right participant\nExamples:\n- `11` = left participant in pair 1\n- `202` = right participant in pair 20\n---\n## Block Number\nTest block number for a given trial (1–8).\n---\n## Trial Number\nEach pair performed:\n- 8 tone sequences\n  - 4 musical duets\n  - 4 constant pitch sequences\n- 5 joint trials per sequence\nTotal:\n- 40 test trials per pair\n- Trial numbers range from 1–40\n---\n## Experimental Condition\n- `0` = constant pitch sequences\n- `1` = musical duets\n---\n## Part Performed\nIndicates which part of the tone sequence the participant performed:\n- `0` = higher-pitch part (for constant pitch sequences) or melody part (for musical duets)\n- `1` = lower-pitch part (for constant pitch sequences) or accompaniment part (for musical duets)\n---\n## Tone Sequence\n1. Twinkle Twinkle Little Star\n2. Hush Little Baby\n3. B.I.N.G.O.\n4. Yankee Doodle\n5. Constant pitch sequence with A4 as higher-pitch part\n6. Constant pitch sequence with C5 as higher-pitch part\n7. Constant pitch sequence with E♭5 as higher-pitch part\n8. Constant pitch sequence with F♯5 as higher-pitch part\n---\n## Joint Agency Ratings\nSelf-reported rating scale: 1–7\n---\n## Mean Synchronization Performance\nThe mean synchronization performance for each trial was calculated as follows. First, we calculated the absolute asynchrony between the two participants’ note onsets at each beat. Then, we converted each asynchrony to a proportion of the inter-onset interval (IOI) from the preceding note onset to the current note onset, which we averaged across the two participants and across all beats in the sequence.\n---\n## Standard Deviation (SD) of Synchronization Performance\nThe SD of synchronization performance was defined as the standard deviation of the asynchronies across all beats in a given each trial.\n---\n# EEG Data Description\nFor each EEG dataset within each pair’s folder:\n- Channels 1–32: left participant EEG\n- Channels 33–64: right participant EEG\nData are stored in BrainVision format.\n---\n# Event Codes (Test Section)\nThe following event markers are present during the test section (see Figure 1 for schematic reference):\n- **S1** – the beginning of the test trials portion of the experiment\n- **S10** – a condition marker indicating the beginning of a block of musical duets\n- **S11** – a condition marker indicating the beginning of a block of constant pitch sequences\n- **S105** – the start of each trial, triggered by pressing the space bar\n- **S128** – The first five S128s mark the metronome tone onsets. Remaining S128s mark the tone onsets from the left participant’s e-music box.\n- **S4** – tone onsets from the right participant’s e-music box\n- **S2** – the end of the left participant’s performance, marked one beat after the last of their 16-beat tone sequence\n- **S3** – the end of the right participant’s performance, marked one beat after the last of their 16-beat tone sequence\n- **S106** – the end of each trial after the rating scales were completed\n- **S107** – the end of each block\n---\n# Figure\n![Illustration of the event codes occurring over time in the dataset.](derivatives/figures/Figure1_EventCodes.png)\n---\n# Notes\n- Data are organized in BIDS format.\n- BrainVision files (.eeg, .vhdr, .vmrk) contain raw hyperscanning EEG data.\n- Behavioural data are provided per pair and as a compiled dataset in the derivatives folder.","recording_modality":["eeg"],"senior_author":"Janeen Loehr","sessions":[],"size_bytes":8658914750,"source":"openneuro","storage":{"backend":"s3","base":"s3://openneuro.org/ds007471","raw_key":"dataset_description.json","dep_keys":["CHANGES","README.md","participants.json","participants.tsv"]},"study_design":null,"study_domain":null,"tasks":["jointaction"],"timestamps":{"digested_at":"2026-04-22T12:30:18.948830+00:00","dataset_created_at":"2026-03-04T18:09:46.578Z","dataset_modified_at":"2026-03-04T18:35:09.000Z"},"total_files":31,"computed_title":"Joint agency EEG dataset","nchans_counts":[{"val":64,"count":31}],"sfreq_counts":[{"val":1000.0,"count":31}],"stats_computed_at":"2026-04-22T23:16:00.312805+00:00","total_duration_s":null,"tagger_meta":{"config_hash":"3557b68bca409f28","metadata_hash":"5d6e97f90cd47b2a","model":"openai/gpt-5.2","tagged_at":"2026-04-07T09:32:40.872789+00:00"},"tags":{"pathology":["Healthy"],"modality":["Auditory"],"type":["Other"],"confidence":{"pathology":0.65,"modality":0.85,"type":0.7},"reasoning":{"few_shot_analysis":"Most similar few-shot convention is the “EEG Motor Movement/Imagery Dataset”, which is labeled Type=Motor when the primary construct is movement execution/imagery. This joint-action dataset also contains overt performance (synchronized tone production), so Motor is a plausible runner-up Type by convention. However, unlike the motor-imagery example, the stated goal here is the “sense of joint agency” (a social/agency construct not represented as a dedicated Type label in the allowed set), which pushes the Type toward Other.\nFor Modality, few-shot examples show that stimulus channel (not response) drives Modality; e.g., the digit-span example is Modality=Auditory because digits are presented auditorily. Here, the task is built around tones/metronome/tone sequences, making Auditory the closest match by that same convention.","metadata_analysis":"Key quoted facts from the dataset README:\n1) Study aim/construct: “EEG hyperscanning study examining cognitive and neural signals underlying the sense of joint agency during a musical joint action task”.\n2) Auditory stimulus content: “Each pair performed: 8 tone sequences – 4 musical duets – 4 constant pitch sequences”.\n3) Explicit tone-onset events/metronome: “The first five S128s mark the metronome tone onsets. Remaining S128s mark the tone onsets…”, and “S4 – tone onsets from the right participant’s e-music box”.\n4) Overt action/synchronization framing: “we calculated the absolute asynchrony between the two participants’ note onsets at each beat”.\nParticipants field is minimal (“Subjects: 31”) and contains no diagnostic/clinical recruitment information.","paper_abstract_analysis":"No useful paper information.","evidence_alignment_check":"Pathology:\n- Metadata says: no diagnosis/clinical recruitment is mentioned; only “Subjects: 31”.\n- Few-shot pattern suggests: when no disorder is indicated, label as Healthy (normative cohort).\n- Alignment: ALIGN (no conflicting explicit clinical facts).\n\nModality:\n- Metadata says: musical/tone stimuli are central: “8 tone sequences”, “metronome tone onsets”, “tone onsets”.\n- Few-shot pattern suggests: classify by dominant stimulus channel (e.g., digit-span with auditory stimuli -> Auditory).\n- Alignment: ALIGN (auditory stimulation is explicit and dominant).\n\nType:\n- Metadata says: primary construct is “sense of joint agency” in a “musical joint action task”.\n- Few-shot pattern suggests: overt movement paradigms can map to Type=Motor (as in the motor movement/imagery example), but when the primary construct is not a standard cognitive label, use Other.\n- Alignment: PARTIAL (task involves motor performance, but stated research purpose emphasizes agency/social joint action rather than motor control per se). Metadata emphasis supports Other over Motor.","decision_summary":"Top-2 candidates and final selections:\n\nPathology:\n1) Healthy (selected): No clinical/diagnostic terms; only “Subjects: 31” and an experimental joint-action study description.\n2) Unknown: participant health status is not explicitly stated.\nDecision: Healthy is stronger because the dataset is framed as a typical cognitive neuroscience hyperscanning experiment with no recruitment by disorder.\n\nModality:\n1) Auditory (selected): explicit auditory/tone content—“tone sequences” and “metronome tone onsets”.\n2) Multisensory: participants both hear tones and perform actions; could involve combined sensorimotor context.\nDecision: Auditory wins because the described stimuli/events are explicitly tone/metronome onset markers and musical sequences, and Modality is defined by stimulus channel.\n\nType:\n1) Other (selected): explicit construct focus—“sense of joint agency” during “musical joint action”.\n2) Motor: overt synchronized performance and note-onset timing could be treated as motor control.\nDecision: Other wins because the stated research purpose is joint agency (social/agency construct), which is not directly represented among the allowed Type labels; Motor is a secondary aspect.\n\nConfidence notes (quotes/features used):\n- Pathology confidence limited by lack of explicit “healthy” wording (only absence of clinical terms).\n- Modality confidence supported by multiple explicit tone/metronome quotes.\n- Type confidence moderate due to plausible Motor runner-up despite explicit agency emphasis."}},"canonical_name":null,"name_confidence":0.64,"name_meta":{"suggested_at":"2026-04-14T10:18:35.343Z","model":"openai/gpt-5.2 + openai/gpt-5.4-mini + deterministic_fallback"},"name_source":"author_year","author_year":"Zhou2026"}}