{"success":true,"database":"eegdash","data":{"_id":"6953f4249276ef1ee07a346f","dataset_id":"ds006866","associated_paper_doi":null,"authors":["Szymon Mąka","Marta Chrustowicz","Łukasz Okruszek"],"bids_version":"1.8.0","contact_info":["Szymon Mąka","Łukasz Okruszek"],"contributing_labs":null,"data_processed":true,"dataset_doi":"doi:10.18112/openneuro.ds006866.v1.0.0","datatypes":["eeg"],"demographics":{"subjects_count":148,"ages":[29,35,21,23,32,21,26,32,28,24,25,26,28,22,21,20,26,31,26,34,25,30,31,25,22,27,31,23,23,33,32,34,28,28,31,25,23,20,21,29,31,20,23,24,19,20,22,32,21,33,24,25,21,27,19,29,22,19,23,25,27,25,20,22,20,25,21,22,26,29,23,26,34,20,20,23,24,35,24,20,21,24,32,28,32,27,28,35,31,35,27,23,27,22,25,21,26,31,21,21,26,28,23,19,22,22,26,20,22,31,22,24,24,24,23,21,28,28,29,24,22,24,21,35,22,23,26,23,29,32,22,19,18,25,21,22,24,31,23,26,24,22,28,19,35,20,24,26],"age_min":18,"age_max":35,"age_mean":25.304054054054053,"species":null,"sex_distribution":{"o":148},"handedness_distribution":null},"experimental_modalities":null,"external_links":{"source_url":"https://openneuro.org/datasets/ds006866","osf_url":null,"github_url":null,"paper_url":null},"funding":["National Science Centre, Poland (Grant No: 2019/35/B/HS6/00517)"],"ingestion_fingerprint":"7cf23d91db853ac1bcaf319c5989cd3ad682133021c7e17e2c42c92d16b7bc30","license":"CC0","n_contributing_labs":null,"name":"Discrepancy between self-report and neurophysiological markers of socio-affective responses in lonely individuals","readme":"# Emotion Processing and Regulation Task (Static Stimuli) — EEG Dataset\nThis dataset contains EEG recordings and behavioral data from the **Emotion Processing and Regulation** task with static emotional stimuli.\n* **Preregistration:** [https://osf.io/g8qey](https://osf.io/g8qey)\n* **Preprint:** [https://osf.io/preprints/psyarxiv/v9dt3_v2](https://osf.io/preprints/psyarxiv/v9dt3_v2)\n---\n## Participants\n* **N = 148** right-handed, neurologically healthy adults with normal or corrected-to-normal vision.\n---\n## Experimental Design\nThe single session comprised 240 trials, split evenly across six conditions defined by stimulus content type Participants completed 240 trials in a single session, evenly allocated to a 2 (content: social, non-social) × 3 (regulation requirement: watch-neutral, watch-negative, reappraise-negative) factorial design. On each trial, they viewed a static image for 5 s and either passively watched it or reappraised it as instructed. They then rated arousal and subsequently valence of the image on separate 9-point scales.\n---\n## EEG Data Acquisition\n* **EEG Cap:** 64-channel QuickCap\n* **Amplifier:** Neuroscan SynampsRT\n* **Sampling rate:** 1000 Hz\n* **Impedance:** below 5 kΩ\n### Recorded Channels\nFP1, FPZ, FP2, AF3, AF4, F7, F5, F3, F1, FZ, F2, F4, F6, F8,\nFT7, FC5, FC3, FC1, FCZ, FC2, FC4, FC6, FT8, T7, C5, C3, C1, CZ, C2, C4, C6, T8,\nM1, TP7, CP5, CP3, CP1, CPZ, CP2, CP4, CP6, TP8, M2, P7, P5, P3, P1, PZ, P2, P4, P6, P8,\nPO7, PO5, PO3, POZ, PO4, PO6, PO8, CB1, O1, OZ, O2, CB2\n### Additional Sensors\n* HEO (Horizontal EOG)\n* VEO (Vertical EOG)\n* EKG\n* GSR/EDA\n---\n## EEG Preprocessing\nAll preprocessing was conducted in **MATLAB R2020b** using **EEGLAB 2023.0** and **ERPLAB 9.10**. The preprocessing pipeline and fully commented scripts are available in `code/Preprocessing_EEG.m`.\n### Summary of preprocessing steps\n1. **Band-pass filtering:** 0.1–30 Hz (zero-phase Hamming-windowed FIR)\n2. **Downsampling:** 250 Hz\n3. **Re-referencing:** average mastoids (M1, M2)\n4. **Automatic bad-channel rejection:** *clean_rawdata* (autocorrelation = 0.8)\n5. **ICA:** *runica* algorithm\n6. **Automatic component rejection:** **ADJUST** and **SASICA**\n7. **Spherical interpolation** of removed channels\n8. **Epoching:** −200 to 5000 ms relative to stimulus onset\n9. **Baseline correction:** −200 ms pre-stimulus\n10. **Artifact rejection:** ±100 µV within 200 ms moving window (100 ms step)\n11. **Condition averaging:** using ERPLAB\n---\n## Derivatives & Supplemental Data\n### `derivatives/processed_erps/`\nContains averaged ERP files (`.erp`) for each participant after preprocessing.\n### `code/`\nIncludes MATLAB preprocessing scripts and documentation (`Preprocessing_EEG.m`).","recording_modality":["eeg"],"senior_author":"Łukasz Okruszek","sessions":[],"size_bytes":124810279277,"source":"openneuro","study_design":null,"study_domain":null,"tasks":["EmotionProcessingandRegulationtask"],"timestamps":{"digested_at":"2026-04-22T12:29:44.580804+00:00","dataset_created_at":"2025-10-29T19:00:26.774Z","dataset_modified_at":"2025-10-29T23:48:25.000Z"},"total_files":148,"storage":{"backend":"s3","base":"s3://openneuro.org/ds006866","raw_key":"dataset_description.json","dep_keys":["CHANGES","README.md","participants.json","participants.tsv","task-EmotionProcessingandRegulationtask_events.json"]},"tagger_meta":{"config_hash":"4a051be509a0e3d0","metadata_hash":"ef248156a800ec66","model":"openai/gpt-5.2","tagged_at":"2026-01-20T19:11:46.871392+00:00"},"tags":{"pathology":["Healthy"],"modality":["Visual"],"type":["Affect"],"confidence":{"pathology":0.85,"modality":0.85,"type":0.9},"reasoning":{"few_shot_analysis":"Closest few-shot by construct and stimulus channel is the example titled \"EEG: Three armed bandit gambling task\" (Healthy + Visual + Affect). While that task is gambling/reward, the labeling convention is that when the primary scientific target is emotional/affective processing (rather than, e.g., pure perception), Type is set to \"Affect\" and Modality reflects the presented stimuli channel (visual). This guides mapping the current emotion regulation paradigm with static images to (Visual, Affect) in a healthy cohort.","metadata_analysis":"Key population and task facts from the dataset README: (1) Population: \"N = 148 right-handed, neurologically healthy adults\". (2) Visual affective stimuli and regulation: \"Emotion Processing and Regulation task with static emotional stimuli\" and \"they viewed a static image for 5 s and either passively watched it or reappraised it as instructed\". (3) Affective ratings: \"They then rated arousal and subsequently valence of the image\". These lines indicate a healthy sample performing an affective (emotion processing/regulation) task driven by visual image presentation.","paper_abstract_analysis":"No useful paper information.","evidence_alignment_check":"Pathology — Metadata says: \"neurologically healthy adults\". Few-shot pattern suggests: when explicitly healthy, label \"Healthy\". ALIGN.\nModality — Metadata says: \"static emotional stimuli\" and \"viewed a static image\". Few-shot pattern suggests: image-based paradigms map to \"Visual\" (even if physiology like EDA is recorded). ALIGN.\nType — Metadata says: \"Emotion Processing and Regulation\" plus valence/arousal ratings and reappraisal instruction (emotion regulation). Few-shot pattern suggests: emotion/affect-focused tasks map to Type \"Affect\" (see Healthy+Visual+Affect example). ALIGN.","decision_summary":"Pathology top-2: (1) Healthy — supported by \"neurologically healthy adults\". (2) Unknown — would apply only if no population description; weaker because health status is explicit. Final: Healthy (confidence driven by explicit participant description).\nModality top-2: (1) Visual — supported by \"static image\" / \"viewed a static image\". (2) Multisensory — possible only if stimuli were multi-channel; weaker because additional channels listed (EOG/EKG/GSR) are recordings, not stimulus modalities. Final: Visual.\nType top-2: (1) Affect — supported by \"Emotion Processing and Regulation\" and \"rated arousal\" and \"valence\" and \"reappraise\". (2) Attention — could be involved but not the primary construct emphasized. Final: Affect."}},"computed_title":"Discrepancy between self-report and neurophysiological markers of socio-affective responses in lonely individuals","nchans_counts":[{"val":69,"count":148}],"sfreq_counts":[{"val":1000.0,"count":148}],"stats_computed_at":"2026-04-22T23:16:00.312084+00:00","total_duration_s":444062.728,"author_year":"Maka2025_Discrepancy","canonical_name":null}}