{"success":true,"database":"eegdash","data":{"_id":"6953f4249276ef1ee07a3470","dataset_id":"ds006890","associated_paper_doi":null,"authors":["Huixiang Yang","Ryohei Fukuma","Tomoyuki Namima","Kotaro Okuda","Asaya Nishi","Takamitsu Iwata","Abdi Reza","Kota S Sasaki","Taro Kaiju","Gurlal Gill","Haruhiko Kishima","Shinji Nishimoto","Takufumi Yanagisawa"],"bids_version":"1.9.0","contact_info":["Huixiang Yang"],"contributing_labs":null,"data_processed":true,"dataset_doi":"doi:10.18112/openneuro.ds006890.v1.0.0","datatypes":["ieeg"],"demographics":{"subjects_count":2,"ages":[9,8],"age_min":8,"age_max":9,"age_mean":8.5,"species":null,"sex_distribution":{"f":2},"handedness_distribution":null},"experimental_modalities":null,"external_links":{"source_url":"https://openneuro.org/datasets/ds006890","osf_url":null,"github_url":null,"paper_url":null},"funding":["Grant JPMJER1801 from JST, JPMJMS2012 (TY) from Moonshot R&D, JPMJCR18A5 (TY) from CREST, JPMJCR24U2 (TY) from AIP"],"ingestion_fingerprint":"76c94f70f2cedd6f5ad6d7df7088367fbb07684c3d91121e0ff5a14193abb584","license":"CC0","n_contributing_labs":null,"name":"Longitudinal Multitask Wireless ECoG Data from Two Fully Implanted Macaca fuscata","readme":"Longitudinal Multitask Wireless ECoG Data from Two Fully Implanted Macaca fuscata — README\n## Overview\nThis repository contains a wireless subdural ECoG (iEEG) dataset from *Macaca fuscata* monkeys,\norganized in compliance with the iEEG-BIDS specification.\nRecordings were acquired several times each week using a wireless, inductively powered implant. The data were curated and organized in BIDS format to facilitate reproducible research in neuroscience.\nKeywords: wireless subdural ECoG, iEEG, Macaca fuscata, BIDS-compliant dataset,\nlongitudinal recordings, task-based neurophysiology\n## BIDS Organization\n- dataset_description.json\n- participants.tsv, participants.json\n- README.md, CHANGES.md\n- sub-<id>/ses-<index>/ieeg/ (with *_ieeg.edf, *_ieeg.json, *_channels.tsv, *_events.tsv, *_scans.tsv, *_electrodes.tsv, *_electrodes.json, *_coordsystem.json)\n## Tasks\nTasks include rest, pressing, reaching, listening, sep.\nOnly curated and validated tasks are exported.\n## Signals and Channels\n- Uniform sampling rate per file.\n- channels.tsv lists physiological (ECoG), trigger (TRIGGER) and auxiliary channels (MISC).\n## Usage\nThis dataset can be loaded with BIDS-compatible toolboxes such as MNE-Python, FieldTrip, or EEGLAB.\nInspect *_events.tsv for task timing and *_channels.tsv for channel information.\n## Participants\nEach subject corresponds to an individual monkey (e.g., sub-monkeyb, sub-monkeyc).\n## Ethics\nAll animal procedures complied with Japanese laws and institutional regulations, including the Science Council of Japan Guidelines for Proper Conduct of Animal Experiments and national standards on pain relief and euthanasia, and were approved by the Animal Experiment Committee — The University of Osaka (approval FBS-25-002).\n## License and Citation\nLicense: CC BY 4.0\nCitation: [Authors], “[Dataset Title],” [Repository/DOI], [Year].\n## Contact\nMaintainer: Huixiang Yang, The University of Osaka, yanghuixiang@bci.med.osaka-u.ac.jp\nFor issues, please use the repository issue tracker.","recording_modality":["ieeg"],"senior_author":"Takufumi Yanagisawa","sessions":["day05","day06","day08","day09","day102","day104","day105","day107","day110","day111","day113","day117","day118","day120","day123","day125","day127","day13","day130","day132","day134","day138","day140","day141","day147","day149","day15","day152","day154","day155","day159","day161","day162","day166","day168","day169","day170","day174","day177","day178","day180","day182","day183","day184","day187","day188","day189","day19","day190","day195","day197","day199","day202","day203","day206","day208","day209","day211","day212","day217","day218","day219","day22","day225","day226","day229","day231","day232","day237","day238","day24","day240","day244","day245","day246","day250","day251","day252","day253","day257","day26","day260","day261","day265","day267","day268","day27","day274","day275","day277","day279","day280","day281","day282","day284","day286","day287","day288","day29","day290","day293","day294","day295","day296","day298","day300","day301","day302","day305","day307","day308","day310","day312","day314","day315","day318","day319","day321","day322","day324","day326","day328","day329","day33","day333","day334","day335","day336","day338","day341","day342","day343","day346","day348","day350","day351","day355","day356","day359","day36","day362","day363","day365","day366","day368","day369","day371","day373","day377","day378","day379","day380","day383","day384","day385","day386","day387","day388","day389","day39","day392","day393","day394","day396","day397","day400","day402","day403","day404","day406","day407","day408","day409","day41","day410","day413","day418","day420","day422","day426","day427","day43","day432","day433","day434","day435","day437","day439","day441","day444","day446","day448","day45","day450","day452","day453","day455","day458","day461","day462","day463","day464","day465","day466","day468","day469","day471","day475","day476","day478","day48","day483","day486","day490","day492","day497","day499","day50","day503","day506","day510","day513","day517","day52","day520","day521","day524","day525","day53","day56","day57","day59","day64","day66","day71","day73","day75","day76","day77","day80","day82","day83","day85","day87","day91","day92","day94","day95","day97","day98","day99"],"size_bytes":44207054000,"source":"openneuro","study_design":null,"study_domain":null,"tasks":["listening","pressing","reaching","rest","sep"],"timestamps":{"digested_at":"2026-04-22T12:29:45.249286+00:00","dataset_created_at":"2025-11-03T13:21:47.212Z","dataset_modified_at":"2025-11-04T05:20:04.000Z"},"total_files":870,"storage":{"backend":"s3","base":"s3://openneuro.org/ds006890","raw_key":"dataset_description.json","dep_keys":["CHANGES","README.md","participants.json","participants.tsv"]},"tagger_meta":{"config_hash":"4a051be509a0e3d0","metadata_hash":"764f1e4db0021f2d","model":"openai/gpt-5.2","tagged_at":"2026-01-20T19:12:22.616344+00:00"},"tags":{"pathology":["Healthy"],"modality":["Multisensory"],"type":["Motor"],"confidence":{"pathology":0.6,"modality":0.8,"type":0.7},"reasoning":{"few_shot_analysis":"Closest few-shot conventions:\n1) Cross-modal Oddball Task (Parkinson's) is labeled Modality=Multisensory because the paradigm explicitly combines visual and auditory cues (\"visual pre-cue and an auditory pre-cue, which occurred at the same time\"). This guides labeling when multiple stimulus channels are present.\n2) EEG Motor Movement/Imagery Dataset is labeled Type=Motor because the primary experimental content is pressing/movement/imagery tasks (movement execution/imagery as research focus → Motor). This guides choosing Type=Motor when the task list is dominated by reaching/pressing style behaviors, even if other tasks exist.\nThese examples guide STYLE: (a) multiple stimulus/task modalities → Multisensory; (b) motor execution tasks → Motor type.","metadata_analysis":"Key metadata facts from README:\n- Population/species and recording context: \"wireless subdural ECoG (iEEG) dataset from *Macaca fuscata* monkeys\" and \"Each subject corresponds to an individual monkey\".\n- Task set explicitly includes multiple modalities: \"Tasks include rest, pressing, reaching, listening, sep.\".\n- The dataset is described as \"Longitudinal Multitask\" and includes \"task-based neurophysiology\" recordings acquired \"several times each week\".\nThese lines indicate: non-human participants with no stated disease recruitment; tasks spanning motor (pressing/reaching), auditory (listening), and somatosensory/evoked potential stimulation (SEP), plus rest.","paper_abstract_analysis":"No useful paper information.","evidence_alignment_check":"Pathology:\n- Metadata says: \"*Macaca fuscata* monkeys\" / \"Each subject corresponds to an individual monkey\" (no diagnosis described).\n- Few-shot pattern suggests: when no disorder is stated and cohort is normative, label as Healthy (e.g., multiple healthy datasets).\n- Alignment: ALIGN (no explicit pathology; treat as non-clinical cohort).\n\nModality:\n- Metadata says: \"Tasks include rest, pressing, reaching, listening, sep.\" (implies multiple stimulus/input channels: auditory + somatosensory stimulation + motor task context).\n- Few-shot pattern suggests: explicit multi-channel tasks → Multisensory (as in Cross-modal Oddball labeled Multisensory).\n- Alignment: ALIGN.\n\nType:\n- Metadata says: \"Tasks include rest, pressing, reaching, listening, sep.\" and dataset is \"Longitudinal Multitask\".\n- Few-shot pattern suggests: when movement execution/imagery is a main focus, Type → Motor (as in EEG Motor Movement/Imagery Dataset).\n- Alignment: PARTIAL (dataset is multitask including rest/listening/SEP, but motor tasks are prominently listed and are typical primary behavioral components).","decision_summary":"Top-2 comparative selection:\n\nPathology candidates:\n1) Healthy — Evidence: no clinical recruitment/diagnosis stated; participants are simply \"*Macaca fuscata* monkeys\" and \"Each subject corresponds to an individual monkey\".\n2) Other — Evidence: non-human primate cohort could be considered outside typical human clinical categories.\nHead-to-head: Healthy is stronger because Pathology is defined as clinical condition used to recruit participants, and none is given; the dataset appears non-clinical/normative.\nConfidence basis: only indirect evidence (absence of diagnosis + animal cohort), no explicit \"healthy\" statement.\n\nModality candidates:\n1) Multisensory — Evidence: task list spans \"listening\" (auditory) and \"sep\" (somatosensory evoked potentials) alongside motor/rest: \"Tasks include rest, pressing, reaching, listening, sep.\"; also \"Longitudinal Multitask\".\n2) Motor — Evidence: \"pressing\" and \"reaching\" are motor tasks.\nHead-to-head: Multisensory is stronger because the dataset explicitly includes both auditory (listening) and SEP (somatosensory stimulation) tasks rather than a single dominant stimulus channel.\nConfidence basis: 2 explicit quotes about multitask/task list + strong few-shot analog (cross-modal oddball → Multisensory).\n\nType candidates:\n1) Motor — Evidence: motor behaviors are explicitly listed: \"pressing\" and \"reaching\" within the task set.\n2) Other — Evidence: it is a \"Longitudinal Multitask\" dataset also including \"rest\", \"listening\", and \"sep\", so the overarching purpose could be general/methodological rather than a single cognitive construct.\nHead-to-head: Motor is slightly stronger because the clearest cognitive/behavioral construct present in the metadata is movement execution (pressing/reaching), whereas rest/listening/SEP are less specific about higher-level constructs.\nConfidence basis: 1 clear task-list quote + reasonable inference from task composition."}},"computed_title":"Longitudinal Multitask Wireless ECoG Data from Two Fully Implanted Macaca fuscata","nchans_counts":[{"val":50,"count":471},{"val":66,"count":399}],"sfreq_counts":[{"val":1000.0,"count":870}],"stats_computed_at":"2026-04-22T23:16:00.312097+00:00","total_duration_s":380957.001,"author_year":"Yang2025_Longitudinal","canonical_name":null}}