Cookies on this website

We use cookies to ensure that we give you the best experience on our website. If you click 'Accept all cookies' we'll assume that you are happy to receive all cookies and you won't see this message again. If you click 'Reject all non-essential cookies' only necessary cookies providing core functionality such as security, network management, and accessibility will be enabled. Click 'Find out more' for information on how to change your cookie settings.

Abstract

The COGITATE consortium is a large-scale adversarial collaborative project, in which preregistered predictions from two theories of consciousness (Global Neuronal Workspace Theory, GNWT and Integrated Information Theory, IIT) are tested using the same experimental design, with MEG, fMRI and iEEG across seven testing sites, and bringing together experts from over 10 research centres. From its inception, the consortium has strived to adhere to the highest standards of Open Science. Not an easy task, especially considering the scale of the consortium. How to create and amend a preregistration? How to make methodological decisions? How to lower the bar for the reusability of code and data? In this talk, I will offer a first-hand account of the challenges we faced, and will provide concrete advice that can be applied to projects of any scale.

 

Bio

After obtaining a degree in Psychology in Argentina, Yamil completed a PhD in Cognitive Neuroscience, co-supervised by Jacques Mehler (SISSA, Italy) and Tristan Bekinschtein (University of Cambridge, UK), where he employed EEG and pupillometry to study predictive processes in auditory speech processing. Later, Yamil joined the lab of Davide Crepaldi (SISSA, Italy) to study how the processing of visual words is supported by the statistical learning abilities of our visual system.

In late 2021, Yamil joined the Predictive Brain lab as a member of the COGITATE consortium, which is a large consortium contrasting consciousness theories. Yamil is currently working on a project in auditory perception, funded by a Marie Skłodowska-Curie grant. He uses MEG and challenging auditory signal detection tasks, to study how sensory evidence, and update signals, interact to give rise to our auditory experience.