2022 GAC 1
How can we optimally use neuroscience data to guide the next generation of brain models of primate visual and linguistic cognition?
Authors:
Kohitij Kar, York University, MIT
Greta Tuckute, MIT
Joel Zylberberg, York University
SueYeon Chung, NYU
Alona Fyshe, University of Alberta
Evelina Fedorenko, MIT
Konrad Kording, University of Pennsylvania
Nikolaus Kriegeskorte, Columbia University
Abstract:
With the advent of deep learning and the availability of more sophisticated models, our hypothesis space has become more enriched. Neuroscientific data is now used to falsify large-scale vision models like feedforward deep convolutional neural networks, and transformer models for language processing. We posit that neuroscientific data serves two vital purposes. First, data can be used to falsify the current models. Second, data can be used to develop better alternative models. Current, standard practices in system neuroscience do deliver on both these fronts. However, the interaction with data is often indirect and therefore, limited. Here we propose to discuss and debate the full scope of using neural and behavioral datasets. Can we use neural data more efficiently for model building (beyond a “motivator” for various computational motifs)? We aim to identify and implement novel ways to use neural data directly during model training. Theoretical discussions will address: 1) how much data is enough data? 2) what type of data is needed? (neural spiking? fMRI BOLD? is behavioral data sufficient?). Lastly, we will also discuss more efficient data-collection approaches for model evaluation (beyond using data for post-hoc model validation). Our goal is to closely integrate neuroscience data collection and model development.
Instructions for CCN community members:
Download the proposal PDF using the button above.
Use the Disqus box below to leave comments and reviews, and discuss the submission. You'll need to create a Disqus account or log into Disqus using a google account or similar in order to leave a comment/review.