♫musicjinni

Next-generation recurrent network models for cognitive neuroscience

video thumbnail
Dr. Guangyu Robert Yang, Dept. of Brain and Cognitive Sciences (BCS), EECS Dept., Schwarzman College of Computing (SCC), MIT

Abstract: Recurrent Neural Networks (RNNs) trained with machine learning techniques on cognitive tasks have become a widely accepted tool for neuroscientists. In comparison to traditional computational models in neuroscience, RNNs can offer substantial advantages at explaining complex behavior and neural activity patterns. Their use allows rapid generation of mechanistic hypotheses for cognitive computations. RNNs further provide a natural way to flexibly combine bottom-up biological knowledge with top-down computational goals into network models. However, early works of this approach are faced with fundamental challenges. In this talk, I will discuss some of these challenges, and several recent steps that we took to partly address them and to build next-generation RNN models for cognitive neuroscience.​

CBMM10 Panel: Research on Intelligence in the Age of AI

The Road to Intelligence

Evidence that recurrent circuits are critical to the ventral stream’s execution of core object...

MIT Center for Brains, Minds, and Machines Summer School: Boris Katz and Andrei Barbu

Fast Recurrent Processing via Ventrolateral Prefrontal Cortex Is Needed by the Primate Ventral St...

Panel Discussion: What is the relationship between biological brains and AI algorithms?

Simulating a Primary Visual Cortex at the Front of CNNs Improves Robustness to Image Perturbations

What would it mean to understand intelligence?

Successes and challenges in modern artificial intelligence

Can computers learn like humans?

Probing memory circuits in the primate brain: from single neurons to neural networks

BMM Virtual Summer Course 2020 Introduction

Characterizing models of visual intelligence

Brains, Minds & Machines Seminar Series: Computer Vision that is changing our lives

Recurrent computations for visual pattern completion (publication release video)

Inferior temporal cortex potential cortical precursor of orthographic processing in untrained monkey

Closing Remarks

CBMM Season's Greetings 2021

How does an AI "see" what's in a photo?

Feedforward and feedback processes in visual recognition

CBMM10 Panel: Neuroscience to AI and back again

Quest | CBMM History and Future: Tommy Poggio

Keynote Panel: Why is it Time to Try Again? A Look to the Future

CBMM10 - Marc Raibert

How brain computations can inspire new paths in AI - Part 2

The Debate Over “Understanding” in AI’s Large Language Models

CBMM Undergraduate Summer Neuroscience Research Internship Alum: Heather Kosakowski

Social visual representations in humans and machines

Apical dendrites as a site for gradient calculations

CBMM-Siemens Graduate Fellowship: Tejas Kulkarni

Disclaimer DMCA