The Biophysical Principles Underlying Computation in Neural Substrates

Iran Roman will presenting and dicussing the following:

Abstract

This presentation offers an overview of NeuroAI’s development, starting with seminal biophysical models such as the Hodgkin-Huxley equations, which elucidate the ionic mechanisms underlying neuronal action potentials. We then discuss the Wilson-Cowan model, capturing the interactions between excitatory and inhibitory neuronal populations. Advancements in the 2010s used recurrent neural networks to paralleling complex computations observed in both macaque and RNNs. We then discuss dynamical systems approaches, emphasizing their efficacy in modeling the temporal evolution of human brain activity, including unsupervised Hebbian Learning. Finally, we explore bifurcation theory’s role in identifying critical parameters that govern perception-action coupling within neural substrates, illustrating how minor variations can lead to significant shifts in neural computation.

Related Work

(Mante et al., 2013)

(Zemlianova et al., 2024)

(Roman et al., 2023)

(Driscoll et al., 2024)

References

  1. Context-dependent computation by recurrent dynamics in prefrontal cortex
    Valerio Mante, David Sussillo, Krishna V Shenoy, and 1 more author
    nature, Apr 2013
  2. Dynamical mechanisms of how an RNN keeps a beat, uncovered with a low-dimensional reduced model
    Klavdia Zemlianova, Amitabha Bose, and John Rinzel
    Scientific Reports, Apr 2024
  3. Hebbian learning with elasticity explains how the spontaneous motor tempo affects music performance synchronization
    Iran R Roman, Adrian S Roman, Ji Chul Kim, and 1 more author
    PLOS Computational Biology, Apr 2023
  4. Flexible multitask computation in recurrent networks utilizes shared dynamical motifs
    Laura N Driscoll, Krishna Shenoy, and David Sussillo
    Nature Neuroscience, Apr 2024



Enjoy Reading This Article?

Here are some more articles you might like to read next:

  • Deep Learning is Not So Mysterious or Different
  • Approaching Deep Learning through the Spectral Dynamics of Weights
  • Explaining Transformers Using Model-Based Stochastic Signal Processing
  • NeurIPS 2024 Recap
  • Expressive Power of Temporal Message Passing