Modeling Neural Circuits Made Simple with Python

Look inside
Paperback
$45.00 US
On sale Mar 19, 2024 | 168 Pages | 9780262548083
An accessible undergraduate textbook in computational neuroscience that provides an introduction to the mathematical and computational modeling of neurons and networks of neurons.

Understanding the brain is a major frontier of modern science. Given the complexity of neural circuits, advancing that understanding requires mathematical and computational approaches. This accessible undergraduate textbook in computational neuroscience provides an introduction to the mathematical and computational modeling of neurons and networks of neurons. Starting with the biophysics of single neurons, Robert Rosenbaum incrementally builds to explanations of neural coding, learning, and the relationship between biological and artificial neural networks. Examples with real neural data demonstrate how computational models can be used to understand phenomena observed in neural recordings. Based on years of classroom experience, the material has been carefully streamlined to provide all the content needed to build a foundation for modeling neural circuits in a one-semester course.

  • Proven in the classroom
  • Example-rich, student-friendly approach
  • Includes Python code and a mathematical appendix reviewing the requisite background in calculus, linear algebra, and probability
  • Ideal for engineering, science, and mathematics majors and for self-study
Contents
List of Figures ix
Preface xi

1 Modeling Single Neurons 1
1.1 The Leaky Integrator Model 1
1.2 The EIF Model 5
1.3 Modeling Synapses 10

2 Measuring and Modeling Neural Variability 15
2.1 Spike Train Variability, Firing Rates, and Tuning 15
2.2 Modeling Spike Train Variability with Poisson Processes 21
2.3 Modeling a Neuron with Noisy Synaptic Input 25

3 Modeling Networks of Neurons 33
3.1 Feedforward Spiking Networks and Their Mean-Field Approximation 33
3.2 Recurrent Spiking Networks and Their Mean-Field Approximation 37
3.3 Modeling Surround Suppression with Rate Network Models 43

4 Modeling Plasticity and Learning 49
4.1 Synaptic Plasticity 49
4.2 Feedforward Artificial Neural Networks 54

Appendix A: Mathematical Background 61
A.1 Introduction to ODEs 61
A.2 Exponential Decay as a Linear, Autonomous ODE 63
A.3 Convolutions 65
A.4 One-Dimensional Linear ODEs with Time-Dependent Forcing 69
A.5 The Forward Euler Method 71
A.6 Fixed Points, Stability, and Bifurcations in One-Dimensional ODEs 74
A.7 Dirac Delta Functions 78
A.8 Fixed Points, Stability, and Bifurcations in Systems of ODEs 81

Appendix B: Additional Models and Concepts 89
B.1 Ion Channel Currents and the HH Model 89
B.2 Other Simplified Models of Single Neurons 97
B.3 Conductance-Based Synapse Models 113

B.4 Neural Coding 115
B.5 Derivations and Alternative Formulations of Rate Network Models 124
B.6 Hopfield Networks 127
B.7 Training Readouts from Chaotic RNNs 131
B.8 DNNs and Backpropagation 136

References 141
Index 147
Robert Rosenbaum is Associate Professor of Applied and Computational Mathematics and Statistics at the University of Notre Dame. His research in computational neuroscience is focused on using computational models of neural circuits to help understand the dynamics and statistics of neural activity underlying sensory processing and learning.

About

An accessible undergraduate textbook in computational neuroscience that provides an introduction to the mathematical and computational modeling of neurons and networks of neurons.

Understanding the brain is a major frontier of modern science. Given the complexity of neural circuits, advancing that understanding requires mathematical and computational approaches. This accessible undergraduate textbook in computational neuroscience provides an introduction to the mathematical and computational modeling of neurons and networks of neurons. Starting with the biophysics of single neurons, Robert Rosenbaum incrementally builds to explanations of neural coding, learning, and the relationship between biological and artificial neural networks. Examples with real neural data demonstrate how computational models can be used to understand phenomena observed in neural recordings. Based on years of classroom experience, the material has been carefully streamlined to provide all the content needed to build a foundation for modeling neural circuits in a one-semester course.

  • Proven in the classroom
  • Example-rich, student-friendly approach
  • Includes Python code and a mathematical appendix reviewing the requisite background in calculus, linear algebra, and probability
  • Ideal for engineering, science, and mathematics majors and for self-study

Table of Contents

Contents
List of Figures ix
Preface xi

1 Modeling Single Neurons 1
1.1 The Leaky Integrator Model 1
1.2 The EIF Model 5
1.3 Modeling Synapses 10

2 Measuring and Modeling Neural Variability 15
2.1 Spike Train Variability, Firing Rates, and Tuning 15
2.2 Modeling Spike Train Variability with Poisson Processes 21
2.3 Modeling a Neuron with Noisy Synaptic Input 25

3 Modeling Networks of Neurons 33
3.1 Feedforward Spiking Networks and Their Mean-Field Approximation 33
3.2 Recurrent Spiking Networks and Their Mean-Field Approximation 37
3.3 Modeling Surround Suppression with Rate Network Models 43

4 Modeling Plasticity and Learning 49
4.1 Synaptic Plasticity 49
4.2 Feedforward Artificial Neural Networks 54

Appendix A: Mathematical Background 61
A.1 Introduction to ODEs 61
A.2 Exponential Decay as a Linear, Autonomous ODE 63
A.3 Convolutions 65
A.4 One-Dimensional Linear ODEs with Time-Dependent Forcing 69
A.5 The Forward Euler Method 71
A.6 Fixed Points, Stability, and Bifurcations in One-Dimensional ODEs 74
A.7 Dirac Delta Functions 78
A.8 Fixed Points, Stability, and Bifurcations in Systems of ODEs 81

Appendix B: Additional Models and Concepts 89
B.1 Ion Channel Currents and the HH Model 89
B.2 Other Simplified Models of Single Neurons 97
B.3 Conductance-Based Synapse Models 113

B.4 Neural Coding 115
B.5 Derivations and Alternative Formulations of Rate Network Models 124
B.6 Hopfield Networks 127
B.7 Training Readouts from Chaotic RNNs 131
B.8 DNNs and Backpropagation 136

References 141
Index 147

Author

Robert Rosenbaum is Associate Professor of Applied and Computational Mathematics and Statistics at the University of Notre Dame. His research in computational neuroscience is focused on using computational models of neural circuits to help understand the dynamics and statistics of neural activity underlying sensory processing and learning.

National Depression Education and Awareness Month

For National Depression Education and Awareness Month in October, we are sharing a collection of titles that educates and informs on depression, including personal stories from those who have experienced depression and topics that range from causes and symptoms of depression to how to develop coping mechanisms to battle depression.

Read more

Books for LGBTQIA+ History Month

For LGBTQIA+ History Month in October, we’re celebrating the shared history of individuals within the community and the importance of the activists who have fought for their rights and the rights of others. We acknowledge the varying and diverse experiences within the LGBTQIA+ community that have shaped history and have led the way for those

Read more

Books for Latinx & Hispanic Heritage Month

Penguin Random House Education is proud to celebrate Latinx & Hispanic Heritage Month, which runs annually from September 15th through October 15th.  We are highlighting the works of our authors and illustrators from the Latinx and Hispanic community, whose stories and characters have a profound impact on our society. Here is a collection of titles

Read more