About Me Videos Research Miscellaneous Personal Covid-19 Home

What is Statistics?

What is Statistics? by Emery N. Brown and Robert E. Kass

Statistical Inference: The Big Picture

Statistical Inference: The Big Picture in Statistical Science 2011, Vol. 26, No. 1, 1-9.

Statistical Models of the Brain

Statistical Models of the Brain is a course in computational neuroscience I began teaching in 2011. Brent Doiron co-taught it with me on 4 occasions, 2016-2019. It has evolved a lot, mainly in terms of refined focus and pedagogy.

A quick video summary (less than 5 minutes) of the Spring 2021 version may be found here: Wrap up

The course emphasizes statistical models, but the term is understood broadly so that it includes models that come from ideas and techniques in statistical physics.

In 2021 I decided to create short video lectures, 15-30 minutes, that students would view prior to class. They are linked below, together with the topic and the specific reading(s) on which each one is based. Students were required to submit comments and/or questions ahead of each class meeting, I went through them to find commonalities, and then class was devoted entirely to issues raised by the students.

The full syllabus is here

Corrections to the book Analysis of Neural Data by Kass, Eden, and Brown (KEB) may be found here.

Topics

  • 1. What is computational neuroscience?
    • Video
    • Additional background video on tuning curves
    • Reading: Section 1 (Introduction) in Kass, R.E. and 24 others (2018) Computational neuroscience: Mathematical and statistical perspectives, Ann. Rev. Statist. Appl., 5: 183-214.
    • Attention: the brain-as-computer metaphor; Marr's three levels of analysis; tuning curve video.
  • 2. (Background) Random variables; What is a statistical model? Fitting statistical models to data.
    • Video
    • Reading: Kass, Eden, Brown (KEB), Chapter 1, especially 1.2.1; Chapter 3 through Equation (3.1); Section 3.2 through 3.2.3 (reminder to see corrections).
    • Attention: "Signal" and "noise" in Examples 1.4 and 1.5; Equation (1.4)
    • Additional reading: Read the rest of Chapter 3, especially 3.2.4 (reminder to see corrections); pay attention to Figure 1.1; Sections 1.2.5, 1.2.6 (see also Section 8.1), 3.2.4.
  • 3. (Background) Statistical independence of random variables; log transformations.
    • Video 1 independence
    • Video 2 log transformations
    • Reading: KEB Ch 2, esp. 2.2.1; Ch 4 through 4.2.2; 4.3.1 through Equation (4.26); 5.1-5.3; 5.4.2
    • Attention: Secs 2.2.1, 5.2.1, 5.4.2; Figs 2.5, 2.6.
    • Additional reading: Read the rest of 4.3.1 and Ch 5; Attention: Secs 4.2.4, 5.5.
  • 4. (Background) The Law of Large Numbers and the Central Limit Theorem; statistical estimation; least-squares linear regression and the linear algebra concept of a basis.
    • Video
    • Reading: KEB Ch 6 through 6.1.1; 6.2.1; 7.1, 7.2, 7.3.1; Introduction to Ch 12; 12.5 through 12.5.1; appendices A.7 and A.9; 12.5.3 through equation (12.57) on p. 342; 12.5.8.
    • Attention: 6.2.1; Fig 7.2; Introduction to 12.5 and 12.5.8; A.7; Fig 12.9 (which is the same as the bottom of Fig A.2).
    • Additional reading: 6.1.2, 6.3.2; 12.5.5, 12.5.7; A.8; attention to 12.5.7. Secs 7.3.8, 7.3.9 are recommended.
  • 5. Random walk integrate-and-fire models and balanced excitation and inhibition
    • Video 1 background: introduction to point processes
    • Video 2
    • Reading: KEB Sec 5.4.6; Introduction to Ch 19.
      Shadlen, M.N. and Newsome, W.T. (1998) The variable discharge of cortical neurons: implications for connectivity, computation, and information coding. J. Neurosci., 18: 3870–3896. Up to Section 2, p. 3877, and concluding remarks.
      Stein, R.B., Gossen, E.R., and Jones, K.E. (2005) Neuronal variability: noise or part of the signal? Nat. Rev. Neuro., 6:389-397. Only Figure 1 and Figure 2, pp. 390-391. The histograms are explained by this statement on p. 392, “The ability of the neuron to transmit signals faithfully is only evident after analysing many cycles of the stimulus. However, transmission by a population of neurons, rather than a single neuron, would allow the signal to be evident in real time.” See my video lecture for a bit more on this.
    • Attention: Shadlen and Newsome, "price of dynamic range is noise," Fig 2; Stein et al., Fig 2.
    • Additional reading: KEB Secs 19.1-19.2: attention to the theorem in 19.2.1.
  • 6. Population vectors
    • Video 1
    • Video 2
    • Reading: KEB, Sec 12.5.4.
      Georgopoulos, A.P., Lurito, J.T., Petrides, M., Schwartz, A.B., and Massey, J.T. (1989) Mental rotations of the neuronal population vector, Science, 243: 234–236.
      Black, M.J. and Donoghue, J.P. (2007) Probabilistically modeling and decoding neural population activity in motor cortex, in G. Dornhege, J. del R. Millan, T. Hinterberger, D. McFarland, K.-R. Muller (eds.), Toward Brain-Computer Interfacing, MIT Press, pp. 147–159.
    • Attention: KEB, Example 12.6; Figure 2 of Georgopoulos et al.; Equation (5) of Black and Donoghue.
  • 7. Information theory in human discrimination
    • Video
    • Background reading: KEB Section 4.3.2, especially comments about entropy and channel capacity, pp. 95-97, including Examples 4.5 and 4.6.
    • Reading: Miller, G.A. (1956) The magical number seven, plus or minus two, Psychol. Rev., 63: 343-355.
    • Attention: KEB Example 4.6; Miller, Figure 2.
  • 8. (Background) Differential equations
    • Video by 3Blue1Brown
      Here are some questions that should help you better understand the video:
      Q1. Under what circumstances are differential equations used?
      Q2. What is the order of a differential equation?
      Q3. What is a phase space?
      Q4. What is an attracting state?
  • 9. Electrical circuit model of a neuron; passive synaptic dynamics and phenomenological models of spiking; integrate-and-fire dynamics
    • Video (Nour Riman)
    • Reading: Ermentrout and Terman (2010) Mathematical Foundations of Neuroscience, Springer. Secs 1.1-1.5 (an electronic version of this book is freely available to all Pitt and CMU students).
    • Attention: Nernst equation and how it differs from GHK equation; time constant of RC model.
  • 10. The Hodgkin-Huxley model of action potential generation
    • Video (Nour Riman)
    • Reading: Ermentrout and Terman (2010) Mathematical Foundations of Neuroscience, Springer. Secs 1.7-1.10.
    • Attention: voltage clamp; distinction between sodium and potassium conductances.
  • 11. Network dynamics (No Video)
    • Reading: Vogels, TP; Rajan, K; Abbott, LF. (2005) Neural network dynamics. Ann. Rev. Neurosci. 28: 357–376.
    • Attention: From network perspective, advantages of ongoing activity due to balanced excitation and inhibition (as in Figure 3 d,e).
  • 12. (Background) Bayes' Theorem; optimality of Bayesian classifiers; mean squared error; Bayes and maximum likelihood.
    • Video
    • Reading: KEB Secs 4.3.3-4.3.4 through p. 101; 8.1-8.2; 8.3.3.
    • Attention: Theorem on p. 182; Equation (8.10); Figure 8.8.
  • 13.Cognition and optimality; ACT-R
    • Video
    • Background reading: KEB, pp. 102-103, through Example 4.9.
    • Reading: Anderson (2007) How Can the Human Mind Occur in the Physical Universe?, Chapter 1.
    • Attention: Anderson's three "shortcuts"
  • 14. (Background) Statistical tests, ROC curves, signal detection theory
    • Video
    • Reading: KEB Chapter 10 up to the beginning of Sec 10.1.1 (p. 249); Secs 10.4.1; 10.4.3-10.4.4, especially Figure 10.3.
    • Attention: Figures 10.3 and 10.4.
    • Additional reading: The rest of Chapter 10.
  • 15. Optimal observers in perception and action
    • Video
    • Background reading: KEB Chapter 16 through equation (16.18) on p. 449, especially Example 16.1; see also Example 8.1.
    • Additional reading: The rest of Chap 16. Reading: Kording, K.P. and Wolpert, D.M. (2004) Bayesian integration in sensorimotor learning, Nature, 427: 244-247.  
    • Attention: KEB, Equations (16.11), (16.12); Kording and Wolpert Figure 2.
  • 16. (Background) Regression and generalized regression.
    • Video
    • Reading: KEB Chapter 14 through 14.1 (can skip 14.1.2, 14.1.5); 15.2 through 15.2.4.
  • 17. Firing rate and neural coding; spike trains as point processes
    • Video
    • Reading: KEB Example 14.5, pp. 410-411; Chapter 19 through page 569.
      Additional reading: check the rest of Ch 19, and read what interests you; read Sections 1 and 2, and Figures 6 and 9, of Weber and Pillow (2017).
      Weber, A.I. and Pillow, J.W. (2017) Capturing the dynamical repertoire of single neurons with generalized linear models, Neural Comput., 29: 3260-3289.
      Also of potential interest: Chen, Y., Xin, Q., Ventura, V., and Kass, R.E. (2018) Stability of point process spiking neuron models, J. Comput. Neurosci., 46:19-42, especially Figure 8c.
    • Attention: the 3 types of point processes identified in the lecture as needed for the next reading.
    • Additional reading: KEB Figure 19.9 and Weber and Pillow Figure 9.
  • 18. Information theory in neural coding (2 videos)
    • Video 1
    • Video 2
    • Background reading: KEB, Example 4.6.
    • Reading: Nirenberg, S., Carcieri, S.M., Jacobs, A.L. and Latham, P.E. (2001) Retinal ganglion cells act largely as independent encoders, Nature, 411: 698–701.
      Jacobs, A.L., Fridman, G., Douglas, R.M., Alam, N.M., Latham, P.E., Prusky, G.T., and Nirenberg, S. (2009) Ruling out and ruling in neural codes, Proc. Nat. Acad. Sci., 106: 5936–5941.
    • Attention: Figure 3 of Nirenberg et al. and Figure 2 of Jacobs et al.
    • Recommended, especially for comp students: Rieke, F., Warland, D., de Ruyter van Steveninck, R., Bialek, W. (1997) Spikes: Exploring the Neural Code, MIT Press. Read pages 101-113, 148-156. (See Readings)
  • 19. Neural implementation of Bayesian inference
    • Video
    • Reading: Salinas, E. (2006) Noisy neurons can certainly compute, Nature Neurosci., 9: 1349–1350.
      Additional reading: Ma, W.J., Beck, J.M., Latham, P.E., and Pouget, A. (2006) Bayesian inference with probabilistic population codes, Nature Neurosci., 9: 1432–1438.
      Orellana, J., Rodu, J., and Kass, R.E. (2017) Population vectors can provide near optimal integration of information, Neural Comput., 29: 2021-2029.
    • Attention: Figure 1 of Salinas.
  • 20. Population-wide variability: spike count correlations; dimensionality reduction
    • Video
    • Background reading: KEB, Example 6.1, p. 141 (review), and Section 17.3.1, especially the examples.
    • Reading: Averback, B.B., Latham, P.E., and Pouget, A. (2006) Neural correlations, population coding, and computation, Nature Reviews Neurosci., 7: 358-366, only through page 360.
      Cunningham, J.P. and Yu, B.M. (2014) Dimensionality reduction for large-scale neural recordings, Nat. Neurosci., 17: 1500-1509, only through p. 1504, up to “Selecting a dimensionality reduction method.”
    • Attention: Figure 1 of Averbeck, Latham, Yu and Figures 1 and 2 of Cunningham and Yu.
  • 21. Neural basis of decision making (systems level)
    • Video
    • Background reading: KEB, Section 11.1.5 and the discussion of SDT in Section 10.4.4.
    • Reading: Gold and Shadlen (2007) The neural basis of decision-making, Ann. Rev. Neuroscience, 30: 535-574, only through the discussion of Figure 5.
    • Attention: Figures 4 and 5c.
  • 22. Network models of working memory and decision-making
    • Video (Chencheng Huang)
    • Reading: Wang, X.-J. (2008) Decision making in recurrent neuronal circuits, Neuron, 60: 215-234, through the section “Recurrent Cortical Circuit Mechanism,” which ends on p. 223.
  • 23. Oscillations and coordinated neural activity.
    • Video
    • Background reading: KEB, Chapter 18 through Section 18.2.2 and Section 18.5 through 18.5.1. Reading: Fries, P. (2005) A mechanism for cognitive dynamics: neuronal commu- nication through neuronal coherence, TRENDS in Cognitive Sciences, 9: 474-480. Attention: Figure 3a.
  • 24. Reinforcement learning
    • Video
    • Reading: Glimcher, P. (2011) Understanding dopamine and reinforcement learning: The dopamine reward prediction error hypothesis, PNAS, 108: 15647–15654 (with corrections, pp. 17568–17569), through the interpretation of Figure 3.
    • Attention: Figures 2 and 3.
    • Recommended, especially for computational students: Y Niv (2009) Reinforcement learning in the brain J. Math. Psychol., 53: 139-154.
  • 25. Artificial neural networks and systems neuroscience
    • Video
    • Reading: Richards, Lillicrap, Therien, Kording and many others (2019) A deep learning framework for neuroscience, Nature Neuroscience, 22: 761-770.
    • Attention: The three design components of artificial neural networks.
  • 26. Graphs and networks
    • Video
    • Reading: Bau, G.L., ..., Bassett, D.L., and Satterthwaite, T.D. (2017) Modular segregation of structural brain networks supports the development of executive function in youth, Current Biol., 27: 1561-1572.
    • Attention: Graphical abstract and Figure 7 (the methods are explained, briefly, in the video lecture).
    • Recommended: Bassett, D.S., Zurn, P., and Gold, J.I. (2018) On the nature and use of models in network neuroscience, Nature Reviews Neurosci., 19:566-578, only up until "Density of study in this 3D space," p. 571.
  • 27. What is science? (Additional video: wrap-up)

Miscellaneous

Doctoral and Postdoctoral Alumni

Former PhD Students

Former Postdoctoral Trainees