You'll find here a list of additional material to help you understand the course and go deeper in the concepts.
This is a growing selection accumulated over many previous years. It is by no means exhaustive but also may not all be relevant any more.
(most of this is adapted from previous years)
Additional resources
Student notes
Below are some notes made by previous students. They may contain errors; you are strongly encouraged to report back to us and contribute changes. There is significant overlap in some areas; pick the perspective you prefer.
Theoretical Neuroscience
Topic | Link | Comments |
---|---|---|
Full TN Notes | Jorge's TN notes | A somewhat comprehensive set of notes covering the main parts of the course. Should be particularly useful for mean-field theory of networks. Also includes a list of useful papers to read. |
Differential correlations | Jorge's correlations notes | Some handwritten notes on Peter Latham's lecture on differential correlations (Moreno-Bote et al, 2014) |
Learning rules | Learning rules (Kirsty) | Mainly plasticity/hebbian rules, summarised from Dayan&Abbott. |
Biophysics notes | Biophysics (Kirsty) | Covers most of Peter Latham's biophysics, at a higher level than Jorge's more comprehensive notes |
Information theory for TN | Information Theory (TN) (Kirsty) | High level definitions and relationships between Information Theory terms |
Machine Learning (including kernels)
Topic | Link | Comments | Full ML Notes | Jorge's ML notes | A pretty comprehensive set of notes covering almost everything in the course. There are likely many mistakes. Please find them and tell me. If you would like access to the latex code to update it yourself feel free to ask and I'll give you the Overleaf link. |
---|---|---|
Graphical models | Graphical models (Kirsty) | Examples of simple graphical models, understanding independence structures, etc. |
Probability Theory | Stats background notes (Lea) | Basic probability theory, Bayesian Inference, etc. |
Kernels cribsheet | Kernels notes (Kirsty) | Some useful kernels definitions |
Kernels Background | Kernels background notes (Kirsty) | Some useful linear algebra |
Kernels handwritten notes | Full notes (messy), Comparing Prob Distributions (clean), Convex Optimization & SVMs (clean) | Jorge's handwritten kernels notes (there may be mistakes) |
Latex template
It is highly recommended that you use latex for your assignment in this course and others. You may find it helpful to start with this sample assignment template, which demonstrates some figures, maths and other useful things.
Past exam questions
Some past exam papers can be found on the (old) Gatsby internal wiki. Note that this is only accessible from within the Gatsby network. If any papers are missing, please chase up the PIs and update the wiki.General reading
- The main course textbook is Theoretical Neuroscience by Dayan and Abbott. The appendix of the book covers a fair amount of the maths needed for the course.
- One recommended and comprehensive maths textbook is Mathematical Methods for Physics and Engineering by Riley, Hobson, and Bence. Strang also has an excellent text on Linear Algebra and its Applications
- For some reading on neuroscience, the most comprehensive textbook is probably Principles of Neural Science by Kandel. If you want something more concise, these Instant Notes in Neuroscience are really good and efficient.
- A nice readable introduction to dynamical systems and phase plane analysis with particular emphasis on single neuron models can be found in Dynamical Systems in Neuroscience by Izhikevich.
- Liam Paninski also has some truly wonderful lecture notes here, if you want to read a little more about the statistical analysis of neural data.
- Some quick references: background cribsheet, matrix identities, Fourier transforms and matrix cookbook.
Plasticity
- Review of plasticity; see in particular Fig. 2:
L. F. Abbott and Sacha B. Nelson (2000). Synaptic plasticity: taming the beast. Nat Neurosci, 3: 1178 - 1183
Systems
The Kandel is a great source of information about many aspects of neuroscience (see above). Some additional reading:
- Crick, F. and Asanuma, C. (1986). Certain aspects of the anatomy and physiology of the cerebral cortex. In J. L. McClelland and D. E. Rumelhart (eds.), Parallel Distributed Processing: Explorations in the Microstructure of Cognition, vol. II, chapter 20, pp. 333-371. MIT Press
- Hubel, DH. and Wiesel, T. (1962). Receptive fields, binocular interaction, and functional architecture in the cat’s visual cortex. J. Physiol, 160: 106-154.
- King, AJ. and Nelken, I. (2009). Unraveling the principles of auditory cortical processing: can we learn from the visual system?. Nat Neurosci, 12(6): 698-701
Neural Coding
- An amazing review about spike-triggered average and related techniques can be found in the following paper:
Schwartz O, Pillow JW, Rust NC, Simoncelli EP. (2006). Spike-triggered neural characterization. Journal of Vision, 6(4):484-507 - Liam Paninski’s notes on the statistical analysis of neural data are also particularly relevant for this part of the course
Here are some additional papers selected by R. Williamson:
- This first set of four papers provide an excellent background to spike-triggered neural characterization and the use of generalized linear models in neural data analysis.
- Pillow JW, Shlens J, Paninski L, Sher A, Litke AM, Chichilnisky EJ, Simoncelli EP. (2008). Spatio-temporal correlations and visual signaling in a complete neuronal population.. Nature, 454: 995-999.
- Paninski L, Pillow JW, and Lewi J. (2007). Statistical models for neural encoding, decoding, and optimal stimulus design. In Computational Neuroscience: Theoretical Insights Into Brain Function, eds. P Cisek, T Drew, & J Kalaska.
- Pillow JW. (2007). Likelihood-based approaches to modeling the neural code. In Bayesian Brain: Probabilistic Approaches to Neural Coding, eds. K Doya, S Ishii, A Pouget & R Rao. MIT press. 53-70.
- this paper provides an excellent review of the use of nonlinear systems identification in neural coding:
Wu MC, David SV, and Gallant JL. (2006). Complete functional characterization of sensory neurons by systems identification. Annu Rev Neurosci, 29:477-505 - here are a bunch of general references on receptive field estimation (and the issue of nonlinearities), the two Sahani & Linden papers are certainly worth a read.
- Sahani M, and Linden JF. (2003). Evidence optimization techniques for estimating stimulus-response functions. In S. Becker, S. Thrun, and K. Obermayer, eds., Advances in Neural Information Processing Systems, vol. 15, pp. 301–308, Cambridge, MA, 2003. MIT Press.
- [Mentioned in class] Sahani M, and Linden JF (2003). How linear are auditory cortical responses?In S. Becker, S. Thrun, and K. Obermayer, eds., Advances in Neural Information Processing Systems, vol. 15, pp. 109–116, Cambridge, MA, 2003. MIT Press.
- Suzuki K. (2004). Determining the receptive field of a neural filter. J. Neural Eng. I: 228-237
- Klein DJ, Depireux DA, Simon JZ & Shamma, SA. (2000). Robust spectrotemporal reverse correlation for the auditory system: Optimizing stimulus design. J. Comp. Neurosci. 9, 85-111.
- Aertsen, AM. & Johannesma, PI. (1981). The spectro-temporal receptive field. A functional characteristic of auditory neurons. Biol Cybern42, 133-43.
- Linden JF, Liu RC, Sahani M, Schreiner CE and Merzenich MM (2003). Spectrotemporal Structure of Receptive Fields in Areas AI and AAF of Mouse Auditory Cortex. J Neurophysiol 90:2660-2675.
- Depireux DA, Simon JZ, Klein DJ, Shamma SA. (2001). Spectro-temporal response field characterization with dynamic ripples in ferret primary auditory cortex. J Neurophysol. 85(3):1220-34.
Biophysics
- Dayan and Abbott cover Biophysics thoroughly. The notation is different from the lecture though.
- Dynamical Systems in Neuroscience is also a good source of information, specifically for the Phase plan analysis and an overview of many neuron models.
- Spike Neuron Models by W. Gerstner and W. Kistler covers all those models as well. The mathematical derivations are well explained and it covers population dynamics too. Scholarpedia
- Here are Peter Latham's handwritten notes. He used them to teach the course, but they are a few years old right now, so the material does not correspond well to the lectures anymore. But they could be used as a supplement for the course. They are probably impossible to understand without going to the course and may contain mistakes.
- Lecture 1 - passive neurons and Hodgkin Huxley
- Lecture 2 - simplified model, suitable for nullcline analysis
- Lecture 3 - a little philosophy, and nullcline analysis
- Lecture 4 - passive dendrites and axons
- Lecture 5 - synapses
- Lecture 6 - also synapses (but an older set of notes)
- Lecture 7 - philosophy and phase plane analysis
- Lecture 8 - grand summary of biophysics
- Also here is an introductory lecture from another course courtesy of Peter Latham:
biophysics.pptx
Networks
- Peter Latham's handwritten notes. Same comments than above.
- Lecture 9 - Wilson-Cowan equations
- Lecture 10 - Computations
- Lecture 11 - Hopfield networks 1
- Lecture 12 - Hopfield networks 2
- Lecture 13 - Line attractors
Extra resources
Please feedback to the TAs which resources you found helpful, so this list can improve over time...Population coding
Recommended papers:
- Implications of Neuronal Diversity on Population Coding
- A computational analysis of the Relationship between Neuronal and Behavioral Responses to Visual Motion
- The Effect of Noise Correlations in Populations of Diversely Tuned Neurons
- Robust information propagation through noisy neural circuits
- Neuronal Tuning: To Sharpen or Broaden?
- Information-limiting correlations
Legacy lecture slides
- Population coding slides (contain some of the derivations Peter went through in lecture)
- Correlations slides (might be useful for Gatsby assignment)
Biophysics
Online resources
- Lecture notes from Mark van Rossum
- Neuronal Dynamics online textbook
- Neuronal Dynamics video lectures
- Also see Extra resources page for legacy notes/slides from Peter L.
Textbooks
Math notes from Peter L
(see lecture list above)Important papers
- Intrinsic dynamics in neuronal networks. Analyses Wilson-Cowan dynamics in different regimes, including lots of nullclines.
Systems
- Foundational Neuroscience questions and answers. These look like well-written answers to systems-ey questions - at least 50% of them seem relevant to our TN course, similar to questions in the systems section of the short-question exam. Let us know if you find them useful.