Wuhan course in computational neuroscience
Peter Latham
pel@gatsby.ucl.ac.uk
Previous courses:
2021
2022
2023
2024 course
summary of 2022 course
Hopefully this will eventually be updated, but don't hold your breath. ;)
Lectures
1. 24.04.15 Recurrent network of randomly connected neurons, part I.
zoom recording Passcode: W0k8h4w#
youtube
2. 24.04.17 Recurrent network of randomly connected neurons, part II.
zoom recording Passcode: cg4Ovv?7
youtube
3. 24.04.22 Recurrent network of randomly connected neurons, part III.
Very sorry -- I forgot to record the lecture. But almost all of what I
said can be found in the second and third lecture from
last year's course.
4. 24.04.24 Recurrent network of randomly connected neurons, part IV, plus start of Hopfield networks.
zoom recording Passcode: cu23F$NE
youtube
5. 24.04.28 Recurrent network of randomly connected neurons, part III.
Very sorry -- I forgot to record the lecture. again. But almost all of what I
said can be found in the fifth and sixth lecture from
last year's course.
6. 24.05.06 Synaptic transmission
(Shangbin Chen powerpoint lecture notes)
7. 24.05.08 Synaptic plasticity
(Shangbin Chen powerpoint lecture notes)
8. 24.05.13 Learning in the brain and in deep networks.
zoom recording Passcode: 8+gH^8w2
youtube
9. 24.05.15 Learning -- mainly linear regression, but in the overparameterized regime.
zoom recording Passcode: ^Sa.f7+x
youtube
10. 24.05.20 Backprop in
feedforward networks, RNNs and start of transformers.
zoom recording Passcode: K?8Gj4s+
youtube
11. 24.05.22 A little bit on
transformers, but mainly RL.
zoom recording Passcode: reqShK3*
youtube
Notes, papers and problems
Randomly connected networks:
The
Wilson-Cowan model, which considered the dynamics of average
excitatory and inhibitory firing rates
van Vreeswijk and Sompolinsky's classic paper. hard, but thorough
firing rate dynamics -- possibly most useful for the construction of
nullclines
followup experimental paper
randomly connected networks
A very detailed writeup of randomly connected networks.
Probably way too much information
Homework problems (to see if you really understand randomly
connected networks)
Problem set 1
Problem 4: practice nullclines
Problem set 2
Problem 1: Wilson-Cowan nullclines
Problem 3: sparse connectivity
Problem 4: continuous time Hopfield network
Attractor networks
Hopfield's original paper
Notes
on the Hopfield model
realistic Hopfield networks, simple version
realistic Hopfield networks, more complicated version
line
attractor networks
Biologically plausible deep learning
one possible method; see the paragraph starting on line 59 for a list
of recent approaches
Synaptic plasticity
Spike-timing dependent plasticity (STDP)
How to stabilize STDP
BCM rule
Ocular dominance columns