Seminar Neural Networks

Understanding the processing of information in our brain is one of the great challenges in the natural sciences where progress has originated from a combination of approaches from biology, physics and information science. Theoretical Neurosciences aims to develop models starting from the constituents of our brains (the neurons) to explain inter alia perception, sensory-motor integration, and storage and recall of memory. Machine Learning is an important off-spring where impressive progress has been achieved recently. The seminar will introduce important models and concepts in neural networks and the prevalent theoretical techniques for analyzing them. Presentations fill focus on hands-on examples with Jupyter notebooks.

Seminar topics include: Neuron models, Perceptron, Boltzmann machine, convolutional networks, Hopfield model, Bayesian inference, Replica theory, backpropagation, Hebbian and reinforcement learning.

Literature:

  • Theory of Neural Information Processing Systems, Coolen, Kühn & Sollich (Oxford University Press, 2005); 
  • Information Theory, Inference, and learning algorithms, MacKay (Cambridge University Press, 2003)
  • Reviews and recent publications as provided