34814 - Neural Systems (2nd Cycle)

Academic Year 2019/2020

  • Docente: Mauro Ursino
  • Credits: 9
  • SSD: ING-INF/06
  • Language: Italian
  • Moduli: Mauro Ursino (Modulo 1) Elisa Magosso (Modulo 2)
  • Teaching Mode: Traditional lectures (Modulo 1) Traditional lectures (Modulo 2)
  • Campus: Cesena
  • Corso: Second cycle degree programme (LM) in Biomedical Engineering (cod. 9243)

Learning outcomes

At the end of the course, the student manages the main theoretical and practical instruments on the mian neural models, both artificial and physiologically inspired, on the learning techniques for neural networks, on the problems that can be afforded by any network type. He/she can computationally simulate the behavior of simple neural networks and critically examine the obtained results. He/she can connect the knowledge acquired through the use of mathematical models to many aspects of neurophysiology and cognitive sciences. He/she knows the main problems related with neuroimaging and electroencephalography.

Course contents

Models of neural cells

The fundamental properties of neural cells. The membrane potential and the role of the main ions. The excitable cell and the action potential. The electric circuit of the cell membrane. The Hodgkin-Huxley model. Introduction to the “integrate and fire” models and their advantages. Analysi of the frequency discharge in the “integrate and fire” model excited by a constant current. The insertion of synaptic conductances in the “integrate and fire” model. Model of a neural network of interconnected “integrate and fire” neurons. Advantages and limits of these models.

Models of neural networks

Model simplifications: from the “spiking” neuron to the “firing rate” neuron. Advantages and limits of the simplified model with indications of the choice for the suitable model. General characteristics of a neural network. Some examples of elementary neural networks: a pure feedforward network, the feedforward+feedback model, the excitatory and inhibitory neurons. The learning of synapses, the connectionism and the Hebb rule. Experimental evidences on the Hebb rule: the homosynaptic and heterosynaptic reinforcement and weakening.

The associative memory

Introduction to hetero-associative memories. The conditioned and unconditioned stimulus. Examples of hetero-associative memories trained with the Hebbian rule. The storage of orthogonal patterns and the interference among non-orthogonal patterns. Main advantages of these memories (robustness, insensitivity to disturbances). Introduction to the autoassociative memories. The Hopfield model. Energy in the Hopfield network and the convergence theorem. The storage addressed memories. Analysis of the storage capacity of the Hopfield network. The Hopfield network as a model for the hippocampus. Principal anatomical and functional aspects of the hippocampus. The short time episodic memories.

Networks with supervised learning (or error correction networks)

Introduction to the networks with a supervisor. The perceptron by Rosenblatt. The perceptron learning rule and the convergence theorem. The perceptron as a linear classifier: virtues and limitations. The problem of the exclusive-or. Extension of the perceptron to networks witho continuous activation function: the delta rule. Multilayer feedforward networks. The backpropagation learning algorithm: training of output and hidden neurons. Advantages and limitations of the network trained with the backpropagation algorithm. Neurophysiological relevance of the error correction networks. The anatomical structure and function of the cerebellum. The cerebellum as a perceptron. The forced learning (or learning with critics) and the interaction with the environment. Algoritm for the forced learning in a stochastic network.

The self-organized networks

Introduction to learning without a supervisor: its aim and main characteristics. Neural networks for the principal component analysis of a random vector. The lateral inhibition and its role in the neurosensory systems. The competitive networks. Contrast enhancement in a model of the compound eye. The formation of categories through self-organized networks. The “winner takes all” networks. Main limitations of these networks. The Kohonen network and the formation of topological maps. Exemples of topological maps in cerebral cortex for sensory perception.

The retina

Introduction to vision: the rtina in superior animals. The photoreceptors, the horizontal cells and the bipolar cells in the retina. Contrast enhancement and motion detection in the retina. The perception of colour: basic principles.

Large-scale organisation of the brain

Current hypotheses on brain organisation. Principles of information processing in the posterior cortex: the unimodal visual processing stream (what and where) and the somato-sensory processing. Association among different sensory modalities: the amigdala and the orbitofrontal cortex. Necessity for alternative types of memories and the corresponding neural networks. The role of the hippocampus. The integrative memory, the episodic memory and the working memory (perifrontal cortex).

Elements of electroencephalography

The main characteristics of the EEG signal and the cortical column. The Principal Component Analysis (PCA) and the Independent Component Analysis (ICA): application to EEG signal. The main EEG rhythms and their physiological significance. Methods for suppression of artifacts from the EEG. The problem of source reconstruction in the cortex from high-density EEG measurement on the scalp. The use of neural mass models for the simulation of EEG signals: the model of a single cortical column and the simulation of networks of interconnected neural columns

Readings/Bibliography

Lecture notes provided by the teacher. This material will be uploaded on the platform for the repository of educational material made available by the University.

The following texts can be useful to deepen some aspects after the exam:

For a comprehensive analysis of the different models of neurons::

· P. Dayan, L.F. Abbott. “Theoretical Neuroscience. Computational and Mathematical Modeling of Neural Systems”. The MIT Press, London, England, 2001.

For the mathematical aspects and the rigorous demonstration of the main neural network properties::

· J. Hertz, A. Krogh, R. G. Palmer. “Introduction to the Theory of Neural Computation”. Addison Wesley, NewYork, 1991.

· S. Haykin. “Neural Networks. A Comprehensive Foundation”, IEEE Press, NewYork, 1994.

For the relationships with the neuroscience and cognitive science:

· J.A. Anderson. “An Introduction to Neural Networks”. The MIT Press, Cambridge, MA, 1995.

· E.T. Rolls, A. Treves. “Neural Networks and Brain Function”. Oxford University Press. Oxford, 1998.

· R. C. O'Reilly, Y. Munakata. “Computational Explorations in Cognitive Neuroscience”. The MIT Press, Cambridge, MA, 2000.

For the physiological aspects of neuroscience::

· E.R. Kandel, J.H. Schwartz, T.M. Jessell. "Principles of Neural Sciences", McGraw Hill, 2005

Teaching methods

The course comprehends both ex-cathedra lessons and practical exercises on the personal computer, with the software package matlab. The aim of the lessons is to provide the students with a theoretical knowledge about the main models of neurons and the fundamental classes of neural networks, and to make them aware about the advantages and limitations of each available technique. The practical exercises aim at training the students on the resolution of simple real problems with the use of neural networks, and at showing the potential benefits but also the shortcomings of the main models introduced during the course.

Attendance at lectures is strongly recommended both for ex-cathedra lessons and for matlab exercises, sinceall the aspects provided in the teaching materials are deepened and commented in detail by the teacher in the classroom.

 

Assessment methods

Oral exam based on a talk with the student (duration 45-50 minutes).

The final exam is based on an interview with the student (duration 45-50 minutes). During the interview two questions will be asked to the student, on two different aspects of the course concerning neuronal modeling (models of neurons, association networks, error correction networks, self-organized networks, including a discussion of exercises). A third question will concern the aspects related to electroencephalography.  Each question weights one third of the total.

Aim of the exam is to assess the pursuing of the objectives, in particular:

- knowledge of the main neuron models;

- knowledge of the main kinds of neural networks and their potential applications;

- knowledge of the main problems in computational neuroscience;

- the main techniques for processing the electroencephalographic signal;

- the skill to apply the knowledge acquired during the course.

The analytical and synthetic attitudes of the student, his/her language possession, and clarity in exposition are also part of the final judgment.

To achieve the laude, the student must exhibit very good mastery of the subject on each of the three questions asked. The score is then scaled based on the number and severity of the mistakes made.

 

Office hours

See the website of Mauro Ursino

See the website of Elisa Magosso