- Docente: Flavio Zabini
- Credits: 6
- SSD: ING-INF/03
- Language: English
- Teaching Mode: Traditional lectures
- Campus: Bologna
-
Corso:
Second cycle degree programme (LM) in
Telecommunications Engineering (cod. 9205)
Also valid for Second cycle degree programme (LM) in Electronic Engineering (cod. 0934)
Second cycle degree programme (LM) in Electronic Engineering (cod. 0934)
-
from Sep 18, 2023 to Dec 20, 2023
Learning outcomes
TThe course aims at providing basic concepts of Big Data (i.e., volume, velocity, variability, variety, veracity, value). Besides, students will familiarize with three of the main shifts of paradigm in communications compelled by Big Data: the finite length information theory for a proper study of machine-type communications required by the IoT, the multidimensional stochastic sampling (instead of regular sampling in the time domain) required by crowdsensing, and the use of neural networks (instead of Von Neumann machine) for applications and services in wireless communications (e.g., broadband, 5G).
Course contents
1. Introduction and Examples (6 hours)
- Considerations and scenarios. What does Big Data means (the six "V") - 1 hour
- Examples of paradigm shifts: from Von Neumann machine to Neural Networks; from regular sampling theory to stochastic sampling theory - 1 hour
- Examples of environmental monitoring (wide ground with real or virtual sensors) and coverage - 4 hours
2. Network services with Big Data (6 hours)
- Basic networking: network layers, SDN and VNF;
- Batch processing vs Stream processing: constraints and network designs;
- Consumer requirements and network design: the chain of value;
- Data center networks: Structure and components, topology, spanning trees, addressing and routing, traffic characteristics
3. Random Sampling and Reconstruction with Big Data (20 hours)
- Regular and irregular sampling. Cauchy formulation. WKS Sampling Theorem. Levinson Theorem - 1 hour)
- From Shannon Sampling Theory to Random Sampling Theory: WKS Sampling Theorem revised - 2 hours
- One dimension Poisson sampling (problem formulation, Poisson sampling process in time domain, Marvasti's spectral theorem) - 2 hours
- Spatial Point Process Theory - 3 hours
- Multidimensional Signal Reconstruction via random sampling - 3 hours
- Uncertainties in realistic scenario and Big Data related topics - 2 hours
- Application to the example of environmental monitoring - 3 hours
- Application to the example of coverage - 3 hours
4. Neural networks for Big Data Communications (10 hours)
- Introduction to Neural Networks: components and architectures. Rosenblatt's perceptron model - 1 hour
- Multilayer perceptron model. Cybenko's theorem - 1 hour
- Logical functions (AND, OR, XOR) with neural networks: exercises. Examples of activation functions - 2 hours
- Supervised learning and approximation. Delta rule: proof. Example of supervised learning: the least mean square error problem with a perceptron - 2 hours
- The back propagation. Limitations of traditional networks - 1 hours
- Laboratory experience ("make your own neural network") - 3 hours
5. Deep Neural Networks (3 hours)
- Why Deep Networks
- Choice of the network size: example
- Laboratory experience
6. Bayesian Networks (7 hours)
- Structure - 3 hours
- Inference - 2 hours
- Applications - 2 hours
7. Boltzmann Machines (8 hours)
- Hidden Markov Networks - 1 hour
- Unit state probability. Proof of the sigmoidal pdf. Equilibrium state. - 2 hours
- Restricted Boltzmann Machines (part I: definition, energy function, conditional independence). - 2 hours
- Restricted Boltzmann Machines (part II: marginal probabilities, visible units distribution, interpretation, sigmoidal function with proof, the gradient of the log-likelihood). - 3 hours
Readings/Bibliography
Suggested books:
- Goodfellow et al, "Deep Learning", The MIT Press 2016, www.deeplearningbook.org
- S. Theodoridis, K. Koutroumbas, "Pattern Recognition", Elsevier AP, 2009
- L. E. Sucar, "Probabilistic Graphs Models", Springer 2015.
Teaching methods
Traditional lectures and experimental activity ("make your own neural network") at teacher's and students' laptops.
Assessment methods
Oral exam at the end of the course.
Students have to achieve the sufficiency in each one of the main topics.Teaching tools
Blackboard, slides, laptop. Datacamp
Office hours
See the website of Flavio Zabini