87450 - Models and Numerical Methods in Physics

Academic Year 2023/2024

  • Teaching Mode: Traditional lectures
  • Campus: Bologna
  • Corso: Second cycle degree programme (LM) in Physics of the Earth System (cod. 8626)

    Also valid for Second cycle degree programme (LM) in Physics (cod. 9245)

Learning outcomes

"At the end of the course the student will acquire the tools to build up dynamical models for the evolution of the classical physical systems formed by interacting particles under the influence of external fields. He/she will be able to use numerical techniques for the solution of the corresponding differential equation even in the case of fluctuating fields. In particular, in the limit of a large number of particles the kinetic and the fluid approximations will be developed; in the case of long range interactions the average field equations will be considered, together with self-consistent solutions and collision models based on stochastic processes."

Course contents

"..a large Language Model is just something that compress part of the Internet ....and then it dreams about...."

(Andrej Karpathy, https://youtu.be/zjkBMFhNj_g?si=_CjyJdSOKhvyVZXk)


The course will be focused on developing the mathematical and computational tools for understanding the close relation between Entopy, Information and Compression, starting from stochastic processes over finite alphabet, together with a discussion of some concrete applications to Large Language Models and other Generative A.I. Models.

Methods and Techniques lie in the intersection between Dynamical Systems, Statistical Mechanics and Information Theory.The course will combine lectures at the blackboard with numerical investigations (in Python).

This online program will be refined over time, but here a first list of topics:


-Review of probability theory/dynamical systems.

-Shift spaces over finite alphabets. ,Ergodicity and covering

- Entropy of a random variable.

- The Shannon-McMillan-Breiman (SMB) theorem. Example: Shannon’s source coding theorem.

-Entropy and coding–asymptotic optimality and Shannon’s theorem.

-Coding and Entropy: the Lempel-Ziv parsing and coding.

-Relative Entropy and Entropy Production

- Byte Pair Encoding

- Deep Compression: Convolutional and Recurrent (LSTM) networks, variational Auto-Encoder and Transformer (GPT)

- Deep Entropy and applications: GAN (Generative Adversial Network), Probabilistic Diffusion Models

Readings/Bibliography

Here just the basic reference books used in the course. All sources, books and papers, will be available to students in digital format:

  • Notes "Entropy. Information and Large Language Models", M.- Degli Esposti (2024)
  • Shields, P.C. The Ergodic Theory of Discrete Sample Paths. Graduate Studies in Mathematics, AMS 1996.
  • Cover, M.T., and Thomas, A.J.: Elements of Information Theory. John Wiley & and Sons,1991.
  • Andrej Karpathy's Lecture (YouTube)

Teaching methods

Lectures and Numerical Simulations (with Python)

Assessment methods

to be defined

Teaching tools

Blackboard and Python

Office hours

See the website of Mirko Degli Esposti