**May 11, 2020 at 09:30 - May 15, 2020**
- BCAM

**Yonatan Aljadeff (Weizmann Institute of Science, Israel)**

**DATES:** 11-15 May 2020 (5 sessions)

**TIME:** 9:30 - 11:30 (a total of 10 hours)

**LOCATION:** BCAM Seminar room

**ABSTRACT:**
In this concentrated course I will review seminal results in statistical physics, that have been the bedrock of modern theoretical neuroscience. The course will be divided to two: First, I will discuss Random Matrix Theory (RMT) and its connections to the dynamics of recurrent neural networks, analyzed using Dynamic Mean-Field Theory (DMFT). Second, I will introduce the Perceptron model and its analysis using the replica method.

Since the publication of the original papers in the 1980’s, many scientists have extended and generalized these theories pertaining to artificial neural net-works, so that these results could be used more directly to understand biological neural networks. Accordingly, the course will introduce the audience to the nontrivial mathematical methods through an in-depth derivation of the classical results; and will discuss a number of notable recent extensions of these results.

This is a unique opportunity to get hands-on with advanced mathematical topics relevant both within and outside of neuroscience. These topics are often left out of curricula rendering them inaccessible to both students and professionals.

The course is tailored to students and professionals with a background in mathematics, physics, engineering, and computer science. Indeed, the course’s material is highly relevant to current topics in these fields such as free probability, weight initialization of deep networks, compressed sensing, and so on. The models that will be analyzed are motivated by neurobiological experiments, so the course will also be accessible to neuroscientists with strong quantitative skills, as well as biologists interested in large systems. The lectures will include both slide presentations and black-board derivations.

**PROGRAMME:**
1. Random matrices: The circular and elliptic laws. A statistical physics derivation of Girko’s circular law–the spectral density of an asymmetric random matrix; and the elliptic law– the spectral density for a random matrix with correlations.

2. Transition to chaos of random networks. Dynamic mean-field analysis of a neural network model with random synaptic weights.

3. Random networks, with structure. Extending the RMT and DMFT results to partially random and partially structured systems. This models, for example, systems with cell- type dependent interactions, distance de- pendent interactions, heterogeneous and correlated degree distributions, and so on.

4. The classical perceptron. Replica symmetric calculation of the capacity of a classical perceptron.

** *Registration is free, but mandatory before May 8th.** To sign-up go to

https://forms.gle/Edin1kpn72GCwM3EA and fill the registration form.

**Student grants are available. **Please, let us know if you need support for travel and accommodation expenses in the previous form before

**April 6th**.