My research group will be primarily focused on three subject areas, namely:

**Mathematical Modelling of Multi-scale Spatio-Temporal Neuronal Dynamics:** This research theme aims to construct parsimonious mathematical models that lead to empirically testable predictions of both normal and pathological brain states. These states are measured at different spatio-temporal scales, from intracellular activity (e.g. protein interactions, chemical, electrical etc) to that of large scale activity (e.g single cell recordings to large scale electrical activity (e.g EEG/LFP- Electroencephalogram/Local Field Potentials). As an example of a recent work we undertook, is one which we investigated the question of how synaptic molecular machinery finely tunes the timing of neurotransmitter release via different modes (synchronous, delayed and spontaneous) and short-term plasticity[1] . Combining experiments and modelling (based on principles of slow-fast dynamical systems) we unravelled a new mathematical structure that we termed activity-induced transcritical canards, which explains how the synaptic molecular machinery achieves this homeostasis tuning. The Figure 2 depicts the multi-scale processes we accounted for, which includes protein-protein interaction (the so called SNARE-SM biological framework characterised by the 2013 Nobel Prize winner Prof. Thomas S\"udhof at Stanford University), neuro-transmitter release dynamics and dual-whole cell recordings (measuring the electrical activity of hippocampal cells, specifically a pre-synaptic CCK cell and post-synaptic pyramidal cell). The second figure compares the model's electrical activity output with measurements of the pyramidal cell. We are now incorporating these results in a new model of hippocampal microcircuit involving CCK cells, fast spiking PV cells and Pyramidal cells, which have been found to play a fundamental role in Epilepsy and Alzheimer's disease.

*Figure 2: A: Multi-scale model that bridges the gap between protein-protein interaction with neuronal electrical activity. B: Compares the model with the measured neuronal activity[1]. NB: bottom figure of dual-whole cell recording in panel A, adapted with permission from [2].*

**Neuronal Control:** To systematically address the question of neuronal controllability, we are developing a novel theoretical framework that combines three well established theories, namely: Bifurcation Theory for dynamical systems, Pseudo-Arclength Numerical Continuation methods for tracking stability boundaries and Feedback Control Theory[5]. These combined theories enable the implementation of a robust method that allows for direct tracking of nonlinear oscillations and their bifurcations directly from noisy experimental data (i.e. model-free) measured in closed loop experiments such as dynamic-clamp (see Figure 3). This is an ongoing research and importantly the aim is to provide a framework, whereby the experimentalist will visit new cellular states that are un-attainable by conventional experiments. This technology will enable the experimentalist to design novel experimental paradigms by having the possibility of placing the cell (for example at a stability boundary our in an unstable regime), therefore, allowing for the possibility to test a number of hypothesis. We envisage that this technology will lead to a better understanding of neuronal dynamics, for example, onset of pathological states due to stability boundaries, to validate mathematical models, but also will facilitate the design of intelligent machine-brain interfaces.

*Figure 3: Dynamic clamp electrophysiology, a computer controlled closed-loop experiment, which we endow it with our novel framework. B: Real-time tracking of solution branches (under parameter variation), stability boundaries and unstable states.*

**Neuronal Computation:** This research line aims at understanding neuronal computations and unveiling its neural code. Under the hypothesis that certain brain regions (in particular the lower brain structures) are equivalent to Turing machines and therefore perform symbolic computations, then the question is how do these biological neuronal circuits implement Turing machines? We are exploring this question by using notions from representation theory to derive an invertible map that allows to map symbolic dynamics to vectorial space and therefore, Turing machines to Neural Networks (see Figure 4) [6]. Assuming that electrophysiological measurements (observables) are in vectorial space, then our theoretical framework suggests a possible route to decode the neural information based on sample time series (e.g. obtained from central pattern generators) and therefore allowing the reconstruction of the associated Turing Computation. We envisage that these results will be important in developing Neuromorphic circuits (circuits that mimic neuronal circuits), which could then be used as intelligent machine-brain interfaces to treat pathologies, to understand certain brain functions and also in robotics.

*Figure 4: Mapping Turing machines to multi-scale neuro-circuits. Subsequently observables of the model are compared/correlated with measurements and via our mapping the associated computation and neural code could be potentially determined[6]*

**Dynamic-based Machine Learning:** Machine Learning (ML) has been successfully applied towards numerous applications, from engineering to biomedical data. Specifically, in ML with application to biomedical data, a typical problem is that of identifying classifiers amongst large data sets, for example, distinguishing from epileptic data, which patients are treatable by anti-epileptic drugs (AED) and those insensitive to AED. Techniques such as kernel methods and support vector machines lift (via an operator φ) the data to a higher dimensional space (i.e. feature space), where therein classification is made possible (see Figure5 (a)). However, these techniques are unable to extract dynamical modes since they assume a static structure (i.e. atemporal) in the data (parameterised by fixed parameter ω). Therefore, typical ML methods have difficulty in dealing with data emerging from multi-scale spatio-temporal systems where complex oscillations vary across different epochs of data, e.g. observed in 3Hz EEG (poly)spike-wave data in absence seizure (see Figure5 (b2)). To circumvent this we are interested in investigating the idea of endowing ML with knowledge about multi-scale models. To illustrate the point, we have developed a mean-field model that replicates 3Hz EEG (poly)spike-wave data (see Figure 5 (c1-c2)) and successfully relates changes in specific model parameters in a consistent manner to clinical recordings (see Figure5 (b1-b2 and d))[4]. Interestingly, fitting the model to epochs of data allows us to trace a pathway in a two-parameter bifurcation diagram (see Figure 5 (d)) that is consistent across different seizure episodes of the same patient. Moreover, the pathway (with some variability) dictates whether or not the patient is responsive to AED. This insight reveals that multi-scale spatio-temporal models with slowly-varying quantities (parameters) provide a robust technique for classifying dynamic varying complex data and is complementary to purely data-based approaches. The hope is that these techniques could be robust to be employed at a clinical setting.

*Figure 5: (a) Schematic representation of Learning algorithm that classifies data by lifting to a feature space. (b1) One-parameter bifurcation diagram of the model[4]. (b2) 3Hz poly(spike) EEG seizure data whose dynamic evolution can be mapped, epoch by epoch, to various solution branches shown in panel (b1). (c1-2) Comparison between data (c1) and model (c2). (d) Two-parameter bifurcation diagram allowing a classification of the data.*

REFERENCES

[1] Rodrigues S., Desroches M., Krupa M., Cortes J. M., Ali A. B., Sejnowski T. J., Proc. Natl. Acad. Sci. USA, 113, E1108-E1115, 2016.

[2] Afia et. al. Journal of Neuroscience, 21(9), 2992-2999, 2001.

[3] Desroches M., Guillamon A., Ponce E., Prohens R., Rodrigues S., Teruel A. E., SIAM Review, 58(4):653-691, 2016.

[4] Marten F., Rodrigues S., Benjamin O., Richardson M. P., Terry J. R., Phil.Trans. of the Royal Society A, 367,1145-1161, 2009.

[5] Sieber J., Rapaport A., Rodrigues S., Desroches M., Bioproc. Biosys. Eng 36(10), 1497-1507, 2013.

[6] Carmantini G., beim Graben P., Desroches M., Rodrigues S., Neural Networks, 85(1), 85-105, 2017.