Verónica Álvarez will defend her doctoral thesis on Monday, 25th September

  • The defense will be held in Faculty of Computer Science of UPV/EHU at Donostia

Veronica Alvarez received her Bachelor’s degree in Mathematics from the University of Salamanca in 2019 and her Master’s degree in Mathematical Research from the Polytechnic University of Valencia, in 2020.

She joined Basque Center for Applied Mathematics – BCAM in July 2019. Her main scientific interests include statistics, data science, and machine learning.

Her PhD thesis, Supervised Learning in Time-dependent Environments with Performance Guarantees, has been supervised by Santiago Mazuelas (BCAM) and Jose Antonio Lozano, Scientific Director at BCAM.

The defense will be held at Faculty of Computer Science of UPV/EHU in Donostia. It will take place on Monday 25th September at 11:00.

On behalf of all BCAM members, we would like to wish Verónica the best of luck in her upcoming thesis defense.


PhD thesis Title:

Supervised Learning in Time-dependent Environments with Performance Guarantees


In practical scenarios, it is common to learn from a sequence of related problems (tasks). Such tasks are usually time-dependent in the sense that consecutive tasks are often significantly more similar. Time-dependency is common in multiple applications such as load forecasting, spam main filtering, and face emotion recognition. For instance, in the problem of load forecasting, the consumption patterns in consecutive time periods are significantly more similar since human habits and weather factors change gradually over time. Learning from a sequence tasks holds promise to enable accurate performance even with few samples per task by leveraging information from different tasks. However, harnessing the benefits of learning from a sequence of tasks is challenging since tasks are characterized by different underlying distributions. Most existing techniques are designed for situations where the tasks’ similarities do not depend on their order in the sequence. Existing techniques designed for timedependent tasks adapt to changes between consecutive tasks accounting for a scalar rate of change by using a carefully chosen parameter such as a learning rate or a weight factor. However, the tasks’ changes are commonly multidimensional, i.e., the timedependency often varies across different statistical characteristics describing the tasks. For instance, in the problem of load forecasting, the statistical characteristics related to weather factors often change differently from those related to generation. In this dissertation, we establish methodologies for supervised learning from a sequence of time-dependent tasks that effectively exploit information from all tasks, provide multidimensional adaptation to tasks’ changes, and provide computable tight performance guarantees. We develop methods for supervised learning settings where tasks arrive over time including techniques for supervised classification under concept drift (SCD) and techniques for continual learning (CL). In addition, we present techniques for load forecasting that can adapt to time changes in consumption patterns and assess intrinsic uncertainties in load demand. The numerical results show that the proposed methodologies can significantly improve the performance of existing methods using multiple benchmark datasets. This dissertation makes theoretical contributions leading to efficient algorithms for multiple machine learning scenarios that provide computable performance guarantees and superior performance than state-of-the-art techniques.