Hide menu

IDA Machine Learning Seminars - Spring 2024


The IDA Machine Learning Seminars is a series of research presentations given by nationally and internationally recognized researchers in the field of machine learning.

• You can subscribe to the email list used for announcing upcoming seminars here.
• You can subscribe to the seminar series' calendar using this ics link.


Friday, March 15, 10:30, 2024

Learning Paradigms for Timeseries (TS) and SpatioTemporal (ST) Data (and Tasks):
Towards Generative AI for TS and ST
Flora Salim, Professor in the School of Computer Science and Engineering, University of New South Wales (UNSW)

Abstract: Abstract: The initial release of ChatGPT that has seen a worldwide uptake of over than 100 million in just two months after its launch in November 2022. There were several key milestones leading to the development of these foundation models, which underpin the ChatGPT technology, including the introduction of Transformers architecture and the self-supervised learning paradigm. How has the underpinning technologies been applied in the pervasive computing domain, such as for human behaviour modelling, and traffic and weather forecasting? Access to annotated human behaviour data has been expensive and often infeasible. This demands new ways for modelling behaviours at scale, moving away from discriminative, fully-supervised learning approaches, and from narrow tasks. The heterogeneity of both the data sources and the downstream tasks, as well as lack of annotations, makes self-supervised learning to be a compelling choice, as they require no labelled data and can be made compact and generalisable. I will present our self-supervised learning (SSL) pretraining approaches for multimodal sensor data, and also recent works on multimodal self-supervision. I will show why Transformer architecture, designed for sequence-to-sequence modelling, with multi-head attention mechanism, is a perfect fit for time-series data. When combined with Graph structure, it becomes a powerful combo for spatiotemporal prediction tasks. I will also present our works on leveraging Large Language Models (LLMs) for time-series modelling, such as for traffic forecasting and energy demand forecasting, using natural language prompts. Finally, I will discuss open issues around these models, including fairness and explainability, and present our ongoing projects to address them.

Bio: Flora Salim is a Professor in the School of Computer Science and Engineering (CSE), the inaugural Cisco Chair of Digital Transport & AI, University of New South Wales (UNSW) Sydney, and the Deputy Director (Engagement) of UNSW AI Institute. Her research is on machine learning for time-series and multimodal sensor data and on trustworthy AI. She has received several prestigious fellowships including Humboldt-Bayer Fellowship, Humboldt Fellowship, Victoria Fellowship, and ARC Australian Postdoctoral (Industry) Fellowship. She was a recipient of the Women in AI Awards 2022 Australia and New Zealand (Defence and Intelligence Category). She has worked with many industry and government partners, and managed large-scale research and innovation projects, leading to several patents and deployed systems. She is a member of the Australian Research Council (ARC) College of Experts. She has served as a Senior Area Chair / Area Chair of AAAI, WWW, NeurIPS, and many other top-tier conferences in AI and ubiquitous computing. She is an Editor of IMWUT, Associate-Editor-in-Chief of IEEE Pervasive Computing, Associate Editor of ACM Transactions on Spatial Algorithms and Systems, a Steering Committee member of ACM UbiComp, and an Associate of ELLIS Alicante.

Location: Alan Turing



Wednesday, March 20, 15:15, 2024

Can neural ODEs forecast global weather?
Markus Heinonen, Department of Computer Science, Aalto University

Abstract: Neural ODEs have surfaced in the last decade as a new perspective on modelling dynamics by learning the time-derivative that drives the system evolution forward as a neural network. While succesful on systems of limited complexity, large-scale demonstrations have been lacking. Recently large autoregressive transformer models have been developed with breakthrough global weather forecasting performances, but at times with little consideration of the underlying dynamics. We consider global weather as a continuous-time PDE with mass-preserving dynamics, and show how simple convolution networks can achieve state-of-the-art weather prediction performance with just a few million parameters. This talk is based on the paper ClimODE: Climate Forecasting With Physics-informed Neural ODEs accepted for ICLR24 oral presentation.

Bio: Markus Heinonen is an Academy Research Fellow at Aalto University, Finland with a PhD from University of Helsinki in 2013. His research interests are centered on probabilistic machine learning with emphasis on understanding uncertainty of deep learning with Bayesian perspectives to neural networks. In addition he has worked on learning ODEs and PDEs with neural networks and Gaussian processes.
Location: Ada Lovelace



Wednesday, April 17, 15:15, 2024

TBA
Bertrand Charpentier, Data Analytics and Machine Learning Group, Technical University of Munich

Abstract: TBA
Location: TBA



Wednesday, May 15, 15:15, 2024

TBA
Yingzhen Li, Department of Computing, Imperial College London

Abstract: TBA
Location: TBA




Future Seminars

Fall 2024


Past Seminars

Fall 2023   |   Spring 2023   |   Fall 2022   |   Spring 2022   |   Spring 2021   |   Spring 2020   |   Fall 2019   |   Spring 2019   |   Fall 2018   |   Spring 2018   |   Fall 2017   |   Spring 2017   |   Fall 2016   |   Spring 2016  |   Fall 2015   |   Spring 2015   |   Fall 2014



The seminars are typically held every fourth Wednesday at 15.15-16.15 in Alan Turing.
For further information, or if you want to be notified about the seminars by e-mail, please contact Fredrik Lindsten.


Page responsible: Fredrik Lindsten
Last updated: 2024-03-14