Agenda

17 Apr 2024 15:00

Physics-inspired learning on graphs

Aula Epsilon 1 , Edificio EPSILON - Campus Scientifico via Torino

Speaker: Michael Bronstein, Oxford University

Abstract:
The message-passing paradigm has been the “battle horse” of deep learning on graphs for several years, making graph neural networks a big success in a wide range of applications, from particle physics to protein design. From a theoretical viewpoint, it established the link to the Weisfeiler-Lehman hierarchy, allowing to analyse the expressive power of GNNs. We argue that the very “node-and-edge”-centric mindset of current graph deep learning schemes may hinder future progress in the field. As an alternative, we propose physics-inspired “continuous” learning models that open up a new trove of tools from the fields of differential geometry, algebraic topology, and differential equations so far largely unexplored in graph ML.

Bio Sketch:
Michael Bronstein is the DeepMind Professor of AI at the University of Oxford. He previously served as Head of Graph Learning Research at Twitter, professor at Imperial College London, and held visiting appointments at Stanford, MIT, and Harvard. He is the recipient of the Royal Society Wolfson Research Merit Award, Royal Academy of Engineering Silver Medal, Turing World-Leading AI Research Fellowship, five ERC grants, two Google Faculty Research Awards, and two Amazon AWS ML Research Awards. He is a Member of the Academia Europaea, Fellow of IEEE, IAPR, BCS, and ELLIS, ACM Distinguished Speaker, and World Economic Forum Young Scientist. In addition to his academic career, Michael is a serial entrepreneur and founder of multiple startup companies, including Novafora, Invision (acquired by Intel in 2012), Videocites, and Fabula AI (acquired by Twitter in 2019). He is the Chief Scientist at VantAI and scientific advisor at Recursion Pharmaceuticals.

Language

The event will be held in English

Organized by

Dipartimento di Scienze Ambientali, Informatica e Statistica - Luca Cosmo

Search in the agenda