Authors: Fey, Matthias
Title: On the power of message passing for learning on graph-structured data
Language (ISO): en
Abstract: This thesis proposes novel approaches for machine learning on irregularly structured input data such as graphs, point clouds and manifolds. Specifically, we are breaking up with the regularity restriction of conventional deep learning techniques, and propose solutions in designing, implementing and scaling up deep end-to-end representation learning on graph-structured data, known as Graph Neural Networks (GNNs). GNNs capture local graph structure and feature information by following a neural message passing scheme, in which node representations are recursively updated in a trainable and purely local fashion. In this thesis, we demonstrate the generality of message passing through a unified framework suitable for a wide range of operators and learning tasks. Specifically, we analyze the limitations and inherent weaknesses of GNNs and propose efficient solutions to overcome them, both theoretically and in practice, e.g., by conditioning messages via continuous B-spline kernels, by utilizing hierarchical message passing, or by leveraging positional encodings. In addition, we ensure that our proposed methods scale naturally to large input domains. In particular, we propose novel methods to fully eliminate the exponentially increasing dependency of nodes over layers inherent to message passing GNNs. Lastly, we introduce PyTorch Geometric, a deep learning library for implementing and working with graph-based neural network building blocks, built upon PyTorch.
Subject Headings: Graph neural networks
Deep learning
Subject Headings (RSWK): Graph
Deep Learning
URI: http://hdl.handle.net/2003/41059
http://dx.doi.org/10.17877/DE290R-22906
Issue Date: 2022
Appears in Collections:LS 07 Graphische Systeme

Files in This Item:
File Description SizeFormat 
dissertation_fey.pdfDNB10.78 MBAdobe PDFView/Open


This item is protected by original copyright



Items in Eldorado are protected by copyright, with all rights reserved, unless otherwise indicated.