Full metadata record
DC FieldValueLanguage
dc.contributor.advisorMutzel, Petra-
dc.contributor.authorMorris, Christopher-
dc.date.accessioned2020-02-19T06:22:38Z-
dc.date.available2020-02-19T06:22:38Z-
dc.date.issued2019-
dc.identifier.urihttp://hdl.handle.net/2003/38718-
dc.identifier.urihttp://dx.doi.org/10.17877/DE290R-20637-
dc.description.abstractLinked data arise in many real-world settings - from chemical molecules, and protein-protein interaction networks, to social networks. Hence, in recent years, machine learning methods for graphs have become an important area of research. The present work deals with (supervised) graph classification, i.e., given a set of (class) labeled graphs we want to train a model such that it can predict the labels of unlabeled graphs. Thereto, we want to map a graph to a (usually) high-dimensional vector such that standard machine learning techniques can be applied. In the first part of this thesis, we present kernel methods for graphs. Specifically, we present a framework for scalable kernels that can handle continuous vertex and edge labels, which often arise with real-world graphs, e.g., chemical and physical properties of atoms. Moreover, we present a graph kernel which can take global or higher-order graph properties into account, usually not captured by other graph kernels. To this end, we propose a local version of the $k$-dimensional Weisfeiler-Leman algorithm, which is a well-known heuristic for the graph isomorphism problem. We show that a variant of our local algorithm has at least the same power as the original algorithm, while at the same time taking the sparsity of the underlying graph into account and preventing overfitting. To make this kernel scalable, we present an approximation algorithm with theoretical guarantees. Subsequently, we introduce a theoretical framework for the analysis of graph kernels showing that most kernels are not capable of distinguishing simple graph-theoretic properties and propose a theoretically well-founded graph kernel, which can distinguish these properties. The second part of this thesis deals with neural approaches to graph classification and their connection to kernel methods. We show that the expressivity of so-called graph neural networks can be upper-bounded by the $1$-dimensional Weisfeiler-Leman graph isomorphism heuristic. We then leverage this insight to propose a generalization which is provable more powerful than graph neural networks regarding distinguishing non-isomorphic graphs.en
dc.language.isoende
dc.subjectGraph kernelen
dc.subjectMachine learning with graphsen
dc.subjectGraph neural networksen
dc.subject.ddc004-
dc.titleLearning with graphs: kernel and neural approachesen
dc.typeTextde
dc.contributor.refereeKersting, Kristian-
dc.date.accepted2019-12-20-
dc.type.publicationtypedoctoralThesisde
dc.subject.rswkKern <Mathematik>de
dc.subject.rswkGraphde
dc.subject.rswkMaschinelles Lernende
dc.subject.rswkNeuronales Netzde
dcterms.accessRightsopen access-
eldorado.secondarypublicationfalsede
Appears in Collections:LS 11

Files in This Item:
File Description SizeFormat 
main.pdfDNB1.46 MBAdobe PDFView/Open


This item is protected by original copyright



This item is protected by original copyright rightsstatements.org