A. Kazi, L. Cosmo, N. Navab, M. Bronstein
Differentiable Graph Module (DGM) for Graph Convolutional Networks ArXiv available at https://arxiv.org/pdf/2002.04999.pdf (bib) |
||
Graph deep learning has recently emerged as apowerful ML concept allowing to generalize suc-cessful deep neural architectures to non-Euclideanstructured data.Such methods have shownpromising results on a broad spectrum of appli-cations ranging from social science, biomedicine,and particle physics to computer vision, graphics,and chemistry. One of the limitations of the major-ity of current graph neural network architecturesis that they are often restricted to the transductivesetting and rely on the assumption that the under-lying graph isknownandfixed. In many settings,such as those arising in medical and healthcare ap-plications, this assumption is not necessarily truesince the graph may be noisy, partially- or evencompletely unknown, and one is thus interestedin inferring it from the data. This is especiallyimportant in inductive settings when dealing withnodes not present in the graph at training time.Furthermore, sometimes such a graph itself mayconvey insights that are even more important thanthe downstream task. In this paper, we introduceDifferentiable Graph Module (DGM), a learnablefunction predicting the edge probability in thegraph relevant for the task, that can be combinedwith convolutional graph neural network layersand trained in an end-to-end fashion. We providean extensive evaluation of applications from thedomains of healthcare (disease prediction), brainimaging (gender and age prediction), computergraphics (3D point cloud segmentation), and com-puter vision (zero-shot learning). We show thatour model provides a significant improvementover baselines both in transductive and inductivesettings and achieves state-of-the-art results. | ||
This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each authors copyright. In most cases, these works may not be reposted without the explicit permission of the copyright holder. |