EEGNN: edge enhanced graph neural network with a Bayesian nonparametric graph model
Training deep graph neural networks (GNNs) poses a challenging task, as the performance of GNNs may suffer from the number of hidden message-passing layers. The literature has focused on the proposals of over-smoothing and under-reaching to explain the performance deterioration of deep GNNs. In this paper, we propose a new explanation for such deteriorated performance phenomenon, mis-simplification, that is, mistakenly simplifying graphs by preventing self-loops and forcing edges to be unweighted. We show that such simplifying can reduce the potential of message-passing layers to capture the structural information of graphs. In view of this, we propose a new framework, edge enhanced graph neural network (EEGNN). EEGNN uses the structural information extracted from the proposed Dirichlet mixture Poisson graph model (DMPGM), a Bayesian nonparametric model for graphs, to improve the performance of various deep message-passing GNNs. We propose a Markov chain Monte Carlo inference framework for DMPGM. Experiments over different datasets show that our method achieves considerable performance increase compared to baselines.
| Item Type | Article |
|---|---|
| Copyright holders | © 2023 The Author(s) |
| Departments | LSE > Academic Departments > Statistics |
| Date Deposited | 04 Aug 2023 |
| Acceptance Date | 20 Jan 2023 |
| URI | https://researchonline.lse.ac.uk/id/eprint/119918 |
Explore Further
- https://www.scopus.com/pages/publications/85165165285 (Scopus publication)
- https://www.lse.ac.uk/statistics/people/xinghao-qiao (Author)
- https://proceedings.mlr.press/v206/liu23a.html
- https://proceedings.mlr.press/v206/ (Official URL)