Citation: | HOU Jingyi, TANG Yuxin, YU Xinbo, LIU Zhijie. Inferring local topology via variational convolution for graph representation[J]. Chinese Journal of Engineering, 2023, 45(10): 1750-1758. doi: 10.13374/j.issn2095-9389.2022.07.24.005 |
[1] |
Ying R, He R N, Chen K F, et al. Graph convolutional neural networks for web-scale recommender systems // Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining. London, 2018: 974
|
[2] |
Dai H J, Li C T, Coley C W, et al. Retrosynthesis prediction with conditional graph logic network // Advances in Neural Information Processing Systems. Vancouver, 2019: 8870
|
[3] |
Han K, Wang Y, Guo J, et al. Vision GNN: An image is worth graph of nodes // Advances in Neural Information Processing Systems. New Orleans, 2022
|
[4] |
Vaswani A, Shazeer N, Parmar N, et al. Attention is all you need // Advances in Neural Information Processing Systems. Long Beach, 2017: 5998
|
[5] |
Cho K, van Merrienboer B, Gulcehre C, et al. Learning phrase representations using RNN encoder–decoder for statistical machine translation // Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP). Doha, 2014: 1724
|
[6] |
Hamilton W L, Ying R, Leskovec J. Inductive representation learning on large graphs // Advances in Neural Information Processing Systems. Long Beach, 2017: 1024
|
[7] |
Veličković P, Cucurull G, Casanova A, et al. Graph attention networks // International Conference on Learning Representations. Toulon, 2018
|
[8] |
Wijesinghe A, Wang Q. A New Perspective on “How graph neural networks go beyond Weisfeiler-Lehman?” // International Conference on Learning Representations. Online, 2022
|
[9] |
劉建偉, 劉俊文, 羅雄麟. 深度學習中注意力機制研究進展. 工程科學學報, 2021, 43(11):1499
Liu J W, Liu J W, Luo X L. Research progress in attention mechanism in deep learning. Chin J Eng, 2021, 43(11): 1499
|
[10] |
Ying C X, Cai T L, Luo S J, et al. Do transformers really perform badly for graph representation? // Advances in Neural Information Processing Systems. Online, 2021: 28877
|
[11] |
Alon U, Yahav E. On the bottleneck of graph neural networks and its practical implications[J/OL]. arXiv preprint (2020-6-9) [2022-7-24].https://arxiv.org/abs/2006.05205
|
[12] |
Jin W G, Barzilay R, Jaakkola T. Junction tree variational autoencoder for molecular graph generation // International Conference on Machine Learning. Stockholm, 2018: 2323
|
[13] |
Chen Z D, Chen L, Villar S, et al. Can graph neural networks count substructures? // Proceedings of the 34th International Conference on Neural Information Processing Systems. Vancouver, 2020: 10383
|
[14] |
Bouritsas G, Frasca F, Zafeiriou S, et al. Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Trans Pattern Anal Mach Intell, 2023, 45(1): 657 doi: 10.1109/TPAMI.2022.3154319
|
[15] |
Yu H, Zhao S Y, Shi J Y. STNN-DDI: A substructure-aware tensor neural network to predict drug-drug interactions. Brief Bioinform, 2022, 23(4): bbac209 doi: 10.1093/bib/bbac209
|
[16] |
Hu W, Fey M, Zitnik M, et al. Open graph benchmark: Datasets for machine learning on graphs // Advances in Neural Information Processing Systems. Online, 2020
|
[17] |
Lowe D. Chemical reactions from US patents (1976-Sep2016) [J/OL]. Figshare (2017-6-14) [2022-7-24]. https://doi.org/10.6084/m9.figshare.5104873.v1
|
[18] |
Ahneman D T, Estrada J G, Lin S, et al. Predicting reaction performance in C-N cross-coupling using machine learning. Science, 2018, 360(6385): 186 doi: 10.1126/science.aar5169
|
[19] |
Wu F, Fan A, Baevski A, et al. Pay less attention with lightweight and dynamic convolutions // International Conference on Learning Representations. New Orleans, 2019
|
[20] |
Wu Z H, Liu Z J, Lin J, et al. Lite transformer with long-short range attention[J/OL]. arXiv preprint (2020-4-24) [2022-7-24].https://arxiv.org/abs/2004.11886
|
[21] |
Gulati A, Qin J, Chiu C C, et al. Conformer: Convolution-augmented transformer for speech recognition // Interspeech Conference. Shanghai, 2020: 5036
|
[22] |
Wang Y Q, Xu Z L, Wang X L, et al. End-to-end video instance segmentation with transformers // IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR). Nashville, 2021: 8737
|
[23] |
Wu H P, Xiao B, Codella N, et al. CvT: introducing convolutions to vision transformers // IEEE/CVF International Conference on Computer Vision (ICCV). Montreal, 2022: 22
|
[24] |
Si C Y, Yu W H, Zhou P, et al. Inception transformer[J/OL]. arXiv preprint (2022-5-25) [2022-7-24].https://arxiv.org/abs/2205.12956
|
[25] |
Szegedy C, Vanhoucke V, Ioffe S, et al. Rethinking the inception architecture for computer vision // IEEE Conference on Computer Vision and Pattern Recognition (CVPR). Seattle, 2016: 2818
|
[26] |
Szegedy C, Ioffe S, Vanhoucke V, et al. Inception-v4, inception-resnet and the impact of residual connections on learning // Proceedings of the Thirty-First AAAI Conference on Artificial Intelligence. San Francisco, 2017: 4278
|
[27] |
Zhou B L, Andonian A, Oliva A, et al. Temporal relational reasoning in videos // Proceedings of the European Conference on Computer Vision. Munich, 2018: 803
|
[28] |
Kim Y. Convolutional neural networks for sentence classification[J/OL]. arXiv preprint (2014-8-25) [2022-7-24]. https://arxiv.org/abs/1408.5882
|
[29] |
Kingma D P, Welling M. Auto-encoding variational bayes // International Conference on Learning Representations. Banff, 2014: 1
|
[30] |
Rezende D J, Mohamed S, Wierstra D. Stochastic backpropagation and approximate inference in deep generative models // International Conference on Machine Learning. Beijing, 2014: 1278
|
[31] |
Maddison C J, Mnih A, Teh Y W. The concrete distribution: A continuous relaxation of discrete random variables[J/OL]. arXiv preprint (2016-11-2) [2022-7-24].https://arxiv.org/abs/1611.00712
|
[32] |
Schwaller P, Vaucher A C, Laino T, et al. Prediction of chemical reaction yields using deep learning. Mach Learn:Sci Technol, 2021, 2(1): 015016 doi: 10.1088/2632-2153/abc81d
|
[33] |
Landrum G. Rdkit documentation[J/OL]. Rdkit (2012-12-1) [2022-7-24]. http://www.rdkit.org/RDKit_Docs.2012_12_1.pdf
|
[34] |
Schwaller P, Laino T, Gaudin T, et al. Molecular transformer: A model for uncertainty-calibrated chemical reaction prediction. ACS Cent Sci, 2019, 5(9): 1572 doi: 10.1021/acscentsci.9b00576
|
[35] |
Schwaller P, Probst D, Vaucher A C, et al. Mapping the space of chemical reactions using attention-based neural networks. Nat Mach Intell, 2021, 3(2): 144 doi: 10.1038/s42256-020-00284-w
|
[36] |
Tailor S A, Opolka F L, Liò P, et al. Do we need anisotropic graph neural networks? [J/OL]. arXiv preprint (2021-4-3) [2022-7-24].https://arxiv.org/abs/2104.01481
|
[37] |
Zhang M, Li P. Nested graph neural networks // Advances in Neural Information Processing Systems. Online, 2021: 15734
|
[38] |
Chuang K V, Keiser M J. Comment on “Predicting reaction performance in C–N cross-coupling using machine learning”. Science, 2018, 362(6416): 186
|
[39] |
Sandfort F, Strieth-Kalthoff F, Kühnemund M, et al. A structure-based platform for predicting chemical reactivity. Chem, 2020, 6(6): 1379 doi: 10.1016/j.chempr.2020.02.017
|