TY - JOUR
T1 - Graph Transformer Networks
T2 - Learning meta-path graphs to improve GNNs
AU - Yun, Seongjun
AU - Jeong, Minbyul
AU - Yoo, Sungdong
AU - Lee, Seunghun
AU - Yi, Sean S.
AU - Kim, Raehyun
AU - Kang, Jaewoo
AU - Kim, Hyunwoo J.
N1 - Funding Information:
This research was supported by the following funding sources: National Research Foundation of Korea ( NRF-2020R1A2C3010638 ); ICT Creative Consilience program ( IITP-2022-2020-0-01819 ) and Research on CPU vulnerability detection and validation ( No.2019-0-00533 ) supervised by the IITP (Institute for Information & communications Technology Planning & Evaluation) funded by the Korea government (MSIT).
Publisher Copyright:
© 2022 The Authors
PY - 2022/9
Y1 - 2022/9
N2 - Graph Neural Networks (GNNs) have been widely applied to various fields due to their powerful representations of graph-structured data. Despite the success of GNNs, most existing GNNs are designed to learn node representations on the fixed and homogeneous graphs. The limitations especially become problematic when learning representations on a misspecified graph or a heterogeneous graph that consists of various types of nodes and edges. To address these limitations, we propose Graph Transformer Networks (GTNs) that are capable of generating new graph structures, which preclude noisy connections and include useful connections (e.g., meta-paths) for tasks, while learning effective node representations on the new graphs in an end-to-end fashion. We further propose enhanced version of GTNs, Fast Graph Transformer Networks (FastGTNs), that improve scalability of graph transformations. Compared to GTNs, FastGTNs are up to 230× and 150× faster in inference and training, and use up to 100× and 148× less memory while allowing the identical graph transformations as GTNs. In addition, we extend graph transformations to the semantic proximity of nodes allowing non-local operations beyond meta-paths. Extensive experiments on both homogeneous graphs and heterogeneous graphs show that GTNs and FastGTNs with non-local operations achieve the state-of-the-art performance for node classification tasks. The code is available: https://github.com/seongjunyun/Graph_Transformer_Networks
AB - Graph Neural Networks (GNNs) have been widely applied to various fields due to their powerful representations of graph-structured data. Despite the success of GNNs, most existing GNNs are designed to learn node representations on the fixed and homogeneous graphs. The limitations especially become problematic when learning representations on a misspecified graph or a heterogeneous graph that consists of various types of nodes and edges. To address these limitations, we propose Graph Transformer Networks (GTNs) that are capable of generating new graph structures, which preclude noisy connections and include useful connections (e.g., meta-paths) for tasks, while learning effective node representations on the new graphs in an end-to-end fashion. We further propose enhanced version of GTNs, Fast Graph Transformer Networks (FastGTNs), that improve scalability of graph transformations. Compared to GTNs, FastGTNs are up to 230× and 150× faster in inference and training, and use up to 100× and 148× less memory while allowing the identical graph transformations as GTNs. In addition, we extend graph transformations to the semantic proximity of nodes allowing non-local operations beyond meta-paths. Extensive experiments on both homogeneous graphs and heterogeneous graphs show that GTNs and FastGTNs with non-local operations achieve the state-of-the-art performance for node classification tasks. The code is available: https://github.com/seongjunyun/Graph_Transformer_Networks
KW - Graph Neural Networks
KW - Heterogeneous graphs
KW - Machine learning on graphs
KW - Network analysis
UR - http://www.scopus.com/inward/record.url?scp=85132351038&partnerID=8YFLogxK
U2 - 10.1016/j.neunet.2022.05.026
DO - 10.1016/j.neunet.2022.05.026
M3 - Article
C2 - 35716619
AN - SCOPUS:85132351038
VL - 153
SP - 104
EP - 119
JO - Neural Networks
JF - Neural Networks
SN - 0893-6080
ER -