Residential Collegefalse
Status已發表Published
Temporal inductive path neural network for temporal knowledge graph reasoning
Dong, Hao1,2; Wang, Pengyang3; Xiao, Meng1,2; Ning, Zhiyuan1,2; Wang, Pengfei1,2; Zhou, Yuanchun1,2
2024-02-01
Source PublicationArtificial Intelligence
ISSN0004-3702
Volume329Pages:104085
Abstract

Temporal Knowledge Graph (TKG) is an extension of traditional Knowledge Graph (KG) that incorporates the dimension of time. Reasoning on TKGs is a crucial task that aims to predict future facts based on historical occurrences. The key challenge lies in uncovering structural dependencies within historical subgraphs and temporal patterns. Most existing approaches model TKGs relying on entity modeling, as nodes in the graph play a crucial role in knowledge representation. However, the real-world scenario often involves an extensive number of entities, with new entities emerging over time. This makes it challenging for entity-dependent methods to cope with extensive volumes of entities, and effectively handling newly emerging entities also becomes a significant challenge. Therefore, we propose Temporal Inductive Path Neural Network (TiPNN), which models historical information in an entity-independent perspective. Specifically, TiPNN adopts a unified graph, namely history temporal graph, to comprehensively capture and encapsulate information from history. Subsequently, we utilize the defined query-aware temporal paths on a history temporal graph to model historical path information related to queries for reasoning. Extensive experiments illustrate that the proposed model not only attains significant performance enhancements but also handles inductive settings, while additionally facilitating the provision of reasoning evidence through history temporal graphs.

KeywordGraph Neural Networks Knowledge Graph Reasoning Temporal Knowledge Graph Temporal Reasoning
DOI10.1016/j.artint.2024.104085
URLView the original
Indexed BySCIE
Language英語English
WOS Research AreaComputer Science
WOS SubjectComputer Science, Artificial Intelligence
WOS IDWOS:001185065000001
PublisherELSEVIER, RADARWEG 29, 1043 NX AMSTERDAM, NETHERLANDS
Scopus ID2-s2.0-85184028054
Fulltext Access
Citation statistics
Document TypeJournal article
CollectionDEPARTMENT OF COMPUTER AND INFORMATION SCIENCE
THE STATE KEY LABORATORY OF INTERNET OF THINGS FOR SMART CITY (UNIVERSITY OF MACAU)
Corresponding AuthorWang, Pengyang; Wang, Pengfei
Affiliation1.Computer Network Information Center, Chinese Academy of Sciences, Beijing, 2 Dongsheng South Rd, Haidian District, China
2.University of Chinese Academy of Sciences, Beijing, 1 Yanqihu East Rd, Huairou District, China
3.Department of Computer and Information Science, The State Key Laboratory of Internet of Things for Smart City, University of Macau, Macau, Avenida da Universidade, Taipa, China
Corresponding Author AffilicationUniversity of Macau
Recommended Citation
GB/T 7714
Dong, Hao,Wang, Pengyang,Xiao, Meng,et al. Temporal inductive path neural network for temporal knowledge graph reasoning[J]. Artificial Intelligence, 2024, 329, 104085.
APA Dong, Hao., Wang, Pengyang., Xiao, Meng., Ning, Zhiyuan., Wang, Pengfei., & Zhou, Yuanchun (2024). Temporal inductive path neural network for temporal knowledge graph reasoning. Artificial Intelligence, 329, 104085.
MLA Dong, Hao,et al."Temporal inductive path neural network for temporal knowledge graph reasoning".Artificial Intelligence 329(2024):104085.
Files in This Item:
There are no files associated with this item.
Related Services
Recommend this item
Bookmark
Usage statistics
Export to Endnote
Google Scholar
Similar articles in Google Scholar
[Dong, Hao]'s Articles
[Wang, Pengyang]'s Articles
[Xiao, Meng]'s Articles
Baidu academic
Similar articles in Baidu academic
[Dong, Hao]'s Articles
[Wang, Pengyang]'s Articles
[Xiao, Meng]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[Dong, Hao]'s Articles
[Wang, Pengyang]'s Articles
[Xiao, Meng]'s Articles
Terms of Use
No data!
Social Bookmark/Share
All comments (0)
No comment.
 

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.