Journal of Guangdong University of Technology ›› 2022, Vol. 39 ›› Issue (04): 24-31.doi: 10.12052/gdutxb.210062

Previous Articles     Next Articles

A Knowledge Representation Model Based on Bi-Objective Alternate Optimization Under Probability

Zhang Xin, Wang Zhen-you   

  1. School of Mathematics and Statistics, Guangdong University of Technology, Guangzhou 510520, China
  • Received:2021-04-25 Online:2022-07-10 Published:2022-06-29

Abstract: Aiming at the problem that the TransD model has many parameters and the two representations of entities are not related, an improved knowledge representation model PTransD is proposed, which reduces the number of parameters by reducing the number of entity projections and clustering entities, while using K-L (Kullback-Leibler ) The divergence limits the entity projection to the same probability distribution as the corresponding entity class. During model training, the triple loss and K-L loss are alternately optimized, and the entities in the classes with large spacing between the entities are replaced to improve the quality of negative examples. Finally, based on the experimental results of triple classification and link prediction on the knowledge graph data set, the performance has been significantly improved in various indicators. It can be applied to the perfection and reasoning of knowledge map.

Key words: knowledge graph, representation learning, alternate optimization, triple classification, link prediction

CLC Number: 

  • TP3
[1] LENAT D B, PRAKASH M, SHEPHERD M. CYC: using common sense knowledge to overcome brittleness and knowledge acquisition bottlenecks [J]. AI Magazine, 1985, 6(4): 65.
[2] MILLER G A. WordNet [J]. Communications of the ACM, 1995, 38(11): 39-41.
[3] BOLLACKER K, COOK R, TUFTS P. Freebase: ashared database of structured general human knowledge[C]//Proceedings of the Twenty-Second AAAI Conference on Artificial Intelligence. Vancouver: AAAI Press, 2007: 1962-1963.
[4] JI S, PAN S, CAMBRIA E, et al. A survey on knowledge graphs: representation, acquisition and applications [EB/OL]. (2021-04-01)[2021-04-20]. https://arxiv.org/abs/2002.00388.
[5] GAO Y, LI Y, LIN Y, et al. Deep learning on knowledge graph for recommender system: a survey[EB/OL]. (2020-03-25) [2021-04-20]. https://arxiv.org/abs/2004.00387.
[6] HUANG X, ZHANG J, LI D, et al. Knowledge graph embedding based question answering [C]//Proceedings of the Twelfth ACM International Conference on Web Search and Data Mining. New York: ACM, 2019: 105-113.
[7] DAI Y, WANG S, XIONG N N, et al. A survey on knowledge graph embedding: approaches, applications and benchmarks [J]. Electronics, 2020, 9(5): 750.
[8] MIKOLOV T, SUTSKEVER I, CHEN K, et al. Distributed representations of words and phrases and their compositionality [J]. Advances in Neural Information Processing Systems, 2013, 26: 3111-3119.
[9] BORDES A, USUNIER N, GARCIA-DURAN A, et al. Translating embeddings for modeling multi-relational data[C]// Proceedings of the 26th International Conference on Neural Information Processing Systems. Lake Tahoe: NIPS, 2013: 2787-2795.
[10] LIN Y, LIU Z, SUN M, et al. Learning entity and relation embeddings for knowledge graph completion[C]//Proceedings of the AAAI Conference on Artificial Intelligence. Austin: AAAI Press, 2015: 2181-2187.
[11] JI G, HE S, XU L, et al. Knowledge graph embedding via dynamic mapping matrix[C] //Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing. Beijing: ACL, 2015: 687-696.
[12] WANG H, ZHANG F, ZHAO M, et al. Multi-task feature learning for knowledge graph enhanced recommendation[C]//Proceedings of the 2019 World Wide Web Conference. San Francisco: ACM, 2019: 2000-2010.
[13] WANG Z, ZHANG J, FENG J, et al. Knowledge graph embedding by translating on hyperplanes[C] //In Proceedings of the Twenty-eighth AAAI Conference on Artificial Intelligence. Québec City: AAAI Press, 2014: 1112–1119.
[14] DO K, TRAN T, VENKATESH S. Knowledge graph embedding with multiple relation projections[C]// 2018 24th International Conference on Pattern Recognition. Beijing: IEEE, 2018: 332-337.
[15] ZHU Q, ZHOU X, TAN J, et al. Learning knowledge graph embeddings via generalized hyperplanes[C] //Proceedings of the 18th International Conference on Computational Science. Wuxi: Springer, 2018: 624-638.
[16] ZEILER M D. Adadelta: an adaptive learning rate method[EB/OL]. (2012-12-22) [2021-04-20]. https://arxiv.org/abs/1212.5701v1.
[17] BORDES A, GLOROT X, WESTON J, et al. Joint learning of words and meaning representations for open-text semantic parsing[C]// Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics. La Palma: PMLR, 2012: 127-135.
[18] BORDES A, WESTON J, COLLOBERT R, et al. Learning structured embeddings of knowledge bases[C]//In Proceedings of the Twenty-eighth AAAI Conference on Artificial Intelligence. San Francisco: AAAI Press, 2011: 301–306.
[19] BORDES A, GLOROT X, WESTON J, et al. A semantic matching energy function for learning with multi-relational data [J]. Machine Learning, 2014, 94(2): 233-259.
[1] Wang Pei-zhuang, Zeng Fan-hui, Sun Hui, Li Xing-sen, Guo Jian-wei, Meng Xiang-fu, He Jing. Extension of Knowledge Graph and its Intelligent Extension Library [J]. Journal of Guangdong University of Technology, 2021, 38(04): 9-16.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!