Journal of Guangdong University of Technology ›› 2021, Vol. 38 ›› Issue (03): 1-8.doi: 10.12052/gdutxb.200173

    Next Articles

An Attention Text Summarization Model Based on Syntactic Structure Fusion

Teng Shao-hua, Dong Pu, Zhang Wei   

  1. School of Computers, Guangdong University of Technology, Guangzhou 510006, China
  • Received:2020-12-17 Online:2021-05-10 Published:2021-03-12

Abstract: The traditional text summarization generation model based on sequence does not consider the context semantic information of words, which leads to the low accuracy of the generated text summary and does not conform to the human language habits. In view of this inadequacy, a structure-based attention sequence to sequence model (SBA) is proposed, which combines the sequence to sequence generation model of attention mechanism and introduces the syntactic structure information of the text, so that the context vector obtained by the attention structure contains both the semantic information and the syntactic structure information of the text, so as to obtain the generated text abstract. The experimental results based on Gigaword dataset show that the proposed method can effectively improve the accuracy and readability of the generated summary.

Key words: text summarization, sequence to sequence model, attention mechanism, syntactic structure

CLC Number: 

  • TP391
[1] GAMBHIR M, GUPTA V. Recent automatic text summarization techniques: a survey [J]. Artificial Intelligence Review, 2017, 47(1): 1-66.
[2] SUTSKEVER I, VINYALS O, LE Q V. Sequence to sequence learning with neural networks[C]//Advances in Neural Information Processing Systems. Montereal: MIT Press, 2014: 3104-3112.
[3] ELMAN J L. Finding structure in time [J]. Cognitive Science, 1990, 14(2): 179-211.
[4] HOCHREITER S, SCHMIDHUBER J. Long short-term memory [J]. Neural Computation, 1997, 9(8): 1735-1780.
[5] RUSH A M, CHOPRA S, WESTON J. A neural attention model for abstractive sentence summarization[C]//Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing. Lisbon: ACL, 2015: 379-389.
[6] CHOPRA S, AULI M, RUSH A M. Abstractive sentence summarization with attentive recurrent neural networks[C]//Proceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. San Diego: ACL, 2016: 93-98.
[7] NALLAPATI R, ZHOU B, DOS SANTOS C, et al. Abstractive text summarization using sequence-to-sequence RNNs and beyond[C]//Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning. Berlin: CoNLL, 2016: 280-290.
[8] GULCEHRE C, AHN S, NALLAPATI R, et al. Pointing the unknown words[C]//Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics. Berlin: ACL, 2016: 140-149.
[9] GU J, LU Z, LI H, et al. Incorporating copying mechanism in sequence-to-sequence learning[C]//Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics. Berlin: ACL, 2016: 1631-1640.
[10] SEE A, LIU P J, MANNING C D. Get to the point: summarization with pointer-generator networks[C]//Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics. Vancouver: ACL, 2017: 1073-1083.
[11] LIU L, LU Y, YANG M, et al. Generative adversarial network for abstractive text summarization[C]//Proceedings of the AAAI Conference on Artificial Intelligence. San Francisco: AAAI, 2017: 8109-8110.
[12] 张敏, 曾碧卿, 韩旭丽, 等. DAPC: 结合双注意力和指针覆盖的文本摘要模型[J]. 计算机工程与应用, 2020, 56(8): 149-157.
ZHANG M, ZENG B Q, HAN X L, et al. DAPC: dual attention and pointer-coverage network based summarization model [J]. Computer Engineering and Applications, 2020, 56(8): 149-157.
[13] 党宏社, 陶亚凡, 张选德. 基于混合注意力与强化学习的文本摘要生成[J]. 计算机工程与应用, 2020, 56(1): 185-190.
DANG H S, TAO Y F, ZHANG X D. Abstractive summarization model based on mixture attention and reinforcement learning [J]. Computer Engineering and Applications, 2020, 56(1): 185-190.
[14] LI J, XIONG D, TU Z, et al. Modeling source syntax for neural machine translation[C]//Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics. Vanconver: ACL, 2017: 688-697.
[15] CAO Z, WEI F, LI W, et al. Faithful to the original: fact aware neural abstractive summarization[C]//Proceedings of the AAAI Conference on Artificial Intelligence. New Orleans: AAAI, 2018: 4781-4791.
[16] 谭有新, 滕少华. 短文本特征的组合加权方法[J]. 广东工业大学学报, 2020, 37(5): 51-61.
TAN Y X, TENG S H. Combined Weighting Method for Short Text Features [J]. Journal of Guangdong University of Technology, 2020, 37(5): 51-61.
[17] CHUNG J, GULCEHRE C, CHO K, et al. Empirical evaluation of gated recurrent neural networks on sequence modeling[C]//Advances in Neural Information Processing Systems. Montereal: MIT Press, 2014: 1-9.
[18] LIU R, HU J, WEI W, et al. Structural embedding of syntactic trees for machine comprehension[C]//Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing. Copenhagen: ACL, 2017: 815-824.
[19] GOODFELLOW I, BENGIO Y, COURVILLE A. Deep learning[M]. Cambridge: MIT Press, 2016: 55-60.
[20] TAN J, WAN X, XIAO J. Abstractive document summarization with a graph-based attentional neural model[C]//Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics. Voncouver: ACL, 2017: 1171-1181.
[21] LIN C Y. Rouge: a package for automatic evaluation of summaries[C]//Text Summarization Branches Out. Barcelona: ACL, 2004: 74-81.
[22] KINGMA D P, BA J A. A method for stochastic optimization[C]//Proceedings of the 3rd International Conference for Learning Representations. San Diego: ICLR, 2015: 1-15.
[23] LI P, LAM W, BING L, et al. Deep recurrent generative decoder for abstractive text summarization[C]//Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing. Copenhagen: ACL, 2017: 2091-2100.
[1] Wu Jun-xian, He Yuan-lie. Channel Attentive Self-supervised Network for Monocular Depth Estimation [J]. Journal of Guangdong University of Technology, 2023, 40(02): 22-29.
[2] Liu Hong-wei, Lin Wei-zhen, Wen Zhan-ming, Chen Yan-jun, Yi Min-qi. A MABM-based Model for Identifying Consumers' Sentiment Polarity―Taking Movie Reviews as an Example [J]. Journal of Guangdong University of Technology, 2022, 39(06): 1-9.
[3] Liang Guan-shu, Cao Jiang-zhong, Dai Qing-yun, Huang Yun-fei. An Unsupervised Trademark Retrieval Method Based on Attention Mechanism [J]. Journal of Guangdong University of Technology, 2020, 37(06): 41-49.
[4] Zeng Bi-qing, Han Xu-li, Wang Sheng-yu, Xu Ru-yang, Zhou Wu. Sentiment Classification Based on Double Attention Convolutional Neural Network Model [J]. Journal of Guangdong University of Technology, 2019, 36(04): 10-17.
[5] Gao Jun-yan, Liu Wen-yin, Yang Zhen-guo. Object Tracking Combined with Attention and Feature Fusion [J]. Journal of Guangdong University of Technology, 2019, 36(04): 18-23.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!