JOURNAL OF BEIJING UNIVERSITY OF POSTS AND TELECOM ›› 2019, Vol. 42 ›› Issue (3): 21-28.doi: 10.13190/j.jbupt.2018-289
• Papers • Previous Articles Next Articles
English Textual Entailment Recognition Using Capsules
ZHU Hao, TAN Yong-mei
- Intelligence Science and Technology Center, Beijing University of Posts and Telecommunications, Beijing 100876, China
-
Received:
2018-11-16Online:
2019-06-28Published:
2019-06-20
CLC Number:
Cite this article
ZHU Hao, TAN Yong-mei. English Textual Entailment Recognition Using Capsules[J]. JOURNAL OF BEIJING UNIVERSITY OF POSTS AND TELECOM, 2019, 42(3): 21-28.
share this article
Add to citation manager EndNote|Ris|BibTeX
URL: https://journal.bupt.edu.cn/EN/10.13190/j.jbupt.2018-289
[1] Bowman S R, Angeli G, Potts C, et al.A large annotated corpus for learning natural language inference[C]//Conference on Empirical Methods in Natural Language Processing.Lisbon:ACL Press, 2015:632-642. [2] Williams A, Nangia N, Bowman S R.A broad-coverage challenge corpus for sentence understanding through inference[C]//Conference of the North American Chapter of the Association for Computational Linguistics:Human Language Technologies:Volume 1, Long Papers.New Orleans:ACL Press, 2018:1112-1122. [3] Rocktäschel T, Grefenstette E, Hermann K M, et al.Reasoning about entailment with neural attention[EB/OL].2016(2016-03-01)[2018-11-16].https://arxiv.org/abs/1509.06664. [4] Parikh A P, Täckström O, Das D, et al.A decomposable attention model for natural language inference[C]//Conference on Empirical Methods in Natural Language Processing.Austin:ACL Press, 2016:2249-2255. [5] Sabour S, Frosst N, Hinton G E.Dynamic routing between capsules[C]//Advances in Neural Information Processing Systems.Long Beach:MIT Press, 2017:3859-3869. [6] Wang Y, Sun A, Han J, et al.Sentiment analysis by capsules[C]//Proceedings of the 2018 World Wide Web Conference.Lyon:International World Wide Web Conferences Steering Committee, 2018:1165-1174. [7] Liu Y, Sun C, Lin L, et al.Learning natural language inference using bidirectional LSTM model and inner-attention[EB/OL].2016(2016-05-30)[2018-11-16].https://arxiv.org/abs/1605.09090. [8] Munkhdalai T, Yu H.Neural semantic encoders[C]//Proceedings of the 15th Conference of the European Chapter of the Association for Computational Linguistics.Valencia:ACL Press, 2017:397-407. [9] Wang S, Jiang J.Learning natural language inference with LSTM[C]//Proceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics:Human Language Technologies.San Diego:ACL Press, 2015:1442-1451. [10] Munkhdalai T, Yu H.Neural tree indexers for text understanding[C]//Proceedings of the 15th Conference of the European Chapter of the Association for Computational Linguistics.Valencia:ACL Press, 2017:11-21. [11] Sha L, Chang B, Sui Z, et al.Reading and thinking:re-read LSTM unit for textual entailment recognition[C]//International Conference on Computational Linguistics:Technical Papers.Osaka:The COLING 2016 Organizing Committee, 2016:2870-2879. [12] Gong Y, Luo H, Zhang J.Natural language inference over interaction space[EB/OL].2018(2018-05-26)[2018-11-16].https://arxiv.org/abs/1709.04348. [13] Srivastava R K, Greff K, Schmidhuber J.Highway networks[EB/OL].2015(2015-11-03)[2018-11-16].https://arxiv.org/abs/1505.00387. [14] Bowman S R, Gauthier J, Rastogi A, et al.A fast unified model for parsing and sentence understanding[C]//Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics.Berlin:ACL Press, 2016:1466-1477. [15] Mou L, Men R, Li G, et al.Natural language inference by tree-based convolution and heuristic matching[C]//Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics.Berlin:ACL Press, 2016:130-136. [16] Shen T, Zhou T, Long G, et al.DiSAN:directional self-attention network for RNN/CNN-free language understanding[C]//The 32th AAAI Conference on Artificial Intelligence.New Orleans:AAAI Press, 2018:5446-5455. [17] Chen Q, Zhu X, Ling Z, et al.Enhanced LSTM for natural language inference[C]//Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics.Vancouver:ACL Press, 2016:1657-1668. [18] Balazs J A, Marrese-Taylor E, Loyola P, et al.Refining raw sentence representations for textual entailment recognition via attention[C]//Proceedings of the 2nd Workshop on Evaluating Vector Space Representations for NLP.Copenhagen:ACL Press, 2017:51-55. [19] Nie Y, Bansal M.Shortcut-stacked sentence encoders for multi-domain inference[C]//Proceedings of the 2nd Workshop on Evaluating Vector Space Representations for NLP.Copenhagen:ACL Press, 2017:41-45. |
[1] | ZHANG Chenyu, WEN Xiangming, CHEN Yawen. High-Precision and Low-Cost Timing Method of Mobile Cellular Network [J]. Journal of Beijing University of Posts and Telecommunications, 2023, 46(1): 103-108. |
[2] | . Intent-driven Demand-aware Resource Service in Autonomous Networks [J]. Journal of Beijing University of Posts and Telecommunications, 2022, 45(6): 85-91. |
[3] | CHU Xing-he, LU Zhao-ming, WANG Lu-han, WU Mu-qing, WEN Xiang-ming. Multi-Path Assisted Cooperative Radio-Based Localization for Connected Vehicles [J]. Journal of Beijing University of Posts and Telecommunications, 2021, 44(2): 116-123. |
[4] | ZHANG Tian-kui, WANG Xiao-fei, YANG Li-wei, YANG Ding-cheng. A SFC Deployment and Computation Resource Allocation Joint Algorithm in Mobile Networks [J]. Journal of Beijing University of Posts and Telecommunications, 2021, 44(1): 7-13. |
[5] | Lü Ting-jie, SONG Luo-na, TENG Ying-lei, FENG Ye-yuan. The Architecture Design and Evaluation Method for Eco-Sustainabability Oriented Next Generation Communication Networks [J]. Journal of Beijing University of Posts and Telecommunications, 2020, 43(6): 18-26,35. |
[6] | HE Jian-hua, ZHAO Hui, XU Xiao-bin, YAN Lei, WANG Shang-guang. Data Collection Method of Space-Based Internet of Things Based on Improved Double Level Distributed LT Code [J]. Journal of Beijing University of Posts and Telecommunications, 2020, 43(6): 118-125. |
[7] | MA Lu, LIU Ming, LI Chao, LU Zhao-ming, MA Huan. A Cloud-Edge Collaborative Computing Task Scheduling Algorithm for 6G Edge Networks [J]. Journal of Beijing University of Posts and Telecommunications, 2020, 43(6): 66-73. |
[8] | GUAN Wan-qing, ZHANG Hai-jun, LU Zhao-ming. Intelligent Resource Allocation Algorithm for 6G Multi-Tenant Network Slicing Based on Deep Reinforcement Learning [J]. Journal of Beijing University of Posts and Telecommunications, 2020, 43(6): 132-139. |
[9] | LUO Yi, WANG Yu-ting, SHI Rong-hua, YAN Meng-chun, ZENG Hao. Secrecy Outage Probability Analysis of Underlay Cognitive Cooperative Relay Network with Energy Harvesting [J]. Journal of Beijing University of Posts and Telecommunications, 2020, 43(3): 105-111,124. |
[10] | LI Jun-yao, CHANG Yong-yu, ZENG Tian-yi. Channel Correlation Based LOS/NLOS Identification for 3D Massive MIMO Systems [J]. Journal of Beijing University of Posts and Telecommunications, 2020, 43(1): 1-7. |
[11] | JIANG Fang, ZHANG Nan-fei, HU Yan-jun, WANG Yi. BP Neural Network Based CSI Device-Free Target Classification Method [J]. Journal of Beijing University of Posts and Telecommunications, 2020, 43(1): 40-45. |
[12] | REN Jia-zhi, TIAN Hui, NIE Gao-feng. Proactive Caching Scheme with Local Content Popularity Prediction [J]. Journal of Beijing University of Posts and Telecommunications, 2020, 43(1): 80-91. |
[13] | XU Jiu-yun, SUN Zhong-shun, ZHANG Ru-ru. Mobile Phone Energy Saving Based on Link Prediction [J]. Journal of Beijing University of Posts and Telecommunications, 2020, 43(1): 8-13,27. |
[14] | LI Xiao-hui, DU Yang-fan, SHI Xiao-zhu, YANG Xu. NLOS Ranging Error Compensation Algorithm Based on Fuzzy Association Channel Identification [J]. Journal of Beijing University of Posts and Telecommunications, 2020, 43(1): 21-27. |
[15] | LI Peng, WANG De-yong, SHI Wen-xi, JIANG Zhi-guo. Research on Person Re-Identification Based on Deep Learning under Big Data Environment [J]. JOURNAL OF BEIJING UNIVERSITY OF POSTS AND TELECOM, 2019, 42(6): 29-34. |
Viewed | ||||||
Full text |
|
|||||
Abstract |
|
|||||