[1] Bowman S R, Angeli G, Potts C, et al.A large annotated corpus for learning natural language inference[C]//Conference on Empirical Methods in Natural Language Processing.Lisbon:ACL Press, 2015:632-642.
[2] Williams A, Nangia N, Bowman S R.A broad-coverage challenge corpus for sentence understanding through inference[C]//Conference of the North American Chapter of the Association for Computational Linguistics:Human Language Technologies:Volume 1, Long Papers.New Orleans:ACL Press, 2018:1112-1122.
[3] Rocktäschel T, Grefenstette E, Hermann K M, et al.Reasoning about entailment with neural attention[EB/OL].2016(2016-03-01)[2018-11-16].https://arxiv.org/abs/1509.06664.
[4] Parikh A P, Täckström O, Das D, et al.A decomposable attention model for natural language inference[C]//Conference on Empirical Methods in Natural Language Processing.Austin:ACL Press, 2016:2249-2255.
[5] Sabour S, Frosst N, Hinton G E.Dynamic routing between capsules[C]//Advances in Neural Information Processing Systems.Long Beach:MIT Press, 2017:3859-3869.
[6] Wang Y, Sun A, Han J, et al.Sentiment analysis by capsules[C]//Proceedings of the 2018 World Wide Web Conference.Lyon:International World Wide Web Conferences Steering Committee, 2018:1165-1174.
[7] Liu Y, Sun C, Lin L, et al.Learning natural language inference using bidirectional LSTM model and inner-attention[EB/OL].2016(2016-05-30)[2018-11-16].https://arxiv.org/abs/1605.09090.
[8] Munkhdalai T, Yu H.Neural semantic encoders[C]//Proceedings of the 15th Conference of the European Chapter of the Association for Computational Linguistics.Valencia:ACL Press, 2017:397-407.
[9] Wang S, Jiang J.Learning natural language inference with LSTM[C]//Proceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics:Human Language Technologies.San Diego:ACL Press, 2015:1442-1451.
[10] Munkhdalai T, Yu H.Neural tree indexers for text understanding[C]//Proceedings of the 15th Conference of the European Chapter of the Association for Computational Linguistics.Valencia:ACL Press, 2017:11-21.
[11] Sha L, Chang B, Sui Z, et al.Reading and thinking:re-read LSTM unit for textual entailment recognition[C]//International Conference on Computational Linguistics:Technical Papers.Osaka:The COLING 2016 Organizing Committee, 2016:2870-2879.
[12] Gong Y, Luo H, Zhang J.Natural language inference over interaction space[EB/OL].2018(2018-05-26)[2018-11-16].https://arxiv.org/abs/1709.04348.
[13] Srivastava R K, Greff K, Schmidhuber J.Highway networks[EB/OL].2015(2015-11-03)[2018-11-16].https://arxiv.org/abs/1505.00387.
[14] Bowman S R, Gauthier J, Rastogi A, et al.A fast unified model for parsing and sentence understanding[C]//Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics.Berlin:ACL Press, 2016:1466-1477.
[15] Mou L, Men R, Li G, et al.Natural language inference by tree-based convolution and heuristic matching[C]//Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics.Berlin:ACL Press, 2016:130-136.
[16] Shen T, Zhou T, Long G, et al.DiSAN:directional self-attention network for RNN/CNN-free language understanding[C]//The 32th AAAI Conference on Artificial Intelligence.New Orleans:AAAI Press, 2018:5446-5455.
[17] Chen Q, Zhu X, Ling Z, et al.Enhanced LSTM for natural language inference[C]//Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics.Vancouver:ACL Press, 2016:1657-1668.
[18] Balazs J A, Marrese-Taylor E, Loyola P, et al.Refining raw sentence representations for textual entailment recognition via attention[C]//Proceedings of the 2nd Workshop on Evaluating Vector Space Representations for NLP.Copenhagen:ACL Press, 2017:51-55.
[19] Nie Y, Bansal M.Shortcut-stacked sentence encoders for multi-domain inference[C]//Proceedings of the 2nd Workshop on Evaluating Vector Space Representations for NLP.Copenhagen:ACL Press, 2017:41-45. |