Please use this identifier to cite or link to this item: https://idr.l3.nitk.ac.in/jspui/handle/123456789/8318
Full metadata record
DC FieldValueLanguage
dc.contributor.authorSharma, A.
dc.contributor.authorHarithas, C.
dc.date.accessioned2020-03-30T10:18:24Z-
dc.date.available2020-03-30T10:18:24Z-
dc.date.issued2019
dc.identifier.citationProceedings - 17th IEEE International Conference on Machine Learning and Applications, ICMLA 2018, 2019, Vol., , pp.1-7en_US
dc.identifier.urihttp://idr.nitk.ac.in/jspui/handle/123456789/8318-
dc.description.abstractIn this paper, we focussed on non-factoid question answering problem using a bidirectional LSTM with an inner attention mechanism and indexing for better accuracy. Non factoid QA is an important task and can be significantly applied in constructing useful knowledge bases and extracting valuable information. The advantage of using Deep Learning frameworks in solving these kind of problems is that it does not require any feature engineering and other linguistic tools. The proposed approach is to extend a LSTM (Long Short Term Memory) model in two directions, one with a Convolutional layer and other with an inner attention mechanism, proposed by Bingning Wang, et al., to the LSTMs, to generate answer representations in accordance with the question. On top of this Deep Learning model we used an information retrieval model based on indexing to generate answers and improve the accuracy. The proposed methodology showed an improvement in accuracy over the referred model and respective baselines and also with respect to the answer lengths used. The models are tested with two non factoid QA data sets: TREC-QA and InsuranceQA. � 2018 IEEE.en_US
dc.titleInner Attention Based bi-LSTMs with Indexing for non-Factoid Question Answeringen_US
dc.typeBook chapteren_US
Appears in Collections:2. Conference Papers

Files in This Item:
There are no files associated with this item.


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.