語系:
繁體中文
English
說明(常見問題)
登入
回首頁
切換:
標籤
|
MARC模式
|
ISBD
Computational Natural Language Infer...
~
ProQuest Information and Learning Co.
Computational Natural Language Inference : = Robust and Interpretable Question Answering.
紀錄類型:
書目-語言資料,手稿 : Monograph/item
正題名/作者:
Computational Natural Language Inference :/
其他題名:
Robust and Interpretable Question Answering.
作者:
Sharp, Rebecca Reynolds.
面頁冊數:
1 online resource (164 pages)
附註:
Source: Dissertation Abstracts International, Volume: 78-12(E), Section: A.
標題:
Linguistics. -
電子資源:
click for full text (PQDT)
ISBN:
9780355227802
Computational Natural Language Inference : = Robust and Interpretable Question Answering.
Sharp, Rebecca Reynolds.
Computational Natural Language Inference :
Robust and Interpretable Question Answering. - 1 online resource (164 pages)
Source: Dissertation Abstracts International, Volume: 78-12(E), Section: A.
Thesis (Ph.D.)--The University of Arizona, 2017.
Includes bibliographical references
We address the challenging task of computational natural language inference, by which we mean bridging two or more natural language texts while also providing an explanation of how they are connected. In the context of question answering (i.e., finding short answers to natural language questions), this inference connects the question with its answer and we learn to approximate this inference with machine learning. In particular, here we present four approaches to question answering, each of which shows a significant improvement in performance over baseline methods. In our first approach, we make use of the underlying discourse structure inherent in free text (i.e. whether the text contains an explanation, elaboration, contrast, etc.) in order to increase the amount of training data for (and subsequently the performance of) a monolingual alignment model. In our second work, we propose a framework for training customized lexical semantics models such that each one represents a single semantic relation. We use causality as a use case, and demonstrate that our customized model is able to both identify causal relations as well as significantly improve our ability to answer causal questions. We then propose two approaches that seek to answer questions by learning to rank human-readable justifications for the answers, such that the model selects the answer with the best justification. The first uses a graph-structured representation of the background knowledge and performs information aggregation to construct multi-sentence justifications. The second reduces pre-processing costs by limiting itself to a single sentence and using a neural network to learn a latent representation of the background knowledge. For each of these, we show that in addition to significant improvement in correctly answering questions, we also outperform a strong baseline in terms of the quality of the answer justification given.
Electronic reproduction.
Ann Arbor, Mich. :
ProQuest,
2018
Mode of access: World Wide Web
ISBN: 9780355227802Subjects--Topical Terms:
557829
Linguistics.
Index Terms--Genre/Form:
554714
Electronic books.
Computational Natural Language Inference : = Robust and Interpretable Question Answering.
LDR
:03122ntm a2200325K 4500
001
913851
005
20180628103545.5
006
m o u
007
cr mn||||a|a||
008
190606s2017 xx obm 000 0 eng d
020
$a
9780355227802
035
$a
(MiAaPQ)AAI10622193
035
$a
(MiAaPQ)arizona:15800
035
$a
AAI10622193
040
$a
MiAaPQ
$b
eng
$c
MiAaPQ
100
1
$a
Sharp, Rebecca Reynolds.
$3
1186857
245
1 0
$a
Computational Natural Language Inference :
$b
Robust and Interpretable Question Answering.
264
0
$c
2017
300
$a
1 online resource (164 pages)
336
$a
text
$b
txt
$2
rdacontent
337
$a
computer
$b
c
$2
rdamedia
338
$a
online resource
$b
cr
$2
rdacarrier
500
$a
Source: Dissertation Abstracts International, Volume: 78-12(E), Section: A.
500
$a
Advisers: Michael Hammond; Mihai Surdeanu.
502
$a
Thesis (Ph.D.)--The University of Arizona, 2017.
504
$a
Includes bibliographical references
520
$a
We address the challenging task of computational natural language inference, by which we mean bridging two or more natural language texts while also providing an explanation of how they are connected. In the context of question answering (i.e., finding short answers to natural language questions), this inference connects the question with its answer and we learn to approximate this inference with machine learning. In particular, here we present four approaches to question answering, each of which shows a significant improvement in performance over baseline methods. In our first approach, we make use of the underlying discourse structure inherent in free text (i.e. whether the text contains an explanation, elaboration, contrast, etc.) in order to increase the amount of training data for (and subsequently the performance of) a monolingual alignment model. In our second work, we propose a framework for training customized lexical semantics models such that each one represents a single semantic relation. We use causality as a use case, and demonstrate that our customized model is able to both identify causal relations as well as significantly improve our ability to answer causal questions. We then propose two approaches that seek to answer questions by learning to rank human-readable justifications for the answers, such that the model selects the answer with the best justification. The first uses a graph-structured representation of the background knowledge and performs information aggregation to construct multi-sentence justifications. The second reduces pre-processing costs by limiting itself to a single sentence and using a neural network to learn a latent representation of the background knowledge. For each of these, we show that in addition to significant improvement in correctly answering questions, we also outperform a strong baseline in terms of the quality of the answer justification given.
533
$a
Electronic reproduction.
$b
Ann Arbor, Mich. :
$c
ProQuest,
$d
2018
538
$a
Mode of access: World Wide Web
650
4
$a
Linguistics.
$3
557829
650
4
$a
Computer science.
$3
573171
655
7
$a
Electronic books.
$2
local
$3
554714
690
$a
0290
690
$a
0984
710
2
$a
ProQuest Information and Learning Co.
$3
1178819
710
2
$a
The University of Arizona.
$b
Linguistics.
$3
1183978
856
4 0
$u
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=10622193
$z
click for full text (PQDT)
筆 0 讀者評論
多媒體
評論
新增評論
分享你的心得
Export
取書館別
處理中
...
變更密碼[密碼必須為2種組合(英文和數字)及長度為10碼以上]
登入