語系:
繁體中文
English
說明(常見問題)
登入
回首頁
切換:
標籤
|
MARC模式
|
ISBD
Joint Training for Neural Machine Tr...
~
Cheng, Yong.
Joint Training for Neural Machine Translation
紀錄類型:
書目-語言資料,印刷品 : Monograph/item
正題名/作者:
Joint Training for Neural Machine Translation/ by Yong Cheng.
作者:
Cheng, Yong.
面頁冊數:
XIII, 78 p. 23 illus., 9 illus. in color.online resource. :
Contained By:
Springer Nature eBook
標題:
Natural language processing (Computer science). -
電子資源:
https://doi.org/10.1007/978-981-32-9748-7
ISBN:
9789813297487
Joint Training for Neural Machine Translation
Cheng, Yong.
Joint Training for Neural Machine Translation
[electronic resource] /by Yong Cheng. - 1st ed. 2019. - XIII, 78 p. 23 illus., 9 illus. in color.online resource. - Springer Theses, Recognizing Outstanding Ph.D. Research,2190-5053. - Springer Theses, Recognizing Outstanding Ph.D. Research,.
1. Introduction -- 2. Neural Machine Translation -- 3. Agreement-based Joint Training for Bidirectional Attention-based Neural Machine Translation -- 4. Semi-supervised Learning for Neural Machine Translation -- 5. Joint Training for Pivot-based Neural Machine Translation -- 6. Joint Modeling for Bidirectional Neural Machine Translation with Contrastive Learning -- 7. Related Work -- 8. Conclusion.
This book presents four approaches to jointly training bidirectional neural machine translation (NMT) models. First, in order to improve the accuracy of the attention mechanism, it proposes an agreement-based joint training approach to help the two complementary models agree on word alignment matrices for the same training data. Second, it presents a semi-supervised approach that uses an autoencoder to reconstruct monolingual corpora, so as to incorporate these corpora into neural machine translation. It then introduces a joint training algorithm for pivot-based neural machine translation, which can be used to mitigate the data scarcity problem. Lastly it describes an end-to-end bidirectional NMT model to connect the source-to-target and target-to-source translation models, allowing the interaction of parameters between these two directional models.
ISBN: 9789813297487
Standard No.: 10.1007/978-981-32-9748-7doiSubjects--Topical Terms:
802180
Natural language processing (Computer science).
LC Class. No.: QA76.9.N38
Dewey Class. No.: 006.35
Joint Training for Neural Machine Translation
LDR
:02620nam a22003975i 4500
001
1009917
003
DE-He213
005
20200704201045.0
007
cr nn 008mamaa
008
210106s2019 si | s |||| 0|eng d
020
$a
9789813297487
$9
978-981-32-9748-7
024
7
$a
10.1007/978-981-32-9748-7
$2
doi
035
$a
978-981-32-9748-7
050
4
$a
QA76.9.N38
072
7
$a
UYQL
$2
bicssc
072
7
$a
COM073000
$2
bisacsh
072
7
$a
UYQL
$2
thema
082
0 4
$a
006.35
$2
23
100
1
$a
Cheng, Yong.
$e
author.
$4
aut
$4
http://id.loc.gov/vocabulary/relators/aut
$3
1303938
245
1 0
$a
Joint Training for Neural Machine Translation
$h
[electronic resource] /
$c
by Yong Cheng.
250
$a
1st ed. 2019.
264
1
$a
Singapore :
$b
Springer Singapore :
$b
Imprint: Springer,
$c
2019.
300
$a
XIII, 78 p. 23 illus., 9 illus. in color.
$b
online resource.
336
$a
text
$b
txt
$2
rdacontent
337
$a
computer
$b
c
$2
rdamedia
338
$a
online resource
$b
cr
$2
rdacarrier
347
$a
text file
$b
PDF
$2
rda
490
1
$a
Springer Theses, Recognizing Outstanding Ph.D. Research,
$x
2190-5053
505
0
$a
1. Introduction -- 2. Neural Machine Translation -- 3. Agreement-based Joint Training for Bidirectional Attention-based Neural Machine Translation -- 4. Semi-supervised Learning for Neural Machine Translation -- 5. Joint Training for Pivot-based Neural Machine Translation -- 6. Joint Modeling for Bidirectional Neural Machine Translation with Contrastive Learning -- 7. Related Work -- 8. Conclusion.
520
$a
This book presents four approaches to jointly training bidirectional neural machine translation (NMT) models. First, in order to improve the accuracy of the attention mechanism, it proposes an agreement-based joint training approach to help the two complementary models agree on word alignment matrices for the same training data. Second, it presents a semi-supervised approach that uses an autoencoder to reconstruct monolingual corpora, so as to incorporate these corpora into neural machine translation. It then introduces a joint training algorithm for pivot-based neural machine translation, which can be used to mitigate the data scarcity problem. Lastly it describes an end-to-end bidirectional NMT model to connect the source-to-target and target-to-source translation models, allowing the interaction of parameters between these two directional models.
650
0
$a
Natural language processing (Computer science).
$3
802180
650
0
$a
Artificial intelligence.
$3
559380
650
0
$a
Computer logic.
$3
786340
650
1 4
$a
Natural Language Processing (NLP).
$3
1254293
650
2 4
$a
Logic in AI.
$3
1228083
710
2
$a
SpringerLink (Online service)
$3
593884
773
0
$t
Springer Nature eBook
776
0 8
$i
Printed edition:
$z
9789813297470
776
0 8
$i
Printed edition:
$z
9789813297494
830
0
$a
Springer Theses, Recognizing Outstanding Ph.D. Research,
$x
2190-5053
$3
1253569
856
4 0
$u
https://doi.org/10.1007/978-981-32-9748-7
912
$a
ZDB-2-SCS
912
$a
ZDB-2-SXCS
950
$a
Computer Science (SpringerNature-11645)
950
$a
Computer Science (R0) (SpringerNature-43710)
筆 0 讀者評論
多媒體
評論
新增評論
分享你的心得
Export
取書館別
處理中
...
變更密碼[密碼必須為2種組合(英文和數字)及長度為10碼以上]
登入