語系:
繁體中文
English
說明(常見問題)
登入
回首頁
切換:
標籤
|
MARC模式
|
ISBD
Attention to Deep Structure in Recur...
~
University of Wyoming.
Attention to Deep Structure in Recurrent Neural Networks.
紀錄類型:
書目-語言資料,手稿 : Monograph/item
正題名/作者:
Attention to Deep Structure in Recurrent Neural Networks./
作者:
Sharpe, Spencer S.
面頁冊數:
1 online resource (62 pages)
附註:
Source: Masters Abstracts International, Volume: 57-01.
標題:
Artificial intelligence. -
電子資源:
click for full text (PQDT)
ISBN:
9780355537383
Attention to Deep Structure in Recurrent Neural Networks.
Sharpe, Spencer S.
Attention to Deep Structure in Recurrent Neural Networks.
- 1 online resource (62 pages)
Source: Masters Abstracts International, Volume: 57-01.
Thesis (M.S.)--University of Wyoming, 2017.
Includes bibliographical references
Deep recurrent networks can build complex representations of sequential data, enabling them to do things such as translate text from one language into another. These networks often utilize an attention mechanism that allows them to focus on important input representations. Network attention can be used to provide insight into network strategies for identifying structure in the input sequence. This study investigates attention in a deep recurrent network, trained to annotate text, to determine how the distribution of information in the text affects learning and network performance. Results suggest that reversing the input text makes it difficult for the network to discover higher-order linguistic structure. This study contributes to an understanding of how our brains might process sequential data.
Electronic reproduction.
Ann Arbor, Mich. :
ProQuest,
2018
Mode of access: World Wide Web
ISBN: 9780355537383Subjects--Topical Terms:
559380
Artificial intelligence.
Index Terms--Genre/Form:
554714
Electronic books.
Attention to Deep Structure in Recurrent Neural Networks.
LDR
:01964ntm a2200337K 4500
001
912170
005
20180608102941.5
006
m o u
007
cr mn||||a|a||
008
190606s2017 xx obm 000 0 eng d
020
$a
9780355537383
035
$a
(MiAaPQ)AAI10619110
035
$a
(MiAaPQ)uwyo:12614
035
$a
AAI10619110
040
$a
MiAaPQ
$b
eng
$c
MiAaPQ
100
1
$a
Sharpe, Spencer S.
$3
1184406
245
1 0
$a
Attention to Deep Structure in Recurrent Neural Networks.
264
0
$c
2017
300
$a
1 online resource (62 pages)
336
$a
text
$b
txt
$2
rdacontent
337
$a
computer
$b
c
$2
rdamedia
338
$a
online resource
$b
cr
$2
rdacarrier
500
$a
Source: Masters Abstracts International, Volume: 57-01.
500
$a
Advisers: Cameron H.G. Wright; Steve Barrett.
502
$a
Thesis (M.S.)--University of Wyoming, 2017.
504
$a
Includes bibliographical references
520
$a
Deep recurrent networks can build complex representations of sequential data, enabling them to do things such as translate text from one language into another. These networks often utilize an attention mechanism that allows them to focus on important input representations. Network attention can be used to provide insight into network strategies for identifying structure in the input sequence. This study investigates attention in a deep recurrent network, trained to annotate text, to determine how the distribution of information in the text affects learning and network performance. Results suggest that reversing the input text makes it difficult for the network to discover higher-order linguistic structure. This study contributes to an understanding of how our brains might process sequential data.
533
$a
Electronic reproduction.
$b
Ann Arbor, Mich. :
$c
ProQuest,
$d
2018
538
$a
Mode of access: World Wide Web
650
4
$a
Artificial intelligence.
$3
559380
650
4
$a
Information science.
$3
561178
650
4
$a
Computer science.
$3
573171
655
7
$a
Electronic books.
$2
local
$3
554714
690
$a
0800
690
$a
0723
690
$a
0984
710
2
$a
ProQuest Information and Learning Co.
$3
1178819
710
2
$a
University of Wyoming.
$b
Neuroscience.
$3
1184407
856
4 0
$u
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=10619110
$z
click for full text (PQDT)
筆 0 讀者評論
多媒體
評論
新增評論
分享你的心得
Export
取書館別
處理中
...
變更密碼[密碼必須為2種組合(英文和數字)及長度為10碼以上]
登入