語系:
繁體中文
English
說明(常見問題)
登入
回首頁
切換:
標籤
|
MARC模式
|
ISBD
SPAF-network with Saturating Pretrai...
~
Burhani, Hasham.
SPAF-network with Saturating Pretraining Neurons.
紀錄類型:
書目-語言資料,手稿 : Monograph/item
正題名/作者:
SPAF-network with Saturating Pretraining Neurons./
作者:
Burhani, Hasham.
面頁冊數:
1 online resource (144 pages)
附註:
Source: Masters Abstracts International, Volume: 55-04.
標題:
Artificial intelligence. -
電子資源:
click for full text (PQDT)
ISBN:
9781339647159
SPAF-network with Saturating Pretraining Neurons.
Burhani, Hasham.
SPAF-network with Saturating Pretraining Neurons.
- 1 online resource (144 pages)
Source: Masters Abstracts International, Volume: 55-04.
Thesis (M.S.)--Trent University (Canada), 2016.
Includes bibliographical references
In this work, various aspects of neural networks, pre-trained with denoising autoencoders (DAE) are explored. To saturate neurons more quickly for feature learning in DAE, an activation function that offers higher gradients is introduced. Moreover, the introduction of sparsity functions applied to the hidden layer representations is studied. More importantly, a technique that swaps the activation functions of fully trained DAE to logistic functions is studied, networks trained using this technique are reffered to as SPAF-networks. For evaluation, the popular MNIST dataset as well as all \(3\) sub-datasets of the Chars74k dataset are used for classification purposes. The SPAF-network is also analyzed for the features it learns with a logistic, ReLU and a custom activation function. Lastly future roadmap is proposed for enhancements to the SPAF-network.
Electronic reproduction.
Ann Arbor, Mich. :
ProQuest,
2018
Mode of access: World Wide Web
ISBN: 9781339647159Subjects--Topical Terms:
559380
Artificial intelligence.
Index Terms--Genre/Form:
554714
Electronic books.
SPAF-network with Saturating Pretraining Neurons.
LDR
:01996ntm a2200337K 4500
001
915742
005
20180823122923.5
006
m o u
007
cr mn||||a|a||
008
190606s2016 xx obm 000 0 eng d
020
$a
9781339647159
035
$a
(MiAaPQ)AAI10100845
035
$a
(MiAaPQ)trentu:10315
035
$a
AAI10100845
040
$a
MiAaPQ
$b
eng
$c
MiAaPQ
100
1
$a
Burhani, Hasham.
$3
1189226
245
1 0
$a
SPAF-network with Saturating Pretraining Neurons.
264
0
$c
2016
300
$a
1 online resource (144 pages)
336
$a
text
$b
txt
$2
rdacontent
337
$a
computer
$b
c
$2
rdamedia
338
$a
online resource
$b
cr
$2
rdacarrier
500
$a
Source: Masters Abstracts International, Volume: 55-04.
500
$a
Adviser: Feng Wenying.
502
$a
Thesis (M.S.)--Trent University (Canada), 2016.
504
$a
Includes bibliographical references
520
$a
In this work, various aspects of neural networks, pre-trained with denoising autoencoders (DAE) are explored. To saturate neurons more quickly for feature learning in DAE, an activation function that offers higher gradients is introduced. Moreover, the introduction of sparsity functions applied to the hidden layer representations is studied. More importantly, a technique that swaps the activation functions of fully trained DAE to logistic functions is studied, networks trained using this technique are reffered to as SPAF-networks. For evaluation, the popular MNIST dataset as well as all \(3\) sub-datasets of the Chars74k dataset are used for classification purposes. The SPAF-network is also analyzed for the features it learns with a logistic, ReLU and a custom activation function. Lastly future roadmap is proposed for enhancements to the SPAF-network.
533
$a
Electronic reproduction.
$b
Ann Arbor, Mich. :
$c
ProQuest,
$d
2018
538
$a
Mode of access: World Wide Web
650
4
$a
Artificial intelligence.
$3
559380
650
4
$a
Computer science.
$3
573171
650
4
$a
Mathematics.
$3
527692
655
7
$a
Electronic books.
$2
local
$3
554714
690
$a
0800
690
$a
0984
690
$a
0405
710
2
$a
ProQuest Information and Learning Co.
$3
1178819
710
2
$a
Trent University (Canada).
$b
Applied Modeling and Quantitative Methods.
$3
1189227
856
4 0
$u
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=10100845
$z
click for full text (PQDT)
筆 0 讀者評論
多媒體
評論
新增評論
分享你的心得
Export
取書館別
處理中
...
變更密碼[密碼必須為2種組合(英文和數字)及長度為10碼以上]
登入