語系:
繁體中文
English
說明(常見問題)
登入
回首頁
切換:
標籤
|
MARC模式
|
ISBD
Distributed Learning on Large-scale ...
~
Wang, Ruohui.
Distributed Learning on Large-scale Data.
紀錄類型:
書目-語言資料,手稿 : Monograph/item
正題名/作者:
Distributed Learning on Large-scale Data./
作者:
Wang, Ruohui.
面頁冊數:
1 online resource (66 pages)
附註:
Source: Dissertation Abstracts International, Volume: 79-07(E), Section: B.
標題:
Computer science. -
電子資源:
click for full text (PQDT)
ISBN:
9780355599077
Distributed Learning on Large-scale Data.
Wang, Ruohui.
Distributed Learning on Large-scale Data.
- 1 online resource (66 pages)
Source: Dissertation Abstracts International, Volume: 79-07(E), Section: B.
Thesis (Ph.D.)--The Chinese University of Hong Kong (Hong Kong), 2017.
Includes bibliographical references
Big data helps machine learning to get better models, but also brings computational challenges. Conventional machine learning algorithms are usually designed for small-scale data. When faced with large-scale data, especially one whose volume exceeds the capacity of a single computer, these algorithms become infeasible. Nowadays, people increasingly rely on distributed computing clusters to process large-scale data. Designing effective algorithms for distributed systems becomes an important topic in machine learning area. In this thesis, we focus on two specific algorithms, exploring their extension in distributed computing environment.
Electronic reproduction.
Ann Arbor, Mich. :
ProQuest,
2018
Mode of access: World Wide Web
ISBN: 9780355599077Subjects--Topical Terms:
573171
Computer science.
Index Terms--Genre/Form:
554714
Electronic books.
Distributed Learning on Large-scale Data.
LDR
:02964ntm a2200349K 4500
001
912205
005
20180608102941.5
006
m o u
007
cr mn||||a|a||
008
190606s2017 xx obm 000 0 eng d
020
$a
9780355599077
035
$a
(MiAaPQ)AAI10757597
035
$a
AAI10757597
040
$a
MiAaPQ
$b
eng
$c
MiAaPQ
100
1
$a
Wang, Ruohui.
$3
1184456
245
1 0
$a
Distributed Learning on Large-scale Data.
264
0
$c
2017
300
$a
1 online resource (66 pages)
336
$a
text
$b
txt
$2
rdacontent
337
$a
computer
$b
c
$2
rdamedia
338
$a
online resource
$b
cr
$2
rdacarrier
500
$a
Source: Dissertation Abstracts International, Volume: 79-07(E), Section: B.
500
$a
Adviser: Xiaogang Wang.
502
$a
Thesis (Ph.D.)--The Chinese University of Hong Kong (Hong Kong), 2017.
504
$a
Includes bibliographical references
520
$a
Big data helps machine learning to get better models, but also brings computational challenges. Conventional machine learning algorithms are usually designed for small-scale data. When faced with large-scale data, especially one whose volume exceeds the capacity of a single computer, these algorithms become infeasible. Nowadays, people increasingly rely on distributed computing clusters to process large-scale data. Designing effective algorithms for distributed systems becomes an important topic in machine learning area. In this thesis, we focus on two specific algorithms, exploring their extension in distributed computing environment.
520
$a
In the first part of this work, we studied the problem of estimating Dirichlet process mixture models. We designed sampling algorithms that are suitable for distributed systems. They allow processors to discover new clusters independently while maintaining their consistency via consolidation schemes. The developed algorithms require low communication cost and can be easily applied to asynchronous settings. This part of the work has been accepted in 26th International Joint Conference on Artificial Intelligence (IJCAI 2017).
520
$a
In the second part, we studied batch normalization techniques within deep neural networks. We reformulated the classical batch normalization method by treating batch statistics as model parameters and introducing proximal optimization procedures to update them through iterations. This design eliminates the communication required by accumulating batch statistics thus scales well in distributed computing environments. In the meantime, it effectively mitigates the performance issue of small mini-batches. This part of the work has been submitted to 31th Annual Conference on Neural Information Processing Systems (NIPS 2017).
533
$a
Electronic reproduction.
$b
Ann Arbor, Mich. :
$c
ProQuest,
$d
2018
538
$a
Mode of access: World Wide Web
650
4
$a
Computer science.
$3
573171
650
4
$a
Artificial intelligence.
$3
559380
650
4
$a
Information technology.
$3
559429
655
7
$a
Electronic books.
$2
local
$3
554714
690
$a
0984
690
$a
0800
690
$a
0489
710
2
$a
ProQuest Information and Learning Co.
$3
1178819
710
2
$a
The Chinese University of Hong Kong (Hong Kong).
$b
Information Engineering.
$3
1184457
856
4 0
$u
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=10757597
$z
click for full text (PQDT)
筆 0 讀者評論
多媒體
評論
新增評論
分享你的心得
Export
取書館別
處理中
...
變更密碼[密碼必須為2種組合(英文和數字)及長度為10碼以上]
登入