語系:
繁體中文
English
說明(常見問題)
登入
回首頁
切換:
標籤
|
MARC模式
|
ISBD
Exploiting Heterogeneity for Efficient Deep Learning.
紀錄類型:
書目-語言資料,手稿 : Monograph/item
正題名/作者:
Exploiting Heterogeneity for Efficient Deep Learning./
作者:
Lin, Shouxu.
面頁冊數:
1 online resource (56 pages)
附註:
Source: Masters Abstracts International, Volume: 84-06.
Contained By:
Masters Abstracts International84-06.
標題:
Information science. -
電子資源:
click for full text (PQDT)
ISBN:
9798363516306
Exploiting Heterogeneity for Efficient Deep Learning.
Lin, Shouxu.
Exploiting Heterogeneity for Efficient Deep Learning.
- 1 online resource (56 pages)
Source: Masters Abstracts International, Volume: 84-06.
Thesis (M.S.)--Carnegie Mellon University, 2022.
Includes bibliographical references
In recent years, we have observed that the scale of Deep Neural Network (DNN) Models, with respect to the number of training examples, the number of model parameters, or both, has multiplied by several orders of magnitude. Accompanied by this is the significant growth in the computational requirements for training these large DNN models. To accelerate these giant DNN models, several different parallelization strategies, such as data and model parallelism, have been widely adopted, where the computation workloads are distributed and parallelized across multiple accelerators. Several recent works have been conducted to automate the parallelization strategy for either model-specific or general DNN models.This thesis steps beyond the existing early works in distributed deep learning by exploiting hardware accelerator and DNN model heterogeneity. We investigate the exploitation of heterogeneity through two case studies: heterogeneity-aware DL cluster scheduler and heterogeneity-aware DNN parallelization planner.
Electronic reproduction.
Ann Arbor, Mich. :
ProQuest,
2024
Mode of access: World Wide Web
ISBN: 9798363516306Subjects--Topical Terms:
561178
Information science.
Subjects--Index Terms:
HeterogeneityIndex Terms--Genre/Form:
554714
Electronic books.
Exploiting Heterogeneity for Efficient Deep Learning.
LDR
:02259ntm a22003617 4500
001
1147253
005
20240909100738.5
006
m o d
007
cr bn ---uuuuu
008
250605s2022 xx obm 000 0 eng d
020
$a
9798363516306
035
$a
(MiAaPQ)AAI30242225
035
$a
AAI30242225
040
$a
MiAaPQ
$b
eng
$c
MiAaPQ
$d
NTU
100
1
$a
Lin, Shouxu.
$3
1472934
245
1 0
$a
Exploiting Heterogeneity for Efficient Deep Learning.
264
0
$c
2022
300
$a
1 online resource (56 pages)
336
$a
text
$b
txt
$2
rdacontent
337
$a
computer
$b
c
$2
rdamedia
338
$a
online resource
$b
cr
$2
rdacarrier
500
$a
Source: Masters Abstracts International, Volume: 84-06.
500
$a
Advisor: Ganger, Greg.
502
$a
Thesis (M.S.)--Carnegie Mellon University, 2022.
504
$a
Includes bibliographical references
520
$a
In recent years, we have observed that the scale of Deep Neural Network (DNN) Models, with respect to the number of training examples, the number of model parameters, or both, has multiplied by several orders of magnitude. Accompanied by this is the significant growth in the computational requirements for training these large DNN models. To accelerate these giant DNN models, several different parallelization strategies, such as data and model parallelism, have been widely adopted, where the computation workloads are distributed and parallelized across multiple accelerators. Several recent works have been conducted to automate the parallelization strategy for either model-specific or general DNN models.This thesis steps beyond the existing early works in distributed deep learning by exploiting hardware accelerator and DNN model heterogeneity. We investigate the exploitation of heterogeneity through two case studies: heterogeneity-aware DL cluster scheduler and heterogeneity-aware DNN parallelization planner.
533
$a
Electronic reproduction.
$b
Ann Arbor, Mich. :
$c
ProQuest,
$d
2024
538
$a
Mode of access: World Wide Web
650
4
$a
Information science.
$3
561178
650
4
$a
Computer science.
$3
573171
653
$a
Heterogeneity
653
$a
Deep Learning
653
$a
Deep Neural Networks
655
7
$a
Electronic books.
$2
local
$3
554714
690
$a
0984
690
$a
0723
710
2
$a
Carnegie Mellon University.
$b
Information Networking Institute.
$3
1466992
710
2
$a
ProQuest Information and Learning Co.
$3
1178819
773
0
$t
Masters Abstracts International
$g
84-06.
856
4 0
$u
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=30242225
$z
click for full text (PQDT)
筆 0 讀者評論
多媒體
評論
新增評論
分享你的心得
Export
取書館別
處理中
...
變更密碼[密碼必須為2種組合(英文和數字)及長度為10碼以上]
登入