語系:
繁體中文
English
說明(常見問題)
登入
回首頁
切換:
標籤
|
MARC模式
|
ISBD
Non-convex Optimization for Machine ...
~
Princeton University.
Non-convex Optimization for Machine Learning : = Design, Analysis, and Understanding.
紀錄類型:
書目-語言資料,手稿 : Monograph/item
正題名/作者:
Non-convex Optimization for Machine Learning :/
其他題名:
Design, Analysis, and Understanding.
作者:
Ma, Tengyu.
面頁冊數:
1 online resource (301 pages)
附註:
Source: Dissertation Abstracts International, Volume: 79-05(E), Section: B.
Contained By:
Dissertation Abstracts International79-05B(E).
標題:
Artificial intelligence. -
電子資源:
click for full text (PQDT)
ISBN:
9780355480559
Non-convex Optimization for Machine Learning : = Design, Analysis, and Understanding.
Ma, Tengyu.
Non-convex Optimization for Machine Learning :
Design, Analysis, and Understanding. - 1 online resource (301 pages)
Source: Dissertation Abstracts International, Volume: 79-05(E), Section: B.
Thesis (Ph.D.)--Princeton University, 2017.
Includes bibliographical references
Non-convex optimization is ubiquitous in modern machine learning: recent breakthroughs in deep learning require optimizing non-convex training objective functions; problems that admit accurate convex relaxation can often be solved more efficiently with non-convex formulations. However, the theoretical understanding of non-convex optimization remained rather limited. Can we extend the algorithmic frontier by efficiently optimizing a family of interesting non-convex functions? Can we successfully apply non-convex optimization to machine learning problems with provable guarantees? How do we interpret the complicated models in machine learning that demand non-convex optimizers?
Electronic reproduction.
Ann Arbor, Mich. :
ProQuest,
2018
Mode of access: World Wide Web
ISBN: 9780355480559Subjects--Topical Terms:
559380
Artificial intelligence.
Index Terms--Genre/Form:
554714
Electronic books.
Non-convex Optimization for Machine Learning : = Design, Analysis, and Understanding.
LDR
:03568ntm a2200397Ki 4500
001
916821
005
20180928111501.5
006
m o u
007
cr mn||||a|a||
008
190606s2017 xx obm 000 0 eng d
020
$a
9780355480559
035
$a
(MiAaPQ)AAI10638379
035
$a
(MiAaPQ)princeton:12361
035
$a
AAI10638379
040
$a
MiAaPQ
$b
eng
$c
MiAaPQ
$d
NTU
100
1
$a
Ma, Tengyu.
$3
1190666
245
1 0
$a
Non-convex Optimization for Machine Learning :
$b
Design, Analysis, and Understanding.
264
0
$c
2017
300
$a
1 online resource (301 pages)
336
$a
text
$b
txt
$2
rdacontent
337
$a
computer
$b
c
$2
rdamedia
338
$a
online resource
$b
cr
$2
rdacarrier
500
$a
Source: Dissertation Abstracts International, Volume: 79-05(E), Section: B.
500
$a
Adviser: Sanjeev Arora.
502
$a
Thesis (Ph.D.)--Princeton University, 2017.
504
$a
Includes bibliographical references
520
$a
Non-convex optimization is ubiquitous in modern machine learning: recent breakthroughs in deep learning require optimizing non-convex training objective functions; problems that admit accurate convex relaxation can often be solved more efficiently with non-convex formulations. However, the theoretical understanding of non-convex optimization remained rather limited. Can we extend the algorithmic frontier by efficiently optimizing a family of interesting non-convex functions? Can we successfully apply non-convex optimization to machine learning problems with provable guarantees? How do we interpret the complicated models in machine learning that demand non-convex optimizers?
520
$a
Towards addressing these questions, in this thesis, we theoretically studied various machine learning models including sparse coding, topic models, and matrix completion, linear dynamical systems, and word embeddings.
520
$a
We first consider how to find a coarse solution to serve as a good starting point for local improvement algorithms such as stochastic gradient descent. We propose efficient methods for sparse coding and topic inference with better provable guarantees.
520
$a
Second, we propose a framework for analyzing local improvement algorithms that start from a course solution. We apply it successfully to the sparse coding problem.
520
$a
Then, we consider a family of non-convex functions satisfying that all local minima are also global (and some additional regularity property). Such functions can be optimized by local improvement algorithms efficiently from a random or arbitrary starting point. The challenge that we address here, in turn, becomes proving that an objective function belongs to this class. We establish such results for the natural learning objectives of matrix completion and linear dynamical systems.
520
$a
Finally, we make steps towards interpreting the non-linear models that require non-convex training algorithms. We reflect on the principles of word embeddings in natural language processing. We give a generative model for the texts, using which we explain why different non-convex formulations such as word2vec and GloVe can learn similar word embeddings with the surprising performance --- analogous words have embeddings with similar differences.
533
$a
Electronic reproduction.
$b
Ann Arbor, Mich. :
$c
ProQuest,
$d
2018
538
$a
Mode of access: World Wide Web
650
4
$a
Artificial intelligence.
$3
559380
650
4
$a
Computer science.
$3
573171
655
7
$a
Electronic books.
$2
local
$3
554714
690
$a
0800
690
$a
0984
710
2
$a
ProQuest Information and Learning Co.
$3
1178819
710
2
$a
Princeton University.
$b
Computer Science.
$3
1179801
773
0
$t
Dissertation Abstracts International
$g
79-05B(E).
856
4 0
$u
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=10638379
$z
click for full text (PQDT)
筆 0 讀者評論
多媒體
評論
新增評論
分享你的心得
Export
取書館別
處理中
...
變更密碼[密碼必須為2種組合(英文和數字)及長度為10碼以上]
登入