語系:
繁體中文
English
說明(常見問題)
登入
回首頁
切換:
標籤
|
MARC模式
|
ISBD
Reliable and Adaptive Stochastic Optimization in the Face of Messy Data.
紀錄類型:
書目-語言資料,手稿 : Monograph/item
正題名/作者:
Reliable and Adaptive Stochastic Optimization in the Face of Messy Data./
作者:
Xie, Miaolan.
面頁冊數:
1 online resource (151 pages)
附註:
Source: Dissertations Abstracts International, Volume: 85-12, Section: A.
Contained By:
Dissertations Abstracts International85-12A.
標題:
Mathematics. -
電子資源:
click for full text (PQDT)
ISBN:
9798382842257
Reliable and Adaptive Stochastic Optimization in the Face of Messy Data.
Xie, Miaolan.
Reliable and Adaptive Stochastic Optimization in the Face of Messy Data.
- 1 online resource (151 pages)
Source: Dissertations Abstracts International, Volume: 85-12, Section: A.
Thesis (Ph.D.)--Cornell University, 2024.
Includes bibliographical references
Solving real-world stochastic optimization problems (e.g., in machine learning) presents two key challenges: the messiness of real-world data, which can be noisy, biased, or corrupted due to factors like outliers, distribution shifts, and even adversarial attacks; and the laborious, time-intensive requirement of manually tuning step sizes in many existing algorithms.I study stochastic adaptive optimization algorithms under a simple, common framework. The algorithms in this framework avoid the need for manual step size tuning by adaptively adjusting it in each iteration based on the algorithm's progress. To address the issue of messy data, the framework only assumes access to function-related information through probabilistic oracles, which may be biased and corrupted. This framework is very general, encompassing a wide range of algorithms, and is applicable to multiple problem settings, such as expected loss minimization in machine learning, simulation optimization, and derivative-free optimization. We establish iteration complexity bounds for two algorithms within it - stochastic adaptive step search and stochastic adaptive cubic-regularized Newton method - under reasonable oracle conditions. Additionally, we derive a meta-theorem to bound the sample complexity for any algorithm in the framework.
Electronic reproduction.
Ann Arbor, Mich. :
ProQuest,
2024
Mode of access: World Wide Web
ISBN: 9798382842257Subjects--Topical Terms:
527692
Mathematics.
Subjects--Index Terms:
Mathematical optimizationIndex Terms--Genre/Form:
554714
Electronic books.
Reliable and Adaptive Stochastic Optimization in the Face of Messy Data.
LDR
:02731ntm a22004097 4500
001
1151738
005
20241118085748.5
006
m o d
007
cr mn ---uuuuu
008
250605s2024 xx obm 000 0 eng d
020
$a
9798382842257
035
$a
(MiAaPQ)AAI31242760
035
$a
AAI31242760
040
$a
MiAaPQ
$b
eng
$c
MiAaPQ
$d
NTU
100
1
$a
Xie, Miaolan.
$3
1478559
245
1 0
$a
Reliable and Adaptive Stochastic Optimization in the Face of Messy Data.
264
0
$c
2024
300
$a
1 online resource (151 pages)
336
$a
text
$b
txt
$2
rdacontent
337
$a
computer
$b
c
$2
rdamedia
338
$a
online resource
$b
cr
$2
rdacarrier
500
$a
Source: Dissertations Abstracts International, Volume: 85-12, Section: A.
500
$a
Advisor: Scheinberg, Katya.
502
$a
Thesis (Ph.D.)--Cornell University, 2024.
504
$a
Includes bibliographical references
520
$a
Solving real-world stochastic optimization problems (e.g., in machine learning) presents two key challenges: the messiness of real-world data, which can be noisy, biased, or corrupted due to factors like outliers, distribution shifts, and even adversarial attacks; and the laborious, time-intensive requirement of manually tuning step sizes in many existing algorithms.I study stochastic adaptive optimization algorithms under a simple, common framework. The algorithms in this framework avoid the need for manual step size tuning by adaptively adjusting it in each iteration based on the algorithm's progress. To address the issue of messy data, the framework only assumes access to function-related information through probabilistic oracles, which may be biased and corrupted. This framework is very general, encompassing a wide range of algorithms, and is applicable to multiple problem settings, such as expected loss minimization in machine learning, simulation optimization, and derivative-free optimization. We establish iteration complexity bounds for two algorithms within it - stochastic adaptive step search and stochastic adaptive cubic-regularized Newton method - under reasonable oracle conditions. Additionally, we derive a meta-theorem to bound the sample complexity for any algorithm in the framework.
533
$a
Electronic reproduction.
$b
Ann Arbor, Mich. :
$c
ProQuest,
$d
2024
538
$a
Mode of access: World Wide Web
650
4
$a
Mathematics.
$3
527692
650
4
$a
Computer science.
$3
573171
650
4
$a
Information science.
$3
561178
653
$a
Mathematical optimization
653
$a
Nonlinear optimization
653
$a
Simulation optimization
653
$a
Stochastic optimization
653
$a
Messy data
655
7
$a
Electronic books.
$2
local
$3
554714
690
$a
0796
690
$a
0984
690
$a
0723
690
$a
0405
710
2
$a
ProQuest Information and Learning Co.
$3
1178819
710
2
$a
Cornell University.
$b
Operations Research and Information Engineering.
$3
1466270
773
0
$t
Dissertations Abstracts International
$g
85-12A.
856
4 0
$u
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=31242760
$z
click for full text (PQDT)
筆 0 讀者評論
多媒體
評論
新增評論
分享你的心得
Export
取書館別
處理中
...
變更密碼[密碼必須為2種組合(英文和數字)及長度為10碼以上]
登入
第一次登入時,112年前入學、到職者,密碼請使用身分證號登入;112年後入學、到職者,密碼請使用身分證號"後六碼"登入,請注意帳號密碼有區分大小寫!
帳號(學號)
密碼
請在此電腦上記得個人資料
取消
忘記密碼? (請注意!您必須已在系統登記E-mail信箱方能使用。)