語系:
繁體中文
English
說明(常見問題)
登入
回首頁
切換:
標籤
|
MARC模式
|
ISBD
Random Models in Nonlinear Optimization.
~
ProQuest Information and Learning Co.
Random Models in Nonlinear Optimization.
紀錄類型:
書目-語言資料,手稿 : Monograph/item
正題名/作者:
Random Models in Nonlinear Optimization./
作者:
Menickelly, Matt.
面頁冊數:
1 online resource (168 pages)
附註:
Source: Dissertation Abstracts International, Volume: 79-03(E), Section: B.
Contained By:
Dissertation Abstracts International79-03B(E).
標題:
Industrial engineering. -
電子資源:
click for full text (PQDT)
ISBN:
9780355245844
Random Models in Nonlinear Optimization.
Menickelly, Matt.
Random Models in Nonlinear Optimization.
- 1 online resource (168 pages)
Source: Dissertation Abstracts International, Volume: 79-03(E), Section: B.
Thesis (Ph.D.)
Includes bibliographical references
In recent years, there has been a tremendous increase in the interest of applying techniques of deterministic optimization to stochastic settings, largely motivated by problems that come from machine learning domains. A natural question that arises in light of this interest is the extent to which iterative algorithms designed for deterministic (nonlinear, possibly non-convex) optimization must be modified in order to properly make use of inherently random information about a problem. This thesis is concerned with exactly this question, and adapts the model-based trust-region framework of derivative-free optimization (DFO) for use in situations where objective function values or the set of points selected by an algorithm to be objectively evaluated are random.
Electronic reproduction.
Ann Arbor, Mich. :
ProQuest,
2018
Mode of access: World Wide Web
ISBN: 9780355245844Subjects--Topical Terms:
679492
Industrial engineering.
Index Terms--Genre/Form:
554714
Electronic books.
Random Models in Nonlinear Optimization.
LDR
:03890ntm a2200385Ki 4500
001
910778
005
20180517112610.5
006
m o u
007
cr mn||||a|a||
008
190606s2017 xx obm 000 0 eng d
020
$a
9780355245844
035
$a
(MiAaPQ)AAI10287265
035
$a
(MiAaPQ)lehigh:11751
035
$a
AAI10287265
040
$a
MiAaPQ
$b
eng
$c
MiAaPQ
099
$a
TUL
$f
hyy
$c
available through World Wide Web
100
1
$a
Menickelly, Matt.
$3
1182233
245
1 0
$a
Random Models in Nonlinear Optimization.
264
0
$c
2017
300
$a
1 online resource (168 pages)
336
$a
text
$b
txt
$2
rdacontent
337
$a
computer
$b
c
$2
rdamedia
338
$a
online resource
$b
cr
$2
rdacarrier
500
$a
Source: Dissertation Abstracts International, Volume: 79-03(E), Section: B.
500
$a
Adviser: Katya Scheinberg.
502
$a
Thesis (Ph.D.)
$c
Lehigh University
$d
2017.
504
$a
Includes bibliographical references
520
$a
In recent years, there has been a tremendous increase in the interest of applying techniques of deterministic optimization to stochastic settings, largely motivated by problems that come from machine learning domains. A natural question that arises in light of this interest is the extent to which iterative algorithms designed for deterministic (nonlinear, possibly non-convex) optimization must be modified in order to properly make use of inherently random information about a problem. This thesis is concerned with exactly this question, and adapts the model-based trust-region framework of derivative-free optimization (DFO) for use in situations where objective function values or the set of points selected by an algorithm to be objectively evaluated are random.
520
$a
In the first part of this thesis, we consider an algorithmic framework called STORM (STochastic Optimization with Random Models), which as an iterative method, is essentially identical to model-based trust-region methods for smooth DFO. However, by imposing fairly general probabilistic conditions related to the concept of fully-linearity on objective function models and objective function estimates, we prove that iterates of algorithms in the STORM framework exhibit almost sure convergence to first-order stationary points for a broad class of unconstrained stochastic functions. We then show that algorithms in the STORM framework enjoy the canonical rate of convergence for unconstrained non-convex optimization. Throughout the thesis, examples are provided demonstrating how the mentioned probabilistic conditions might be satisfied through particular choices of model-building and function value estimation.
520
$a
In the second part of the thesis, we consider a framework called manifold sampling, intended for unconstrained DFO problems where the objective is nonsmooth, but enough is known a priori about the structure of the nonsmoothness that one can classify a given queried point as belonging to a certain smooth manifold of the objective surface. We particularly examine the case of sums of absolute values of (non-convex) black-box functions. Although we assume in this work that the individual black-box functions can be deterministically evaluated, we consider a variant of manifold sampling wherein random queries are made in each iteration to enhance the algorithm's "awareness" of the diversity of manifolds in a neighborhood of a current iterate. We then combine the ideas of STORM and manifold sampling to yield a practical algorithm intended for non-convex ℓ1-regularized empirical risk minimization.
533
$a
Electronic reproduction.
$b
Ann Arbor, Mich. :
$c
ProQuest,
$d
2018
538
$a
Mode of access: World Wide Web
650
4
$a
Industrial engineering.
$3
679492
650
4
$a
Applied mathematics.
$3
1069907
650
4
$a
Operations research.
$3
573517
655
7
$a
Electronic books.
$2
local
$3
554714
690
$a
0546
690
$a
0364
690
$a
0796
710
2
$a
ProQuest Information and Learning Co.
$3
1178819
710
2
$a
Lehigh University.
$b
Industrial Engineering.
$3
1182234
773
0
$t
Dissertation Abstracts International
$g
79-03B(E).
856
4 0
$u
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=10287265
$z
click for full text (PQDT)
筆 0 讀者評論
多媒體
評論
新增評論
分享你的心得
Export
取書館別
處理中
...
變更密碼[密碼必須為2種組合(英文和數字)及長度為10碼以上]
登入