語系:
繁體中文
English
說明(常見問題)
登入
回首頁
切換:
標籤
|
MARC模式
|
ISBD
Lectures on nonsmooth optimization
紀錄類型:
書目-語言資料,印刷品 : Monograph/item
正題名/作者:
Lectures on nonsmooth optimization/ by Qinian Jin.
作者:
Jin, Qinian.
出版者:
Cham :Springer Nature Switzerland : : 2025.,
面頁冊數:
xiii, 560 p. :ill., digital ; : 24 cm.;
Contained By:
Springer Nature eBook
標題:
Nonsmooth optimization. -
電子資源:
https://doi.org/10.1007/978-3-031-91417-1
ISBN:
9783031914171
Lectures on nonsmooth optimization
Jin, Qinian.
Lectures on nonsmooth optimization
[electronic resource] /by Qinian Jin. - Cham :Springer Nature Switzerland :2025. - xiii, 560 p. :ill., digital ;24 cm. - Texts in applied mathematics,v. 822196-9949 ;. - Texts in applied mathematics ;40..
Preface -- Introduction -- Convex sets and convex functions -- Subgradient and mirror descent methods -- Proximal algorithms -- Karush-Kuhn-Tucker theory and Lagrangian duality -- ADMM: alternating direction method of multipliers -- Primal dual splitting algorithms -- Error bound conditions and linear convergence -- Optimization with Kurdyka- Lojasiewicz property -- Semismooth Newton methods -- Stochastic algorithms -- References -- Index.
This book provides an in-depth exploration of nonsmooth optimization, covering foundational algorithms, theoretical insights, and a wide range of applications. Nonsmooth optimization, characterized by nondifferentiable objective functions or constraints, plays a crucial role across various fields, including machine learning, imaging, inverse problems, statistics, optimal control, and engineering. Its scope and relevance continue to expand, as many real-world problems are inherently nonsmooth or benefit significantly from nonsmooth regularization techniques. This book covers a variety of algorithms for solving nonsmooth optimization problems, which are foundational and recent. It first introduces basic facts on convex analysis and subdifferetial calculus, various algorithms are then discussed, including subgradient methods, mirror descent methods, proximal algorithms, alternating direction method of multipliers, primal dual splitting methods and semismooth Newton methods. Moreover, error bound conditions are discussed and the derivation of linear convergence is illustrated. A particular chapter is delved into first order methods for nonconvex optimization problems satisfying the Kurdyka-Lojasiewicz condition. The book also addresses the rapid evolution of stochastic algorithms for large-scale optimization. This book is written for a wide-ranging audience, including senior undergraduates, graduate students, researchers, and practitioners who are interested in gaining a comprehensive understanding of nonsmooth optimization.
ISBN: 9783031914171
Standard No.: 10.1007/978-3-031-91417-1doiSubjects--Topical Terms:
527964
Nonsmooth optimization.
LC Class. No.: QA402.5
Dewey Class. No.: 519.6
Lectures on nonsmooth optimization
LDR
:03002nam a2200337 a 4500
001
1166933
003
DE-He213
005
20250704131713.0
006
m d
007
cr nn 008maaau
008
251217s2025 sz s 0 eng d
020
$a
9783031914171
$q
(electronic bk.)
020
$a
9783031914164
$q
(paper)
024
7
$a
10.1007/978-3-031-91417-1
$2
doi
035
$a
978-3-031-91417-1
040
$a
GP
$c
GP
041
0
$a
eng
050
4
$a
QA402.5
072
7
$a
PBU
$2
bicssc
072
7
$a
MAT042000
$2
bisacsh
072
7
$a
PBU
$2
thema
082
0 4
$a
519.6
$2
23
090
$a
QA402.5
$b
.J61 2025
100
1
$a
Jin, Qinian.
$3
1495727
245
1 0
$a
Lectures on nonsmooth optimization
$h
[electronic resource] /
$c
by Qinian Jin.
260
$a
Cham :
$c
2025.
$b
Springer Nature Switzerland :
$b
Imprint: Springer,
300
$a
xiii, 560 p. :
$b
ill., digital ;
$c
24 cm.
490
1
$a
Texts in applied mathematics,
$x
2196-9949 ;
$v
v. 82
505
0
$a
Preface -- Introduction -- Convex sets and convex functions -- Subgradient and mirror descent methods -- Proximal algorithms -- Karush-Kuhn-Tucker theory and Lagrangian duality -- ADMM: alternating direction method of multipliers -- Primal dual splitting algorithms -- Error bound conditions and linear convergence -- Optimization with Kurdyka- Lojasiewicz property -- Semismooth Newton methods -- Stochastic algorithms -- References -- Index.
520
$a
This book provides an in-depth exploration of nonsmooth optimization, covering foundational algorithms, theoretical insights, and a wide range of applications. Nonsmooth optimization, characterized by nondifferentiable objective functions or constraints, plays a crucial role across various fields, including machine learning, imaging, inverse problems, statistics, optimal control, and engineering. Its scope and relevance continue to expand, as many real-world problems are inherently nonsmooth or benefit significantly from nonsmooth regularization techniques. This book covers a variety of algorithms for solving nonsmooth optimization problems, which are foundational and recent. It first introduces basic facts on convex analysis and subdifferetial calculus, various algorithms are then discussed, including subgradient methods, mirror descent methods, proximal algorithms, alternating direction method of multipliers, primal dual splitting methods and semismooth Newton methods. Moreover, error bound conditions are discussed and the derivation of linear convergence is illustrated. A particular chapter is delved into first order methods for nonconvex optimization problems satisfying the Kurdyka-Lojasiewicz condition. The book also addresses the rapid evolution of stochastic algorithms for large-scale optimization. This book is written for a wide-ranging audience, including senior undergraduates, graduate students, researchers, and practitioners who are interested in gaining a comprehensive understanding of nonsmooth optimization.
650
0
$a
Nonsmooth optimization.
$3
527964
650
1 4
$a
Optimization.
$3
669174
650
2 4
$a
Continuous Optimization.
$3
888956
710
2
$a
SpringerLink (Online service)
$3
593884
773
0
$t
Springer Nature eBook
830
0
$a
Texts in applied mathematics ;
$v
40.
$3
888901
856
4 0
$u
https://doi.org/10.1007/978-3-031-91417-1
950
$a
Mathematics and Statistics (SpringerNature-11649)
筆 0 讀者評論
多媒體
評論
新增評論
分享你的心得
Export
取書館別
處理中
...
變更密碼[密碼必須為2種組合(英文和數字)及長度為10碼以上]
登入