Language:
English
繁體中文
Help
Login
Back
Switch To:
Labeled
|
MARC Mode
|
ISBD
Operator Theory for Analysis of Conv...
~
Gallagher, Patrick W.
Operator Theory for Analysis of Convex Optimization Methods in Machine Learning.
Record Type:
Language materials, manuscript : Monograph/item
Title/Author:
Operator Theory for Analysis of Convex Optimization Methods in Machine Learning./
Author:
Gallagher, Patrick W.
Description:
1 online resource (281 pages)
Notes:
Source: Dissertation Abstracts International, Volume: 76-05(E), Section: B.
Subject:
Mathematics. -
Online resource:
click for full text (PQDT)
ISBN:
9781321401738
Operator Theory for Analysis of Convex Optimization Methods in Machine Learning.
Gallagher, Patrick W.
Operator Theory for Analysis of Convex Optimization Methods in Machine Learning.
- 1 online resource (281 pages)
Source: Dissertation Abstracts International, Volume: 76-05(E), Section: B.
Thesis (Ph.D.)--University of California, San Diego, 2014.
Includes bibliographical references
As machine learning has more closely interacted with optimization, the concept of convexity has loomed large. Two properties beyond simple convexity have received particularly close attention: strong smoothness and strong convexity. These properties (and their relatives) underlie machine learning analyses from convergence rates to generalization bounds --- they are central and fundamental.
Electronic reproduction.
Ann Arbor, Mich. :
ProQuest,
2018
Mode of access: World Wide Web
ISBN: 9781321401738Subjects--Topical Terms:
527692
Mathematics.
Index Terms--Genre/Form:
554714
Electronic books.
Operator Theory for Analysis of Convex Optimization Methods in Machine Learning.
LDR
:03474ntm a2200373K 4500
001
913756
005
20180622095237.5
006
m o u
007
cr mn||||a|a||
008
190606s2014 xx obm 000 0 eng d
020
$a
9781321401738
035
$a
(MiAaPQ)AAI3666862
035
$a
(MiAaPQ)ucsd:14496
035
$a
AAI3666862
040
$a
MiAaPQ
$b
eng
$c
MiAaPQ
100
1
$a
Gallagher, Patrick W.
$3
1186721
245
1 0
$a
Operator Theory for Analysis of Convex Optimization Methods in Machine Learning.
264
0
$c
2014
300
$a
1 online resource (281 pages)
336
$a
text
$b
txt
$2
rdacontent
337
$a
computer
$b
c
$2
rdamedia
338
$a
online resource
$b
cr
$2
rdacarrier
500
$a
Source: Dissertation Abstracts International, Volume: 76-05(E), Section: B.
500
$a
Advisers: Virginia de Sa; Philip Gill.
502
$a
Thesis (Ph.D.)--University of California, San Diego, 2014.
504
$a
Includes bibliographical references
520
$a
As machine learning has more closely interacted with optimization, the concept of convexity has loomed large. Two properties beyond simple convexity have received particularly close attention: strong smoothness and strong convexity. These properties (and their relatives) underlie machine learning analyses from convergence rates to generalization bounds --- they are central and fundamental.
520
$a
This thesis takes as its focus properties from operator theory that, in specific instances, relate to broadened conceptions of convexity, strong smoothness, and strong convexity. Some of the properties we consider coincide with strong smoothness and strong convexity in some settings, but represent broadenings of these concepts in other situations of interest. Our intention throughout is to take an approach that balances theoretical generality with ease of use and subsequent extension.
520
$a
Through this approach we establish a framework, novel in its scope of application, in which a single analysis serves to recover standard convergence rates (typically established via a variety of separate arguments) for convex optimization methods prominent in machine learning.
520
$a
The framework is based on a perspective in which the iterative update for each convex optimization method is regarded as the application of some operator. We establish a collection of correspondences, novel in its comprehensiveness, that exist between ''contractivity-type'' properties of the iterative update operator and ''monotonicity-type'' properties of the associated displacement operator. We call particular attention to the comparison between the broader range of properties that we discuss and the more restricted range considered in the contemporary literature, demonstrating as well the relationship between the broader and narrower range.
520
$a
In support of our discussion of these property correspondences and the optimization method analyses based on them, we relate operator theory concepts that may be unfamiliar to a machine learning audience to more familiar concepts from convex analysis. In addition to grounding our discussion of operator theory, this turns out to provide a fresh perspective on many touchstone concepts from convex analysis.
533
$a
Electronic reproduction.
$b
Ann Arbor, Mich. :
$c
ProQuest,
$d
2018
538
$a
Mode of access: World Wide Web
650
4
$a
Mathematics.
$3
527692
650
4
$a
Artificial intelligence.
$3
559380
655
7
$a
Electronic books.
$2
local
$3
554714
690
$a
0405
690
$a
0800
710
2
$a
ProQuest Information and Learning Co.
$3
1178819
710
2
$a
University of California, San Diego.
$b
Cognitive Science.
$3
1186722
856
4 0
$u
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=3666862
$z
click for full text (PQDT)
based on 0 review(s)
Multimedia
Reviews
Add a review
and share your thoughts with other readers
Export
pickup library
Processing
...
Change password
Login