語系:
繁體中文
English
說明(常見問題)
登入
回首頁
切換:
標籤
|
MARC模式
|
ISBD
Theory and Algorithms for Learning w...
~
ProQuest Information and Learning Co.
Theory and Algorithms for Learning with Stratified Decisions.
紀錄類型:
書目-語言資料,手稿 : Monograph/item
正題名/作者:
Theory and Algorithms for Learning with Stratified Decisions./
作者:
DeSalvo, Giulia.
面頁冊數:
1 online resource (290 pages)
附註:
Source: Dissertation Abstracts International, Volume: 78-12(E), Section: B.
Contained By:
Dissertation Abstracts International78-12B(E).
標題:
Applied mathematics. -
電子資源:
click for full text (PQDT)
ISBN:
9780355128628
Theory and Algorithms for Learning with Stratified Decisions.
DeSalvo, Giulia.
Theory and Algorithms for Learning with Stratified Decisions.
- 1 online resource (290 pages)
Source: Dissertation Abstracts International, Volume: 78-12(E), Section: B.
Thesis (Ph.D.)
Includes bibliographical references
This dissertation deals with several key problems in modern machine learning that are deeply related to stratified decisions: learning with an abstention option and learning with complex decision trees or cascades.
Electronic reproduction.
Ann Arbor, Mich. :
ProQuest,
2018
Mode of access: World Wide Web
ISBN: 9780355128628Subjects--Topical Terms:
1069907
Applied mathematics.
Index Terms--Genre/Form:
554714
Electronic books.
Theory and Algorithms for Learning with Stratified Decisions.
LDR
:04104ntm a2200385Ki 4500
001
909725
005
20180426091044.5
006
m o u
007
cr mn||||a|a||
008
190606s2017 xx obm 000 0 eng d
020
$a
9780355128628
035
$a
(MiAaPQ)AAI10261671
035
$a
(MiAaPQ)nyu:12957
035
$a
AAI10261671
040
$a
MiAaPQ
$b
eng
$c
MiAaPQ
099
$a
TUL
$f
hyy
$c
available through World Wide Web
100
1
$a
DeSalvo, Giulia.
$3
1180640
245
1 0
$a
Theory and Algorithms for Learning with Stratified Decisions.
264
0
$c
2017
300
$a
1 online resource (290 pages)
336
$a
text
$b
txt
$2
rdacontent
337
$a
computer
$b
c
$2
rdamedia
338
$a
online resource
$b
cr
$2
rdacarrier
500
$a
Source: Dissertation Abstracts International, Volume: 78-12(E), Section: B.
500
$a
Adviser: Mehryar Mohri.
502
$a
Thesis (Ph.D.)
$c
New York University
$d
2017.
504
$a
Includes bibliographical references
520
$a
This dissertation deals with several key problems in modern machine learning that are deeply related to stratified decisions: learning with an abstention option and learning with complex decision trees or cascades.
520
$a
Classification with abstentions is a key learning scenario where the algorithm can abstain from making a prediction, at the price of incurring a fixed cost. This learning scenario arises in a wide range of applications including health, bioinformatics, astronomical event detection, active learning, and many others. We analyze learning with abstentions in its full generality for both the batch setting and the on-line setting. We first introduce a novel framework for analyzing this scenario that consists of simultaneously learning two functions: a classifier along with an abstention function. We present a full theoretical analysis of this framework, including new data-dependent learning bounds as well as several consistency and calibration results. These theoretical guarantees guide us in deriving new algorithms for binary classification and multi-class classification. We also report the results of several experiments suggesting that our algorithms provide a significant improvement in practice over confidence-based algorithms. In the on-line setting, we design several algorithms and derive regret guarantees in both the adversarial and stochastic setting. In the process, we derive a new regret bounds for on-line learning with feedback graphs and also design a new algorithm for on-line learning with sleeping experts that takes advantage of time-varying feedback graphs. We present natural extensions of existing algorithms as a baseline, and we then design more sophisticated algorithms that explicitly exploit the structure of our problem. We empirically validate the improvement of these more sophisticated algorithms on several datasets.
520
$a
Decision trees are ubiquitous learning models commonly used in classification, regression, and clustering applications. In order to tackle much harder tasks for several of these scenarios, we introduce a broad learning model formed by cascades of predictors, Deep Cascades, that is structured as general decision trees in which leaf predictors or node questions may be members of rich function families. For both binary classification and multi-class classification, we present new data-dependent theoretical guarantees for learning with Deep Cascades in terms of the complexities of the sub-families composing these sets of predictors and the fraction of sample points reaching each leaf that are correctly classified. These guarantees can guide the design of a variety of different algorithms for deep cascade models and we give a detailed description of such algorithms along with favorable experimental results.
533
$a
Electronic reproduction.
$b
Ann Arbor, Mich. :
$c
ProQuest,
$d
2018
538
$a
Mode of access: World Wide Web
650
4
$a
Applied mathematics.
$3
1069907
650
4
$a
Computer science.
$3
573171
650
4
$a
Artificial intelligence.
$3
559380
655
7
$a
Electronic books.
$2
local
$3
554714
690
$a
0364
690
$a
0984
690
$a
0800
710
2
$a
ProQuest Information and Learning Co.
$3
1178819
710
2
$a
New York University.
$b
Mathematics.
$3
1180487
773
0
$t
Dissertation Abstracts International
$g
78-12B(E).
856
4 0
$u
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=10261671
$z
click for full text (PQDT)
筆 0 讀者評論
多媒體
評論
新增評論
分享你的心得
Export
取書館別
處理中
...
變更密碼[密碼必須為2種組合(英文和數字)及長度為10碼以上]
登入