Language:
English
繁體中文
Help
Login
Back
Switch To:
Labeled
|
MARC Mode
|
ISBD
Divide and Conquer Algorithms for Ma...
~
University of California, Riverside.
Divide and Conquer Algorithms for Machine Learning.
Record Type:
Language materials, manuscript : Monograph/item
Title/Author:
Divide and Conquer Algorithms for Machine Learning./
Author:
Izbicki, Michael John.
Description:
1 online resource (170 pages)
Notes:
Source: Dissertation Abstracts International, Volume: 79-07(E), Section: A.
Contained By:
Dissertation Abstracts International79-07A(E).
Subject:
Information science. -
Online resource:
click for full text (PQDT)
ISBN:
9780355472417
Divide and Conquer Algorithms for Machine Learning.
Izbicki, Michael John.
Divide and Conquer Algorithms for Machine Learning.
- 1 online resource (170 pages)
Source: Dissertation Abstracts International, Volume: 79-07(E), Section: A.
Thesis (Ph.D.)--University of California, Riverside, 2017.
Includes bibliographical references
This thesis improves the scalability of machine learning by studying mergeable learning algorithms. In a mergeable algorithm, many processors independently solve the learning problem on small subsets of the data. Then a master processor merges the solutions together with only a single round of communication. Mergeable algorithms are popular because they are fast, easy to implement, and have strong privacy guarantees.
Electronic reproduction.
Ann Arbor, Mich. :
ProQuest,
2018
Mode of access: World Wide Web
ISBN: 9780355472417Subjects--Topical Terms:
561178
Information science.
Index Terms--Genre/Form:
554714
Electronic books.
Divide and Conquer Algorithms for Machine Learning.
LDR
:03213ntm a2200361Ki 4500
001
920660
005
20181203094031.5
006
m o u
007
cr mn||||a|a||
008
190606s2017 xx obm 000 0 eng d
020
$a
9780355472417
035
$a
(MiAaPQ)AAI10624359
035
$a
(MiAaPQ)ucr:13088
035
$a
AAI10624359
040
$a
MiAaPQ
$b
eng
$c
MiAaPQ
$d
NTU
100
1
$a
Izbicki, Michael John.
$3
1195525
245
1 0
$a
Divide and Conquer Algorithms for Machine Learning.
264
0
$c
2017
300
$a
1 online resource (170 pages)
336
$a
text
$b
txt
$2
rdacontent
337
$a
computer
$b
c
$2
rdamedia
338
$a
online resource
$b
cr
$2
rdacarrier
500
$a
Source: Dissertation Abstracts International, Volume: 79-07(E), Section: A.
500
$a
Adviser: Christian R. Shelton.
502
$a
Thesis (Ph.D.)--University of California, Riverside, 2017.
504
$a
Includes bibliographical references
520
$a
This thesis improves the scalability of machine learning by studying mergeable learning algorithms. In a mergeable algorithm, many processors independently solve the learning problem on small subsets of the data. Then a master processor merges the solutions together with only a single round of communication. Mergeable algorithms are popular because they are fast, easy to implement, and have strong privacy guarantees.
520
$a
Our first contribution is a novel fast cross validation procedure suitable for any mergeable algorithm. This fast cross validation procedure has a constant runtime independent of the number of folds and can be implemented on distributed systems. This procedure is also widely applicable. We show that 32 recently proposed learning algorithms are mergeable and therefore fit our cross validation framework. These learning algorithms come from many subfields of machine learning, including density estimation, regularized loss minimization, dimensionality reduction, submodular optimization, variational inference, and markov chain monte carlo.
520
$a
We also provide two new mergeable learning algorithms. In the context of regularized loss minimization, existing merge procedures either have high bias or slow runtimes. We introduce the optimal weighted average (OWA) merge procedure, which achieves both a low bias and fast runtime. We also improve the cover tree data structure for fast nearest neighbor queries by providing a merge procedure. In doing so, we improve both the theoretical guarantees of the cover tree and its practical runtime. For example, the original cover tree was able to find nearest neighbors in time O( cexp12 log n), and we improve this bound to O(chole 4 log n) for i.i.d. data. Here, c exp and chole are measures of the "intrinsic dimensionality'' of the data, and on typical datasets c exp > chole. Experiments on large scale ad-click, genomics, and image classification tasks empirically validate these algorithms.
533
$a
Electronic reproduction.
$b
Ann Arbor, Mich. :
$c
ProQuest,
$d
2018
538
$a
Mode of access: World Wide Web
650
4
$a
Information science.
$3
561178
650
4
$a
Artificial intelligence.
$3
559380
655
7
$a
Electronic books.
$2
local
$3
554714
690
$a
0723
690
$a
0800
710
2
$a
ProQuest Information and Learning Co.
$3
1178819
710
2
$a
University of California, Riverside.
$b
Computer Science.
$3
1195526
773
0
$t
Dissertation Abstracts International
$g
79-07A(E).
856
4 0
$u
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=10624359
$z
click for full text (PQDT)
based on 0 review(s)
Multimedia
Reviews
Add a review
and share your thoughts with other readers
Export
pickup library
Processing
...
Change password
Login