Language:
English
繁體中文
Help
Login
Back
Switch To:
Labeled
|
MARC Mode
|
ISBD
Improving Bayesian Optimization for ...
~
Swersky, Kevin.
Improving Bayesian Optimization for Machine Learning Using Expert Priors.
Record Type:
Language materials, manuscript : Monograph/item
Title/Author:
Improving Bayesian Optimization for Machine Learning Using Expert Priors./
Author:
Swersky, Kevin.
Description:
1 online resource (119 pages)
Notes:
Source: Dissertation Abstracts International, Volume: 79-04(E), Section: B.
Subject:
Artificial intelligence. -
Online resource:
click for full text (PQDT)
ISBN:
9780355456615
Improving Bayesian Optimization for Machine Learning Using Expert Priors.
Swersky, Kevin.
Improving Bayesian Optimization for Machine Learning Using Expert Priors.
- 1 online resource (119 pages)
Source: Dissertation Abstracts International, Volume: 79-04(E), Section: B.
Thesis (Ph.D.)--University of Toronto (Canada), 2017.
Includes bibliographical references
Deep neural networks have recently become astonishingly successful at many machine learning problems such as object recognition and speech recognition, and they are now also being used in many new and creative ways. However, their performance critically relies on the proper setting of numerous hyperparameters. Manual tuning by an expert researcher has been a traditionally effective approach, however it is becoming increasingly infeasible as models become more complex and machine learning systems become further embedded within larger automated systems. Bayesian optimization has recently been proposed as a strategy for intelligently optimizing the hyperparameters of deep neural networks and other machine learning systems; it has been shown in many cases to outperform experts, and provides a promising way to reduce both the computational and human time required. Regardless, expert researchers can still be quite effective at hyperparameter tuning due to their ability to incorporate contextual knowledge and intuition into their search, while traditional Bayesian optimization treats each problem as a black box and therefore cannot take advantage of this knowledge. In this thesis, we draw inspiration from these abilities and incorporate them into the Bayesian optimization framework as additional prior information. These extensions include the ability to transfer knowledge between problems, the ability to transform the problem domain into one that is easier to optimize, and the ability to terminate experiments when they are no longer deemed to be promising, without requiring their training to converge. We demonstrate in experiments across a range of machine learning models that these extensions significantly reduce the cost and increase the robustness of Bayesian optimization for automatic hyperparameter tuning.
Electronic reproduction.
Ann Arbor, Mich. :
ProQuest,
2018
Mode of access: World Wide Web
ISBN: 9780355456615Subjects--Topical Terms:
559380
Artificial intelligence.
Index Terms--Genre/Form:
554714
Electronic books.
Improving Bayesian Optimization for Machine Learning Using Expert Priors.
LDR
:03002ntm a2200325K 4500
001
912150
005
20180608102940.5
006
m o u
007
cr mn||||a|a||
008
190606s2017 xx obm 000 0 eng d
020
$a
9780355456615
035
$a
(MiAaPQ)AAI10241438
035
$a
(MiAaPQ)toronto:15042
035
$a
AAI10241438
040
$a
MiAaPQ
$b
eng
$c
MiAaPQ
100
1
$a
Swersky, Kevin.
$3
1184377
245
1 0
$a
Improving Bayesian Optimization for Machine Learning Using Expert Priors.
264
0
$c
2017
300
$a
1 online resource (119 pages)
336
$a
text
$b
txt
$2
rdacontent
337
$a
computer
$b
c
$2
rdamedia
338
$a
online resource
$b
cr
$2
rdacarrier
500
$a
Source: Dissertation Abstracts International, Volume: 79-04(E), Section: B.
500
$a
Adviser: Richard S. Zemel.
502
$a
Thesis (Ph.D.)--University of Toronto (Canada), 2017.
504
$a
Includes bibliographical references
520
$a
Deep neural networks have recently become astonishingly successful at many machine learning problems such as object recognition and speech recognition, and they are now also being used in many new and creative ways. However, their performance critically relies on the proper setting of numerous hyperparameters. Manual tuning by an expert researcher has been a traditionally effective approach, however it is becoming increasingly infeasible as models become more complex and machine learning systems become further embedded within larger automated systems. Bayesian optimization has recently been proposed as a strategy for intelligently optimizing the hyperparameters of deep neural networks and other machine learning systems; it has been shown in many cases to outperform experts, and provides a promising way to reduce both the computational and human time required. Regardless, expert researchers can still be quite effective at hyperparameter tuning due to their ability to incorporate contextual knowledge and intuition into their search, while traditional Bayesian optimization treats each problem as a black box and therefore cannot take advantage of this knowledge. In this thesis, we draw inspiration from these abilities and incorporate them into the Bayesian optimization framework as additional prior information. These extensions include the ability to transfer knowledge between problems, the ability to transform the problem domain into one that is easier to optimize, and the ability to terminate experiments when they are no longer deemed to be promising, without requiring their training to converge. We demonstrate in experiments across a range of machine learning models that these extensions significantly reduce the cost and increase the robustness of Bayesian optimization for automatic hyperparameter tuning.
533
$a
Electronic reproduction.
$b
Ann Arbor, Mich. :
$c
ProQuest,
$d
2018
538
$a
Mode of access: World Wide Web
650
4
$a
Artificial intelligence.
$3
559380
650
4
$a
Computer science.
$3
573171
655
7
$a
Electronic books.
$2
local
$3
554714
690
$a
0800
690
$a
0984
710
2
$a
ProQuest Information and Learning Co.
$3
1178819
710
2
$a
University of Toronto (Canada).
$b
Computer Science.
$3
845521
856
4 0
$u
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=10241438
$z
click for full text (PQDT)
based on 0 review(s)
Multimedia
Reviews
Add a review
and share your thoughts with other readers
Export
pickup library
Processing
...
Change password
Login