Language:
English
繁體中文
Help
Login
Back
Switch To:
Labeled
|
MARC Mode
|
ISBD
Black Box Variational Inference : = ...
~
ProQuest Information and Learning Co.
Black Box Variational Inference : = Scalable, Generic Bayesian Computation and its Applications.
Record Type:
Language materials, manuscript : Monograph/item
Title/Author:
Black Box Variational Inference :/
Reminder of title:
Scalable, Generic Bayesian Computation and its Applications.
Author:
Ranganath, Rajesh.
Description:
1 online resource (164 pages)
Notes:
Source: Dissertation Abstracts International, Volume: 79-05(E), Section: B.
Contained By:
Dissertation Abstracts International79-05B(E).
Subject:
Computer science. -
Online resource:
click for full text (PQDT)
ISBN:
9780355480566
Black Box Variational Inference : = Scalable, Generic Bayesian Computation and its Applications.
Ranganath, Rajesh.
Black Box Variational Inference :
Scalable, Generic Bayesian Computation and its Applications. - 1 online resource (164 pages)
Source: Dissertation Abstracts International, Volume: 79-05(E), Section: B.
Thesis (Ph.D.)
Includes bibliographical references
This item is not available from ProQuest Dissertations & Theses.
Probabilistic generative models are robust to noise, uncover unseen patterns, and make predictions about the future. These models have been used successfully to solve problems in neuroscience, astrophysics, genetics, and medicine. The main computational challenge is computing the hidden structure given the data---posterior inference. For most models of interest, computing the posterior distribution requires approximations like variational inference. Variational inference transforms posterior inference into optimization. Classically, this optimization problem was feasible to deploy in only a small fraction of models.
Electronic reproduction.
Ann Arbor, Mich. :
ProQuest,
2018
Mode of access: World Wide Web
ISBN: 9780355480566Subjects--Topical Terms:
573171
Computer science.
Index Terms--Genre/Form:
554714
Electronic books.
Black Box Variational Inference : = Scalable, Generic Bayesian Computation and its Applications.
LDR
:03688ntm a2200409Ki 4500
001
909182
005
20180419121557.5
006
m o u
007
cr mn||||a|a||
008
190606s2017 xx obm 000 0 eng d
020
$a
9780355480566
035
$a
(MiAaPQ)AAI10638418
035
$a
(MiAaPQ)princeton:12362
035
$a
AAI10638418
040
$a
MiAaPQ
$b
eng
$c
MiAaPQ
099
$a
TUL
$f
hyy
$c
available through World Wide Web
100
1
$a
Ranganath, Rajesh.
$3
1179800
245
1 0
$a
Black Box Variational Inference :
$b
Scalable, Generic Bayesian Computation and its Applications.
264
0
$c
2017
300
$a
1 online resource (164 pages)
336
$a
text
$b
txt
$2
rdacontent
337
$a
computer
$b
c
$2
rdamedia
338
$a
online resource
$b
cr
$2
rdacarrier
500
$a
Source: Dissertation Abstracts International, Volume: 79-05(E), Section: B.
500
$a
Adviser: David M. Blei.
502
$a
Thesis (Ph.D.)
$c
Princeton University
$d
2017.
504
$a
Includes bibliographical references
506
$a
This item is not available from ProQuest Dissertations & Theses.
520
$a
Probabilistic generative models are robust to noise, uncover unseen patterns, and make predictions about the future. These models have been used successfully to solve problems in neuroscience, astrophysics, genetics, and medicine. The main computational challenge is computing the hidden structure given the data---posterior inference. For most models of interest, computing the posterior distribution requires approximations like variational inference. Variational inference transforms posterior inference into optimization. Classically, this optimization problem was feasible to deploy in only a small fraction of models.
520
$a
This thesis develops black box variational inference. Black box variational inference is a variational inference algorithm that is easy to deploy on a broad class of models and has already found use in models for neuroscience and health care. It makes new kinds of models possible, ones that were too unruly for previous inference methods.
520
$a
One set of models we develop is deep exponential families. Deep exponential families uncover new kinds of hidden patterns while being predictive of future data. Many existing models are deep exponential families. Black box variational inference makes it possible for us to quickly study a broad range of deep exponential families with minimal added effort for each new type of deep exponential family.
520
$a
The ideas around black box variational inference also facilitate new kinds of variational methods. First, we develop hierarchical variational models. Hierarchical variational models improve the approximation quality of variational inference by building higher-fidelity approximations from coarser ones. We show that they help with inference in deep exponential families. Second, we introduce operator variational inference. Operator variational inference delves into the possible distance measures that can be used for the variational optimization problem. We show that this formulation categorizes various variational inference methods and enables variational approximations without tractable densities.
520
$a
By developing black box variational inference, we have opened doors to new models, better posterior approximations, and new varieties of variational inference algorithms.
533
$a
Electronic reproduction.
$b
Ann Arbor, Mich. :
$c
ProQuest,
$d
2018
538
$a
Mode of access: World Wide Web
650
4
$a
Computer science.
$3
573171
650
4
$a
Statistics.
$3
556824
655
7
$a
Electronic books.
$2
local
$3
554714
690
$a
0984
690
$a
0463
710
2
$a
ProQuest Information and Learning Co.
$3
1178819
710
2
$a
Princeton University.
$b
Computer Science.
$3
1179801
773
0
$t
Dissertation Abstracts International
$g
79-05B(E).
856
4 0
$u
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=10638418
$z
click for full text (PQDT)
based on 0 review(s)
Multimedia
Reviews
Add a review
and share your thoughts with other readers
Export
pickup library
Processing
...
Change password
Login