語系:
繁體中文
English
說明(常見問題)
登入
回首頁
切換:
標籤
|
MARC模式
|
ISBD
Deep Generative Models and Biologica...
~
Fan, Kai.
Deep Generative Models and Biological Applications.
紀錄類型:
書目-語言資料,手稿 : Monograph/item
正題名/作者:
Deep Generative Models and Biological Applications./
作者:
Fan, Kai.
面頁冊數:
1 online resource (154 pages)
附註:
Source: Dissertation Abstracts International, Volume: 79-09(E), Section: B.
Contained By:
Dissertation Abstracts International79-09B(E).
標題:
Statistics. -
電子資源:
click for full text (PQDT)
ISBN:
9780355870114
Deep Generative Models and Biological Applications.
Fan, Kai.
Deep Generative Models and Biological Applications.
- 1 online resource (154 pages)
Source: Dissertation Abstracts International, Volume: 79-09(E), Section: B.
Thesis (Ph.D.)--Duke University, 2017.
Includes bibliographical references
High-dimensional probability distributions are important objects in a wide variety of applications. Generative models provide an excellent manipulation method for training from rich available unlabeled data set and sampling new data points from underlying high-dimensional probability distributions. The recent proposed Variational auto-encoders (VAE) framework is an efficient high-dimensional inference method to modeling complicated data manifold in an approximate Bayesian way, i.e., variational inference. We first discuss how to design fast stochastic backpropagation algorithm for the VAE based amortized variational inference method. Particularly, we propose second order Hessian-free optimization method for Gaussian latent variable models and provide a theoretical justification to the convergence of Monte Carlo estimation in our algorithm. Then, we apply the amortized variational inference to a dynamic modeling application in flu diffusion task. Compared with traditional approximate Gibbs sampling algorithm, we make less assumption to the infection rate.
Electronic reproduction.
Ann Arbor, Mich. :
ProQuest,
2018
Mode of access: World Wide Web
ISBN: 9780355870114Subjects--Topical Terms:
556824
Statistics.
Index Terms--Genre/Form:
554714
Electronic books.
Deep Generative Models and Biological Applications.
LDR
:03248ntm a2200361Ki 4500
001
916823
005
20180928111501.5
006
m o u
007
cr mn||||a|a||
008
190606s2017 xx obm 000 0 eng d
020
$a
9780355870114
035
$a
(MiAaPQ)AAI10639562
035
$a
(MiAaPQ)duke:14339
035
$a
AAI10639562
040
$a
MiAaPQ
$b
eng
$c
MiAaPQ
$d
NTU
100
1
$a
Fan, Kai.
$3
1190669
245
1 0
$a
Deep Generative Models and Biological Applications.
264
0
$c
2017
300
$a
1 online resource (154 pages)
336
$a
text
$b
txt
$2
rdacontent
337
$a
computer
$b
c
$2
rdamedia
338
$a
online resource
$b
cr
$2
rdacarrier
500
$a
Source: Dissertation Abstracts International, Volume: 79-09(E), Section: B.
500
$a
Adviser: Katherine Heller.
502
$a
Thesis (Ph.D.)--Duke University, 2017.
504
$a
Includes bibliographical references
520
$a
High-dimensional probability distributions are important objects in a wide variety of applications. Generative models provide an excellent manipulation method for training from rich available unlabeled data set and sampling new data points from underlying high-dimensional probability distributions. The recent proposed Variational auto-encoders (VAE) framework is an efficient high-dimensional inference method to modeling complicated data manifold in an approximate Bayesian way, i.e., variational inference. We first discuss how to design fast stochastic backpropagation algorithm for the VAE based amortized variational inference method. Particularly, we propose second order Hessian-free optimization method for Gaussian latent variable models and provide a theoretical justification to the convergence of Monte Carlo estimation in our algorithm. Then, we apply the amortized variational inference to a dynamic modeling application in flu diffusion task. Compared with traditional approximate Gibbs sampling algorithm, we make less assumption to the infection rate.
520
$a
Differing from the maximum likelihood approach of VAE, Generative Adversarial Networks (GAN) is trying to solve the generation problem from a game theoretical way. From this viewpoint, we design a framework VAE+GAN, by placing a discriminator on top of auto-encoders based model and introducing an extra adversarial loss. The adversarial training induced by the classification loss is to make the discriminator believe the sample from the generative model is as real as the one from the true dataset. This trick can practically improve the quality of generation samples, demonstrated on images and text domains with elaborately designed architectures. Additionally, we validate the importance of generative adversarial loss with the conditional generative model in two biological applications: approximate Turing pattern PDEs generation in synthetic/system biology, and automatic cardiovascular disease detection in medical imaging processing.
533
$a
Electronic reproduction.
$b
Ann Arbor, Mich. :
$c
ProQuest,
$d
2018
538
$a
Mode of access: World Wide Web
650
4
$a
Statistics.
$3
556824
650
4
$a
Artificial intelligence.
$3
559380
650
4
$a
Bioinformatics.
$3
583857
655
7
$a
Electronic books.
$2
local
$3
554714
690
$a
0463
690
$a
0800
690
$a
0715
710
2
$a
ProQuest Information and Learning Co.
$3
1178819
710
2
$a
Duke University.
$b
Computational Biology and Bioinformatics.
$3
1190670
773
0
$t
Dissertation Abstracts International
$g
79-09B(E).
856
4 0
$u
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=10639562
$z
click for full text (PQDT)
筆 0 讀者評論
多媒體
評論
新增評論
分享你的心得
Export
取書館別
處理中
...
變更密碼[密碼必須為2種組合(英文和數字)及長度為10碼以上]
登入