語系:
繁體中文
English
說明(常見問題)
登入
回首頁
切換:
標籤
|
MARC模式
|
ISBD
Generative Modeling with Differentiable Dynamics.
紀錄類型:
書目-語言資料,手稿 : Monograph/item
正題名/作者:
Generative Modeling with Differentiable Dynamics./
作者:
Chen, Ricky Tian Qi.
面頁冊數:
1 online resource (120 pages)
附註:
Source: Dissertations Abstracts International, Volume: 84-09, Section: B.
Contained By:
Dissertations Abstracts International84-09B.
標題:
Computer science. -
電子資源:
click for full text (PQDT)
ISBN:
9798377616269
Generative Modeling with Differentiable Dynamics.
Chen, Ricky Tian Qi.
Generative Modeling with Differentiable Dynamics.
- 1 online resource (120 pages)
Source: Dissertations Abstracts International, Volume: 84-09, Section: B.
Thesis (Ph.D.)--University of Toronto (Canada), 2023.
Includes bibliographical references
Deep Learning has focused around building differentiable models for a variety of data modalities, but there are still some types of data that deep neural networks struggle to handle natively. In particular, data that come from high-dimensional distributions, time-stamped data that are sampled at irregular frequencies, and event-based data that exhibit both discrete- and continuous-time behaviors. These data modalities can be better modeled by specialized models with particular built-in structures. However, placing such structures within models can also come at the cost of extra computation, in both computing the output and its gradient.We will discuss tractable methods for the construction and application of generative models to high-dimensional and continuous-time data. We focus on building flexible probabilistic models that provide both an efficient algorithm for sampling and log-likelihood evaluations, as these enable simple adoption within a wide variety of machine learning applications.In the first part of this thesis, we construct unbiased gradient estimators for enabling efficient training within two foundational probabilistic modeling frameworks: 1) the Residual Flow makes use of an invertible residual network and excels at both density estimation and hybrid modeling, and 2) SUMO enables unbiased estimation of log marginal likelihoods for latent variable models, enabling them for plug-and-play applications in posterior inference and reinforcement learning.In the second part of this thesis, we present methods for dealing with discrete events localized in continuous time. We first construct a new framework for generative modeling based on ordinary differential equations (ODE) called Continuous Normalizing Flows. We then embed its usage within spatio-temporal point process modeling to build efficient yet flexible models for data from a wide variety of scientific domains. Finally, we show how to differentiate through event handling in ODE solvers, allowing us to extend Neural ODEs to cases of implicitly defined termination times and enabling learning of discrete events and discontinuous dynamics. This allows gradient-based training for switching linear dynamical systems, applicable for neuroscience and finance to decompose a complex time series into intervals.
Electronic reproduction.
Ann Arbor, Mich. :
ProQuest,
2024
Mode of access: World Wide Web
ISBN: 9798377616269Subjects--Topical Terms:
573171
Computer science.
Subjects--Index Terms:
Machine learningIndex Terms--Genre/Form:
554714
Electronic books.
Generative Modeling with Differentiable Dynamics.
LDR
:03672ntm a22003977 4500
001
1143555
005
20240517104558.5
006
m o d
007
cr mn ---uuuuu
008
250605s2023 xx obm 000 0 eng d
020
$a
9798377616269
035
$a
(MiAaPQ)AAI30243146
035
$a
AAI30243146
040
$a
MiAaPQ
$b
eng
$c
MiAaPQ
$d
NTU
100
1
$a
Chen, Ricky Tian Qi.
$3
1468277
245
1 0
$a
Generative Modeling with Differentiable Dynamics.
264
0
$c
2023
300
$a
1 online resource (120 pages)
336
$a
text
$b
txt
$2
rdacontent
337
$a
computer
$b
c
$2
rdamedia
338
$a
online resource
$b
cr
$2
rdacarrier
500
$a
Source: Dissertations Abstracts International, Volume: 84-09, Section: B.
500
$a
Advisor: Duvenaud, David.
502
$a
Thesis (Ph.D.)--University of Toronto (Canada), 2023.
504
$a
Includes bibliographical references
520
$a
Deep Learning has focused around building differentiable models for a variety of data modalities, but there are still some types of data that deep neural networks struggle to handle natively. In particular, data that come from high-dimensional distributions, time-stamped data that are sampled at irregular frequencies, and event-based data that exhibit both discrete- and continuous-time behaviors. These data modalities can be better modeled by specialized models with particular built-in structures. However, placing such structures within models can also come at the cost of extra computation, in both computing the output and its gradient.We will discuss tractable methods for the construction and application of generative models to high-dimensional and continuous-time data. We focus on building flexible probabilistic models that provide both an efficient algorithm for sampling and log-likelihood evaluations, as these enable simple adoption within a wide variety of machine learning applications.In the first part of this thesis, we construct unbiased gradient estimators for enabling efficient training within two foundational probabilistic modeling frameworks: 1) the Residual Flow makes use of an invertible residual network and excels at both density estimation and hybrid modeling, and 2) SUMO enables unbiased estimation of log marginal likelihoods for latent variable models, enabling them for plug-and-play applications in posterior inference and reinforcement learning.In the second part of this thesis, we present methods for dealing with discrete events localized in continuous time. We first construct a new framework for generative modeling based on ordinary differential equations (ODE) called Continuous Normalizing Flows. We then embed its usage within spatio-temporal point process modeling to build efficient yet flexible models for data from a wide variety of scientific domains. Finally, we show how to differentiate through event handling in ODE solvers, allowing us to extend Neural ODEs to cases of implicitly defined termination times and enabling learning of discrete events and discontinuous dynamics. This allows gradient-based training for switching linear dynamical systems, applicable for neuroscience and finance to decompose a complex time series into intervals.
533
$a
Electronic reproduction.
$b
Ann Arbor, Mich. :
$c
ProQuest,
$d
2024
538
$a
Mode of access: World Wide Web
650
4
$a
Computer science.
$3
573171
653
$a
Machine learning
653
$a
Data
653
$a
Residual Flow
653
$a
SUMO
653
$a
Ordinary differential equations
653
$a
Continuous Normalizing Flows
655
7
$a
Electronic books.
$2
local
$3
554714
690
$a
0800
690
$a
0984
710
2
$a
ProQuest Information and Learning Co.
$3
1178819
710
2
$a
University of Toronto (Canada).
$b
Computer Science.
$3
845521
773
0
$t
Dissertations Abstracts International
$g
84-09B.
856
4 0
$u
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=30243146
$z
click for full text (PQDT)
筆 0 讀者評論
多媒體
評論
新增評論
分享你的心得
Export
取書館別
處理中
...
變更密碼[密碼必須為2種組合(英文和數字)及長度為10碼以上]
登入