語系:
繁體中文
English
說明(常見問題)
登入
回首頁
切換:
標籤
|
MARC模式
|
ISBD
A Study of Field Theories Via Neural Networks.
紀錄類型:
書目-語言資料,手稿 : Monograph/item
正題名/作者:
A Study of Field Theories Via Neural Networks./
作者:
Maiti, Anindita.
面頁冊數:
1 online resource (229 pages)
附註:
Source: Dissertations Abstracts International, Volume: 84-10, Section: B.
Contained By:
Dissertations Abstracts International84-10B.
標題:
Theoretical physics. -
電子資源:
click for full text (PQDT)
ISBN:
9798379417994
A Study of Field Theories Via Neural Networks.
Maiti, Anindita.
A Study of Field Theories Via Neural Networks.
- 1 online resource (229 pages)
Source: Dissertations Abstracts International, Volume: 84-10, Section: B.
Thesis (Ph.D.)--Northeastern University, 2023.
Includes bibliographical references
We propose a theoretical understanding of neural networks in terms of Wilsonian effective field theory. The correspondence relies on the fact that many asymptotic neural networks are drawn from Gaussian processes, the analog of non-interacting field theories. Moving away from the asymptotic limit yields a non-Gaussian process and corresponds to turning on particle interactions, allowing for the computation of correlation functions of neural network outputs with Feynman diagrams. Minimal non-Gaussian process likelihoods are determined by the most relevant non-Gaussian terms, according to the flow in their coefficients induced by the Wilsonian renormalization group. This yields a direct connection between overparameterization and simplicity of neural network likelihoods. Whether the coefficients are constants or functions may be understood in terms of GP limit symmetries, as expected from 't Hooft's technical naturalness. General theoretical calculations are matched to neural network experiments in the simplest class of models allowing the correspondence. Our formalism is valid for any of the many architectures that becomes a GP in an asymptotic limit, a property preserved under certain types of training.Parameter-space and function-space provide two different duality frames in which to study neural networks. We demonstrate that symmetries of network densities may be determined via dual computations of network correlation functions, even when the density is unknown and the network is not equivariant. Symmetry-viaduality relies on invariance properties of the correlation functions, which stem from the choice of network parameter distributions. Input and output symmetries of neural network densities are determined, which recover known Gaussian process results in the infinite width limit. The mechanism may also be utilized to determine symmetries during training, when parameters are correlated, as well as symmetries of the Neural Tangent Kernel. We demonstrate that the amount of symmetry in the initialization density affects the accuracy of networks trained on Fashion-MNIST, and that symmetry breaking helps only when it is in the direction of ground truth.We study the origin of non-Gaussianities in neural network field densities, and demonstrate two distinct methods to constrain these systematically. As examples, we engineer a few nonperturbative neural network field distributions. Lastly, we demonstrate a measure for the locality of neural network actions, via cluster decomposition of connected correlation functions of network output ensembles.
Electronic reproduction.
Ann Arbor, Mich. :
ProQuest,
2024
Mode of access: World Wide Web
ISBN: 9798379417994Subjects--Topical Terms:
1180318
Theoretical physics.
Subjects--Index Terms:
Field theoryIndex Terms--Genre/Form:
554714
Electronic books.
A Study of Field Theories Via Neural Networks.
LDR
:03887ntm a22003737 4500
001
1148679
005
20240930100114.5
006
m o d
007
cr bn ---uuuuu
008
250605s2023 xx obm 000 0 eng d
020
$a
9798379417994
035
$a
(MiAaPQ)AAI30419826
035
$a
AAI30419826
040
$a
MiAaPQ
$b
eng
$c
MiAaPQ
$d
NTU
100
1
$a
Maiti, Anindita.
$3
1474704
245
1 2
$a
A Study of Field Theories Via Neural Networks.
264
0
$c
2023
300
$a
1 online resource (229 pages)
336
$a
text
$b
txt
$2
rdacontent
337
$a
computer
$b
c
$2
rdamedia
338
$a
online resource
$b
cr
$2
rdacarrier
500
$a
Source: Dissertations Abstracts International, Volume: 84-10, Section: B.
500
$a
Advisor: Halverson, James.
502
$a
Thesis (Ph.D.)--Northeastern University, 2023.
504
$a
Includes bibliographical references
520
$a
We propose a theoretical understanding of neural networks in terms of Wilsonian effective field theory. The correspondence relies on the fact that many asymptotic neural networks are drawn from Gaussian processes, the analog of non-interacting field theories. Moving away from the asymptotic limit yields a non-Gaussian process and corresponds to turning on particle interactions, allowing for the computation of correlation functions of neural network outputs with Feynman diagrams. Minimal non-Gaussian process likelihoods are determined by the most relevant non-Gaussian terms, according to the flow in their coefficients induced by the Wilsonian renormalization group. This yields a direct connection between overparameterization and simplicity of neural network likelihoods. Whether the coefficients are constants or functions may be understood in terms of GP limit symmetries, as expected from 't Hooft's technical naturalness. General theoretical calculations are matched to neural network experiments in the simplest class of models allowing the correspondence. Our formalism is valid for any of the many architectures that becomes a GP in an asymptotic limit, a property preserved under certain types of training.Parameter-space and function-space provide two different duality frames in which to study neural networks. We demonstrate that symmetries of network densities may be determined via dual computations of network correlation functions, even when the density is unknown and the network is not equivariant. Symmetry-viaduality relies on invariance properties of the correlation functions, which stem from the choice of network parameter distributions. Input and output symmetries of neural network densities are determined, which recover known Gaussian process results in the infinite width limit. The mechanism may also be utilized to determine symmetries during training, when parameters are correlated, as well as symmetries of the Neural Tangent Kernel. We demonstrate that the amount of symmetry in the initialization density affects the accuracy of networks trained on Fashion-MNIST, and that symmetry breaking helps only when it is in the direction of ground truth.We study the origin of non-Gaussianities in neural network field densities, and demonstrate two distinct methods to constrain these systematically. As examples, we engineer a few nonperturbative neural network field distributions. Lastly, we demonstrate a measure for the locality of neural network actions, via cluster decomposition of connected correlation functions of network output ensembles.
533
$a
Electronic reproduction.
$b
Ann Arbor, Mich. :
$c
ProQuest,
$d
2024
538
$a
Mode of access: World Wide Web
650
4
$a
Theoretical physics.
$3
1180318
653
$a
Field theory
653
$a
Machine learning theory
653
$a
Neural networks
653
$a
Wilsonian field theory
655
7
$a
Electronic books.
$2
local
$3
554714
690
$a
0753
690
$a
0800
710
2
$a
Northeastern University.
$b
Physics.
$3
1186941
710
2
$a
ProQuest Information and Learning Co.
$3
1178819
773
0
$t
Dissertations Abstracts International
$g
84-10B.
856
4 0
$u
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=30419826
$z
click for full text (PQDT)
筆 0 讀者評論
多媒體
評論
新增評論
分享你的心得
Export
取書館別
處理中
...
變更密碼[密碼必須為2種組合(英文和數字)及長度為10碼以上]
登入