Language:
English
繁體中文
Help
Login
Back
Switch To:
Labeled
|
MARC Mode
|
ISBD
p-Bits and q-Bits = probabilistic and quantum computing /
Record Type:
Language materials, printed : Monograph/item
Title/Author:
p-Bits and q-Bits/ Supriyo Datta.
Reminder of title:
probabilistic and quantum computing /
Author:
Datta, Supriyo,
Published:
Singapore :World Scientific, : c2025.,
Description:
1 online resource (287 p.) :ill. :
Subject:
Statistical mechanics. -
Online resource:
https://www.worldscientific.com/worldscibooks/10.1142/13877#t=toc
ISBN:
9789811294501
p-Bits and q-Bits = probabilistic and quantum computing /
Datta, Supriyo,1954-
p-Bits and q-Bits
probabilistic and quantum computing /[electronic resource] :Supriyo Datta. - 1st ed. - Singapore :World Scientific,c2025. - 1 online resource (287 p.) :ill. - New era electronics : a lecture notes series ;vol. 3. - New era electronics : a lecture notes series ;vol. 3..
Intro -- Contents -- Acknowledgements -- A Note to Readers -- 1. Prologue -- 1.1 Fig. 1.1: n versus 2n -- 1.2 Figs. 1.2-1.3: Spintronics -- 1.2.0.1 Fig. 1.2 -- 1.2.0.2 Fig. 1.3 -- 1.3 Fig. 1.4: It's the correlations! -- 1.4 Figs. 1.5-1.6: q-bits versus p-bits -- 1.4.0.1 Fig. 1.5: q-bits -- 1.4.0.2 Fig. 1.6: p-bits -- 1.5 Figs. 1.7-1.8: The key difference -- 1.5.0.1 Fig. 1.7 -- 1.5.0.2 Fig. 1.8 -- 1.6 Fig. 1.9: Hardware acceleration -- 1.7 Statistical mechanics -- 2. Statistical Mechanics -- 2.1 State Space -- 2.1.1 Fig. 2.1: Fermi function -- 2.1.2 Fig. 2.2: Boltzmann law -- 2.1.3 Figs. 2.3-2.4: Fermi function from Boltzmann law -- 2.1.3.1 Fig. 2.3 -- 2.1.3.2 Fig. 2.4: Boltzmann Law is NOT the Boltzmann approximation -- 2.1.4 Figs. 2.5-2.7: Two non-interacting energy levels -- 2.1.4.1 Fig. 2.5 -- 2.1.4.2 Fig. 2.6: E - μN -- 2.1.4.3 Fig. 2.7: Probabilities -- 2.1.5 Fig. 2.8: Two levels with interaction -- 2.1.6 Quiz -- 2.1.6.1 Question 1 -- 2.1.6.2 Question 2 -- 2.1.6.3 Question 3 -- 2.2 Boltzmann Law -- 2.2.1 Fig. 2.9: System and reservoir -- 2.2.2 Figs. 2.10-2.13: Justifying the law -- 2.2.2.1 Fig. 2.11 -- 2.2.2.2 Fig. 2.12 -- 2.2.2.3 Fig. 2.13 -- 2.2.3 Fig. 2.14: Canonical ensemble -- 2.2.4 Fig. 2.15: Grand canonical ensemble -- 2.2.5 Quiz -- 2.2.5.1 Question 1 -- 2.2.5.2 Question 2 -- 2.2.5.3 Question 3 -- 2.3 Entropy -- 2.3.1 Figs. 2.16-2.18: Entropy from reservoir model -- 2.3.1.1 Fig. 2.16: Model for reservoir -- 2.3.1.2 Fig. 2.17 -- 2.3.1.3 Fig. 2.18 -- 2.3.2 Figs. 2.19-2.20: Thermodynamic versus information entropy -- 2.3.2.1 Fig. 2.19 -- 2.3.2.2 Fig. 2.20: Expression for entropy -- 2.3.3 Figs. 2.21-2.22: Reservoir model with d-level units -- 2.3.3.1 Fig. 2.21 -- 2.3.3.2 Fig. 2.22 -- 2.3.4 Fig. 2.23: μ, T from entropy S -- 2.3.5 Quiz -- 2.3.5.1 Question 1 -- 2.3.5.2 Question 2 -- 2.3.5.3 Question 3 -- 2.4 Free Energy.
"This book is the third volume in the New Era Electronics lecture notes series, a compilation of volumes defining the important concepts tied to the electronics transition happening in the 21st century. The material is adapted from a unique course that connects three diverse fields - statistical mechanics, neural networks and quantum computing - using the unifying concept of a state-space with 2N dimensions defined by N binary bits. First, the seminal concepts of statistical mechanics, developed to describe natural interacting systems, are described. Then, these concepts are connected to engineering interacting systems like Boltzmann Machines (BM), which are cleverly designed to solve important problems in machine learning. Finally, we connect to engineered quantum systems, stressing the key role of quantum interference in distinguishing them from classical systems like BM. Assuming only a basic background in differential equations and linear algebra, this book is accessible to broader audiences across its described topics, including students in physics, engineering and computing, as well as professionals working actively in the technical fields looking for a primer to quantum computational methods"--
ISBN: 9789811294501Subjects--Topical Terms:
676754
Statistical mechanics.
LC Class. No.: QC174.8 / .D38 2025
Dewey Class. No.: 530.13
p-Bits and q-Bits = probabilistic and quantum computing /
LDR
:11744cam a2200349 a 4500
001
1168266
005
20251017072503.0
006
m o d
007
cr cnu---unuuu
008
251229s2025 si a o 000 0 eng d
020
$a
9789811294501
$q
(ebook for institutions)
020
$a
9789811294518
$q
(ebook for individuals)
020
$z
9789811294617
$q
(paperback)
020
$z
9789811294495
$q
(hardcover)
035
$a
9789811294501
040
$a
MiAaPQ
$b
eng
$c
MiAaPQ
$d
MiAaPQ
041
0
$a
eng
050
4
$a
QC174.8
$b
.D38 2025
082
0 4
$a
530.13
$2
23
100
1
$a
Datta, Supriyo,
$d
1954-
$3
1240290
245
1 0
$a
p-Bits and q-Bits
$h
[electronic resource] :
$b
probabilistic and quantum computing /
$c
Supriyo Datta.
250
$a
1st ed.
260
$a
Singapore :
$b
World Scientific,
$c
c2025.
300
$a
1 online resource (287 p.) :
$b
ill.
490
1
$a
New era electronics : a lecture notes series ;
$v
vol. 3
505
0
$a
Intro -- Contents -- Acknowledgements -- A Note to Readers -- 1. Prologue -- 1.1 Fig. 1.1: n versus 2n -- 1.2 Figs. 1.2-1.3: Spintronics -- 1.2.0.1 Fig. 1.2 -- 1.2.0.2 Fig. 1.3 -- 1.3 Fig. 1.4: It's the correlations! -- 1.4 Figs. 1.5-1.6: q-bits versus p-bits -- 1.4.0.1 Fig. 1.5: q-bits -- 1.4.0.2 Fig. 1.6: p-bits -- 1.5 Figs. 1.7-1.8: The key difference -- 1.5.0.1 Fig. 1.7 -- 1.5.0.2 Fig. 1.8 -- 1.6 Fig. 1.9: Hardware acceleration -- 1.7 Statistical mechanics -- 2. Statistical Mechanics -- 2.1 State Space -- 2.1.1 Fig. 2.1: Fermi function -- 2.1.2 Fig. 2.2: Boltzmann law -- 2.1.3 Figs. 2.3-2.4: Fermi function from Boltzmann law -- 2.1.3.1 Fig. 2.3 -- 2.1.3.2 Fig. 2.4: Boltzmann Law is NOT the Boltzmann approximation -- 2.1.4 Figs. 2.5-2.7: Two non-interacting energy levels -- 2.1.4.1 Fig. 2.5 -- 2.1.4.2 Fig. 2.6: E - μN -- 2.1.4.3 Fig. 2.7: Probabilities -- 2.1.5 Fig. 2.8: Two levels with interaction -- 2.1.6 Quiz -- 2.1.6.1 Question 1 -- 2.1.6.2 Question 2 -- 2.1.6.3 Question 3 -- 2.2 Boltzmann Law -- 2.2.1 Fig. 2.9: System and reservoir -- 2.2.2 Figs. 2.10-2.13: Justifying the law -- 2.2.2.1 Fig. 2.11 -- 2.2.2.2 Fig. 2.12 -- 2.2.2.3 Fig. 2.13 -- 2.2.3 Fig. 2.14: Canonical ensemble -- 2.2.4 Fig. 2.15: Grand canonical ensemble -- 2.2.5 Quiz -- 2.2.5.1 Question 1 -- 2.2.5.2 Question 2 -- 2.2.5.3 Question 3 -- 2.3 Entropy -- 2.3.1 Figs. 2.16-2.18: Entropy from reservoir model -- 2.3.1.1 Fig. 2.16: Model for reservoir -- 2.3.1.2 Fig. 2.17 -- 2.3.1.3 Fig. 2.18 -- 2.3.2 Figs. 2.19-2.20: Thermodynamic versus information entropy -- 2.3.2.1 Fig. 2.19 -- 2.3.2.2 Fig. 2.20: Expression for entropy -- 2.3.3 Figs. 2.21-2.22: Reservoir model with d-level units -- 2.3.3.1 Fig. 2.21 -- 2.3.3.2 Fig. 2.22 -- 2.3.4 Fig. 2.23: μ, T from entropy S -- 2.3.5 Quiz -- 2.3.5.1 Question 1 -- 2.3.5.2 Question 2 -- 2.3.5.3 Question 3 -- 2.4 Free Energy.
505
8
$a
2.4.1 Fig. 2.24: Free energy, F -- 2.4.2 Fig. 2.25: Gibbs' inequality -- 2.4.3 Fig. 2.26: Equilibrium free energy -- 2.4.4 Figs. 2.27-2.28: Entropy drives flow -- 2.4.4.1 Fig. 2.27 -- 2.4.4.2 Fig. 2.28: Flow driven by temperature -- 2.4.5 Quiz -- 2.4.5.1 Question 1 -- 2.4.5.2 Question 2 -- 2.4.5.3 Question 3 -- 2.5 Self-Consistent Field -- 2.5.1 Figs. 2.29-2.32: The exponential problem -- 2.5.1.1 Figs. 2.29 -- 2.5.1.2 Fig. 2.30: Toy example -- 2.5.1.3 Fig. 2.31 -- 2.5.1.4 Fig. 2.32 -- 2.5.2 Figs. 2.33-2.36: SCF method -- 2.5.2.1 Fig. 2.34 -- 2.5.2.2 Fig. 2.35 -- 2.5.2.3 Fig. 2.36 -- 2.5.3 Fig. 2.37: SCF and neural networks -- 2.5.4 Quiz -- 2.5.4.1 Question 1 -- 2.5.4.2 Question 2 -- 2.5.4.3 Question 3 -- 2.6 Fig. 2.38: 5-minute Summary -- 3. Boltzmann Machines -- 3.1 Sampling -- 3.1.1 Figs. 3.1-3.4: From f to n -- 3.1.1.1 Fig. 3.1: Recap -- 3.1.1.2 Fig. 3.2: Replace f with n -- 3.1.1.3 Fig. 3.3 -- 3.1.1.4 Fig. 3.4: Generating samples -- 3.1.2 Figs. 3.5-3.8: Synapse from interaction energy -- 3.1.2.1 Fig. 3.5 -- 3.1.2.2 Fig. 3.6 -- 3.1.2.3 Fig. 3.7 -- 3.1.2.4 Fig. 3.8: A simple code -- 3.1.3 Figs. 3.9-3.15: Toy example -- 3.1.3.1 Fig. 3.9: Solution, the Boltzmann way -- 3.1.3.2 Fig. 3.10 -- 3.1.3.3 Fig. 3.11: How NOT to sample -- 3.1.3.4 Fig. 3.12: Solution by sampling -- 3.1.3.5 Fig. 3.13 -- 3.1.3.6 Fig. 3.14 -- 3.1.3.7 Fig. 3.15: Sampling method, key points -- 3.1.4 Quiz -- 3.1.4.1 Question 1 -- 3.1.4.2 Question 2 -- 3.1.4.3 Question 3 -- 3.2 Orchestrating Interactions -- 3.2.1 Figs. 3.16-3.20: Generalizing the toy model -- 3.2.1.1 Fig. 3.16 -- 3.2.1.2 Fig. 3.17 -- 3.2.1.3 Fig. 3.18: Four level example -- 3.2.1.4 Fig. 3.19 -- 3.2.1.5 Fig. 3.20 -- 3.2.2 Figs. 3.21-3.23: From natural to orchestrated interactions -- 3.2.2.1 Fig. 3.21 -- 3.2.2.2 Fig. 3.22: Software implementation -- 3.2.2.3 Fig. 3.23 -- 3.2.3 Figs. 3.24-3.25: p-bits and q-bits.
505
8
$a
3.2.3.1 Fig. 3.24 -- 3.2.3.2 Fig. 3.25 -- 3.2.4 Quiz -- 3.2.4.1 Question 1 -- 3.2.4.2 Question 2 -- 3.3 Optimization -- 3.3.1 Figs. 3.26-3.27: Graph partitioning -- 3.3.1.1 Fig. 3.26 -- 3.3.1.2 Fig. 3.27 -- 3.3.2 Figs. 3.28-3.31: Defining energy -- 3.3.2.1 Fig. 3.28 -- 3.3.2.2 Fig. 3.29 -- 3.3.2.3 Figs. 3.30-3.31: Finding x,w -- 3.3.3 Figs. 3.32-3.38: Imposing constraints -- 3.3.3.1 Fig. 3.32: State space response -- 3.3.3.2 Figs. 3.33-3.35: Constraints through energy -- 3.3.3.3 Fig. 3.36 -- 3.3.3.4 Figs. 3.37-3.38: From min-cut to max-cut -- 3.3.3.5 Fig. 3.37 -- 3.3.3.6 Fig. 3.38 -- 3.3.4 Figs. 3.39-3.40: Summary -- 3.3.4.1 Fig. 3.39 -- 3.3.4.2 Fig. 3.40 -- 3.3.5 Quiz -- 3.3.5.1 Question 1 -- 3.3.5.2 Question 2 -- 3.4 Inference -- 3.4.1 Figs. 3.41-3.45: Logic gates -- 3.4.1.1 Fig. 3.41 -- 3.4.1.2 Fig. 3.42 -- 3.4.1.3 Figs. 3.43 and 3.44 -- 3.4.1.4 Fig. 3.45 -- 3.4.2 Fig. 3.46: Image classification -- 3.4.3 Figs. 3.47-3.48: A simple learning rule -- 3.4.3.1 Fig. 3.47 -- 3.4.3.2 Fig. 3.48 -- 3.4.4 Figs. 3.49-3.50: Binary-bipolar interconversion -- 3.4.5 Quiz -- 3.4.5.1 Question 1 -- 3.4.5.2 Question 2 -- 3.5 Learning -- 3.5.1 Fig. 3.51: Learning rule #1 -- 3.5.2 Figs. 3.52-3.55: Average value and correlation matrix -- 3.5.2.1 Fig. 3.53 -- 3.5.2.2 Fig. 3.54 -- 3.5.2.3 Fig. 3.55 -- 3.5.3 Figs. 3.56-3.58: Learning rule #2 -- 3.5.3.1 Fig. 3.57 -- 3.5.3.2 Fig. 3.58 -- 3.5.4 Fig. 3.59: Learning a full adder -- 3.5.5 Figs. 3.60-3.62: Learning with hidden units -- 3.5.5.1 Fig. 3.60 -- 3.5.5.2 Fig. 3.61 -- 3.5.5.3 Fig. 3.62 -- 3.5.6 Quiz -- 3.5.6.1 Question 1 -- 3.5.6.2 Question 2 -- 3.5.6.3 Question 3 -- 3.6 Fig. 3.63: 5-minute Summary -- 4. Transition Matrix -- 4.1 Markov Chain Monte Carlo -- 4.1.1 Figs. 4.1-4.4: Transition matrix -- 4.1.1.1 Fig. 4.1: Definition -- 4.1.1.2 Fig. 4.2: Properties -- 4.1.1.3 Fig. 4.3 -- 4.1.1.4 Fig. 4.4.
505
8
$a
4.1.2 Figs. 4.5-4.9: Stationary distribution -- 4.1.2.1 Fig. 4.5 -- 4.1.2.2 Fig. 4.6 -- 4.1.2.3 Fig. 4.7 -- 4.1.2.4 Fig. 4.8 -- 4.1.2.5 Fig. 4.9 -- 4.1.3 Fig. 4.10: Metropolis algorithm -- 4.1.4 Quiz -- 4.1.4.1 Question 1 -- 4.1.4.2 Question 2 -- 4.2 Gibbs' Sampling -- 4.2.1 Figs. 4.11-4.12: How it works -- 4.2.1.1 Fig. 4.11 -- 4.2.1.2 Fig. 4.12 -- 4.2.2 Figs. 4.13-4.20: Toy example with n = 2 -- 4.2.2.1 Fig. 4.13 -- 4.2.2.2 Fig. 4.14: Transition matrix for updating p-bit 1 -- 4.2.2.3 Figs. 4.15-4.16 -- 4.2.2.4 Figs. 4.17-4.18 -- 4.2.2.5 Fig. 4.19: Transition matrix for updating p-bit 2 -- 4.2.2.6 Fig. 4.20 -- 4.2.3 Quiz -- 4.2.3.1 Question 1 -- 4.2.3.2 Question 2 -- 4.3 Sequential Versus Simultaneous Updates -- 4.3.1 Figs. 4.21-4.23: Sequential update -- 4.3.1.1 Fig. 4.21: Toy example -- 4.3.1.2 Fig. 4.22: Transition matrix -- 4.3.1.3 Fig. 4.23 -- 4.3.2 Fig. 4.24: Simultaneous update -- 4.3.3 Figs. 4.25-4.26: Sequential versus simultaneous -- 4.3.3.1 Fig. 4.25 -- 4.3.3.2 Fig. 4.26 -- 4.3.4 Fig. 4.27: Restricted Boltzmann machine -- 4.3.5 Quiz -- 4.3.5.1 Question 1 -- 4.3.5.2 Question 2 -- 4.4 Bayesian Networks -- 4.4.1 Figs. 4.28-4.35: Bayesian versus reciprocal networks -- 4.4.1.1 Figs. 4.29-4.30: Bayesian networks -- 4.4.1.2 Fig. 4.31 -- 4.4.1.3 Fig. 4.32: Why no energy function -- 4.4.1.4 Fig. 4.33 -- 4.4.1.5 Fig. 4.34 -- 4.4.1.6 Fig. 4.35 -- 4.4.2 Figs. 4.36-4.37: Bayes theorem -- 4.4.3 Quiz -- 4.4.3.1 Question 1 -- 4.4.3.2 Question 2 -- 4.5 Feynman Paths -- 4.5.1 Figs. 4.38-4.42: Multiplying W-matrices -- 4.5.1.1 Fig. 4.38: Why W-matrix? -- 4.5.1.2 Fig. 4.39 -- 4.5.1.3 Fig. 4.40 -- 4.5.1.4 Figs. 4.41-4.42 -- 4.5.1.5 Fig. 4.42 -- 4.5.2 Figs. 4.43-4.45: Matrix multiplication as sum over paths -- 4.5.2.1 Fig. 4.44 -- 4.5.2.2 Fig. 4.45 -- 4.5.3 Quiz -- 4.5.3.1 Question 1 -- 4.5.3.2 Question 2 -- 4.5.3.3 Question 3 -- 4.6 Fig. 4.46: 5-minute Summary.
505
8
$a
5. Quantum Boltzmann Law -- 5.1 Quantum Spins -- 5.1.1 Figs. 5.1-5.3: Classical spin -- 5.1.1.1 Fig. 5.1 -- 5.1.1.2 Fig. 5.2 -- 5.1.1.3 Fig. 5.3 -- 5.1.2 Figs. 5.4-5.5: Quantum spins -- 5.1.2.1 Fig. 5.4 -- 5.1.2.2 Fig. 5.5 -- 5.1.3 Fig. 5.6: Density matrix -- 5.1.4 Figs. 5.7-5.10: Predicting measurements -- 5.1.4.1 Figs. 5.7 -- 5.1.4.2 Fig. 5.8 -- 5.1.4.3 Fig. 5.9 -- 5.1.4.4 Fig. 5.10 -- 5.1.5 Quiz -- 5.1.5.1 Question 1 -- 5.1.5.2 Question 2 -- 5.1.5.3 Question 3 -- 5.2 One q-bit System -- 5.2.1 Fig. 5.11: Hamiltonian -- 5.2.2 Figs. 5.12-5.14: Density matrix -- 5.2.2.1 Fig. 5.12 -- 5.2.2.2 Fig. 5.13 -- 5.2.2.3 Fig. 5.14 -- 5.2.3 Figs. 5.15-5.17: Predicting mz -- 5.2.3.1 Fig. 5.15 -- 5.2.3.2 Fig. 5.16 -- 5.2.3.3 Fig. 5.17 -- 5.2.4 Quiz -- 5.2.4.1 Question 1 -- 5.2.4.2 Question 2 -- 5.2.4.3 Question 3 -- 5.3 Spin-Spin Interactions -- 5.3.1 Figs. 5.18-5.19: Interaction Hamiltonian -- 5.3.1.1 Fig. 5.18 -- 5.3.1.2 Fig. 5.19 -- 5.3.2 Figs. 5.20-5.21: 2-spin matrices -- 5.3.2.1 Fig. 5.20 -- 5.3.2.2 Fig. 5.21 -- 5.3.3 Figs. 5.22-5.26: Product matrices -- 5.3.3.1 Fig. 5.22 -- 5.3.3.2 Fig. 5.23 -- 5.3.3.3 Fig. 5.24 -- 5.3.3.4 Fig. 5.25 -- 5.3.3.5 Fig. 5.26 -- 5.3.4 Fig. 5.27: n-spin matrices -- 5.3.5 Figs. 5.28-5.29: Why quantum computers? -- 5.3.5.1 Fig. 5.28 -- 5.3.5.2 Fig. 5.29 -- 5.3.6 Quiz -- 5.3.6.1 Question 1 -- 5.3.6.2 Question 2 -- 5.3.6.3 Question 3 -- 5.4 Two q-bit System -- 5.4.1 Figs. 5.30-5.31: Hamiltonian -- 5.4.1.1 Figs. 5.30 -- 5.4.1.2 Fig. 5.31 -- 5.4.2 Figs. 5.32-5.33: Ising spins -- 5.4.2.1 Fig. 5.32 -- 5.4.2.2 Fig. 5.33 -- 5.4.3 Figs. 5.34-5.35: Quantum spins -- 5.4.3.1 Fig. 5.34 -- 5.4.3.2 Fig. 5.35 -- 5.4.4 Quiz -- 5.4.4.1 Question 1 -- 5.4.4.2 Question 2 -- 5.4.4.3 Question 3 -- 5.5 Quantum Annealing -- 5.5.1 Figs. 5.36-5.38: Why anneal? -- 5.5.1.1 Fig. 5.36 -- 5.5.1.2 Fig. 5.37 -- 5.5.1.3 Fig. 5.38.
505
8
$a
5.5.2 Figs. 5.39-5.40: Translating to quantum spins.
520
$a
"This book is the third volume in the New Era Electronics lecture notes series, a compilation of volumes defining the important concepts tied to the electronics transition happening in the 21st century. The material is adapted from a unique course that connects three diverse fields - statistical mechanics, neural networks and quantum computing - using the unifying concept of a state-space with 2N dimensions defined by N binary bits. First, the seminal concepts of statistical mechanics, developed to describe natural interacting systems, are described. Then, these concepts are connected to engineering interacting systems like Boltzmann Machines (BM), which are cleverly designed to solve important problems in machine learning. Finally, we connect to engineered quantum systems, stressing the key role of quantum interference in distinguishing them from classical systems like BM. Assuming only a basic background in differential equations and linear algebra, this book is accessible to broader audiences across its described topics, including students in physics, engineering and computing, as well as professionals working actively in the technical fields looking for a primer to quantum computational methods"--
$c
Provided by publisher.
588
$a
Description based on online resource; title from digital title page (viewed on September 06, 2024)
650
0
$a
Statistical mechanics.
$3
676754
830
0
$a
New era electronics : a lecture notes series ;
$v
vol. 3.
$3
1497641
856
4 0
$u
https://www.worldscientific.com/worldscibooks/10.1142/13877#t=toc
based on 0 review(s)
Multimedia
Reviews
Add a review
and share your thoughts with other readers
Export
pickup library
Processing
...
Change password
Login