語系:
繁體中文
English
說明(常見問題)
登入
回首頁
切換:
標籤
|
MARC模式
|
ISBD
The geometry of intelligence = foundations of transformer networks in deep learning /
紀錄類型:
書目-語言資料,印刷品 : Monograph/item
正題名/作者:
The geometry of intelligence/ by Pradeep Singh, Balasubramanian Raman.
其他題名:
foundations of transformer networks in deep learning /
作者:
Singh, Pradeep.
其他作者:
Raman, Balasubramanian.
出版者:
Singapore :Springer Nature Singapore : : 2025.,
面頁冊數:
xxi, 361 p. :ill. (some col.), digital ; : 24 cm.;
Contained By:
Springer Nature eBook
標題:
Machine Learning. -
電子資源:
https://doi.org/10.1007/978-981-96-4706-4
ISBN:
9789819647064
The geometry of intelligence = foundations of transformer networks in deep learning /
Singh, Pradeep.
The geometry of intelligence
foundations of transformer networks in deep learning /[electronic resource] :by Pradeep Singh, Balasubramanian Raman. - Singapore :Springer Nature Singapore :2025. - xxi, 361 p. :ill. (some col.), digital ;24 cm. - Studies in big data,v. 1752197-6511 ;. - Studies in big data ;v.1..
Foundations of Representation Theory in Transformers -- Word Embeddings and Positional Encoding -- Attention Mechanisms -- Transformer Architecture: Encoder and Decoder -- Transformers in Natural Language Processing -- Transformers in Computer Vision -- Time Series Forecasting with Transformers -- Signal Analysis and Transformers -- Advanced Topics and Future Directions -- Convergence of Transformer Models: A Dynamical Systems Perspective.
This book offers an in-depth exploration of the mathematical foundations underlying transformer networks, the cornerstone of modern AI across various domains. Unlike existing literature that focuses primarily on implementation, this work delves into the elegant geometry, symmetry, and mathematical structures that drive the success of transformers. Through rigorous analysis and theoretical insights, the book unravels the complex relationships and dependencies that these models capture, providing a comprehensive understanding of their capabilities. Designed for researchers, academics, and advanced practitioners, this text bridges the gap between practical application and theoretical exploration. Readers will gain a profound understanding of how transformers operate in abstract spaces, equipping them with the knowledge to innovate, optimize, and push the boundaries of AI. Whether you seek to deepen your expertise or pioneer the next generation of AI models, this book is an essential resource on the mathematical principles of transformers.
ISBN: 9789819647064
Standard No.: 10.1007/978-981-96-4706-4doiSubjects--Topical Terms:
1137723
Machine Learning.
LC Class. No.: Q335
Dewey Class. No.: 006.30151
The geometry of intelligence = foundations of transformer networks in deep learning /
LDR
:02601nam a2200337 a 4500
001
1162427
003
DE-He213
005
20250522130240.0
006
m d
007
cr nn 008maaau
008
251029s2025 si s 0 eng d
020
$a
9789819647064
$q
(electronic bk.)
020
$a
9789819647057
$q
(paper)
024
7
$a
10.1007/978-981-96-4706-4
$2
doi
035
$a
978-981-96-4706-4
040
$a
GP
$c
GP
041
0
$a
eng
050
4
$a
Q335
072
7
$a
UYQ
$2
bicssc
072
7
$a
COM004000
$2
bisacsh
072
7
$a
UYQ
$2
thema
082
0 4
$a
006.30151
$2
23
090
$a
Q335
$b
.S617 2025
100
1
$a
Singh, Pradeep.
$3
1481436
245
1 4
$a
The geometry of intelligence
$h
[electronic resource] :
$b
foundations of transformer networks in deep learning /
$c
by Pradeep Singh, Balasubramanian Raman.
260
$a
Singapore :
$c
2025.
$b
Springer Nature Singapore :
$b
Imprint: Springer,
300
$a
xxi, 361 p. :
$b
ill. (some col.), digital ;
$c
24 cm.
490
1
$a
Studies in big data,
$x
2197-6511 ;
$v
v. 175
505
0
$a
Foundations of Representation Theory in Transformers -- Word Embeddings and Positional Encoding -- Attention Mechanisms -- Transformer Architecture: Encoder and Decoder -- Transformers in Natural Language Processing -- Transformers in Computer Vision -- Time Series Forecasting with Transformers -- Signal Analysis and Transformers -- Advanced Topics and Future Directions -- Convergence of Transformer Models: A Dynamical Systems Perspective.
520
$a
This book offers an in-depth exploration of the mathematical foundations underlying transformer networks, the cornerstone of modern AI across various domains. Unlike existing literature that focuses primarily on implementation, this work delves into the elegant geometry, symmetry, and mathematical structures that drive the success of transformers. Through rigorous analysis and theoretical insights, the book unravels the complex relationships and dependencies that these models capture, providing a comprehensive understanding of their capabilities. Designed for researchers, academics, and advanced practitioners, this text bridges the gap between practical application and theoretical exploration. Readers will gain a profound understanding of how transformers operate in abstract spaces, equipping them with the knowledge to innovate, optimize, and push the boundaries of AI. Whether you seek to deepen your expertise or pioneer the next generation of AI models, this book is an essential resource on the mathematical principles of transformers.
650
2 4
$a
Machine Learning.
$3
1137723
650
2 4
$a
Communications Engineering, Networks.
$3
669809
650
2 4
$a
Artificial Intelligence.
$3
646849
650
1 4
$a
Computational Intelligence.
$3
768837
650
0
$a
Transformations (Mathematics)
$3
527919
650
0
$a
Artificial intelligence
$x
Mathematics.
$3
840891
700
1
$a
Raman, Balasubramanian.
$3
1202362
710
2
$a
SpringerLink (Online service)
$3
593884
773
0
$t
Springer Nature eBook
830
0
$a
Studies in big data ;
$v
v.1.
$3
1020233
856
4 0
$u
https://doi.org/10.1007/978-981-96-4706-4
950
$a
Intelligent Technologies and Robotics (SpringerNature-42732)
筆 0 讀者評論
多媒體
評論
新增評論
分享你的心得
Export
取書館別
處理中
...
變更密碼[密碼必須為2種組合(英文和數字)及長度為10碼以上]
登入