語系:
繁體中文
English
說明(常見問題)
登入
回首頁
切換:
標籤
|
MARC模式
|
ISBD
Neural Representations of Natural La...
~
Togneri, Roberto.
Neural Representations of Natural Language
紀錄類型:
書目-語言資料,印刷品 : Monograph/item
正題名/作者:
Neural Representations of Natural Language/ by Lyndon White, Roberto Togneri, Wei Liu, Mohammed Bennamoun.
作者:
White, Lyndon.
其他作者:
Togneri, Roberto.
面頁冊數:
XIV, 122 p. 36 illus., 31 illus. in color.online resource. :
Contained By:
Springer Nature eBook
標題:
Computational intelligence. -
電子資源:
https://doi.org/10.1007/978-981-13-0062-2
ISBN:
9789811300622
Neural Representations of Natural Language
White, Lyndon.
Neural Representations of Natural Language
[electronic resource] /by Lyndon White, Roberto Togneri, Wei Liu, Mohammed Bennamoun. - 1st ed. 2019. - XIV, 122 p. 36 illus., 31 illus. in color.online resource. - Studies in Computational Intelligence,7831860-949X ;. - Studies in Computational Intelligence,564.
Introduction -- Machine Learning for Representations -- Current Challenges in Natural Language Processing -- Word Representations -- Word Sense Representations -- Phrase Representations -- Sentence representations and beyond -- Character-Based Representations -- Conclusion.
This book offers an introduction to modern natural language processing using machine learning, focusing on how neural networks create a machine interpretable representation of the meaning of natural language. Language is crucially linked to ideas – as Webster’s 1923 “English Composition and Literature” puts it: “A sentence is a group of words expressing a complete thought”. Thus the representation of sentences and the words that make them up is vital in advancing artificial intelligence and other “smart” systems currently being developed. Providing an overview of the research in the area, from Bengio et al.’s seminal work on a “Neural Probabilistic Language Model” in 2003, to the latest techniques, this book enables readers to gain an understanding of how the techniques are related and what is best for their purposes. As well as a introduction to neural networks in general and recurrent neural networks in particular, this book details the methods used for representing words, senses of words, and larger structures such as sentences or documents. The book highlights practical implementations and discusses many aspects that are often overlooked or misunderstood. The book includes thorough instruction on challenging areas such as hierarchical softmax and negative sampling, to ensure the reader fully and easily understands the details of how the algorithms function. Combining practical aspects with a more traditional review of the literature, it is directly applicable to a broad readership. It is an invaluable introduction for early graduate students working in natural language processing; a trustworthy guide for industry developers wishing to make use of recent innovations; and a sturdy bridge for researchers already familiar with linguistics or machine learning wishing to understand the other.
ISBN: 9789811300622
Standard No.: 10.1007/978-981-13-0062-2doiSubjects--Topical Terms:
568984
Computational intelligence.
LC Class. No.: Q342
Dewey Class. No.: 006.3
Neural Representations of Natural Language
LDR
:03591nam a22004095i 4500
001
1008789
003
DE-He213
005
20200705022623.0
007
cr nn 008mamaa
008
210106s2019 si | s |||| 0|eng d
020
$a
9789811300622
$9
978-981-13-0062-2
024
7
$a
10.1007/978-981-13-0062-2
$2
doi
035
$a
978-981-13-0062-2
050
4
$a
Q342
072
7
$a
UYQ
$2
bicssc
072
7
$a
TEC009000
$2
bisacsh
072
7
$a
UYQ
$2
thema
082
0 4
$a
006.3
$2
23
100
1
$a
White, Lyndon.
$4
aut
$4
http://id.loc.gov/vocabulary/relators/aut
$3
1228394
245
1 0
$a
Neural Representations of Natural Language
$h
[electronic resource] /
$c
by Lyndon White, Roberto Togneri, Wei Liu, Mohammed Bennamoun.
250
$a
1st ed. 2019.
264
1
$a
Singapore :
$b
Springer Singapore :
$b
Imprint: Springer,
$c
2019.
300
$a
XIV, 122 p. 36 illus., 31 illus. in color.
$b
online resource.
336
$a
text
$b
txt
$2
rdacontent
337
$a
computer
$b
c
$2
rdamedia
338
$a
online resource
$b
cr
$2
rdacarrier
347
$a
text file
$b
PDF
$2
rda
490
1
$a
Studies in Computational Intelligence,
$x
1860-949X ;
$v
783
505
0
$a
Introduction -- Machine Learning for Representations -- Current Challenges in Natural Language Processing -- Word Representations -- Word Sense Representations -- Phrase Representations -- Sentence representations and beyond -- Character-Based Representations -- Conclusion.
520
$a
This book offers an introduction to modern natural language processing using machine learning, focusing on how neural networks create a machine interpretable representation of the meaning of natural language. Language is crucially linked to ideas – as Webster’s 1923 “English Composition and Literature” puts it: “A sentence is a group of words expressing a complete thought”. Thus the representation of sentences and the words that make them up is vital in advancing artificial intelligence and other “smart” systems currently being developed. Providing an overview of the research in the area, from Bengio et al.’s seminal work on a “Neural Probabilistic Language Model” in 2003, to the latest techniques, this book enables readers to gain an understanding of how the techniques are related and what is best for their purposes. As well as a introduction to neural networks in general and recurrent neural networks in particular, this book details the methods used for representing words, senses of words, and larger structures such as sentences or documents. The book highlights practical implementations and discusses many aspects that are often overlooked or misunderstood. The book includes thorough instruction on challenging areas such as hierarchical softmax and negative sampling, to ensure the reader fully and easily understands the details of how the algorithms function. Combining practical aspects with a more traditional review of the literature, it is directly applicable to a broad readership. It is an invaluable introduction for early graduate students working in natural language processing; a trustworthy guide for industry developers wishing to make use of recent innovations; and a sturdy bridge for researchers already familiar with linguistics or machine learning wishing to understand the other.
650
0
$a
Computational intelligence.
$3
568984
650
0
$a
Signal processing.
$3
561459
650
0
$a
Image processing.
$3
557495
650
0
$a
Speech processing systems.
$3
564428
650
0
$a
Pattern recognition.
$3
1253525
650
0
$a
Computational linguistics.
$3
555811
650
1 4
$a
Computational Intelligence.
$3
768837
650
2 4
$a
Signal, Image and Speech Processing.
$3
670837
650
2 4
$a
Pattern Recognition.
$3
669796
650
2 4
$a
Computational Linguistics.
$3
670080
700
1
$a
Togneri, Roberto.
$4
aut
$4
http://id.loc.gov/vocabulary/relators/aut
$3
1063269
700
1
$a
Liu, Wei.
$4
aut
$4
http://id.loc.gov/vocabulary/relators/aut
$3
1069876
700
1
$a
Bennamoun, Mohammed.
$e
author.
$4
aut
$4
http://id.loc.gov/vocabulary/relators/aut
$3
1302636
710
2
$a
SpringerLink (Online service)
$3
593884
773
0
$t
Springer Nature eBook
776
0 8
$i
Printed edition:
$z
9789811300615
776
0 8
$i
Printed edition:
$z
9789811300639
776
0 8
$i
Printed edition:
$z
9789811343209
830
0
$a
Studies in Computational Intelligence,
$x
1860-949X ;
$v
564
$3
1253640
856
4 0
$u
https://doi.org/10.1007/978-981-13-0062-2
912
$a
ZDB-2-INR
912
$a
ZDB-2-SXIT
950
$a
Intelligent Technologies and Robotics (SpringerNature-42732)
950
$a
Intelligent Technologies and Robotics (R0) (SpringerNature-43728)
筆 0 讀者評論
多媒體
評論
新增評論
分享你的心得
Export
取書館別
處理中
...
變更密碼[密碼必須為2種組合(英文和數字)及長度為10碼以上]
登入