語系:
繁體中文
English
說明(常見問題)
登入
回首頁
切換:
標籤
|
MARC模式
|
ISBD
Embeddings in natural language processing : = theory and advances in vector representations of meaning /
紀錄類型:
書目-語言資料,印刷品 : Monograph/item
正題名/作者:
Embeddings in natural language processing :/ Mohammad Taher Pilehvar, Jose Camacho-Collados.
其他題名:
theory and advances in vector representations of meaning /
作者:
Pilehvar, Mohammad Taher.
其他作者:
Camacho-Collados, Jose.
出版者:
[San Rafael, Calif.] :Morgan & Claypool, : c2021.,
面頁冊數:
xvii, 157 p. :ill. ; : 24 cm.;
標題:
Natural language processing (Computer science) -
ISBN:
9781636390215 (pbk.) :
Embeddings in natural language processing : = theory and advances in vector representations of meaning /
Pilehvar, Mohammad Taher.
Embeddings in natural language processing :
theory and advances in vector representations of meaning /Mohammad Taher Pilehvar, Jose Camacho-Collados. - [San Rafael, Calif.] :Morgan & Claypool,c2021. - xvii, 157 p. :ill. ;24 cm. - Synthesis lectures on human language technologies,#471947-4040 ;. - Synthesis lectures on human language technologies ;#18. .
Includes bibliographical references (p. 111-155).
Embeddings have undoubtedly been one of the most influential research areas in Natural Language Processing (NLP). Encoding information into a low-dimensional vector representation, which is easily integrable in modern machine learning models, has played a central role in the development of NLP. Embedding techniques initially focused on words, but the attention soon started to shift to other forms: from graph structures, such as knowledge bases, to other types of textual content, such as sentences and documents. This book provides a high-level synthesis of the main embedding techniques in NLP, in the broad sense. The book starts by explaining conventional word vector space models and word embeddings (e.g., Word2Vec and GloVe) and then moves to other types of embeddings, such as word sense, sentence and document, and graph embeddings. The book also provides an overview of recent developments in contextualized representations (e.g., ELMo and BERT) and explains their potential in NLP. Throughout the book, the reader can find both essential information for understanding a certain topic from scratch and a broad overview of the most successful techniques developed in the literature.
ISBN: 9781636390215 (pbk.) :NT1947Subjects--Topical Terms:
641811
Natural language processing (Computer science)
LC Class. No.: QA76.9.N38 / P554 2021
Dewey Class. No.: 006.3/5
Embeddings in natural language processing : = theory and advances in vector representations of meaning /
LDR
:02036cam a2200217 a 4500
001
1065664
003
NhCcYBP
005
20210316170745.6
008
220831s2021 caua b 000 0 eng d
020
$a
9781636390215 (pbk.) :
$c
NT1947
020
$a
9781636390239 (cloth)
020
$z
9781636390222 (ebook)
035
$a
om827597362
040
$a
YDX
$b
eng
$c
YDX
$d
YDX
$d
STF
$d
OCLCO
$d
OCLCF
$d
YDXIT
$d
NFU
050
4
$a
QA76.9.N38
$b
P554 2021
082
0 4
$a
006.3/5
$2
23
100
1
$a
Pilehvar, Mohammad Taher.
$3
1370993
245
1 0
$a
Embeddings in natural language processing :
$b
theory and advances in vector representations of meaning /
$c
Mohammad Taher Pilehvar, Jose Camacho-Collados.
260
$a
[San Rafael, Calif.] :
$b
Morgan & Claypool,
$c
c2021.
300
$a
xvii, 157 p. :
$b
ill. ;
$c
24 cm.
490
1
$a
Synthesis lectures on human language technologies,
$x
1947-4040 ;
$v
#47
504
$a
Includes bibliographical references (p. 111-155).
520
$a
Embeddings have undoubtedly been one of the most influential research areas in Natural Language Processing (NLP). Encoding information into a low-dimensional vector representation, which is easily integrable in modern machine learning models, has played a central role in the development of NLP. Embedding techniques initially focused on words, but the attention soon started to shift to other forms: from graph structures, such as knowledge bases, to other types of textual content, such as sentences and documents. This book provides a high-level synthesis of the main embedding techniques in NLP, in the broad sense. The book starts by explaining conventional word vector space models and word embeddings (e.g., Word2Vec and GloVe) and then moves to other types of embeddings, such as word sense, sentence and document, and graph embeddings. The book also provides an overview of recent developments in contextualized representations (e.g., ELMo and BERT) and explains their potential in NLP. Throughout the book, the reader can find both essential information for understanding a certain topic from scratch and a broad overview of the most successful techniques developed in the literature.
650
0
$a
Natural language processing (Computer science)
$3
641811
650
0
$a
Artificial intelligence.
$3
559380
650
0
$a
Programming languages (Electronic computers)
$x
Semantics.
$3
674560
700
1
$a
Camacho-Collados, Jose.
$3
1370994
830
0
$a
Synthesis lectures on human language technologies ;
$v
#18.
$3
931364
筆 0 讀者評論
全部
圖書館3F 書庫
館藏
1 筆 • 頁數 1 •
1
條碼號
典藏地名稱
館藏流通類別
資料類型
索書號
使用類型
借閱狀態
預約狀態
備註欄
附件
E048044
圖書館3F 書庫
一般圖書(BOOK)
一般圖書
006.35 P637 2021
一般使用(Normal)
在架
0
預約
1 筆 • 頁數 1 •
1
評論
新增評論
分享你的心得
Export
取書館別
處理中
...
變更密碼[密碼必須為2種組合(英文和數字)及長度為10碼以上]
登入