語系:
繁體中文
English
說明(常見問題)
登入
回首頁
切換:
標籤
|
MARC模式
|
ISBD
Composing Fisher Kernels from Deep N...
~
SpringerLink (Online service)
Composing Fisher Kernels from Deep Neural Models = A Practitioner's Approach /
紀錄類型:
書目-語言資料,印刷品 : Monograph/item
正題名/作者:
Composing Fisher Kernels from Deep Neural Models/ by Tayyaba Azim, Sarah Ahmed.
其他題名:
A Practitioner's Approach /
作者:
Azim, Tayyaba.
其他作者:
Ahmed, Sarah.
面頁冊數:
XIII, 59 p. 6 illus., 5 illus. in color.online resource. :
Contained By:
Springer Nature eBook
標題:
Pattern recognition. -
電子資源:
https://doi.org/10.1007/978-3-319-98524-4
ISBN:
9783319985244
Composing Fisher Kernels from Deep Neural Models = A Practitioner's Approach /
Azim, Tayyaba.
Composing Fisher Kernels from Deep Neural Models
A Practitioner's Approach /[electronic resource] :by Tayyaba Azim, Sarah Ahmed. - 1st ed. 2018. - XIII, 59 p. 6 illus., 5 illus. in color.online resource. - SpringerBriefs in Computer Science,2191-5768. - SpringerBriefs in Computer Science,.
Chapter 1. Kernel Based Learning: A Pragmatic Approach in the Face of New Challenges -- Chapter 2. Fundamentals of Fisher Kernels -- Chapter 3. Training Deep Models and Deriving Fisher Kernels: A Step Wise Approach -- Chapter 4. Large Scale Image Retrieval and Its Challenges -- Chapter 5. Open Source Knowledge Base for Machine Learning Practitioners.
This book shows machine learning enthusiasts and practitioners how to get the best of both worlds by deriving Fisher kernels from deep learning models. In addition, the book shares insight on how to store and retrieve large-dimensional Fisher vectors using feature selection and compression techniques. Feature selection and feature compression are two of the most popular off-the-shelf methods for reducing data’s high-dimensional memory footprint and thus making it suitable for large-scale visual retrieval and classification. Kernel methods long remained the de facto standard for solving large-scale object classification tasks using low-level features, until the revival of deep models in 2006. Later, they made a comeback with improved Fisher vectors in 2010. However, their supremacy was always challenged by various versions of deep models, now considered to be the state of the art for solving various machine learning and computer vision tasks. Although the two research paradigms differ significantly, the excellent performance of Fisher kernels on the Image Net large-scale object classification dataset has caught the attention of numerous kernel practitioners, and many have drawn parallels between the two frameworks for improving the empirical performance on benchmark classification tasks. Exploring concrete examples on different data sets, the book compares the computational and statistical aspects of different dimensionality reduction approaches and identifies metrics to show which approach is superior to the other for Fisher vector encodings. It also provides references to some of the most useful resources that could provide practitioners and machine learning enthusiasts a quick start for learning and implementing a variety of deep learning models and kernel functions.
ISBN: 9783319985244
Standard No.: 10.1007/978-3-319-98524-4doiSubjects--Topical Terms:
1253525
Pattern recognition.
LC Class. No.: Q337.5
Dewey Class. No.: 006.4
Composing Fisher Kernels from Deep Neural Models = A Practitioner's Approach /
LDR
:03567nam a22004095i 4500
001
990693
003
DE-He213
005
20200703125930.0
007
cr nn 008mamaa
008
201225s2018 gw | s |||| 0|eng d
020
$a
9783319985244
$9
978-3-319-98524-4
024
7
$a
10.1007/978-3-319-98524-4
$2
doi
035
$a
978-3-319-98524-4
050
4
$a
Q337.5
050
4
$a
TK7882.P3
072
7
$a
UYQP
$2
bicssc
072
7
$a
COM016000
$2
bisacsh
072
7
$a
UYQP
$2
thema
082
0 4
$a
006.4
$2
23
100
1
$a
Azim, Tayyaba.
$4
aut
$4
http://id.loc.gov/vocabulary/relators/aut
$3
1208235
245
1 0
$a
Composing Fisher Kernels from Deep Neural Models
$h
[electronic resource] :
$b
A Practitioner's Approach /
$c
by Tayyaba Azim, Sarah Ahmed.
250
$a
1st ed. 2018.
264
1
$a
Cham :
$b
Springer International Publishing :
$b
Imprint: Springer,
$c
2018.
300
$a
XIII, 59 p. 6 illus., 5 illus. in color.
$b
online resource.
336
$a
text
$b
txt
$2
rdacontent
337
$a
computer
$b
c
$2
rdamedia
338
$a
online resource
$b
cr
$2
rdacarrier
347
$a
text file
$b
PDF
$2
rda
490
1
$a
SpringerBriefs in Computer Science,
$x
2191-5768
505
0
$a
Chapter 1. Kernel Based Learning: A Pragmatic Approach in the Face of New Challenges -- Chapter 2. Fundamentals of Fisher Kernels -- Chapter 3. Training Deep Models and Deriving Fisher Kernels: A Step Wise Approach -- Chapter 4. Large Scale Image Retrieval and Its Challenges -- Chapter 5. Open Source Knowledge Base for Machine Learning Practitioners.
520
$a
This book shows machine learning enthusiasts and practitioners how to get the best of both worlds by deriving Fisher kernels from deep learning models. In addition, the book shares insight on how to store and retrieve large-dimensional Fisher vectors using feature selection and compression techniques. Feature selection and feature compression are two of the most popular off-the-shelf methods for reducing data’s high-dimensional memory footprint and thus making it suitable for large-scale visual retrieval and classification. Kernel methods long remained the de facto standard for solving large-scale object classification tasks using low-level features, until the revival of deep models in 2006. Later, they made a comeback with improved Fisher vectors in 2010. However, their supremacy was always challenged by various versions of deep models, now considered to be the state of the art for solving various machine learning and computer vision tasks. Although the two research paradigms differ significantly, the excellent performance of Fisher kernels on the Image Net large-scale object classification dataset has caught the attention of numerous kernel practitioners, and many have drawn parallels between the two frameworks for improving the empirical performance on benchmark classification tasks. Exploring concrete examples on different data sets, the book compares the computational and statistical aspects of different dimensionality reduction approaches and identifies metrics to show which approach is superior to the other for Fisher vector encodings. It also provides references to some of the most useful resources that could provide practitioners and machine learning enthusiasts a quick start for learning and implementing a variety of deep learning models and kernel functions.
650
0
$a
Pattern recognition.
$3
1253525
650
0
$a
Signal processing.
$3
561459
650
0
$a
Image processing.
$3
557495
650
0
$a
Speech processing systems.
$3
564428
650
0
$a
Information storage and retrieval.
$3
1069252
650
0
$a
Mathematical statistics.
$3
527941
650
0
$a
Data structures (Computer science).
$3
680370
650
0
$a
Artificial intelligence.
$3
559380
650
1 4
$a
Pattern Recognition.
$3
669796
650
2 4
$a
Signal, Image and Speech Processing.
$3
670837
650
2 4
$a
Information Storage and Retrieval.
$3
593926
650
2 4
$a
Probability and Statistics in Computer Science.
$3
669886
650
2 4
$a
Data Storage Representation.
$3
669777
650
2 4
$a
Artificial Intelligence.
$3
646849
700
1
$a
Ahmed, Sarah.
$4
aut
$4
http://id.loc.gov/vocabulary/relators/aut
$3
1208236
710
2
$a
SpringerLink (Online service)
$3
593884
773
0
$t
Springer Nature eBook
776
0 8
$i
Printed edition:
$z
9783319985237
776
0 8
$i
Printed edition:
$z
9783319985251
830
0
$a
SpringerBriefs in Computer Science,
$x
2191-5768
$3
1255334
856
4 0
$u
https://doi.org/10.1007/978-3-319-98524-4
912
$a
ZDB-2-SCS
912
$a
ZDB-2-SXCS
950
$a
Computer Science (SpringerNature-11645)
950
$a
Computer Science (R0) (SpringerNature-43710)
筆 0 讀者評論
多媒體
評論
新增評論
分享你的心得
Export
取書館別
處理中
...
變更密碼[密碼必須為2種組合(英文和數字)及長度為10碼以上]
登入