語系:
繁體中文
English
說明(常見問題)
登入
回首頁
切換:
標籤
|
MARC模式
|
ISBD
Minimum gamma-divergence for regression and classification problems
紀錄類型:
書目-語言資料,印刷品 : Monograph/item
正題名/作者:
Minimum gamma-divergence for regression and classification problems/ by Shinto Eguchi.
作者:
Eguchi, Shinto.
出版者:
Singapore :Springer Nature Singapore : : 2025.,
面頁冊數:
viii, 118 p. :ill., digital ; : 24 cm.;
Contained By:
Springer Nature eBook
標題:
Biostatistics. -
電子資源:
https://doi.org/10.1007/978-981-97-8880-4
ISBN:
9789819788804
Minimum gamma-divergence for regression and classification problems
Eguchi, Shinto.
Minimum gamma-divergence for regression and classification problems
[electronic resource] /by Shinto Eguchi. - Singapore :Springer Nature Singapore :2025. - viii, 118 p. :ill., digital ;24 cm. - JSS research series in statistics,2364-0065. - JSS research series in statistics..
1. Introduction -- 2. Framework of gamma-divergence -- 2.1. Scale invariance -- 2.2 GM divergence and HM divergence -- 3. Minimum divergence methods for generalized linear models -- 3.1. Bernoulli logistic model -- 3.2. Poisson log-linear model -- 3.3. Poisson point process model -- 4. Minimum divergence methods in machine leaning -- 4.1. Multi-class AdaBoost -- 4.2. Boltzmann machine -- 5. gamma-divergence for real valued functions -- 6. Discussion.
This book introduces the gamma-divergence, a measure of distance between probability distributions that was proposed by Fujisawa and Eguchi in 2008. The gamma-divergence has been extensively explored to provide robust estimation when the power index γ is positive. The gamma-divergence can be defined even when the power index γ is negative, as long as the condition of integrability is satisfied. Thus, the authors consider the gamma-divergence defined on a set of discrete distributions. The arithmetic, geometric, and harmonic means for the distribution ratios are closely connected with the gamma-divergence with a negative γ. In particular, the authors call the geometric-mean (GM) divergence the gamma-divergence when γ is equal to -1. The book begins by providing an overview of the gamma-divergence and its properties. It then goes on to discuss the applications of the gamma-divergence in various areas, including machine learning, statistics, and ecology. Bernoulli, categorical, Poisson, negative binomial, and Boltzmann distributions are discussed as typical examples. Furthermore, regression analysis models that explicitly or implicitly assume these distributions as the dependent variable in generalized linear models are discussed to apply the minimum gamma-divergence method. In ensemble learning, AdaBoost is derived by the exponential loss function in the weighted majority vote manner. It is pointed out that the exponential loss function is deeply connected to the GM divergence. In the Boltzmann machine, the maximum likelihood has to use approximation methods such as mean field approximation because of the intractable computation of the partition function. However, by considering the GM divergence and the exponential loss, it is shown that the calculation of the partition function is not necessary, and it can be executed without variational inference.
ISBN: 9789819788804
Standard No.: 10.1007/978-981-97-8880-4doiSubjects--Topical Terms:
783654
Biostatistics.
LC Class. No.: QA295
Dewey Class. No.: 519.536
Minimum gamma-divergence for regression and classification problems
LDR
:03386nam a2200337 a 4500
001
1161552
003
DE-He213
005
20250312115238.0
006
m d
007
cr nn 008maaau
008
251029s2025 si s 0 eng d
020
$a
9789819788804
$q
(electronic bk.)
020
$a
9789819788798
$q
(paper)
024
7
$a
10.1007/978-981-97-8880-4
$2
doi
035
$a
978-981-97-8880-4
040
$a
GP
$c
GP
041
0
$a
eng
050
4
$a
QA295
072
7
$a
PBT
$2
bicssc
072
7
$a
MAT029000
$2
bisacsh
072
7
$a
PBT
$2
thema
082
0 4
$a
519.536
$2
23
090
$a
QA295
$b
.E32 2025
100
1
$a
Eguchi, Shinto.
$e
author.
$3
1308401
245
1 0
$a
Minimum gamma-divergence for regression and classification problems
$h
[electronic resource] /
$c
by Shinto Eguchi.
260
$a
Singapore :
$c
2025.
$b
Springer Nature Singapore :
$b
Imprint: Springer,
300
$a
viii, 118 p. :
$b
ill., digital ;
$c
24 cm.
490
1
$a
JSS research series in statistics,
$x
2364-0065
505
0
$a
1. Introduction -- 2. Framework of gamma-divergence -- 2.1. Scale invariance -- 2.2 GM divergence and HM divergence -- 3. Minimum divergence methods for generalized linear models -- 3.1. Bernoulli logistic model -- 3.2. Poisson log-linear model -- 3.3. Poisson point process model -- 4. Minimum divergence methods in machine leaning -- 4.1. Multi-class AdaBoost -- 4.2. Boltzmann machine -- 5. gamma-divergence for real valued functions -- 6. Discussion.
520
$a
This book introduces the gamma-divergence, a measure of distance between probability distributions that was proposed by Fujisawa and Eguchi in 2008. The gamma-divergence has been extensively explored to provide robust estimation when the power index γ is positive. The gamma-divergence can be defined even when the power index γ is negative, as long as the condition of integrability is satisfied. Thus, the authors consider the gamma-divergence defined on a set of discrete distributions. The arithmetic, geometric, and harmonic means for the distribution ratios are closely connected with the gamma-divergence with a negative γ. In particular, the authors call the geometric-mean (GM) divergence the gamma-divergence when γ is equal to -1. The book begins by providing an overview of the gamma-divergence and its properties. It then goes on to discuss the applications of the gamma-divergence in various areas, including machine learning, statistics, and ecology. Bernoulli, categorical, Poisson, negative binomial, and Boltzmann distributions are discussed as typical examples. Furthermore, regression analysis models that explicitly or implicitly assume these distributions as the dependent variable in generalized linear models are discussed to apply the minimum gamma-divergence method. In ensemble learning, AdaBoost is derived by the exponential loss function in the weighted majority vote manner. It is pointed out that the exponential loss function is deeply connected to the GM divergence. In the Boltzmann machine, the maximum likelihood has to use approximation methods such as mean field approximation because of the intractable computation of the partition function. However, by considering the GM divergence and the exponential loss, it is shown that the calculation of the partition function is not necessary, and it can be executed without variational inference.
650
2 4
$a
Biostatistics.
$3
783654
650
2 4
$a
Linear Models and Regression.
$3
1366135
650
2 4
$a
Machine Learning.
$3
1137723
650
2 4
$a
Parametric Inference.
$3
1366138
650
2 4
$a
Stochastic Modelling in Statistics.
$3
1391582
650
1 4
$a
Statistical Theory and Methods.
$3
671396
650
0
$a
Regression analysis.
$3
569541
650
0
$a
Divergent series.
$3
1110680
710
2
$a
SpringerLink (Online service)
$3
593884
773
0
$t
Springer Nature eBook
830
0
$a
JSS research series in statistics.
$3
1211828
856
4 0
$u
https://doi.org/10.1007/978-981-97-8880-4
950
$a
Mathematics and Statistics (SpringerNature-11649)
筆 0 讀者評論
多媒體
評論
新增評論
分享你的心得
Export
取書館別
處理中
...
變更密碼[密碼必須為2種組合(英文和數字)及長度為10碼以上]
登入