語系:
繁體中文
English
說明(常見問題)
登入
回首頁
切換:
標籤
|
MARC模式
|
ISBD
Python for probability, statistics, ...
~
SpringerLink (Online service)
Python for probability, statistics, and machine learning
紀錄類型:
書目-語言資料,印刷品 : Monograph/item
正題名/作者:
Python for probability, statistics, and machine learning/ by Jose Unpingco.
作者:
Unpingco, Jose.
出版者:
Cham :Springer International Publishing : : 2019.,
面頁冊數:
xiv, 384 p. :ill., digital ; : 24 cm.;
Contained By:
Springer eBooks
標題:
Telecommunication. -
電子資源:
https://doi.org/10.1007/978-3-030-18545-9
ISBN:
9783030185459
Python for probability, statistics, and machine learning
Unpingco, Jose.
Python for probability, statistics, and machine learning
[electronic resource] /by Jose Unpingco. - 2nd ed. - Cham :Springer International Publishing :2019. - xiv, 384 p. :ill., digital ;24 cm.
Introduction -- Part 1 Getting Started with Scientific Python -- Installation and Setup -- Numpy -- Matplotlib -- Ipython -- Jupyter Notebook -- Scipy -- Pandas -- Sympy -- Interfacing with Compiled Libraries -- Integrated Development Environments -- Quick Guide to Performance and Parallel Programming -- Other Resources -- Part 2 Probability -- Introduction -- Projection Methods -- Conditional Expectation as Projection -- Conditional Expectation and Mean Squared Error -- Worked Examples of Conditional Expectation and Mean Square Error Optimization -- Useful Distributions -- Information Entropy -- Moment Generating Functions -- Monte Carlo Sampling Methods -- Useful Inequalities -- Part 3 Statistics -- Python Modules for Statistics -- Types of Convergence -- Estimation Using Maximum Likelihood -- Hypothesis Testing and P-Values -- Confidence Intervals -- Linear Regression -- Maximum A-Posteriori -- Robust Statistics -- Bootstrapping -- Gauss Markov -- Nonparametric Methods -- Survival Analysis -- Part 4 Machine Learning -- Introduction -- Python Machine Learning Modules -- Theory of Learning -- Decision Trees -- Boosting Trees -- Logistic Regression -- Generalized Linear Models -- Regularization -- Support Vector Machines -- Dimensionality Reduction -- Clustering -- Ensemble Methods -- Deep Learning -- Notation -- References -- Index.
This book, fully updated for Python version 3.6+, covers the key ideas that link probability, statistics, and machine learning illustrated using Python modules in these areas. All the figures and numerical results are reproducible using the Python codes provided. The author develops key intuitions in machine learning by working meaningful examples using multiple analytical methods and Python codes, thereby connecting theoretical concepts to concrete implementations. Detailed proofs for certain important results are also provided. Modern Python modules like Pandas, Sympy, Scikit-learn, Tensorflow, and Keras are applied to simulate and visualize important machine learning concepts like the bias/variance trade-off, cross-validation, and regularization. Many abstract mathematical ideas, such as convergence in probability theory, are developed and illustrated with numerical examples. This updated edition now includes the Fisher Exact Test and the Mann-Whitney-Wilcoxon Test. A new section on survival analysis has been included as well as substantial development of Generalized Linear Models. The new deep learning section for image processing includes an in-depth discussion of gradient descent methods that underpin all deep learning algorithms. As with the prior edition, there are new and updated *Programming Tips* that the illustrate effective Python modules and methods for scientific programming and machine learning. There are 445 run-able code blocks with corresponding outputs that have been tested for accuracy. Over 158 graphical visualizations (almost all generated using Python) illustrate the concepts that are developed both in code and in mathematics. We also discuss and use key Python modules such as Numpy, Scikit-learn, Sympy, Scipy, Lifelines, CvxPy, Theano, Matplotlib, Pandas, Tensorflow, Statsmodels, and Keras. This book is suitable for anyone with an undergraduate-level exposure to probability, statistics, or machine learning and with rudimentary knowledge of Python programming.
ISBN: 9783030185459
Standard No.: 10.1007/978-3-030-18545-9doiSubjects--Topical Terms:
568341
Telecommunication.
LC Class. No.: QA76.73.P98 / U57 2019
Dewey Class. No.: 621.382
Python for probability, statistics, and machine learning
LDR
:04360nam a2200337 a 4500
001
941196
003
DE-He213
005
20190629141136.0
006
m d
007
cr nn 008maaau
008
200417s2019 gw s 0 eng d
020
$a
9783030185459
$q
(electronic bk.)
020
$a
9783030185442
$q
(paper)
024
7
$a
10.1007/978-3-030-18545-9
$2
doi
035
$a
978-3-030-18545-9
040
$a
GP
$c
GP
041
0
$a
eng
050
4
$a
QA76.73.P98
$b
U57 2019
072
7
$a
TJK
$2
bicssc
072
7
$a
TEC041000
$2
bisacsh
072
7
$a
TJK
$2
thema
082
0 4
$a
621.382
$2
23
090
$a
QA76.73.P98
$b
U58 2019
100
1
$a
Unpingco, Jose.
$3
1021098
245
1 0
$a
Python for probability, statistics, and machine learning
$h
[electronic resource] /
$c
by Jose Unpingco.
250
$a
2nd ed.
260
$a
Cham :
$c
2019.
$b
Springer International Publishing :
$b
Imprint: Springer,
300
$a
xiv, 384 p. :
$b
ill., digital ;
$c
24 cm.
505
0
$a
Introduction -- Part 1 Getting Started with Scientific Python -- Installation and Setup -- Numpy -- Matplotlib -- Ipython -- Jupyter Notebook -- Scipy -- Pandas -- Sympy -- Interfacing with Compiled Libraries -- Integrated Development Environments -- Quick Guide to Performance and Parallel Programming -- Other Resources -- Part 2 Probability -- Introduction -- Projection Methods -- Conditional Expectation as Projection -- Conditional Expectation and Mean Squared Error -- Worked Examples of Conditional Expectation and Mean Square Error Optimization -- Useful Distributions -- Information Entropy -- Moment Generating Functions -- Monte Carlo Sampling Methods -- Useful Inequalities -- Part 3 Statistics -- Python Modules for Statistics -- Types of Convergence -- Estimation Using Maximum Likelihood -- Hypothesis Testing and P-Values -- Confidence Intervals -- Linear Regression -- Maximum A-Posteriori -- Robust Statistics -- Bootstrapping -- Gauss Markov -- Nonparametric Methods -- Survival Analysis -- Part 4 Machine Learning -- Introduction -- Python Machine Learning Modules -- Theory of Learning -- Decision Trees -- Boosting Trees -- Logistic Regression -- Generalized Linear Models -- Regularization -- Support Vector Machines -- Dimensionality Reduction -- Clustering -- Ensemble Methods -- Deep Learning -- Notation -- References -- Index.
520
$a
This book, fully updated for Python version 3.6+, covers the key ideas that link probability, statistics, and machine learning illustrated using Python modules in these areas. All the figures and numerical results are reproducible using the Python codes provided. The author develops key intuitions in machine learning by working meaningful examples using multiple analytical methods and Python codes, thereby connecting theoretical concepts to concrete implementations. Detailed proofs for certain important results are also provided. Modern Python modules like Pandas, Sympy, Scikit-learn, Tensorflow, and Keras are applied to simulate and visualize important machine learning concepts like the bias/variance trade-off, cross-validation, and regularization. Many abstract mathematical ideas, such as convergence in probability theory, are developed and illustrated with numerical examples. This updated edition now includes the Fisher Exact Test and the Mann-Whitney-Wilcoxon Test. A new section on survival analysis has been included as well as substantial development of Generalized Linear Models. The new deep learning section for image processing includes an in-depth discussion of gradient descent methods that underpin all deep learning algorithms. As with the prior edition, there are new and updated *Programming Tips* that the illustrate effective Python modules and methods for scientific programming and machine learning. There are 445 run-able code blocks with corresponding outputs that have been tested for accuracy. Over 158 graphical visualizations (almost all generated using Python) illustrate the concepts that are developed both in code and in mathematics. We also discuss and use key Python modules such as Numpy, Scikit-learn, Sympy, Scipy, Lifelines, CvxPy, Theano, Matplotlib, Pandas, Tensorflow, Statsmodels, and Keras. This book is suitable for anyone with an undergraduate-level exposure to probability, statistics, or machine learning and with rudimentary knowledge of Python programming.
650
0
$a
Telecommunication.
$3
568341
650
0
$a
Computer science.
$3
573171
650
0
$a
Engineering mathematics.
$3
562757
650
0
$a
Statistics.
$3
556824
650
0
$a
Data mining.
$3
528622
650
1 4
$a
Communications Engineering, Networks.
$3
669809
650
2 4
$a
Probability and Statistics in Computer Science.
$3
669886
650
2 4
$a
Mathematical and Computational Engineering.
$3
1139415
650
2 4
$a
Statistics for Engineering, Physics, Computer Science, Chemistry and Earth Sciences.
$3
782247
650
2 4
$a
Data Mining and Knowledge Discovery.
$3
677765
710
2
$a
SpringerLink (Online service)
$3
593884
773
0
$t
Springer eBooks
856
4 0
$u
https://doi.org/10.1007/978-3-030-18545-9
950
$a
Engineering (Springer-11647)
筆 0 讀者評論
多媒體
評論
新增評論
分享你的心得
Export
取書館別
處理中
...
變更密碼[密碼必須為2種組合(英文和數字)及長度為10碼以上]
登入