掃一掃
關注中圖網(wǎng)
官方微博
本類五星書更多>
-
>
全國計算機等級考試最新真考題庫模擬考場及詳解·二級MSOffice高級應用
-
>
決戰(zhàn)行測5000題(言語理解與表達)
-
>
軟件性能測試.分析與調(diào)優(yōu)實踐之路
-
>
第一行代碼Android
-
>
JAVA持續(xù)交付
-
>
EXCEL最強教科書(完全版)(全彩印刷)
-
>
深度學習
Python高維數(shù)據(jù)分析 版權(quán)信息
- ISBN:9787560655772
- 條形碼:9787560655772 ; 978-7-5606-5577-2
- 裝幀:一般膠版紙
- 冊數(shù):暫無
- 重量:暫無
- 所屬分類:>
Python高維數(shù)據(jù)分析 內(nèi)容簡介
本書介紹了矩陣計算的基本方法, 并從特征值分解和奇異值分解出發(fā), 給出一個超定矩陣的*小二乘法問題的模型建立、證明和一般求解方法, 并引出欠秩的多元線性方程組的求解方法問題, 然后介紹了兩種有損的降維方法。
Python高維數(shù)據(jù)分析 目錄
Chapter 1 Basis of Matrix Calculation
1.1 Fundamental Concepts
1.1.1 Notation
1.1.2 “BiggerBlock” Interpretations of Matrix Multiplication
1.1.3 Fundamental Linear Algebra
1.1.4 Four Fundamental Subspaces of a Matrix
1.1.5 Vector Norms
1.1.6 Determinants
1.1.7 Properties of Determinants
1.2 The Most Basic Matrix Decomposition
1.2.1 Gaussian Elimination
1.2.2 The LU Decomposition
1.2.3 The LDM Factorization
1.2.4 The LDL Decomposition for Symmetric Matrices
1.2.5 Cholesky Decomposition
1.2.6 Applications and Examples of the Cholesky Decomposition
1.2.7 Eigendecomposition
1.2.8 Matrix Norms
1.2.9 Covariance Matrices
1.3 Singular Value Decomposition (SVD)
1.3.1 Orthogonalization
1.3.2 Existence Proof of the SVD
1.3.3 Partitioning the SVD
1.3.4 Properties and Interpretations of the SVD
1.3.5 Relationship between SVD and ED
1.3.6 Ellipsoidal Interpretation of the SVD
1.3.7 An Interesting Theorem
1.4 The Quadratic Form
1.4.1 Quadratic Form Theory
1.4.2 The Gaussian MultiVariate Probability Density Function
1.4.3 The Rayleigh Quotient
Chapter 2 The Solution of Least Squares Problems
2.1 Linear Least Squares Estimation
2.1.1 Example: Autoregressive Modelling
2.1.2 The LeastSquares Solution
2.1.3 Interpretation of the Normal Equations
2.1.4 Properties of the LS Estimate
2.1.5 Linear LeastSquares Estimation and the Cramer Rao Lower Bound
2.2 A Generalized “PseudoInverse” Approach to Solving the Leastsquares Problem
2.2.1 Least Squares Solution Using the SVD
2.2.2 Interpretation of the PseudoInverse
Chapter 3 Principal Component Analysis
3.1 Introductory Example
3.2 Theory
3.2.1 Taking Linear Combinations
3.2.2 Explained Variation
3.2.3 PCA as a Model
3.2.4 Taking More Components
3.3 History of PCA
3.4 Practical Aspects
3.4.1 Preprocessing
3.4.2 Choosing the Number of Components
3.4.3 When Using PCA for Other Purposes
3.4.4 Detecting Outliers
References
3.5 Sklearn PCA
3.5.1 Source Code
3.5.2 Examples
3.6 Principal Component Regression
3.6.1 Source Code
3.6.2 KFold CrossValidation
3.6.3 Examples
3.7 Subspace Methods for Dynamic Model Estimation in PAT Applications
3.7.1 Introduction
3.7.2 Theory
3.7.3 State Space Models in Chemometrics
3.7.4 Milk Coagulation Monitoring
3.7.5 State Space Based Monitoring
3.7.6 Results
3.7.7 Concluding remarks
3.7.8 Appendix
References
Chapter 4 Partial Least Squares Analysis
4.1 Basic Concept
4.1.1 Partial Least Squares
4.1.2 Form of Partial Least Squares
4.1.3 PLS Regression
4.1.4 Statistic
Reference
4.2 NIPALS and SIMPLS Algorithm
4.2.1 NIPALS
4.2.2 SIMPLS
References
4.3 Programming Method of Standard Partial Least Squares
4.3.1 Crossvalidation
4.3.2 Procedure of NIPALS
4.4 Example Application
4.4.1 Demo of PLS
4.4.2 Corn Dataset
4.4.3 Wheat Dataset
4.4.4 Pharmaceutical Tablet Dataset
4.5 Stack Partial Least Squares
4.5.1 Introduction
4.5.2 Theory of Stack Partial Least Squares
4.5.3 Demo of SPLS
4.5.4 Experiments
References
Chapter 5 Regularization
5.1 Regularization
5.1.1 Classification
5.1.2 Tikhonov Regularization
5.1.3 Regularizers for Sparsity
5.1.4 Other Uses of Regularization in Statistics and Machine Learning
5.2 Ridge Regression: Biased Estimation for Nonorthogonal Problems
5.2.1 Properties of Best Linear Unbiased Estimation
5.2.2 Ridge Regression
5.2.3 The Ridge Trace
5.2.4 Mean Square Error Properties of Ridge Regression
5.2.5 A General Form of Ridge Regression
5.2.6 Relation to Other Work in Regression
5.2.7 Selecting a Better Estimate of ?
References
5.3 Lasso
5.3.1 Introduction
5.3.2 Theory of the Lasso
References
5.4 The Example of Ridge Regression and Lasso Regression
5.4.1 Example
5.4.2 Practical Example
5.5 Sparse PCA
5.5.1 Introduction
5.5.2 Motivation and Method Details
5.5.3 SPCA for p ≥ n and Gene Expression Arrays
5.5.4 Demo of SPCA
References
Chapter 6 Transfer Method
6.1 Calibration Transfer of Spectral Models[1]
6.1.1 Introduction
6.1.2 Calibration Transfer Setting
6.1.3 Related Work
6.1.4 New or Adapted Methods
6.1.5 Standardfree Alternatives to Methods Requiring Transfer StandardsReferences
6.2 PLS Subspace Based Calibration Transfer for NIR Quantitative Analysis
6.2.1 Calibration Transfer Method
6.2.2 Experimental
6.2.3 Results and Discussion
6.2.4 Conclusion
References
6.3 Calibration Transfer Based on Affine Invariance for NIR without Standard Samples
6.3.1 Theory
6.3.2 Experimental
6.3.3 Results and Discussion
6.3.4 Conclusions
1.1 Fundamental Concepts
1.1.1 Notation
1.1.2 “BiggerBlock” Interpretations of Matrix Multiplication
1.1.3 Fundamental Linear Algebra
1.1.4 Four Fundamental Subspaces of a Matrix
1.1.5 Vector Norms
1.1.6 Determinants
1.1.7 Properties of Determinants
1.2 The Most Basic Matrix Decomposition
1.2.1 Gaussian Elimination
1.2.2 The LU Decomposition
1.2.3 The LDM Factorization
1.2.4 The LDL Decomposition for Symmetric Matrices
1.2.5 Cholesky Decomposition
1.2.6 Applications and Examples of the Cholesky Decomposition
1.2.7 Eigendecomposition
1.2.8 Matrix Norms
1.2.9 Covariance Matrices
1.3 Singular Value Decomposition (SVD)
1.3.1 Orthogonalization
1.3.2 Existence Proof of the SVD
1.3.3 Partitioning the SVD
1.3.4 Properties and Interpretations of the SVD
1.3.5 Relationship between SVD and ED
1.3.6 Ellipsoidal Interpretation of the SVD
1.3.7 An Interesting Theorem
1.4 The Quadratic Form
1.4.1 Quadratic Form Theory
1.4.2 The Gaussian MultiVariate Probability Density Function
1.4.3 The Rayleigh Quotient
Chapter 2 The Solution of Least Squares Problems
2.1 Linear Least Squares Estimation
2.1.1 Example: Autoregressive Modelling
2.1.2 The LeastSquares Solution
2.1.3 Interpretation of the Normal Equations
2.1.4 Properties of the LS Estimate
2.1.5 Linear LeastSquares Estimation and the Cramer Rao Lower Bound
2.2 A Generalized “PseudoInverse” Approach to Solving the Leastsquares Problem
2.2.1 Least Squares Solution Using the SVD
2.2.2 Interpretation of the PseudoInverse
Chapter 3 Principal Component Analysis
3.1 Introductory Example
3.2 Theory
3.2.1 Taking Linear Combinations
3.2.2 Explained Variation
3.2.3 PCA as a Model
3.2.4 Taking More Components
3.3 History of PCA
3.4 Practical Aspects
3.4.1 Preprocessing
3.4.2 Choosing the Number of Components
3.4.3 When Using PCA for Other Purposes
3.4.4 Detecting Outliers
References
3.5 Sklearn PCA
3.5.1 Source Code
3.5.2 Examples
3.6 Principal Component Regression
3.6.1 Source Code
3.6.2 KFold CrossValidation
3.6.3 Examples
3.7 Subspace Methods for Dynamic Model Estimation in PAT Applications
3.7.1 Introduction
3.7.2 Theory
3.7.3 State Space Models in Chemometrics
3.7.4 Milk Coagulation Monitoring
3.7.5 State Space Based Monitoring
3.7.6 Results
3.7.7 Concluding remarks
3.7.8 Appendix
References
Chapter 4 Partial Least Squares Analysis
4.1 Basic Concept
4.1.1 Partial Least Squares
4.1.2 Form of Partial Least Squares
4.1.3 PLS Regression
4.1.4 Statistic
Reference
4.2 NIPALS and SIMPLS Algorithm
4.2.1 NIPALS
4.2.2 SIMPLS
References
4.3 Programming Method of Standard Partial Least Squares
4.3.1 Crossvalidation
4.3.2 Procedure of NIPALS
4.4 Example Application
4.4.1 Demo of PLS
4.4.2 Corn Dataset
4.4.3 Wheat Dataset
4.4.4 Pharmaceutical Tablet Dataset
4.5 Stack Partial Least Squares
4.5.1 Introduction
4.5.2 Theory of Stack Partial Least Squares
4.5.3 Demo of SPLS
4.5.4 Experiments
References
Chapter 5 Regularization
5.1 Regularization
5.1.1 Classification
5.1.2 Tikhonov Regularization
5.1.3 Regularizers for Sparsity
5.1.4 Other Uses of Regularization in Statistics and Machine Learning
5.2 Ridge Regression: Biased Estimation for Nonorthogonal Problems
5.2.1 Properties of Best Linear Unbiased Estimation
5.2.2 Ridge Regression
5.2.3 The Ridge Trace
5.2.4 Mean Square Error Properties of Ridge Regression
5.2.5 A General Form of Ridge Regression
5.2.6 Relation to Other Work in Regression
5.2.7 Selecting a Better Estimate of ?
References
5.3 Lasso
5.3.1 Introduction
5.3.2 Theory of the Lasso
References
5.4 The Example of Ridge Regression and Lasso Regression
5.4.1 Example
5.4.2 Practical Example
5.5 Sparse PCA
5.5.1 Introduction
5.5.2 Motivation and Method Details
5.5.3 SPCA for p ≥ n and Gene Expression Arrays
5.5.4 Demo of SPCA
References
Chapter 6 Transfer Method
6.1 Calibration Transfer of Spectral Models[1]
6.1.1 Introduction
6.1.2 Calibration Transfer Setting
6.1.3 Related Work
6.1.4 New or Adapted Methods
6.1.5 Standardfree Alternatives to Methods Requiring Transfer StandardsReferences
6.2 PLS Subspace Based Calibration Transfer for NIR Quantitative Analysis
6.2.1 Calibration Transfer Method
6.2.2 Experimental
6.2.3 Results and Discussion
6.2.4 Conclusion
References
6.3 Calibration Transfer Based on Affine Invariance for NIR without Standard Samples
6.3.1 Theory
6.3.2 Experimental
6.3.3 Results and Discussion
6.3.4 Conclusions
展開全部
書友推薦
- >
推拿
- >
二體千字文
- >
企鵝口袋書系列·偉大的思想20:論自然選擇(英漢雙語)
- >
上帝之肋:男人的真實旅程
- >
新文學天穹兩巨星--魯迅與胡適/紅燭學術(shù)叢書(紅燭學術(shù)叢書)
- >
中國歷史的瞬間
- >
我從未如此眷戀人間
- >
我與地壇
本類暢銷