【机器学习】 12 Latent linear models
本章目录
12 Latent linear models 381
12.1 Factor analysis 381
12.1.1 FA is a low rank parameterization of an MVN 381
12.1.2 Inference of the latent factors 382
12.1.3 Unidentifiability 383
12.1.4 Mixtures of factor analysers 385
12.1.5 EM for factor analysis models 386
12.1.6 Fitting FA models with missing data 387
12.2 Principal components analysis (PCA) 387
12.2.1 Classical PCA: statement of the theorem 387
12.2.2 Proof * 389
12.2.3 Singular value decomposition (SVD) 392
12.2.4 Probabilistic PCA 395
12.2.5 EM algorithm for PCA 396
12.3 Choosing the number of latent dimensions 398
12.3.1 Model selection for FA/PPCA 398
12.3.2 Model selection for PCA 399
12.4 PCA for categorical data 402
12.5 PCA for paired and multi-view data 404
12.5.1 Supervised PCA (latent factor regression) 405
12.5.2 Partial least squares 406
12.5.3 Canonical correlation analysis 407
12.6 Independent Component Analysis (ICA) 407
12.6.1 Maximum likelihood estimation 410
12.6.2 The FastICA algorithm 411
12.6.3 Using EM 414
12.6.4 Other estimation principles * 415
github下载链接:https://github.com/916718212/Machine-Learning-A-Probabilistic-Perspective-.git