【机器学习】 11 Mixture models and the EM algorithm
本章目录
11 Mixture models and the EM algorithm 337
11.1 Latent variable models 337
11.2 Mixture models 337
11.2.1 Mixtures of Gaussians 339
11.2.2 Mixture of multinoullis 340
11.2.3 Using mixture models for clustering 340
11.2.4 Mixtures of experts 342
11.3 Parameter estimation for mixture models 345
11.3.1 Unidentifiability 346
11.3.2 Computing a MAP estimate is non-convex 347
11.4 The EM algorithm 348
11.4.1 Basic idea 349
11.4.2 EM for GMMs 350
11.4.3 EM for mixture of experts 357
11.4.4 EM for DGMs with hidden variables 358
11.4.5 EM for the Student distribution * 359
11.4.6 EM for probit regression * 362
11.4.7 Theoretical basis for EM * 363
11.4.8 Online EM 365
11.4.9 Other EM variants * 367
11.5 Model selection for latent variable models 370
11.5.1 Model selection for probabilistic models 370
11.5.2 Model selection for non-probabilistic methods 370
11.6 Fitting models with missing data 372
11.6.1 EM for the MLE of an MVN with missing data 373
github下载链接:https://github.com/916718212/Machine-Learning-A-Probabilistic-Perspective-.git