Next:
5.1 Introduction
Up:
References
Previous:
References
5. The EM Algorithm
Shu Kay Ng, Thriyambakam Krishnan, Geoffrey J. McLachlan
Subsections
5.1 Introduction
5.1.1 Maximum Likelihood Estimation
5.1.2 EM Algorithm: Incomplete-Data Structure
5.1.3 Overview of the Chapter
5.2 Basic Theory of the EM Algorithm
5.2.1 The E- and M-Steps
5.2.2 Generalized EM Algorithm
5.2.3 Convergence of the EM Algorithm
5.2.4 Rate of Convergence of the EM Algorithm
5.2.5 Properties of the EM Algorithm
5.3 Examples of the EM Algorithm
5.3.1 Example 1: Normal Mixtures
5.3.2 Example 2: Censored Failure-Time Data
5.3.3 Example 3: Nonapplicability of EM Algorithm
5.3.4 Starting Values for EM Algorithm
5.3.5 Provision of Standard Errors
5.4 Variations on the EM Algorithm
5.4.1 Complicated E-Step
5.4.1.1 Example 4: Generalized Linear Mixed Models
5.4.2 Complicated M-Step
5.4.2.1 ECM and Multicycle ECM Algorithms
5.4.2.2 Example 5: Single-Factor Analysis Model
5.4.3 Speeding up Convergence
5.4.3.1 ECME, AECM, and PX-EM Algorithms
5.4.3.2 Extensions to the EM for Data Mining Applications
5.5 Miscellaneous Topics on the EM Algorithm
5.5.1 EM Algorithm for MAP Estimation
5.5.2 Iterative Simulation Algorithms
5.5.3 Further Applications of the EM Algorithm
5.5.3.1 Bioinformatics: Mixture of Factor Analyzers
5.5.3.2 Image Analysis: Hidden Markov Models