Information geometry and maximum likelihood criteria

Download: PDF.

“Information geometry and maximum likelihood criteria” by W. Byrne. In Conference on Information Sciences and Systems, (Princeton, NJ), 1996.

Abstract

This paper presents a brief comparison of two information geometries as they are used to describe the EM algorithm used in maximum likelihood estimation from incomplete data. The Alternating Minimization framework based on the I-Geometry developed by Csiszar is presented first, followed by the em-algorithm of Amari. Following a comparison of these algorithms, a discussion of a variation in likelihood criterion is presented. The EM algorithm is usually formulated so as to improve the marginal likelihood criterion. Closely related algorithms also exist which are intended to maximize different likelihood criteria. The 1-Best criterion, for example, leads to the Viterbi training algorithm used in Hidden Markov Modeling. This criterion has an information geometric description that results from a minor modification of the marginal likelihood formulation.

Download: PDF.

BibTeX entry:

@inproceedings{byrne:igmlc,
   author = {W. Byrne},
   title = {Information geometry and maximum likelihood criteria},
   booktitle = {Conference on Information Sciences and Systems},
   pages = {(6 pages)},
   address = {Princeton, NJ},
   year = {1996}
}

Back to Bill Byrne publications.