Abstract for waterhouse_hme

In Proc. 1994 IEEE Workshop on Neural Networks for Signal Processing, pp 177-186.

CLASSIFICATION USING HIERARCHICAL MIXTURES OF EXPERTS

Steve Waterhouse and Tony Robinson

September 1994

There has recently been widespread interest in the use of multiple models for classification and regression in the statistics and neural networks communities. The Hierarchical Mixture of Experts (HME) has been successful in a number of regression problems, yielding significantly faster training through the use of the Expectation Maximisation algorithm. In this paper we extend the HME to classification and results are reported for three common classification benchmark tests: Exclusive-Or, N-input Parity and Two Spirals.


(ftp:) waterhouse_hme.ps.Z (http:) waterhouse_hme.ps.Z
PDF (automatically generated from original PostScript document - may be badly aliased on screen):
  (ftp:) waterhouse_hme.pdf | (http:) waterhouse_hme.pdf

If you have difficulty viewing files that end '.gz', which are gzip compressed, then you may be able to find tools to uncompress them at the gzip web site.

If you have difficulty viewing files that are in PostScript, (ending '.ps' or '.ps.gz'), then you may be able to find tools to view them at the gsview web site.

We have attempted to provide automatically generated PDF copies of documents for which only PostScript versions have previously been available. These are clearly marked in the database - due to the nature of the automatic conversion process, they are likely to be badly aliased when viewed at default resolution on screen by acroread.