Abstract for burrows_nnsp94

Proc. IEEE Conference on Neural Nets for Signal Processing, 1994

THE USE OF RECURRENT NEURAL NETWORKS FOR CLASSIFICATION

Tina Burrows and Mahesan Niranjan

1994

Recurrent neural networks are widely used for context dependent pattern classification tasks such as speech recognition. The feedback in these networks is generally claimed to contribute to integrating the context of the input feature vector to be classified. This paper analyses the use of recurrent neural networks for such applications. We show that the contribution of the feedback connections is primarily a smoothing mechanism and that this is achieved by moving the class boundary of an equivalent feedforward network classifier. We also show that when the sigmoidal hidden nodes of the network operate close to saturation, switching from one class to the next is delayed, and within a class the network decisions are insensitive to the order of presentation of the input vectors.


(ftp:) burrows_nnsp94.ps.Z (http:) burrows_nnsp94.ps.Z
PDF (automatically generated from original PostScript document - may be badly aliased on screen):
  (ftp:) burrows_nnsp94.pdf | (http:) burrows_nnsp94.pdf

If you have difficulty viewing files that end '.gz', which are gzip compressed, then you may be able to find tools to uncompress them at the gzip web site.

If you have difficulty viewing files that are in PostScript, (ending '.ps' or '.ps.gz'), then you may be able to find tools to view them at the gsview web site.

We have attempted to provide automatically generated PDF copies of documents for which only PostScript versions have previously been available. These are clearly marked in the database - due to the nature of the automatic conversion process, they are likely to be badly aliased when viewed at default resolution on screen by acroread.