Unsupervised Learning in Recurrent Neural Networks

M. Klapper-Rybicka, N. N. Schraudolph, and J. Schmidhuber. Unsupervised Learning in Recurrent Neural Networks. In Proc. Intl. Conf. Artificial Neural Networks (ICANN), pp. 674–681, Springer Verlag, Berlin, Vienna, Austria, 2001.

Download

pdf djvu ps.gz
186.8kB   89.1kB   74.8kB  

Abstract

While much work has been done on unsupervised learning in feedforward neural network architectures, its potential with (theoretically more powerful) reecurrent networks and time-varying inputs has rarely been explored. Here we train Long Short-Term Memory (LSTM) recurrent networks to maximize two information-theoretic objectives for unsupervised learning: Binary Information Gain Optimization and Nonparametric Entropy Optimization. LSTM learns to discriminate different types of temporal sequences and group them according to a variety of features.

BibTeX Entry

@inproceedings{KlaSchSch01,
     author = {Magdalena Klapper-Rybicka and Nicol N. Schraudolph
               and J\"urgen Schmid\-huber},
      title = {\href{http://nic.schraudolph.org/pubs/KlaSchSch01.pdf}{
               Unsupervised Learning in Recurrent Neural Networks}},
      pages = {674--681},
     editor = {Georg Dorffner and Horst Bischof and Kurt Hornik},
  booktitle =  icann,
    address = {Vienna, Austria},
     volume =  2130,
     series = {\href{http://www.springer.de/comp/lncs/}{
               Lecture Notes in Computer Science}},
  publisher = {\href{http://www.springer.de/}{Springer Verlag}, Berlin},
       year =  2001,
   b2h_type = {Top Conferences},
  b2h_topic = {>Entropy Optimization},
   abstract = {
    While much work has been done on unsupervised learning in feedforward
    neural network architectures, its potential with (theoretically more
    powerful) reecurrent networks and time-varying inputs has rarely been
    explored.  Here we train Long Short-Term Memory (LSTM) recurrent networks
    to maximize two information-theoretic objectives for unsupervised
    learning: \href{http://nic.schraudolph.org/bib2html/b2hd-nips92.html}{
    Binary Information Gain Optimization} and
    \href{http://nic.schraudolph.org/bib2html/b2hd-emma}{
    Nonparametric Entropy Optimization}.  LSTM learns to discriminate
    different types of temporal sequences and group them according to a
    variety of features.
}}

Generated by bib2html.pl (written by Patrick Riley) on Thu Sep 25, 2014 12:00:33