Sequential learning and regularization in variational recurrent autoencoder

Jen-Tzung Chien, Chih Jung Tsai

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review


Latent variable model based on variational autoencoder (VAE) is influential in machine learning for signal processing. VAE basically suffers from the issue of posterior collapse in sequential learning procedure where the variational posterior easily collapses to a prior as standard Gaussian. Latent semantics are then neglected in optimization process. The recurrent decoder therefore generates noninformative or repeated sequence data. To capture sufficient latent semantics from sequence data, this study simultaneously fulfills an amortized regularization for encoder, extends a Gaussian mixture prior for latent variable, and runs a skip connection for decoder. The noise robust prior, learned from the amortized encoder, is likely aware of temporal features. A variational prior based on the amortized mixture density is formulated in implementation of variational recurrent autoencoder for sequence reconstruction and representation. Owing to skip connection, the sequence samples are continuously predicted in decoder with contextual precision at each time step. Experiments on language model and sentiment classification show that the proposed method mitigates the issue of posterior collapse and learns the meaningful latent features to improve the inference and generation for semantic representation.

Original languageEnglish
Title of host publication28th European Signal Processing Conference, EUSIPCO 2020 - Proceedings
PublisherEuropean Signal Processing Conference, EUSIPCO
Number of pages5
ISBN (Electronic)9789082797053
StatePublished - 24 Jan 2021
Event28th European Signal Processing Conference, EUSIPCO 2020 - Amsterdam, Netherlands
Duration: 24 Aug 202028 Aug 2020

Publication series

NameEuropean Signal Processing Conference
ISSN (Print)2219-5491


Conference28th European Signal Processing Conference, EUSIPCO 2020


  • Bayesian learning
  • Language model
  • Recurrent neural network
  • Sequential learning
  • Variational autoencoder

Fingerprint Dive into the research topics of 'Sequential learning and regularization in variational recurrent autoencoder'. Together they form a unique fingerprint.

Cite this