Latent Variable architecture enables data augmentation through hidden state modulation

Data augmentation seeks to derive more training samples from data. In contrast, the latent-variable architecture is based on fuzzily influencing hidden states in the latent space during prediction with actual input in order to stochastically generate output alternatives. Those output alternatives can provide a framework for expressing uncertainty in contrastive learning. As opposed to GAN’s, the latent-variable architecture uses input from a dataset. This is analogous to how dreaming enables a form of data augmentation.

Resources

Backlinks