Early Inference in Energy-Based Models Approximates Back-Propagation
Yoshua Bengio
and
Asja Fischer
arXiv e-Print archive - 2015 via Local arXiv
Keywords:
cs.LG
First published: 2015/10/09 (9 years ago) Abstract: We show that Langevin MCMC inference in an energy-based model with latent
variables has the property that the early steps of inference, starting from a
stationary point, correspond to propagating error gradients into internal
layers, similarly to back-propagation. The error that is back-propagated is
with respect to visible units that have received an outside driving force
pushing them away from the stationary point. Back-propagated error gradients
correspond to temporal derivatives of the activation of hidden units. This
observation could be an element of a theory for explaining how brains perform
credit assignment in deep hierarchies as efficiently as back-propagation does.
In this theory, the continuous-valued latent variables correspond to averaged
voltage potential (across time, spikes, and possibly neurons in the same
minicolumn), and neural computation corresponds to approximate inference and
error back-propagation at the same time.