The paper studies the problem of memory storage with discrete (digital) synapses. Previous work established that memory capacity can be increased by adding a cascade of (latent) states but the optimal state transition dynamics was unknown and the actual dynamics was usually hand-picked using some heuristic rules. In this paper the authors aim to derive the optimal transition dynamics for synaptic cascades. They first derive an upper bound on achievable memory capacity and show that simple models with linear chain structures can approach (achieve) this bound.
The paper applies the theory if ergodic Markov chains in continuous time to the analysis of the memory properties of online learning in synapses with intrinsic states extending earlier work of Abbott, Fusi and their co-workers.