DelugeNets: Deep Networks with Massive and Flexible Cross-layer Information Inflows
Jason Kuen
and
Xiangfei Kong
and
Gang Wang
arXiv e-Print archive - 2016 via Local arXiv
Keywords:
cs.CV, cs.LG, cs.NE
First published: 2016/11/17 (8 years ago) Abstract: Human brains are adept at dealing with the deluge of information they
continuously receive, by suppressing the non-essential inputs and focusing on
the important ones. Inspired by such capability, we propose Deluge Networks
(DelugeNets), a novel class of neural networks facilitating massive cross-layer
information inflows from preceding layers to succeeding layers. The connections
between layers in DelugeNets are efficiently established through cross-layer
depthwise convolutional layers with learnable filters, acting as a flexible
selection mechanism. By virtue of the massive cross-layer information inflows,
DelugeNets can propagate information across many layers with greater
flexibility and utilize network parameters more effectively, compared to
existing ResNet models. Experiments show the superior performances of
DelugeNets in terms of both classification accuracies and parameter
efficiencies. Remarkably, a DelugeNet model with just 20.2M parameters achieve
state-of-the-art accuracy of 19.02% on CIFAR-100 dataset, outperforming
DenseNet model with 27.2M parameters.
It's not clear to me where the difference between DenseNets and DelungeNets are.
## Evaluation
* Cifar-10: 3.76% error (DenseNet: )
* Cifar-100: 19.02% error
## See also
* [reddit](https://www.reddit.com/r/MachineLearning/comments/5l0k6w/r_delugenets_deep_networks_with_massive_and/)
* [DenseNet](http://www.shortscience.org/paper?bibtexKey=journals/corr/1608.06993)