Revisiting Residual Networks with Nonlinear Shortcuts

Chaoning Zhang (KAIST), Francois Rameau (KAIST), Seokju Lee (KAIST), Junsik Kim (KAIST), Philipp Benz (KAIST), Dawit Mureja Argaw (KAIST), Jean-Charles Bazin (KAIST), In So Kweon (KAIST)

Residual networks (ResNets) with an identity shortcut have been widely used in various computer vision tasks due to their compelling performance and simple design. In this paper we revisit ResNet identity shortcut and propose RGSNets which are based on a new nonlinear ReLU Group Normalization (RG) shortcut, outperforming the existing ResNet by a relatively large margin. Our work is inspired by previous findings that there is a trade-off between representational power and gradient stability in deep networks and that the identity shortcut reduces the representational power. Our proposed nonlinear RG shortcut can contribute to effectively utilizing the representational power of relatively shallow networks and outperform much (3 or 4 times) deeper ResNets, which demonstrates the high efficiency of RG shortcut. Moreover, we have explored variations of RGSNets, and our experimental result shows that Res-RGSNet combining the proposed RG shortcut with the existing identity shortcut achieves the best performance and is robust to network depth. Our code and model will be publicly available.


Paper (PDF)

title={Revisiting Residual Networks with Nonlinear Shortcuts},
author={Chaoning Zhang and Francois Rameau and Seokju Lee and Junsik Kim and Philipp Benz and Dawit Mureja Argaw and Jean-Charles Bazin and In So Kweon},
booktitle={Proceedings of the British Machine Vision Conference (BMVC)},
publisher={BMVA Press},
editor={Kirill Sidorov and Yulia Hicks},