Review: RoR — ResNet of ResNet / Multilevel ResNet (Image Classification)

After the success of ResNet which become an state-of-the-art deep learning approaches and won numerous recognition competitions, there were numerous researches working on how to generalize or improve the ResNet, such as Pre-Activation ResNet, ResNet in ResNet (RiR), ResNet with Stochastic Depth (SD), Wide Residual Network (WRN)..Concept of RoR (Residual Networks of Residual Networks)Original ResNet (Left), RoR (Right)Original ResNet is shown at the left above, numerous Residual Blocks are cascaded together and form a very deep network.Within a Residual Block, there are two paths:Convolution path which perform convolution to extract the featuresShortcut Connection path to directly transmit the input signal to the next layer.With Shortcut Connection paths, gradient vanishing problem can be reduced because error signal can be propagated to early layers easier during back propagation.RoR shown at the right above, proposes that we can also have shortcut connection across a group of Residual Blocks..On top of this, we can also have another level of shortcut connection across a group of “groups of Residual Blocks”.Authors argue that:RoR transfers the learning problem to learning the residual mapping s of residual mapping, which is simpler and easier to learn than the original ResNet.And layers in upper blocks can also propagate information to layers in lower blocks.2..Hopefully I can cover it as well.References[2018 TCSVT] [RoR]Residual Networks of Residual Networks: Multilevel Residual NetworksMy Related Reviews on Image Classification[LeNet] [AlexNet] [ZFNet] [VGGNet] [SPPNet] [PReLU-Net] [GoogLeNet / Inception-v1] [BN-Inception / Inception-v2] [Inception-v3] [Inception-v4] [Xception] [MobileNetV1] [ResNet] [Pre-Activation ResNet] [RiR] [Stochastic Depth] [WRN] [DenseNet]. More details

Leave a Reply