Research & Papers

Learn&Drop: Fast Learning of CNNs based on Layer Dropping

New method cuts ResNet-152 FLOPs by 83.74% while preserving accuracy.

Deep Dive

Learn&Drop: Fast Learning of CNNs based on Layer Dropping, by Cruciata et al. from University of Palermo and TU Delft, introduces a novel training efficiency technique for deep convolutional neural networks. Unlike prior work focused on inference compression or backpropagation reduction, this method targets forward propagation by evaluating each layer's parameter change and learning potential during training. Layers that show minimal change or learning are dropped, reducing the number of parameters and operations in the forward pass. The approach was validated on VGG-11, VGG-16, ResNet-50, and ResNet-152 architectures using MNIST, CIFAR-10, and Imagenette datasets.

Results show training time is more than halved with minimal accuracy loss. FLOPs reduction in forward propagation ranges from 17.83% for VGG-11 to 83.74% for ResNet-152. The method is especially beneficial for fine-tuning and online training where data arrives sequentially, enabling faster model adaptation without sacrificing performance. Accepted at Springer Neural Computing and Applications, this technique offers a practical solution for accelerating CNN training in resource-constrained environments.

Key Points
  • Learn&Drop reduces training time by over 50% on VGG and ResNet architectures
  • FLOPs reduction in forward propagation ranges from 17.83% (VGG-11) to 83.74% (ResNet-152)
  • Ideal for fine-tuning and online learning where data arrives sequentially

Why It Matters

Learn&Drop halves CNN training time with minimal accuracy loss, enabling faster model adaptation for real-world applications.