Primer: Only connect: The variety and splendor of neural network architectures
Data Sciences Platform, Ó³»´«Ã½
Beginning with a brief history of connectionism and Convolutional Neural Networks (CNN), we will present several recent innovations in neural network architecture design. Motivated by the vanishing and exploding gradients problem, we show how both residual and long-range skip connections allow models to grow deeper and more powerful. Skip connections organized hierarchically, as in the U-Net architecture, naturally apply to segmentation problems, like the anatomical segmentation of cardiac MRI. We conclude with a discussion of recent innovations in CNN training including one-cycle learning and adaptive pooling.