No fun backstory for this post. These things are just cool and important in machine learning. Turns out they are very algorithmically similar. So I’ll explain how 🙂
Residual Blocks Review:
A residual block in a neural network is defined by the presence of skip layers, as seen in the id(Hk-1) connection in figure (1). These skip layers represent the addition of a layer i-k so some layer i . These blocks reduce the effect of vanishing gradients, and enable the development of much deep neural networks. This architecture enabled Microsoft to win the 2015 ImageNet challenge, and residual blocks have become a staple of deep neural networks.
Continue reading “2 Sides of the Same Coin: Residual Blocks and Gradient Boosting”