Abstract. The paper that popularised ReLU activations in modern deep learning. Shows that rectified linear units outperform sigmoid and tanh in restricted Boltzmann machines, establishing ReLU as a cornerstone of the deep learning revival.
Tags:neural-networksactivationsreluurl-only
This site is currently in Beta. Contact: Chris Paton