References

Rectified Linear Units Improve Restricted Boltzmann Machines

Vinod Nair & Geoffrey E. Hinton (2010)

Proceedings of ICML 2010, 807-814.

URL: https://icml.cc/Conferences/2010/papers/432.pdf

Abstract. The paper that popularised ReLU activations in modern deep learning. Shows that rectified linear units outperform sigmoid and tanh in restricted Boltzmann machines, establishing ReLU as a cornerstone of the deep learning revival.

Tags: neural-networks activations relu url-only

This site is currently in Beta. Contact: Chris Paton

Textbook of Usability · Textbook of Digital Health

Auckland Maths and Science Tutoring

AI tools used: Claude (research, coding, text), ChatGPT (diagrams, images), Grammarly (editing).