References

Dropout: A Simple Way to Prevent Neural Networks from Overfitting

Nitish Srivastava, Geoffrey E. Hinton, Alex Krizhevsky, Ilya Sutskever, & Ruslan Salakhutdinov (2014)

Journal of Machine Learning Research, 15(1), 1929-1958.

URL: https://jmlr.org/papers/v15/srivastava14a.html

Abstract. Introduces dropout as a regularisation technique that randomly deactivates neurons during training, forcing the network to develop redundant representations. Dropout is interpreted as an approximation to averaging over an exponential ensemble of thinned networks.

Tags: regularisation dropout url-only

This site is currently in Beta. Contact: Chris Paton

Textbook of Usability · Textbook of Digital Health

Auckland Maths and Science Tutoring

AI tools used: Claude (research, coding, text), ChatGPT (diagrams, images), Grammarly (editing).