References

Searching for Activation Functions

Prajit Ramachandran, Barret Zoph, & Quoc V. Le (2017)

arXiv.

DOI: https://doi.org/10.48550/arxiv.1710.05941

Abstract. Uses automated search to discover novel activation functions and identifies Swish (x·sigmoid(x)) as a particularly effective alternative to ReLU, matching or exceeding it on a range of image and language tasks.

Tags: neural-networks activations swish

This site is currently in Beta. Contact: Chris Paton

Textbook of Usability · Textbook of Digital Health

Auckland Maths and Science Tutoring

AI tools used: Claude (research, coding, text), ChatGPT (diagrams, images), Grammarly (editing).