References

Practical Bayesian Optimization of Machine Learning Algorithms

Jasper Snoek, Hugo Larochelle, & Ryan P. Adams (2012)

Advances in Neural Information Processing Systems 25.

URL: https://arxiv.org/abs/1206.2944

Abstract. Demonstrates that Bayesian optimisation with Gaussian-process surrogates is a practical and effective method for hyperparameter tuning of deep-learning systems. The authors present Spearmint, a Bayesian-optimisation toolkit, and show that it outperforms grid search and random search across a range of machine-learning tasks. Bayesian optimisation maintains a probabilistic model of the validation-loss surface as a function of hyperparameters and at each step picks the most promising next point by maximising an acquisition function (typically expected improvement). The paper made Bayesian-optimisation hyperparameter search a standard practice in the deep-learning community.

Tags: optimisation hyperparameters

Cited in:

This site is currently in Beta. Contact: Chris Paton

Textbook of Usability · Textbook of Digital Health

Auckland Maths and Science Tutoring

AI tools used: Claude (research, coding, text), ChatGPT (diagrams, images), Grammarly (editing).