References

Language Models are Unsupervised Multitask Learners

Alec Radford, Jeffrey Wu, Rewon Child, David Luan, Dario Amodei, & Ilya Sutskever (2019)

OpenAI Technical Report.

URL: https://cdn.openai.com/better-language-models/language_models_are_unsupervised_multitask_learners.pdf

Abstract. Introduces GPT-2, a 1.5-billion-parameter decoder-only transformer demonstrating that sufficiently large language models exhibit impressive zero-shot performance on many NLP tasks without task-specific fine-tuning.

Tags: transformer gpt language-models url-only

This site is currently in Beta. Contact: Chris Paton

Textbook of Usability · Textbook of Digital Health

Auckland Maths and Science Tutoring

AI tools used: Claude (research, coding, text), ChatGPT (diagrams, images), Grammarly (editing).