References

Improving Language Understanding by Generative Pre-Training

Alec Radford, Karthik Narasimhan, Tim Salimans, & Ilya Sutskever (2018)

OpenAI Technical Report.

URL: https://cdn.openai.com/research-covers/language-unsupervised/language_understanding_paper.pdf

Abstract. Introduces GPT (Generative Pre-trained Transformer), a decoder-only transformer trained with autoregressive language modelling then fine-tuned on downstream tasks. GPT established the decoder-only paradigm that underpins modern large language models.

Tags: transformer gpt language-models url-only

This site is currently in Beta. Contact: Chris Paton

Textbook of Usability · Textbook of Digital Health

Auckland Maths and Science Tutoring

AI tools used: Claude (research, coding, text), ChatGPT (diagrams, images), Grammarly (editing).