‹ Back to glossary
Token
intermediate
The basic unit that language models process. A token is roughly a word or word fragment. "Artificial intelligence" is two tokens. AI model pricing and context limits are measured in tokens. GPT-4 can process about 128,000 tokens (roughly 300 pages) in a single conversation.