‹ Back to glossary

Token

intermediate
The basic unit that language models process. A token is roughly a word or word fragment. "Artificial intelligence" is two tokens. AI model pricing and context limits are measured in tokens. GPT-4 can process about 128,000 tokens (roughly 300 pages) in a single conversation.
Related Terms
Large Language Model (LLM)
Questions?
AI Readiness Check All terms
The Native AI Briefing
European AI news, curated and fact-checked. Every 2–3 days. Free.