ClarityAI LogoClarityAI
Core Concepts/token

Token

The small chunks of text an AI reads and processes — roughly a word or part of a word.

What it actually means

AI models don't read text the way humans do, word by word. They break everything into tokens first — small units that could be a full word, a syllable, punctuation, or even a space. "Unbelievable" might be 3 tokens. "Hi" is 1. This matters because every AI model has a token limit — the maximum it can read and respond to in one go. When people say "the model ran out of context," they mean the token limit was hit.

Real-world analogy

Tokens are like Lego bricks. Language gets broken down into individual bricks before the AI processes it, and the model has a box that can only hold so many bricks at once. Once the box is full, older bricks fall out.

Common misconception

Tokens are not words. On average, 1 token is roughly 0.75 words in English, but this varies wildly by language. Hindi, Tamil, and other non-Latin languages often use more tokens per word — which means they cost more and hit limits faster.