Embeddings
A way of converting words or concepts into numbers so AI can understand their meaning and relationships.
An embedding is a list of numbers — called a vector — that represents a word, sentence, or piece of content. Words with similar meanings get similar vectors, which means AI can mathematically measure how related two concepts are.
Imagine placing every word in the English language on a giant map. Words that mean similar things are placed close together — "king" near "queen", "dog" near "puppy". Embeddings are the coordinates of each word on that map. The closer the coordinates, the more related the meaning.
Embeddings aren't just about synonyms. They capture complex relationships — "Paris is to France as Tokyo is to Japan" is a relationship embeddings can represent mathematically. They encode meaning, not just spelling similarity.