AI Fundamentals
Tokenization
The process of splitting raw text or code into discrete units (tokens) before feeding them to a model. How a tokenizer splits code affects costs, context limits, and how well the model handles identifiers and symbols.
AI Fundamentals
The process of splitting raw text or code into discrete units (tokens) before feeding them to a model. How a tokenizer splits code affects costs, context limits, and how well the model handles identifiers and symbols.