From the course: Introduction to Prompt Engineering for Generative AI

Tokens vs. words

From the course: Introduction to Prompt Engineering for Generative AI

Start my 1-month free trial

Tokens vs. words

- [Instructor] In generative AI, you'll often hear the word tokens mentioned. Now, what does this mean? We can think of a token as a small unit that can easily be understood by a large language model. What do I mean by that and how is that different than a word? Well, if you think about the word everyday, you can sort of break it into two tokens: every and day. Now, breaking this down helps the model process this input. If you think about the word joyful, also joy and ful, F-U-L. So one word can be made up of multiple tokens. Some words are one token and some are more. For example, the two words, I'd like, have three tokens: I 'd like, sort of like saying, "I would like." Now, different models have different mechanisms with which they split inputs into tokens. This is the step known as tokenization. And often, the tokenization method really changes the results of the models.

Contents