tokenization

Tokenization is the process of breaking down a sequence of text into smaller units, called tokens, which could be words, phrases, or even individual characters.

Requires login.