Within our system, the term "token" refers to the basic unit that our systems use to compute the length of a text, and this typically covers not just entire words but also special characters, fragments, and spaces, and each token is accounted for and contributes to the established token usage limit, input- and output-wise, on your account.
According to OpenAI: A helpful rule of thumb is that one token generally corresponds to ~4 characters of text for common English text.
What is a token? Print
Modified on: Mon, 14 Apr, 2025 at 3:10 AM
Did you find it helpful? Yes No
Send feedbackSorry we couldn't be helpful. Help us improve this article with your feedback.