Estimate OpenAI token usage for chat completions, including functions, with this Python utility!
This package is based upon hmarr
's openai-chat-tokens. As of right now (September 2023) there is no official documentation from openai on how to accurately predict the number of tokens from functions. This package solves that! Use it to get a very precise estimation of the token count for chat completions and better manage your OpenAI API usage.
Most often it is correct down to the token.
Install the Package via pip
pip install openai-function-tokens
Import the Estimation Function
from openai_function_tokens import estimate_tokens
To use the estimator, call the estimate_tokens
function:
estimate_tokens(messages, functions=None, function_call=None)
Pass in the messages
, and optionally functions
and function_call
, to receive a precise token count.
Credit to hmarr for the original TypeScript tool. For a better understanding of token counting logic, check out his blog post.
Function Calling
How to call functions with chat models
How to use functions with a knowledge base
JSON Schema documentation
Counting tokens (only messages)
Feedback, suggestions, and contributions are highly appreciated. Help make this tool even better!