🪙 Token usage tracking 🪙 We've made it easier to access response metadata returned by LLM providers with a new field on AIMessages! Useful information like token usage for supported models will appear there. Python 🐍: python.langchain.com/docs/modules/m… JS ☕️: js.langchain.com/docs/modules/m…
6
28
148
13K
49
Download Image
@LangChainAI Thank you! Been looking for something like this for months
@LangChainAI This is going to be super helpful. Till now we had to use specific callbacks to get this data. Exposing it in response will make things lot easier 😊🙏🏻🙏🏻