I am trying to instrument an AutoGen app, but as I can add the @traceable in some functions, as I can not wrap the OpenAI client, what is the expected way to instrument an autogen app? The way I did it, it was not getting the llm calls and the token count and etc. Is there any way to do it?
Hi @francisco.junior, We have an OpenTelemetry integration you can use to trace AutoGen and other non-LangChain applications: Trace with OpenTelemetry | 🦜️🛠️ LangSmith
Hi Angus,
I will try this way. I got some traces flowing but not keeping the parent child structure. Got some isolated traces and not them in a call tree.
If I try using the traceable, it does not get the LLM calls made by the agents in Autogen. That is the why I was looking for some code samples.
Thanks,
Francisco