LangSmith tracing - custom

Hi,
I have a "weird’ case, where I would like to track/ trace token count in langsmith UI but not collect details such as content of request, input and output.
Is that possible to do be done with langsmith?

Hi! LangSmith allows masking inputs and outputs. You can see documentation here.

Thank you for the reply,
I run similar setup to the one in the example, but in that case both I/O is hidden which is desired behavior but also token count is hiden - which isnt desired.
How do I fix that?

import openai
from langsmith import Client
from langsmith.wrappers import wrap_openai

openai_client = wrap_openai(openai.Client())
langsmith_client = Client(
  hide_inputs=lambda inputs: {}, hide_outputs=lambda outputs: {}
)

# The trace produced will have its metadata present, but the inputs will be hidden
openai_client.chat.completions.create(
  model="gpt-4o-mini",
  messages=[
      {"role": "system", "content": "You are a helpful assistant."},
      {"role": "user", "content": "Hello!"},
  ],
  langsmith_extra={"client": langsmith_client},
)

# The trace produced will not have hidden inputs and outputs
openai_client.chat.completions.create(
  model="gpt-4o-mini",
  messages=[
      {"role": "system", "content": "You are a helpful assistant."},
      {"role": "user", "content": "Hello!"},
  ],
)

You can track OpenAI’s usage counts by sending the usage_medata field in the metadata, see documentation here!