Utilising @langchain/google-gauth via langsmith

I’m currently messing around with and trying out langsmith for prompt management. I want to using this with my react agent, however I am encountering issues

My langgraph state contains the necessary variables, and normally this works fine when passing in my ChatPromptTemplate, however I am unable to figure out how to use a hub runnable with createReactAgent

const MyAgent = createReactAgent({
  llm: async (_state, runtime) => {
    const { promptId } = runtime.context || {};
    const prompt = await hub.pull<Runnable>(promptId);
  },
  tools: [
    myTool
  ],
  stateSchema: StateAnnotation,
  contextSchema: ContextAnnotation,
  name: 'MyAgent',
});

I am very interested in moving to a dedicated prompt management platform like langsmith, but this is a blocker for me at the moment, as I need to be able to dynamically both the prompt and the model configuration

EDIT:

Ah, after testing again today I realise I forgot to add the { includeModel: true }
Now, I have a seperate error (so I’m moving and renaming the topic)

Previously I was using @langchain/google-gauth, which correctly reads the application default credentials from my system environment variables. However, it appears that langsmith instead uses @langchain/google-vertexai, which instead errors with the following:

Error: Missing key “GOOGLE_VERTEX_AI_WEB_CREDENTIALS” for $.kwargs.last.kwargs.bound.kwargs.credentials in load(secretsMap={})