How to send sensitive data like auth tokens across nodes without getting stored in the checkpoint

I have a graph with two nodes. One node need to make an API call to fetch data. The headers contain the user token. What is the right pattern to pass the headers to the node without the headers getting stored in the checkpoint?

I can’t use the state and config because they are stored in the checkpoint. I looked up about runtime context but apparently that is also accessed through config within the nodes which means that will be checkpointed
too.

def fetch_forum_data(state: ForumState, config: RunnableConfig) → Dict[str, Any]:
topic = state[“topic”]

# call the API using the headers

mock_api_response = [
    {"user": "user1", "content": f"I really like {topic} because it's interesting.", "upvotes": 10},
]
return {"raw_posts": mock_api_response}

def process_data(state: ForumState) → Dict[str, Any]:
raw_posts = state[“raw_posts”]
relevant_posts = [
f"{p[‘user’]}: {p[‘content’]}"
for p in raw_posts
if p[‘upvotes’] > 3
]
combined_content = “\n”.join(relevant_posts)
return {“processed_content”: combined_content}

def create_forum_workflow():
workflow = StateGraph(ForumState)
workflow.add_node(“fetcher”, fetch_forum_data)
workflow.add_node(“processor”, process_data)

workflow.set_entry_point("fetcher")
workflow.add_edge("fetcher", "processor")
workflow.add_edge("processor", END)
app = workflow.compile()
return app

hi @ninja18

have you tried static context? It is for user tokens (headers) and stuff like that Context overview - Docs by LangChain

How to apply this to your example

  1. Define a context schema for what you want to pass (e.g., headers).
  2. Adjust node signatures to receive runtime: Runtime[ContextSchema] and read headers from runtime.context.
  3. Invoke/stream with context={"headers": {...}}. Do not put headers in state or config.
from dataclasses import dataclass
from typing import Any, Dict
from langgraph.graph import StateGraph, END
from langgraph.runtime import Runtime

# Your state
class ForumState(TypedDict):
    topic: str
    raw_posts: list[dict]
    processed_content: str

@dataclass
class ContextSchema:
    headers: Dict[str, str]

def fetch_forum_data(state: ForumState, runtime: Runtime[ContextSchema]) -> Dict[str, Any]:
    topic = state["topic"]
    headers = runtime.context.headers  # e.g., {"Authorization": "Bearer <token>"}
    # Make your API call using headers (omitted)
    mock_api_response = [
        {"user": "user1", "content": f"I really like {topic} because it's interesting.", "upvotes": 10},
    ]
    return {"raw_posts": mock_api_response}

def process_data(state: ForumState) -> Dict[str, Any]:
    raw_posts = state["raw_posts"]
    relevant_posts = [
        f"{p['user']}: {p['content']}"
        for p in raw_posts
        if p["upvotes"] > 3
    ]
    combined_content = "\n".join(relevant_posts)
    return {"processed_content": combined_content}

def create_forum_workflow():
    workflow = StateGraph(ForumState)
    workflow.add_node("fetcher", fetch_forum_data)
    workflow.add_node("processor", process_data)
    workflow.set_entry_point("fetcher")
    workflow.add_edge("fetcher", "processor")
    workflow.add_edge("processor", END)
    return workflow.compile()

app = create_forum_workflow()

# Invocation: pass headers in runtime context (not in state or config)
result = app.invoke(
    {"topic": "LangGraph"},
    context={"headers": {"Authorization": "Bearer YOUR_AUTH_TOKEN"}}
)
  • State is persisted by checkpointers; anything placed there will be saved.
  • Config is also part of checkpoints (“config associated with this checkpoint”), so secrets in config can be captured in the snapshot.
  • The runtime context is per-run input passed via context= and accessed via Runtime; it is not part of the persisted checkpointed state.

If you have a hard requirement that absolutely nothing sensitive ever lands in the checkpoint even by accident, also consider enabling encrypted checkpoint serialization as defense-in-depth (this encrypts whatever is persisted, but does not change what is or isn’t persisted) → Persistence - Docs by LangChain

1 Like

I’d probably mark it as an UntrackedValue

from typing import TypedDict, Annotated
from langgraph.channels.untracked_value import UntrackedValue

class MyState(TypedDict):
    super_secret: Annotated[str, UntrackedValue]

Though obviously it would not last across executions / HIL/ etc.