Langchain pass current timestamp to agent prompt

I have a web app which receives user input request in the controller endpoint. It then constructs and sends the message to the deepagent:

async for step in current_app.agent.astream(
    {"messages": [{"role": "user", "content": message}], "timestamp": datetime.now()},
    stream_mode="values", # Use this to stream all values in the state after each step.
    config = config, # This is needed by Checkpointer
):
    result.append(step["messages"][-1])

The problem I am facing is that since the edit_file is so unreliable ( `edit_file` crashes · Issue #728 · langchain-ai/deepagents · GitHub ), I am thinking of creating a new file with the path name ending with the request current timestamp for the app to work. How should I structure my prompt so that the agent is able to get the timestamp? Is this even possible? For example, I am thinking about the following prompt:

WORKFLOW_INSTRUCTIONS = """You are a helpful question-answering assistant. For context, The current timestamp is {timestamp}.

**Save the request**: Use write_file() to save the user's question to `/user_questions_{timestamp:%d-%m-%Y_%H-%M-%S}.md`
**Write Report**: Write final report to `/final_answer_{timestamp:%d-%m-%Y_%H-%M-%S}.md`

The timestamp has to be consistent in the full turn of the request as the agent might decide to edit the file later in the logic and the corresponding `/final_answers_{timestamp:%d-%m-%Y_%H-%M-%S}.md` has to match the timestamp.

You can write your own FILESYSTEM_SYSTEM_PROMPT and pass that to middleware.
Perhaps you can use something like this:

CUSTOM_FILESYSTEM_PROMPT = """
## Filesystem Tools: `ls`, `read_file`, `write_file`, `edit_file`, `glob`, `grep`

You have access to a filesystem which you can interact with using these tools.
All file paths must start with a /.

- ls: list files in a directory (requires absolute path)
- read_file: read a file from the filesystem
- write_file: write to a file in the filesystem
  - When using `write_file`, make sure `file_path` always is `/final_answer_<current_time>`
- edit_file: edit a file in the filesystem
  - When using `edit_file`, make sure `file_path` always is `/final_answer_<current_time>`
- glob: find files matching a pattern (e.g., "**/*.py")
- grep: search for text within files

## Execute Tool: `execute`

You have access to an `execute` tool for running shell commands in a sandboxed environment.
Use this tool to run commands, scripts, tests, builds, and other shell operations.

- execute: run a shell command in the sandbox (returns output and exit code)

Current-Time: {timestamp}
"""

And then when starting the middleware, you can pass the timestamp that you get from your controller:

FilesystemMiddleware(system_prompt=CUSTOM_FILESYSTEM_PROMPT.format(timestamp))

Note: I placed Current-Timestamp at the end to ensure you benefit from prompt caching.

if the system is not restart, then the time is not update

Bad idea. Not only is the timestamp not runtime dynamic, it also suggest something like in order to enter your house from the main door, you have to go to the back door to get the key. Not intuitive!

You are right to point out that time passed will be static and will not be updated at runtime as also pointed out by @khteh as well.

You’re already passing timestamp in your state:

{"messages": [{"role": "user", "content": message}], "timestamp": datetime.now()}

Therefore I have three more suggestion that are better suited:

Approach 1: Include timestamp in the user message itself

The simplest approach - format the timestamp directly into the message content:

timestamp = datetime.now()
formatted_ts = timestamp.strftime("%d-%m-%Y_%H-%M-%S")

user_message = f"""[Request Timestamp: {formatted_ts}]

{message}

Note: When saving files, use this exact timestamp suffix: _{formatted_ts}"""

async for step in current_app.agent.astream(
    {"messages": [{"role": "user", "content": user_message}]},
    stream_mode="values",
    config=config,
):
    result.append(step["messages"][-1])

Approach 2: Use a custom state schema with timestamp field

If you’re using LangGraph, you can define a custom state that includes the timestamp and have your nodes access it:

from typing import TypedDict, Annotated
from langgraph.graph.message import add_messages

class AgentState(TypedDict):
    messages: Annotated[list, add_messages]
    timestamp: str  # Pre-formatted timestamp string

# When invoking:
timestamp = datetime.now().strftime("%d-%m-%Y_%H-%M-%S")

async for step in current_app.agent.astream(
    {
        "messages": [{"role": "user", "content": message}],
        "timestamp": timestamp
    },
    stream_mode="values",
    config=config,
):
    result.append(step["messages"][-1])

Then in your agent’s system prompt or instructions, reference that the timestamp is available in the state (Just like how I referred in my earlier response but this time will be dynamic).

Approach 3: Inject timestamp via configurable fields

You can pass runtime values through the config’s configurable field:

timestamp = datetime.now().strftime("%d-%m-%Y_%H-%M-%S")

config = {
    "configurable": {
        "thread_id": thread_id,
        "request_timestamp": timestamp
    }
}

async for step in current_app.agent.astream(
    {"messages": [{"role": "user", "content": message}]},
    stream_mode="values",
    config=config,
):
    result.append(step["messages"][-1])

Then your nodes can access config["configurable"]["request_timestamp"].

Recommended Prompt Structure

WORKFLOW_INSTRUCTIONS = """You are a helpful question-answering assistant.

**Important**: Each request has a unique timestamp identifier provided at the start of the user message in the format [Request Timestamp: DD-MM-YYYY_HH-MM-SS]. You MUST use this exact timestamp for ALL file operations in this request to ensure consistency.

**File Naming Convention**:
- Save user questions to: `/user_questions_{timestamp}.md`
- Write final reports to: `/final_answer_{timestamp}.md`

The timestamp MUST remain consistent across all file operations within a single request turn."""

The key insight is that you want the timestamp to be runtime dynamic per-request but static within a single request’s execution…..

I hope this helps @feng-1985 @khteh

2 Likes

What you propose is only at the HMI level. How to integrate the timestamp information with the system prompt so that the agent would use the filesystem tools to write request/response to a file with the timestamp suffix? Take a look at this: deepagents-quickstarts/deep_research/research_agent/prompts.py at main · langchain-ai/deepagents-quickstarts · GitHub

1 Like

@khteh The tools from FilesystemMiddleware (write_file and edit_file) will respect your naming convention as long as you make it explicit in the workflow instructions, the LLM reads the timestamp from context and passes it as the file path argument when calling those tools.

Step 1: Create a single long-lived agent at startup

from datetime import datetime
from deepagents import create_deep_agent

WORKFLOW_INSTRUCTIONS = """You are a helpful question-answering assistant.

**File Naming Convention**: Each request includes a timestamp in the format
[Request Timestamp: DD-MM-YYYY_HH-MM-SS] at the start of the user message.
You MUST use this exact timestamp for ALL file operations in this request.

- Save user questions to: `/user_questions_{timestamp}.md`
- Write final reports to: `/final_answer_{timestamp}.md`

CRITICAL: Extract the timestamp from [Request Timestamp: ...] and use it
verbatim for every `write_file` and `edit_file` call. Do not generate a
new timestamp."""

agent = create_deep_agent(
    model="anthropic:claude-sonnet-4-6",
    system_prompt=WORKFLOW_INSTRUCTIONS,
    backend=your_backend,
    checkpointer=your_checkpointer,
)

Step 2: Inject the concrete timestamp per request in your controller

from uuid import uuid4

@app.route("/chat", methods=["POST"])
async def chat():
    message = request.json["message"]
    thread_id = request.json.get("thread_id", str(uuid4()))
    timestamp = datetime.now().strftime("%d-%m-%Y_%H-%M-%S")

    # Prepend the concrete timestamp so the agent sees it in context
    user_message = f"[Request Timestamp: {timestamp}]\n\n{message}"

    config = {"configurable": {"thread_id": thread_id}}
    result = []

    async for step in agent.astream(
        {"messages": [{"role": "user", "content": user_message}]},
        stream_mode="values",
        config=config,
    ):
        result.append(step["messages"][-1])

    return result