Unable to parse docstring from OpenAI schema

When running langgraph dev, I get this error:

2026-04-18T12:49:31.592176Z \[warning  \] Unable to parse docstring from OpenAPI schema for route /assistants/{assistant_id} (delete_assistant): mapping values are not allowed here
in "", line 3, column 17:
Query params:
^

Using as description \[langgraph_api.utils\] api_variant=local_dev docstring='Delete an assistant by ID.\\n\\n    Query params:\\n        delete_threads: If "true", delete all threads where\\n            metadata.assistant_id matches this assistant.\\n    ' langgraph_api_version=0.8.0 thread_name=MainThread
Traceback (most recent call last):
File "/Users/alex/projects/workspace/2026-3006-project/src/.venv/lib/python3.12/site-packages/langgraph_api/utils/**init**.py", line 198, in get_schema
parsed = self.parse_docstring(endpoint.func)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/alex/projects/workspace/2026-3006-project/src/.venv/lib/python3.12/site-packages/starlette/schemas.py", line 113, in parse_docstring
parsed = yaml.safe_load(docstring)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/alex/projects/workspace/2026-3006-project/src/.venv/lib/python3.12/site-packages/yaml/**init**.py", line 125, in safe_load
return load(stream, SafeLoader)
^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/alex/projects/workspace/2026-3006-project/src/.venv/lib/python3.12/site-packages/yaml/**init**.py", line 81, in load
return loader.get_single_data()
^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/alex/projects/workspace/2026-3006-project/src/.venv/lib/python3.12/site-packages/yaml/constructor.py", line 49, in get_single_data
node = self.get_single_node()
^^^^^^^^^^^^^^^^^^^^^^
File "/Users/alex/projects/workspace/2026-3006-project/src/.venv/lib/python3.12/site-packages/yaml/composer.py", line 36, in get_single_node
document = self.compose_document()
^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/alex/projects/workspace/2026-3006-project/src/.venv/lib/python3.12/site-packages/yaml/composer.py", line 58, in compose_document
self.get_event()
File "/Users/alex/projects/workspace/2026-3006-project/src/.venv/lib/python3.12/site-packages/yaml/parser.py", line 118, in get_event
self.current_event = self.state()
^^^^^^^^^^^^
File "/Users/alex/projects/workspace/2026-3006-project/src/.venv/lib/python3.12/site-packages/yaml/parser.py", line 193, in parse_document_end
token = self.peek_token()
^^^^^^^^^^^^^^^^^
File "/Users/alex/projects/workspace/2026-3006-project/src/.venv/lib/python3.12/site-packages/yaml/scanner.py", line 129, in peek_token
self.fetch_more_tokens()
File "/Users/alex/projects/workspace/2026-3006-project/src/.venv/lib/python3.12/site-packages/yaml/scanner.py", line 223, in fetch_more_tokens
return self.fetch_value()
^^^^^^^^^^^^^^^^^^
File "/Users/alex/projects/workspace/2026-3006-project/src/.venv/lib/python3.12/site-packages/yaml/scanner.py", line 577, in fetch_value
raise ScannerError(None, None,
yaml.scanner.ScannerError: mapping values are not allowed here
in "", line 3, column 17:
Query params:
^
2026-04-18T12:49:31.597092Z \[info     \] Starting In-Memory runtime with langgraph-api=0.8.0 and in-memory runtime=0.27.4 \[langgraph_runtime_inmem.lifespan\] api_variant=local_dev langgraph_api_version=0.8.0 langgraph_runtime_inmem_version=0.27.4 thread_name=ThreadPoolExecutor-1_0 version=0.8.0
2026-04-18T12:49:31.731444Z\

I traced it back to these lines in the file langgraph_api/api/assistants.py:501:

Query params:
        delete_threads: If "true", delete all threads where
            metadata.assistant_id matches this assistant.
  

If I remove the colons after Query params and delete_threads, it works. Might be some trouble with starlette or pyyaml trying to parse this (as far as I can tell from that error message).

I tried to reproduce it with a fresh project (using langgraph new) but that one worked fine.

hi @alexk

what I can see in the source code:

  • langgraph-api catches the YAML ScannerError in utils/_init_.py and falls back to using the raw docstring as the OpenAPI description
  • root cause: Starlette’s BaseSchemaGenerator.parse_docstring() runs yaml.safe_load() on the whole docstring (after any - split). delete_assistant in langgraph_api/api/assistants.py mixes a plain-scalar first line with a YAML-mapping-looking Query params: block and has no - separator, which PyYAML rejects
  • not project-dependent: I verified the identical buggy docstring exists in langgraph-api 0.6.39, 0.7.100, and 0.8.0. The schema is built in update_openapi_spec at server startup, so every langgraph dev hits it. The “fresh project doesn’t show it” observation is almost certainly log filtering, not a code difference.
  • fix options:

Ranked from least to most invasive.

1. Ignore it. The spec works, the endpoint works, Studio works. This is the honest recommendation.

2. Filter the warning. Add a structlog/logging filter that drops records matching the "Unable to parse docstring from OpenAPI schema" event. For example, as a logging.Filter attached to the root logger in your entrypoint module that langgraph dev imports:

import logging

class _DropOpenAPIDocstringWarn(logging.Filter):
    def filter(self, record: logging.LogRecord) -> bool:
        msg = record.getMessage()
        return "Unable to parse docstring from OpenAPI schema" not in msg

logging.getLogger().addFilter(_DropOpenAPIDocstringWarn())

Because langgraph-api uses structlog.stdlib.get_logger (see langgraph_api/utils/__init__.py), stdlib filters work.

3. Monkey-patch the docstring before server startup. This gets rid of the warning and the fallback - the route will have a correctly parsed OpenAPI description. Put it in a module langgraph dev imports before routes are registered (e.g. the top of your graph module):

from langgraph_api.api import assistants

if assistants.delete_assistant.__doc__ and "---" not in assistants.delete_assistant.__doc__:
    assistants.delete_assistant.__doc__ = (
        "Delete an assistant by ID.\n"
        "\n"
        "---\n"  # Starlette splits on '---' and parses only what follows as YAML.
        "description: |\n"
        "  Delete an assistant by ID. `delete_threads=true` also deletes all threads\n"
        "  whose metadata.assistant_id matches this assistant.\n"
    )

The --- separator is exactly the mechanism Starlette’s parse_docstring provides for mixing prose and OpenAPI YAML - adding it is the right long-term fix, and langgraph-api should probably do this upstream.

4. File / track the upstream bug. The proper fix belongs in langgraph-api: either add --- markers to these docstrings or stop using YAML-in-docstrings entirely (e.g. use docstring-parser or a decorator like FastAPI’s response model hooks). If you open an issue, link this thread and point at langgraph_api/api/assistants.py::delete_assistant.

Thanks a lot for confirming this, I was wondering if it was my specific setup. Because when I run langgraph new and create a new deep agent app, it starts just fine with exactly the same pacakge versions that I see in the app that makes trouble.

I just fixed it locally, guess I’ll not submit a fix for now as I don’t really know how to do it correctly. You don’t happen to know where can submit this as an issue? I couldn’t find a dedicated langgraph-api repo yet in GitHub.