ChatOpenAI with HF Inference API endpoint no longer working!

Hello,

I’ve been using ChatOpenAI class to access HuggingFace models but it seems that has stopped working. Has anyone faced with same issue?

Many thanks,
Alexandre

from pydantic import SecretStr
from langchain_openai import ChatOpenAI

llm = ChatOpenAI(
    model="meta-llama/Llama-3.3-70B-Instruct",
    api_key=SecretStr(os.environ['HUGGINGFACEHUB_API_TOKEN']),
    base_url="https://api-inference.huggingface.co/models/meta-llama/Llama-3.3-70B-Instruct/v1/"
)

llm.invoke(
    [
        HumanMessage("Hello"),
    ]
)

Error

NotFoundError('Not Found')Traceback (most recent call last):


  File "/usr/local/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 765, in generate
    self._generate_with_cache(


  File "/usr/local/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 1000, in _generate_with_cache
    for chunk in self._stream(messages, stop=stop, **kwargs):
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^


  File "/usr/local/lib/python3.12/site-packages/langchain_openai/chat_models/base.py", line 668, in _stream
    response = self.client.create(**payload)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^


  File "/usr/local/lib/python3.12/site-packages/openai/_utils/_utils.py", line 279, in wrapper
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^


  File "/usr/local/lib/python3.12/site-packages/openai/resources/chat/completions/completions.py", line 929, in create
    return self._post(
           ^^^^^^^^^^^


  File "/usr/local/lib/python3.12/site-packages/openai/_base_client.py", line 1276, in post
    return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^


  File "/usr/local/lib/python3.12/site-packages/openai/_base_client.py", line 949, in request
    return self._request(
           ^^^^^^^^^^^^^^


  File "/usr/local/lib/python3.12/site-packages/openai/_base_client.py", line 1057, in _request
    raise self._make_status_error_from_response(err.response) from None


openai.NotFoundError: Not Found

The issue wasn’t with ChatOpenAI but HF interference endpoint that’s changed to “https://router.huggingface.co/v1”. It’s working again.