I was trying to test GPT-5 in the Playground when it was unable to provide an output due to top_p being an unsupported parameter in the request.
Error message
BadRequestError(‘Error code: 400 - {\‘error\’: {\‘message\’: “Unsupported parameter: \‘top_p\’ is not supported with this model.”, \‘type\’: \‘invalid_request_error\’, \‘param\’: \‘top_p\’, \‘code\’: \‘unsupported_parameter\’}}’)Traceback (most recent call last): File “/usr/lib/python3.11/site-packages/langchain_core/runnables/base.py”, line 2383, in _atransform_stream_with_config File “/usr/lib/python3.11/site-packages/langchain_core/runnables/base.py”, line 3446, in _atransform File “/usr/lib/python3.11/site-packages/langchain_core/runnables/base.py”, line 5721, in atransform File “/usr/lib/python3.11/site-packages/langchain_core/runnables/base.py”, line 1578, in atransform File “/usr/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py”, line 645, in astream File “/usr/lib/python3.11/site-packages/langchain_openai/chat_models/base.py”, line 2822, in _astream File “/usr/lib/python3.11/site-packages/langchain_openai/chat_models/base.py”, line 1370, in _astream File “/usr/lib/python3.11/site-packages/openai/resources/chat/completions/completions.py”, line 2603, in create File “/usr/lib/python3.11/site-packages/openai/_base_client.py”, line 1794, in post File “/usr/lib/python3.11/site-packages/openai/_base_client.py”, line 1594, in request openai.BadRequestError: Error code: 400 - {‘error’: {‘message’: “Unsupported parameter: ‘top_p’ is not supported with this model.”, ‘type’: ‘invalid_request_error’, ‘param’: ‘top_p’, ‘code’: ‘unsupported_parameter’}}
I saw another forum post that experienced a similar error due to temperature being unsupported by GPT-5. Is there a way to disable all of the unsupported parameters of GPT-5 so that it can be used in LangSmith?
