I’m trying to use the embeddings API via Azure AI Foundry. It’s returning the following error:
Traceback (most recent call last):
File "/usr/lib/python3.10/runpy.py", line 196, in _run_module_as_main
return _run_code(code, main_globals, None,
File "/usr/lib/python3.10/runpy.py", line 86, in _run_code
exec(code, run_globals)
File ".../__main__.py", line 14, in <module>
result = azure.embeddings.create(
File ".../.venv/lib/python3.10/site-packages/openai/resources/embeddings.py", line 128, in create
return self._post(
File ".../.venv/lib/python3.10/site-packages/openai/_base_client.py", line 1239, in post
return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
File ".../.venv/lib/python3.10/site-packages/openai/_base_client.py", line 1034, in request
raise self._make_status_error_from_response(err.response) from None
openai.BadRequestError: Unsupported data type
To Reproduce
from openai import AzureOpenAI
LLM_ENDPOINT = "https://xxx.openai.azure.com/openai/deployments/o3-mini/chat/completions?api-version=2025-02-01-preview"
LLM_API_VERSION = "2025-02-01-preview"
LLM_API_KEY = "..."
azure = AzureOpenAI(
azure_endpoint=LLM_ENDPOINT,
api_key=LLM_API_KEY,
api_version=LLM_API_VERSION,
)
result = azure.embeddings.create(
model="o3-mini",
input="test",
encoding_format="float",
)
OS
pop_os 22.04
Python version
Python v3.10
Library version
openai v1.76.2