The same code I’ve been using before to generate images with DALL-E 3 via the API no longer works. I also tried the example code from the documentation, which did not work either.
All systems are operational according to OpenAI’s status page. DALL-E 2 works fine – but not DALL-E 3. GPT-3.5-Turbo also works fine.
Code (from documentation, having only added API key, logging, and print statement):
import os
import logging
logging.basicConfig(level=logging.DEBUG)
from openai import OpenAI
client = OpenAI(api_key=os.environ["OAI_KEY"])
response = client.images.generate(
model="dall-e-3",
prompt="a white siamese cat",
size="1024x1024",
quality="standard",
n=1,
)
image_url = response.data[0].url
print(image_url)
Console output:
DEBUG:httpx:load_ssl_context verify=True cert=None trust_env=True http2=False
DEBUG:httpx:load_verify_locations cafile='C:\\Users\\Aimjock\\Desktop\\Aimjock Stuff\\Code test newwww\\.code-testing\\Lib\\site-packages\\certifi\\cacert.pem'
DEBUG:openai._base_client:Request options: {'method': 'post', 'url': '/images/generations', 'files': None, 'json_data': {'prompt': 'a white siamese cat', 'model': 'dall-e-3', 'n': 1, 'quality': 'standard', 'size': '1024x1024'}}
DEBUG:httpcore.connection:connect_tcp.started host='api.openai.com' port=443 local_address=None timeout=5.0 socket_options=None
DEBUG:httpcore.connection:connect_tcp.complete return_value=<httpcore._backends.sync.SyncStream object at 0x000001F735755730>
DEBUG:httpcore.connection:start_tls.started ssl_context=<ssl.SSLContext object at 0x000001F7368E0150> server_hostname='api.openai.com' timeout=5.0
DEBUG:httpcore.connection:start_tls.complete return_value=<httpcore._backends.sync.SyncStream object at 0x000001F736925670>
DEBUG:httpcore.http11:send_request_headers.started request=<Request [b'POST']>
DEBUG:httpcore.http11:send_request_headers.complete
DEBUG:httpcore.http11:send_request_body.started request=<Request [b'POST']>
DEBUG:httpcore.http11:send_request_body.complete
DEBUG:httpcore.http11:receive_response_headers.started request=<Request [b'POST']>
DEBUG:httpcore.http11:receive_response_headers.failed exception=RemoteProtocolError('Server disconnected without sending a response.')
DEBUG:httpcore.http11:response_closed.started
DEBUG:httpcore.http11:response_closed.complete
INFO:openai._base_client:Retrying request to /images/generations in 0.780869 seconds
DEBUG:openai._base_client:Request options: {'method': 'post', 'url': '/images/generations', 'files': None, 'json_data': {'prompt': 'a white siamese cat', 'model': 'dall-e-3', 'n': 1, 'quality': 'standard', 'size': '1024x1024'}}
DEBUG:httpcore.connection:connect_tcp.started host='api.openai.com' port=443 local_address=None timeout=5.0 socket_options=None
DEBUG:httpcore.connection:connect_tcp.complete return_value=<httpcore._backends.sync.SyncStream object at 0x000001F7369270E0>
DEBUG:httpcore.connection:start_tls.started ssl_context=<ssl.SSLContext object at 0x000001F7368E0150> server_hostname='api.openai.com' timeout=5.0
DEBUG:httpcore.connection:start_tls.complete return_value=<httpcore._backends.sync.SyncStream object at 0x000001F736927080>
DEBUG:httpcore.http11:send_request_headers.started request=<Request [b'POST']>
DEBUG:httpcore.http11:send_request_headers.complete
DEBUG:httpcore.http11:send_request_body.started request=<Request [b'POST']>
DEBUG:httpcore.http11:send_request_body.complete
DEBUG:httpcore.http11:receive_response_headers.started request=<Request [b'POST']>
DEBUG:httpcore.http11:receive_response_headers.failed exception=KeyboardInterrupt()
DEBUG:httpcore.http11:response_closed.started
DEBUG:httpcore.http11:response_closed.complete