Question About Speed Of GPT4-Turbo W/Images

Hi! We are sending Images to GPT 4 along with a short 0-50 Word prompt and asking for a ~30 word response. This is taking like 4 seconds. Is this consistent with what you guys have been experiencing? Is there any way to speed this up? Does GPT4-Turbo do this faster? Or does anyone know of alternative models I could use? Thanks in advance?

Send an image URL instead of streaming the base64 of it.

Okay cool! Can I ask why that is faster? Basically our server gets image in base64 from a edge device prior to sending so do you think it would be faster to make a URL for the image on the spot then send the URL rather than just stream it in 64?

My bash script API warper also sends the base64 instead of relying on yet another server to be online and fast for OpenAI to access the image from it.

Sending a URL instead of the base64 would be faster only if the user has a limited upload bandwidth or a slow base64 converter. Otherwise, I don’t think it is “faster” to send the URL than base64.

It does not seem to be your case to have a limited upload bandwidth, either.