GPT-3 Turbo Assistant Trimming Dall-E URLs

Working on a project using the new assistant API to call a function that generates images via Dall-E. I have found that if I use GPT-4 the URL is returned via the assistant no problem but if I use 3.5 it always trims everything after the file extension, breaking the URL.

Anyone have any ideas to work around this as seems excessive to use GPT-4 tokens just to get the url back correctly?

I tried a few different prompts and function descriptions to tell it not to trim the URLs but seems to not make a difference so far.

Welcome to the forum.

What have you tried so far? It might be a bug. I’ll ask around. More details helpful. Thanks!

What you send back to the AI can be a function return value “DALL-E image successfully displayed in user interface, with download link for user”.

Then actually do what you told the AI you did…

1 Like

Yes, good idea. This did work as a workaround. Would be nice to find a way to handle it natively as in GPT-4 but for now this will do the trick. Just had to pass some extra parameters in so it could message directly instead.

Edit: If I provide that response it is still generating responses with random fake image urls which is interesting. Might even have to drop/ignore its response entirely to not confuse the user…

1 Like

You can append to my return value example with “response to user ‘I generated an image for you, hope you like it!’”

1 Like

It was still hallucinating urls or mount paths with that addition but based on your suggestion I found if I do:

‘DALL-E image successfully generated. Response to user “I generated an image for you, I hope you like it! {url}”’

It will respond with the suggested message and the full url.

1 Like

Excellent! I suspect that once you get it repeating back tokens from its context, it becomes a more reliable parrot to then continue outputting the provided URL.

1 Like

How are you able to get a DALL-E image link from Assistants? My understanding was that it wasn’t supported in the Assistants API.

The answer is buried deep in the original question:

An assistant can make a “function_call” back to your code. You can write the specification for such a function the AI can use, and you can satisfy it in any way you want. The only thing not possible is giving the language AI things that aren’t language.

1 Like

Ah, that makes sense! I will look more into how to use functions. Haven’t really tried it yet. Thank you very much! :slight_smile: