Suggestion - support swapping models during a conversation

Currently I need to pick if I want GPT-4 or GPT-3.5-turbo. For many cases 3.5 is fine but sometimes the additional reasoning power of 4 is what makes all the difference.

I’d like to be able to have a conversation where I can:

  • Swap the model for future responses
  • Regenerate a response with a different model

The API is identical for the two models, so it feels a UI that looks like this should be fine:

[set to gpt-3.5-turbo]

prompt

gpt response
[swap model over to 4]

me

gpt response

[swap model back to 3.5]

me

gpt response
[regenerate with 4]

I think this should be a reasonable change and could lower the load on GPT-4 as people like me could use gpt-3.5 for the speed and only swap “up” when necessary. I know I often pick it as a default because I can’t rerun a whole conversation when I find out that gpt-3.5 doesn’t quite get it.

2 Likes