How do I use new 16k Context model?

Hi! I’m not a coder and struggle to find a way to use new models in an environment similar to playground. Could someone share a dead-simple tool/way to interact with this models? Chat-like style.

You can try my tool: https://promptknit.com, it supports the new 16k models. Create a new prompt with the type Chat, and there you go. Lmk if you have any problems. Knit is free for everyone.

1 Like

Welcome to the forum!

Simply change the model = 'gpt-3.5-turbo' to model = 'gpt-3.5-turbo-16k'

I do, but Maximum length is still 2048. I think it’s limited on the Playground.

That’s the reply length, and yes, the Playground is 2k limited for that variable… there are a few things there that need to be potentially updated, a function entry block for example, few other tweaks for the latest models, also a finish_reason flag display would be nice :smiley:

1 Like

@myroslav.galavai @Foxalabs Knit supports larger reply length (It’s hard to get > 2000t response though, depends on your prompt) and displaying the function calls.

The finish_reason flag is also interesting. I think I will support it.

1 Like