GPT4 API Alternatives Available?

As many of us eagerly wait for access to the GPT4 API, our projects are on hold. I know in my case, I really, really need the higher token limits of the new model. Which got me to thinking: “Is there an alternative?”

I keep hearing about all these ChatGPT alternatives available. I can’t seem to narrow down which of these “alternatives” have APIs, and which, if any, have token limits higher than the current 4K on GPT3.5-turbo.

I am curious if anyone has looked into this and has any suggestions. I believe GPT4 will still be the best alternative, but as I wait for access, if I had something else to work with, that would be great!

3 Likes

afaik, vicuna-13b has been rated the highest performing of them all, except that gpt-4 still remains king of the hill, and apparently miles ahead of vicuna.

but alas, i have no idea how many “tokens” vicuna can handle.

I was also looking into that but wont be long before we have other options

It appears that Vicuna has 2k token limit. That’s what I found by googling. By trying it on hugginface I got an even smaller limit of 1k tokens

There is a brand new (yesterday) model on Huggingface:
MPT-7B-Instruct

Haven’t tried it yet, but it claims to be able, at run time, handle up to 65k token, even though it was trained with 2k token window.

The model has been modified from a standard transformer in the following ways:

1 Like

btw, ALiBi is the technique it uses to be able to use longer context at run time. Based on a quick scan of the paper abstract, looks like performance will likely degrade quickly as you go beyond training context size, YMMV

Its good and free and you dont have to beg for acces wich you have to do with openai