As many of us eagerly wait for access to the GPT4 API, our projects are on hold. I know in my case, I really, really need the higher token limits of the new model. Which got me to thinking: “Is there an alternative?”
I keep hearing about all these ChatGPT alternatives available. I can’t seem to narrow down which of these “alternatives” have APIs, and which, if any, have token limits higher than the current 4K on GPT3.5-turbo.
I am curious if anyone has looked into this and has any suggestions. I believe GPT4 will still be the best alternative, but as I wait for access, if I had something else to work with, that would be great!
btw, ALiBi is the technique it uses to be able to use longer context at run time. Based on a quick scan of the paper abstract, looks like performance will likely degrade quickly as you go beyond training context size, YMMV