Hey! I know that OpenAI has kept a limit on the number of tokens for output.
But, in the case of Copy.ai, they have been making big and long blogs sections, blog paragraphs, and articles that are way above the normal output token count.
How is that achievable , can we ask the support team for OpenAI to allow us to increase the output token length?
Hey! Thank you for the response.
Well, I was referring to the use-case guidelines that OpenAI have mentioned, before going live. Don’t you think that we must need to follow the use case guidelines?
Don’t assume that all of these AI companies are exclusively using OpenAI’s technology at this point. There are multiple AI providers in the market which are less restrictive and you will find some of these companies are using other technology alongside OpenAI.
I didn’t know that. That explains how come Copy.AI are providing an Instagram captions generator when it was clearly not allowed as per OpenAI guidelines.