If a developer follows COPPA guidelines, could an app use OpenAI Chat apis to design an app for Kids aged 5-10 years old?
Kids would be allowed access only with Parent consent as PinWheelGPT did it?
giving young children unsupervised direct access to chatting with gpt is probably a bad idea, since it will hallucinate facts and possibly convince the very impressionable child of a bunch of incorrect information (which could be very dangerous if it’s something chemical that they might try themselves).
is that what you’re mainly thinking about, direct access to the chatbot?
A buddy for kids. From whom they could learn, ask all sorts of questions, cook stories together etc.
It would be better than a general Google search. There may be hallucinations, but they would not be exposed to explicit visual content.
if you perhaps, 100% control the input, so they can’t just type anything they want but rather select from choices, it would be maybe alright, but in the case of direct access there is a possibility that the bot could convince the child to do some dangerous things or misinformation that could be dangerous, like if the child asks about how to behave in traffic, or if you can mix soap with soda to make it drinkable, et.c.
up to I think about age 14-15 or so you have a very very impressionable person that doesn’t have the critical thinking skills or experience to make these determinations~
I also believe it’s in their terms of service that their services should not be used by someone under the age of 13 (?)
Regarding terms, I was looking for more clarifications, as they worked with pinwheelDOTcomSLASHgpt which offers a similar solution.
Was that an exception?
Maybe I’m misinterpreting the terms of use?
“You must be at least 13 years old to use the Services. If you are under 18 you must have your parent or legal guardian’s permission to use the Services. If you use the Services on behalf of another person or entity, you must have the authority to accept the Terms on their behalf.” or they have an agreement/exception maybe?
If you put a lot of engineering into double checking every part of the conversation with a really well prompted “safety agent”, and keeping the parents fully informed, you can probably make a good product for younger people… but I would personally be very scared of the liabilities while we still have such little control over these models~
Using ChatGPT, the service, is not recommended for children under 13.
Using the API, is something you the developer do, not the child. If you the developer are building an application, it is up to you the developer to support an appropriate age group. If, as part of your application, you happen to use OpenAI for inference, you can do that, but if it blows up in your face, that’s on you, not on OpenAI.
This is no different from if you buy, say, wheat flour on the open market – maybe you use it to make coffee pastries, or maybe you use it to make infant formula; it’s really up to you to secure the appropriate safety protocol for your target demographic.
That being said, does OpenAI “recommend” using their services for children under 13? Not that I know. Proceed with caution.
thank you @eslof.github @jwatte for the insights
I was under the impression that you had to propagate these terms of use if you’re using openai api to power a chatbot, e.g. if I let people under 13 use it through my application that uses gpt api, then I would be breaking the terms of use (?).
The reason for the 13-year-old requirement is to fit in with legal standards for the Children’s Online Privacy Protection Act. I understand hospitals and other children’s websites have ways to work within the act without breaking the law, but it is always safer to just stick with 13.
The best thing to do would be to reach out to OpenAI directly, this will require discussion between yourselves, OpenAI and the relevant governmental agencies.
You can use the support bot on help.openai.com (bottom right corner).
As this topic has a selected solution, closed topic.