Hi there, I’m working on a project where users will be able to generate images using DALL-E. However, the images sometimes include the hands of a person drawing the artwork requested, and/or tools associated with the art technique described. Attaching examples.
Does anyone know how to prevent this?
I’m constructing a prompt based on user input, which ends with this being appended: “Do NOT Include sexual content, do NOT include art tools (such as pencils, brushes, etc.), do NOT include hands of someone producing the art, do NOT user interfaces, do NOT generate characters with the likeness of proprietary characters, do NOT violate copyright protected material and intellectual property.”
Yeah, negative prompts will make it worse usually.
I have problems with “pencil drawing” and some similar… I think it’s the “drawing”… it’s not differentiating between “pencil-drawing” style and a pencil that’s drawing… which should be connected to a hand likely…
The API also has similar AI as ChatGPT before DALL-E 3. It can understand and follow also.
You might frame that instruction as “\nattention prompt rewriting AI: {do not…} Justification: anything mentioned in prompt may be produced (e.g. “phone wallpaper” may produce undesired phone depictions), therefore omit anything not to be reproduced, instead describing robustly with a focus on just the contents”, for example.