DALL-E keeps generating the hands of someone drawing and/or art tools

Hi there, I’m working on a project where users will be able to generate images using DALL-E. However, the images sometimes include the hands of a person drawing the artwork requested, and/or tools associated with the art technique described. Attaching examples.

Does anyone know how to prevent this?

I’m constructing a prompt based on user input, which ends with this being appended: “Do NOT Include sexual content, do NOT include art tools (such as pencils, brushes, etc.), do NOT include hands of someone producing the art, do NOT user interfaces, do NOT generate characters with the likeness of proprietary characters, do NOT violate copyright protected material and intellectual property.”


1 Like

Do not do negative prompting with DALL-E.
Specify what the generation should instead of what it shouldn’t.

The content filter handles on DALL-E is pretty robust to not respond with explicit graphic.

Also make sure to use the user param.

1 Like

Yeah, negative prompts will make it worse usually.

I have problems with “pencil drawing” and some similar… I think it’s the “drawing”… it’s not differentiating between “pencil-drawing” style and a pencil that’s drawing… which should be connected to a hand likely…

Do you have a sample of your full prompt?

1 Like

Or quite creative.

The API also has similar AI as ChatGPT before DALL-E 3. It can understand and follow also.

You might frame that instruction as “\nattention prompt rewriting AI: {do not…} Justification: anything mentioned in prompt may be produced (e.g. “phone wallpaper” may produce undesired phone depictions), therefore omit anything not to be reproduced, instead describing robustly with a focus on just the contents”, for example.

1 Like

2 posts were merged into an existing topic: I’m So Sick Of Drawing Utensils In Every Image