So, GPT-4 not only devised the experiment and wrote the code for the interface, but it also successfully controlled a cursor in a gui to accomplish a goal (clicking the yellow square). For anyone interested in intelligent automation, that’s pretty impressive. GPT-4 can draw stuff in a paint program too using cardinal directions and number of pixels. Like “Click and drag to the right 10 pixels”, etc.
Related topics
| Topic | Replies | Views | Activity | |
|---|---|---|---|---|
| What to expect after chatGPT and DALL-E | 1 | 633 | December 29, 2022 | |
| Letting OpenAI GPT-4o mini control my phone | 0 | 343 | December 26, 2024 | |
| ChatGPT New Search feature (browser and app) | 2 | 1171 | November 6, 2024 | |
| ChatGPT goes Multimodal! Sound and vision is rolling out on ChatGPT | 34 | 14317 | December 10, 2023 | |
| GPT-4 Can Extend Its Own Functionalities | 0 | 216 | May 12, 2024 |




