GPT-4 Is Outsmarting itself with Scripts

I have noticed that GPT-4 is sometimes failing at tasks because it is trying to write scripts that are smart enough to do what it does.

For example, sorting messy data requires intelligent sorting.

GPT-4 ends up being stumped became it iss not smart enough to write a script intelligent enough to sort the messy data even though it is smart enough itself to sort the messy data.

I end up having to tell it explicitly NOT to write a script to solve the problem but to just solve the problem.

In essence, the ability for it to write scripts to solves problems, sometimes cause it to outsmart itself.

It needs the ability to step back when scripts fail and try another way or to consider whether perfecting a script would be more time effective than just solving the script “manually”