It does not struggle, you struggle. It is a tool you are using, and it is doing exactly what you're telling it to do. Tools take time to learn, and that's fine. Blaming the tools is counterproductive.
If the code is well documented, at a high level and with inline comments, and if your instructions are clear, it'll figure it out. If it makes a mistake, it's up to you to figure out where the communication broke down and figure out how to communicate more clearly and consistently.
"My Toyota Corolla struggles to drive up icy hills."
"It doesn't struggle, you struggle." ???
It's fine to critique your own tools and their strengths and weaknesses. Claiming that any and all failures of AI are an operator skill issue is counterproductive.
But as a heart surgeon, why would you ever consider using a spoon for the job? AI/LLMs are just a tool. Your professional experience should tell you if it is the right tool. This is where industry experience comes in.
As a heart surgeon with a phobia of sharp things I've found spoons to be great for surgery. If you find it unproductive it's probably a skill issue on your part.
A tool is something I can tightly control. A thing that may or may not work today, and if it does, might stop working tomorrow when the model gets updated without any notification to anyone, the output of which I have to very carefully scrutinize anyway, is not a tool. It's a toy.
It does not struggle, you struggle. It is a tool you are using, and it is doing exactly what you're telling it to do. Tools take time to learn, and that's fine. Blaming the tools is counterproductive.
If the code is well documented, at a high level and with inline comments, and if your instructions are clear, it'll figure it out. If it makes a mistake, it's up to you to figure out where the communication broke down and figure out how to communicate more clearly and consistently.