I don't know if the intention of the article was to stoke worry but actually embrace as opportunity! Users being able to make their own software for their own needs is a worthy and beautiful goal. It's like if people could only eat what was at restaurants and not cook for themselves.
And of course with iteration and feedback loops people can definitely learn how to specify what they want in a fairly precise way.
However, the easier programming gets the less people who doesn't know it need to do it because it's more likely somebody has already solved their problem.
If the thing you want it to do isn’t too unusual, LLMs can guide you into doing something vaguely sensible.
This is the one area of AI I’m actually pretty positive about. Computing largely passed people by, and computers have not lived up to their potential. I could see this approach making computers more useful for a large number of people.
Especially semi-technical people who’s specialty isn’t programming.
Even with GPT4 the most useful usecase are to generate stuff that are so repetative and daunting to type and refactor code that interns wrote.
And making it write documentations.
What i love most is to draft out project specs from user's requirements and to generate user stories.
Or let it write leet-code like problems that is too boring to do manually.
The actual layer of coding is still best done by experienced engineer's biological brains.