General Discussion
In reply to the discussion: AI: Something Big Is Happening [View all]Oneironaut
(6,251 posts)The term programmer is a really broad term that can be anything from someone writing simple html / JavaScript to really complex endeavors that require a deeper knowledge of a field, e.g. niche low level systems programming as an example. Ultimately, programming is a continual process. If you can just prompt AI for code and that works, thats fine - youre probably developing a simple app that hopefully wont need much maintenance. However, you also need to know how to support that app when something goes wrong. You really need all the skills you would have needed to develop the app in the first replace to actually support it.
This is where vibe coding fails. Maintaining software is as complex, if not more complex than building it in the first place. I cant imagine the nightmare of slapping a bunch of AI code into a product that no one knows how it works, and, expecting to have a stable end result. That sounds absolutely terrible and like a lot more work in the long run.
Part of the problem as well is that, in a lot of circumstances, to actually get code back that they need, someone would need to have the experience to know what to ask for (in your case, the person who would check the code accuracy). Time could be saved that way, but, expertise is still needed.
I use LLMs like ChatGPT all the time. However, I see their limitations. LLMs like ChatGPT are still glorified search engines to me. On them, I find things I struggle to find on search engines - mostly by scraping StackOverflow I think. The explanations for solving problems are always mediocre - I feel like they could mislead you if you are looking for a correct explanation. They do have decent accuracy, but, we have a long way to go.
Im definitely open to changing my opinion in the future, though, as AI grows and matures.
