What's going to happen is that LLMs will eventually make fewer mistakes, and then people will just put up with more bugs in almost all situations, leading to everything being noticably worse, and build everything with robustness in mind, not correctness. But it will all be cheaper so there you go.