I think that bad design is really just a symptom of common management strategies for software projects. From what I've seen in industry, non-technical people will tend to just throw pajeet-tier codemonkeys at problems and get a senior dev to bandage everything together until it works. When I look at OOP/webdev/enterprise best-practice advice, it just looks more like tard wrangling than engineering. You make sure that your monkeys have lots of busy work to prevent them from competing too aggressively and destabilizing the team. It's retarded as an engineering practice, but it's stable at an organizational level. Unfortunately, the people who created the kool-aid started drinking it, and now many solo and open source devs do too.
Good design really comes from having a small team of half-decent developers. It really isn't that hard so long as the scale of the project can be kept in one person's head. Once you are working on a team with dozens of people, every piece of code will begin to depend on every other piece of code and you will no longer be able to fix these redundancies, or even recognize they exist. This is why we have ugly, bloated pieces of shit like Chrome/Firefox, Windows/Linux/Mac, OpenGL/DirectX and so on.
Early software in the amiga days could achieve the vast majority of what we do today, with the main limitations being the throughput of the hardware itself, not the software. There were some suboptimal design choices of course, but nothing that could not be ironed out with time. Think of C transforming into C++. C was imperfect, but it became very popular because it had just enough features to be useful. Most of these features were guaranteed by the standard, and simple enough that anyone could work with a standards compliant subset of the language. Then people began to move to C++, which quickly began to grow the standard library, adding more and more features. Now there are multiple incompatible ways to allocate memory, crazy overloading and name mangling behaviors, obscure minutiae and so on. In a sane world, the C standard would have just been amended with something like namespaces/modules and then frozen. A new language standard could have been forked from the original, focusing on fixing design improvements that might break backwards compatibility. Instead we now have dozens of shitty, popular, poorly standardized languages. If you want to write a piece of software and compile/run it a decade from now, your only choices are basically C99 or C++98.
The endless churn of shitware built from other shitware is perpetuated by an industry where developers have an incentive to create future work for themselves. If they actually did good design, the entire silicon valley tech bubble would collapse. That's the real reasons why OS's and languages need to constantly grow and update.
Related talk by Casey Muratori, gives some pretty decent examples backed with quantitative data:
https://www.youtube.com/watch?v=kZRE7HIO3vk