Even when viewing the subject in the most objective way possible, it is clear that software, as a product, generally suffers from low quality.
Take for example a house built from scratch. Usually, the house will function as it is supposed to. It will stand for many years to come, the roof will support heavy weather conditions, the doors and the windows will do their job, the foundations will not collapse even when the house is fully populated. Sure, minor problemsdo occur, like a leaking faucet or a bad paint job, but these are not critical.
Software, on the other hand is much more susceptible to suffer from bad quality: unexpected crashes, erroneous behavior, miscellaneous bugs, etc. Sure, there are many software projects and products which show high quality and are very reliable. But lots of software products do not fall in this category. Take into consideration paradigms like TDD which its popularity is on the rise in the past few years.
Why is this? Why do people have to fear that their software will not work or crash? (Do you walk into a house fearing its foundations will collapse?) Why is software - subjectively - so full of bugs?
Possible reasons:
Modern software engineering exists for only a few decades, a small time period compared to other forms of engineering/production.
Software is very complicated with layers upon layers of complexity, integrating them all is not trivial.
Software development is relatively easy to start with, anyone can write a simple program on his PC, which leads to amateur software leaking into the market.
Tight budgets and timeframes do not allow complete and high quality development and extensive testing.
How do you explain this issue, and do you see software quality advancing in the near future?