I wonder why programming jobs haven't yet "dried up" because of the software evolution, for example, I am a developer myself, which means that I do care about software (I mean I am not of the type of guys that needs a computer mainly to just browse the Internet), and still I wouldn't mind if I never receive any more updates on my Ubuntu machine.
I find that it provides everything I need, and while the updates provide various bug fixes/improvements, I wouldn't mind using it with its current state for the rest of my life, for 2 years of Ubuntu usage I have never bumped at a serious bug/problem.
Another example is Windows, almost half of it's users still use XP, which is practically ancient, yet they find it satisfying all their needs (and I agree with them).
I could go with many more examples, but by now you are understanding my point and my question. While new "trends" appears all of the time (like a new mobile OS) which runs on new platforms and requires some fresh development work, still the majority of the software effort goes in to what I consider as "completed projects", or at least a state of a project which is enough to be considered as completed.
Do you have an explanation? I can't think of the right tags for this question; please edit it the way you find it to be most appropriate.