why are transaction monitors on decline? or are they?
- by mrkafk
http://www.itjobswatch.co.uk/jobs/uk/cics.do
http://www.itjobswatch.co.uk/jobs/uk/tuxedo.do
Look at the demand for programmers (% of job ads that the keyword appears), first graph under the table. It seems like demand for CICS, Tuxedo has fallen from 2.5%/1% respectively to almost zero.
To me, it seems bizarre: now we have more networked and internet enabled machines than ever before. And most of them are talking to some kind of database. So it would seem that use of products whose developers spent last 20-30 years working on distributing and coordinating and optimizing transactions should be on the rise. And it appears they're not.
I can see a few causes but can't tell whether they are true:
we forgot that concurrency and distribution are really hard, and redoing it all by ourselves, in Java, badly.
Erlang killed them all.
Projects nowadays have changed character, like most business software has already been built and we're all doing internet services, using stuff like Node.js, Erlang, Haskell. (I've used RabbitMQ which is written in Erlang, "but it was small specialized side project" kind of thing).
BigData is the emphasis now and BigData doesn't need transactions very much (?).
None of those explanations seem particularly convincing to me, which is why I'm looking for better one. Anyone?