Low hanging fruit where "a sufficiently smart compiler" is needed to get us back to Moore's Law?

Posted by jamie on Programmers See other posts from Programmers or by jamie
Published on 2012-03-20T21:19:12Z Indexed on 2012/03/20 23:39 UTC
Read the original article Hit count: 385

Filed under:
|
|

Paul Graham argues that:

It would be great if a startup could give us something of the old Moore's Law back, by writing software that could make a large number of CPUs look to the developer like one very fast CPU. ... The most ambitious is to try to do it automatically: to write a compiler that will parallelize our code for us. There's a name for this compiler, the sufficiently smart compiler, and it is a byword for impossibility. But is it really impossible?

Can someone provide a concrete example where a paralellizing compiler would solve a pain point? Web-apps don't appear to be a problem: just run a bunch of Node processes. Real-time raytracing isn't a problem: the programmers are writing multi-threaded, SIMD assembly language quite happily (indeed, some might complain if we make it easier!). The holy grail is to be able to accelerate any program, be it MySQL, Garage Band, or Quicken. I'm looking for a middle ground: is there a real-world problem that you have experienced where a "smart-enough" compiler would have provided a real benefit, i.e that someone would pay for?

© Programmers or respective owner

Related posts about computer-science

Related posts about tools