Can knowing C actually hurt the code you write in higher level languages?

Posted by Jurily on Stack Overflow See other posts from Stack Overflow or by Jurily
Published on 2010-05-21T13:44:19Z Indexed on 2010/05/21 14:00 UTC
Read the original article Hit count: 217

The question seems settled, beaten to death even. Smart people have said smart things on the subject. To be a really good programmer, you need to know C.

Or do you?

I was enlightened twice this week. The first one made me realize that my assumptions don't go further than my knowledge behind them, and given the complexity of software running on my machine, that's almost non-existent. But what really drove it home was this Slashdot comment:

The end result is that I notice the many naive ways in which traditional C "bare metal" programmers assume that higher level languages are implemented. They make bad "optimization" decisions in projects they influence, because they have no idea how a compiler works or how different a good runtime system may be from the naive macro-assembler model they understand.

Then it hit me: C is just one more abstraction, like all others. Even the CPU itself is only an abstraction! I've just never seen it break, because I don't have the tools to measure it.

I'm confused. Has my mind been mutilated beyond recovery, like Dijkstra said about BASIC? Am I living in a constant state of premature optimization? Is there hope for me, now that I realized I know nothing about anything? Is there anything to know, even? And why is it so fascinating, that everything I've written in the last five years might have been fundamentally wrong?

To sum it up: is there any value in knowing more than the API docs tell me?

EDIT: Made CW. Of course this also means now you must post examples of the interpreter/runtime optimizing better than we do :)

© Stack Overflow or respective owner

Related posts about c

    Related posts about language-agnostic