Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

No, just have some basic knowledge about how your compiler works and make decisions accordingly. Every major compiler lets the programmer specify exactly the tradeoffs you want. C/C++ simply don't hold your hand by default and let you get the best (reasonably possible) performance if you so desire. That's exactly the way it should be.


No, just use a compiler that doesn't actively attempt to malicously sabotage you. C has perfecly well defined semantics for (eg) null pointer dereference: read or write memory location zero, and consequently probably crash. Not "silently rewrite the pointer to nonzero-but-still-pointing-at-address-zero-somehow and proceed to bypass subsequent sanity checks for no damn reason".


Perfect example of the mismatch between (some) programmers' expectations and what language standards and/compilers require/implement.

> C has perfecly well defined semantics for (eg) null pointer dereference: read or write memory location zero, and consequently probably crash.

C does have "perfectly well-defined semantics" for null pointer dereference, but it's undefined behavior. Sure, the null pointer happens to be 0 on the vast majority of architectures most programmers work with, but apparently non-zero null pointers are still used these days (at least in the admgcn LLVM target, from what I understand after a quick glance at the linked page), so it's not even a "all sane modern machines" thing [0].

In any case, I'd love to see some good benchmarks showing the effect of the more "controversial" optimizations. Compiler writers say they're important, people who don't like these optimizations tend to say that they don't make a significant difference. I don't really feel comfortable taking a side until I at least see what these optimizations enable (or fail to enable). I lean towards agreeing with the compiler writers, but perhaps that's because I haven't been bitten by one of these bugs yet...

    [0]: https://reviews.llvm.org/D26196


The compiler writers are right in the sense that for every one of those optimizations you can have a microbenchmark that will benefit hugely. The opponents are right too, in the sense that for most codebases the more niche optimizations don't matter at all. The right answer, as always, is to take a particular case you actually care about and benchmark yourself. There are no benchmarks that would be generally representative, there is no meaningful "average case".

Personally, I mostly use C/C++ for fairly high performance numerical code and happen to benefit greatly from all the "unsafe" stuff, including optimizations only enabled by compilers taking advantage of undefined behavior. I'm therefore naturally inclined to strongly oppose any attempts to eliminate undefined behavior from the language standards. At the same time, however, I fully recognize that most people would probably benefit from a safer set of compiler defaults (as well as actually reading the compiler manuals once in a while) or even using languages with more restrictive semantics. Ultimately, there is no free lunch and performance doesn't come without its costs.


> C has perfecly well defined semantics for (eg) null pointer dereference: read or write memory location zero, and consequently probably crash.

... no, C explicitely has no well defined semantics since it's undefined behaviour. You may believe that C do due to habit but that's not the case.


No, C explicitly does not define semantics; it still has them, because the underlying hardware and ABI provides them, whether the C standard likes it or not. That's the point: if the compiler isn't actively sabotaging you, you end up with the semantics of the macine code you're compiling to, and can benefit from obvious sanity checks like not mapping anything at address zero.


There is a difference between implementation defined and undefined. Dereferencing a null pointer is undefined and cannot be performed in a well-formed program. Implementation specified behavior (e.g. sizeof(void *)) is dependent on the hardware and ABI.


This is, those semantics you state as defined don't translate into portable C code, not even between compiler versions on the same platform.


> it still has them, because the underlying hardware and ABI provides them,

when you program in C you program against the C abstract machine, not against a particular hardware.


If you are dereferencing a null pointer, you're doing something undefined. Your contract with the compiler was that it would produce valid code as long as you stayed within the confines of C. Thus you cannot blame the compiler for automatically assuming that you're not doing anything wrong, because that's explicitly the instructions it was given.


You can view it as malicious sabotage, or you can view it as a potentially useful optimization (with obvious security caveats when used incorrectly). Either way, my point is that every major compiler lets you make the tradeoffs that are right for your particular usecase. That's a good thing.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: