0, 0L, 1 - 1, (((1 - 1))), (void )0 and (void )(1 - 1) are null pointer constants
...
if one operand is a null pointer constant, the result has the type of the other operand;
This is quite possibly one of the most idiotic semantics I've ever seen specified for a language. Why does the type of an expression depend on the partially evaluated result of one of its subexpressions? This by itself is enough to throw out the window any sane reduction semantics (an important mechanism in language design whereby semantics are defined by specifying how to transform any program into an equivalent but strictly smaller program), though I'm sure there are plenty more such constructs in C.
Ridiculous semantics like this are not only the definition of C99; they're the definition of designed by committee.
It's even simpler. A lot of legacy code out there uses "0" (not NULL or even (void ∗)0, just 0) in place of NULL. So it's very important for backwards compatibility that "test ? (double ∗)x : 0" or "test ? 0 : (int ∗)x" return double ∗ or int ∗, respectively.
I'm not sure why the definition of a null pointer has to be so liberal, but I suspect there's a backwards compatibility constraint in there.
My mistake. I could have sworn that only C++ defined 0 to be equivalent to a null pointer. I guess my confusion stemmed from the fact that the NULL preprocessor macro in C++ is actually defined to be just "0" (or a magical-seeming "__null" in recent versions of g++), whereas in any modern C implementation I've seen, it's "((void *)0)".
Though there are cases in which passing an unadorned "0" as a null pointer can be wrong; there's a null-terminated varargs function example here:
I don't doubt the C99 committee did a fantastic job maintaining backward compatibility with extant code-bases which use ill- or un-defined semantics of older standards. But I don't think that the success of the committee at doing its job necessarily redeems C99 as a semantically sane language -- indeed, I would argue that the need for design-by-committee is indicative of the hefty baggage of backward-compatibility issues which necessitates such convoluted semantics and boxes the C language into a linguistic dead-end.
I have always wondered why languages can link to code written in other languages, yet can't link to an older version of the same language. That is: new code should use new syntax and semantics, while old code should be still compilable and linkable. They should treated as two different languages. Backward compatibility shouldn't mean that you should write new code according to the older ways.
Nowadays we shouldn't be dealing with pitfalls and complexities which resulted from thirty years old design decisions. New versions of a language should reduce complexity of a language instead of increasing it (for the same semantics).
I'm very happy of having left my job as a C++ programmer: I don't have to deal anymore with such madness. And I'd bet that many members of C++ committee are Ph.D. and beyond, showing once more that the more you study, the more you are in danger of losing contact with reality and practicality.
C99 can link to programs written in ANSI-C just fine. The issue is with trying to compile the latter on a compiler written for the former.
This will always be an issue when dealing with a language as entrenched as C: Most of your user-base will have oodles of code written under the previous standard. If you break backwards-compatibility on a level that's going to take them multiple man-months just to get working again, most aren't going to bother and will stay with the older standard instead. Your new standard stagnates and dies.
There IS a way to do this, but it's not nearly as simple as you seem to believe. Take a look at what's happening with Python 2 & 3. It's a difficult, albeit possible, transition, and I'm not sure how often exactly one can reasonably expect to pull it off.
AFAIK, Python folks are experiencing something different: they have two incompatible languages which can't talk to each other, that's why they have to port code from Python 2 to 3. OTOH, no matter how bad I think about Perl, it seems Perl folks are doing fine instead, regarding backward compatibility of version 6 toward 5 (version 6 will have a compatibility mode). Ehi! If I'm not mistaken, ISE Eiffel will happily compile and link both Eiffel and C++ code.
I agree that old code is sacred and I agree that handling backward compatibility is not an easy task, but carrying decades old baggage isn't a solution either, if you ask me. I remember how one day while I was coding in C++ and I realized how much baggage I was carrying in my mind at each step while writing code.
Of course programmers will not switch to the new standard if it does not provide enough advantages, but then history shows that people are willing to even switch languages if their current one becomes an unmanageable mess.
As PG said in an essay of his, different versions of a language are different languages. Then just treat them as such.
Keep in mind that "the ugliest C feature" was committed in 2000, and the article was written in 2005. Things were a mite different in 2000, so this was probably more necessary.
Even if this isn't necessary today, it's a pretty interesting brain exercise.
How often do you really need to make sure you get the float versions rather than simply using the double versions everywhere and dealing with the minor speed hit? The tgmath stuff is really for scientific computing. For simple trigonometry on UI elements and such, there's no noticeable advantage to using e.g. cosf rather than plain cos and letting conversions happen.
He also isn't quite up to speed with Fortran overloading. It is fairly easy to define interfaces that can call different functions based on input type, but it still isn't as nice as C++.
...
if one operand is a null pointer constant, the result has the type of the other operand;
This is quite possibly one of the most idiotic semantics I've ever seen specified for a language. Why does the type of an expression depend on the partially evaluated result of one of its subexpressions? This by itself is enough to throw out the window any sane reduction semantics (an important mechanism in language design whereby semantics are defined by specifying how to transform any program into an equivalent but strictly smaller program), though I'm sure there are plenty more such constructs in C.
Ridiculous semantics like this are not only the definition of C99; they're the definition of designed by committee.