Re: Thou shalt have no other gods before the ANSI C standard
Date: 8 Feb 2005 06:51:43 -0800
Douglas A. Gwyn wrote:
> email@example.com wrote:
> > D. J. Bernstein wrote:
> > >Douglas A. Gwyn wrote:
> > > >The fact is, using calloc to initialize non-integer values is
> > > >*wrong*
> > > I'll readily agree that the all-0-bytes representation of
> > > NULL isn't guaranteed by the C standard. I'll also agree that
> > > there was, once upon a time, an obscure C implementation that,
> > > for silly reasons, used a different representation of NULL.
> > Oh its worse than that -- there apparently exists exactly 2
> > platforms which could use two different representations for a
> > pointer depending on their type. Because of these two platforms
> > (neither of which has a C99 compliant compiler installed on them
> > -- and which never will) the standard says you can't memcpy
> > pointers, or assume the size of one is the same as the size of
> > another, etc.
> When you don't know what you're talking about, it is
> better to remain silent. Now we have yet another error
> that if let stand would mislead more programmers.
> The C standard does allow memcpy of pointer objects.
> The C standard does guarantee that all pointers to the
> same type have the same size.
Read carefully. What you said and what I said are not contradictory.
The problem is for pointers of *different types*. Its important for
suporting a kind of anonymous context handle type.
> Representations of null pointer values that use bit
> patterns other than all-zero-bits can be quite useful
> on many platforms, especially those where address zero
> might contain useful program data.
Yeah, like the 6502. Do you suppose someone's hard at work making a
C99 compiler for it?
> >>... Taking advantage of these guarantees is
> >>convenient for the programmer and doesn't sacrifice any real-world
> One wonders why DJB didn't include ASCII encoding.
Because ASCII is not useful to people who don't speak english? In a
very strong sense, ASCII *CANNOT* become a pervasive standard because
of this. Unicode has a better chance, but digging a little deeper
shows that even Unicode seems to have screwed the pooch for CJK.
> Perhaps he realized that too many people understand
> the general issue when it is applied specifically
> to character encodings.
His point is about defacto-standards. ASCII is not a relevant defacto
standard for most of the world.
> > It is physically impossible for some people to parse and
> > comprehend that statement. I'm not kidding, the very concept
> > of what you just said is just far too abstract of an idea for
> > these people to understand. It is beyond their being.
> It is rare that a program *has* to make use of such
> characteristics of the architecture
This is utter nonsense. I have become a very "portable" conscious
programmer, and find it nearly impossible to write any serious program
without leaning heavily on *most* of those assumptions. What is the
fastest way to find the least power of two greater than a given
integer? What is the fastest way to scan an int to see if it contains
a 0 byte in it? What is the fastest way to count the number of bits in
a given int? How the hell can I do any of these in a way that is both
portable and fast?
There is very little point in killing myself to make my code "truly
portable" because I just have no realistic access to any machine that
doesn't follow the 32-bit + 2scomplement + IEEE-754 assumptions. So I
can't test my code on such platforms anyways.
> [...] , and unnecessarily
> building such assumptions into your code prevents it
> from working when ported to e.g. an embedded processor.
Ah, the embedded environment -- the last bastion of obsolete
processors. It of course didn't occurr to you that in 10 years,
multi-core 32+ bit system-on-a-chip procesors will be more pervasive
and *cheaper* than the 8 bit microcontroller chips that are still in
marginal use today? BTW, do you know of any embedded environments with
a compliant C99 targetting them?
> For many applications, that would be a very poor
> economic decision.
New compiler development for such platforms is not a poor economic
> > 2s complement, IEEE-754, 32bit ints is a defacto standard that has
> > more momentum than any language specification.
> What nonsense. The 64-bit desktop is right around the
As DJB points out, the new 64-bit desktop processors continue to rely
on the assumptions he listed (along with uniform pointer sizes, signed
right shift, 2s complement etc.) I have personally verified this for
the PPC-64 and Sparc, and its well known to be true for the AMD64 Linux
ABI. Even Alpha and Itanium stuck with 32 bits for int.
> [...] hexadecimal floating-point is coming,
Excuse me? What are you talking about? Any deviation from IEEE-754 is
summarily shot down by basically everyone who does serious FP work.
Sun tried to deviate from the full semantics of IEEE-754 *slightly* in
Java, and were forced to back off from the backlash. So I don't know
what this "hexadecimal float-point" thing is that you are talking
about, but its got about as much chance of getting seriously adopted as
Atari making a comeback by entering the super-computer market.
> [...] and
> there are numerous small systems with 16-bit words.
> At least one DSP uses sign-magnitude representation.
And development on this DSP uses a C compiler for more than just glue
code for hoards of the real assembly language inner loops?
> At least one commercial system with a C compiler uses
> ones-complement representation.
That's a hell of a lot of leverage that one commercial compiler has on
> > That's why these important
> > functional attributes continue to exist even though the latest
> > ISO C99 standard (which is 4+ years old now) has not been
> > adopted by any major vendor. The deafening silence with which
> > the C99 standard has been met by the industry is testimony to
> > what the *REAL* standard is (POSIX + the various system ABIs +
> > the "32bit" features you listed above).
> Why would "those functional attributes" cease to
> exist? What you failed to notice was that those
> aren't the only kinds of computers that matter.
They are the only kinds of computers on which new C compilers are being
developed and ported to. The other kinds of computers have plenty of
older standards that are good enough for them. What you have to answer
is why are the *NEW* C standards continuing to pay defference to
platforms on which no more new compiler development is taking place?
> C99 has been gradually implemented by most major compiler
You misspelled "partially". Its been 4+ years, and gcc was extremely
close a long time ago, yet they halted. That is reason/incentive
enough for any competitor to prove that they are "better" than gcc (an
important marketing point considering gcc is "free", and targets nearly
everything of relevance out there) by going ahead and implementing full
C99. Yet nobody is there -- of course, no end users are exactly
demanding C99 compliance either.
> [...] Since it basically adds features to C89 without
> significantly changing features specified by C89, phasing
> in C99 conformance is a reasonable approach.
No, it just breaks important compiler extensions and presents very
serious compatibility issues with C++. Personally, I think Duke Nukem
Forever is going to ship before a major vendor ships a C99 compiler.
> Your idea of a "standard" is no more no less than
> "what happens to be the environment I'm familiar
I'm very familliar with one 8-bit platform, one 16-bit platform, and
one "36-bit" platform (probably one you've never heard of -- we didn't
have a real C compiler for it) and have used two 64-bit platforms
(though I wouldn't call myself expert in either.) That isn't the
point. The point is that every one of those platforms (besides the
64-bit ones) is either completely obsolete or C compiler development is
non-existent on them.
I am unaware of any new "classical CPU" hardware design that isn't
"32bit ints + 2s complement + IEEE-754". It framed the Java standard,
its basically creeped into graphics ASIC standards (in fact I was quite
shocked to learn that they have decided to support *FULL* IEEE-754,
since it costs more transistors to handle denormals which are usually
unimportant for graphics), and it has 100% of workstation and desktop.
Even your precious "embedded space" is basically being taken over by
PowerPC and Xscale/StrongArm and custom processors are typically just
variants on MIPS processors.
-- Paul Hsieh http://www.pobox.com/~qed/ http://bstring.sf.net/