Print

Print


In a message dated 2/27/01 11:04:59 AM, [log in to unmask] writes:

>
>This is a current subject of discussion.  It got a bit of time at the
>most recent J3 meeting.  I believe that the previous feeling was that
>systems that could support arrays that large would typically make
>default integers be 64 bits.  That certainly seems like the "cleanest"
>solution to me.  Otherwise you are going to end up having to
>explicitly specify kinds all over the place; I'd think that would
>get out of hand.
>
>
I thought the problem was not that they assumed that default INTEGERs were 32
bits, but rather that the users assumed that the were the same size as
default REALs. This assumption is standard conforming, but unfortunately the
vendors found it awkward making default REALs 64 bits. Note the vendors
problems occur because 64 bit REALs imply 128 bit DOUBLEs, which can be
solved, but almost always leads to either a very expensive processor design,
a very inefficient implementation of DOUBLE, REALs and DOUBLEs with the same
precision (which users find surprising), or REALs and DOUBLEs which use twice
the storage as the number of bits (which user find surprising and
inconvenient).