Interesting, but maybe I'm missing something in the proposal. It seems
like a scheme for having many different sizes of variables. That could
be fine for scalars, but scalars take up almost no memory. All the
memory consumption for variables is in arrays. If the proposal is to
have each element of the array potentially a different memory size, the
disruption to the total ecosystem (languages, compilers, hardware) would
be enormous. If the plan is to have the size fixed for all elements,
then the largest size needed for any element would be used for the whole
array, and the gain seems limited.
Cheers,
Bill
On 4/26/13 2:41 AM, Keith Bierman wrote:
>
> On Thu, Apr 25, 2013 at 5:16 PM, COMP-FORTRAN-90 automatic digest system
> <[log in to unmask] <mailto:[log in to unmask]>> wrote:
>
> In the end, whatever precision/range you request is going to be mapped
> to 32-bit, 64-bit, or 128-bit. That's today's reality, and I don't
> see it
> changing,
>
>
> Well, John Gustafson has a somewhat appealing proposal (and with the
> right leverage he might make it happen ;>) which would change the status quo
>
> http://sites.ieee.org/scv-cs/files/2013/03/Right-SizingPrecision1.pdf
>
>
> Keith Bierman
> [log in to unmask] <mailto:[log in to unmask]>
>
--
Bill Long [log in to unmask]
Fortran Technical Support & voice: 651-605-9024
Bioinformatics Software Development fax: 651-605-9142
Cray Inc./Cray Plaza, Suite 210/380 Jackson St./St. Paul, MN 55101
|