On Tue, 11 Jul 2006, Tim Jenness wrote:
> On Tue, 11 Jul 2006, David Berry wrote:
>
> > Having said I'll take a look at using the NDF scaled array facilities in
> > CONVERT, I'm not sure what's required. In terms of ndf2fits, I guess there
> > is no need for any change since it already creates BSCALE and BZERO
> > keywords with appropriate values. For fits2ndf, I'm presuming what is
> > needed is a new adam parameter indicating if the resulting NDF should be
> > stored in scaled form or not, although this is a bit messy in view of the
> > existence of the FMTCNV parameter.
> >
> > Have I missed anything?
>
> This work all started because we couldn't read one of those big HI
> survey images when it was uncompressed and, although we worked around it
> by allowing the output type to be specified it made sense for disk space
> to leave it alone and just write scaled FITS files to scaled NDFs
> (this only works if the scaled NDF can internally specify what the
> original target type is, otherwise that HI image would still uncompress
> to _DOUBLE and would still not fit in GAIA).
So are you saying you want some way for applications to be able to get at
the scaled values? At the moment, the changes I have made only provide
access to the unscaled values (that is, the ARY library automatically
converts the stored integers back into the original floating point values
when the array is mapped). I thought this was what we decided at York.
> Secondly, we need a way of creating scaled NDFs and there are only two
> ways. Eihter from a scaled FITS file or by a kappa command.
I'm working on a kappa command at the moment.
> I'm still worried that CONVERT will duplicate a lot of the scaling code
> inside ndf2fits/fits2ndf that is also used in ARY.
Not really looked into it yet.
David
|