On Tue, 11 Jul 2006, David Berry wrote:
> Having said I'll take a look at using the NDF scaled array facilities in
> CONVERT, I'm not sure what's required. In terms of ndf2fits, I guess there
> is no need for any change since it already creates BSCALE and BZERO
> keywords with appropriate values. For fits2ndf, I'm presuming what is
> needed is a new adam parameter indicating if the resulting NDF should be
> stored in scaled form or not, although this is a bit messy in view of the
> existence of the FMTCNV parameter.
>
> Have I missed anything?
This work all started because we couldn't read one of those big HI
survey images when it was uncompressed and, although we worked around it
by allowing the output type to be specified it made sense for disk space
to leave it alone and just write scaled FITS files to scaled NDFs
(this only works if the scaled NDF can internally specify what the
original target type is, otherwise that HI image would still uncompress
to _DOUBLE and would still not fit in GAIA).
Secondly, we need a way of creating scaled NDFs and there are only two
ways. Eihter from a scaled FITS file or by a kappa command.
I'm still worried that CONVERT will duplicate a lot of the scaling code
inside ndf2fits/fits2ndf that is also used in ARY.
--
Tim Jenness
JAC software
http://www.jach.hawaii.edu/~timj
|