On 13/12/2007, Tim Jenness <[log in to unmask]> wrote:
> Well, the FTS team have already written code using JNIHDS to read a
> standard SCUBA-2 file (including JCMSTATE extensions). They have to write
> the answer out as
>
> NDF
> .DATA_ARRAY
>
> .MORE
> .FITS
> .FTS2DR
> SOMEINFO <NDF>
> DATA_ARRAY
> SOMEMOREINFO <NDF>
> DATA_ARRAY
> .PROVENANCE
>
> and they are willing to do that manually with HDS calls.
>
> The other thing I was wanting though was either writing an AXIS component
> for the main data array or writing the .WCS as an AST frameset.
>
> Which do we think would be easier if we were writing from scratch?
It depends what form the WCS information is available in. If you have
an array of WCS values for each pixel then almost certainly it would
be easier to write an AXIS structure. If you already have a FrameSet,
then probably a WCS component.
Of course, the *proper* way to do it is WCS, but it sounds like
cutting corners is the order of the day here.
> I'm
> assuming that serializing the FrameSet and writing it to .WCS <_CHAR>
> would be easier (but there would be none of the sanity checking that is
> done by ndfPtwcs and I don't know what constraints there are on the shape
> of the _CHAR array).
>
> I'm happy to take advice on this. There is no requirement for variance,
> quality, pixel origins. If there was a kappa routine to allow me to update
> provenance then we wouldn't need to write that either (the .PROVENANCE in
> the output file could be a DAT_COPY of the provenance from the input file)
So you want a kappa app that allows a specified group of NDFs to be
registered as direct parents of another NDF?
David
|