> > Are you saying we should propagate them from the reference NDF, or
> > that we shouldn't propagate them at all? I'm a bit twitchy that we
> > seem to be modifying standard kappa behaviour (which is to propagate
> > extensions form the primary input NDF).
> In the particular case of wcsmosaic, I'm saying that it's doubtful that
> the extensions from the first input file are at all relevant in the mosaic
> output (if there is more than one input file). SMURF extensions are just
> an example.
My view is that KAPPA should not be corrupted for specialist requirements.
Bending its own conventions will only confuse some users (and
programmers), and could lead to nasty surprises.
In the WCSMOSAIC Implementation Notes, we should describe the behaviour
more explicitly, so if you don't want to inherit from the primary NDF,
you can run ERASE or SETEXT or whatever to remove the bits you don't
> Noting that until yesterday, extensions would have been propagated from
> the REF image (also highly dubious).
That was a bug IMHO. I would have fixed it myself had it not turned up
as I was leaving for the weekend.
> Extension propagation makes most sense for a single input going to a
> single output.
Yes, but it can make sense in more complex cases. If you want want more
than another, say in ADD, you select the primary via IN1. In KAPPA I
don't want a Figaro, or IRAF situation emerging, where the conventions
vary from application to application. These conventions arose during
the NDF design discussions.
> > NDFs were used as inputs)? I can think of at least two alternative
> > approaches: 1) use PROVREM after wcsmosaic to remove the unwanted
> > reference provenance (this would need a slight change to provrem), or
That keeps the generality with other OO tasks that let you remove cruft.
> > 2) before running wcsmosaic, set a flag in the provenance extension of
> > the reference NDF that tells NDG not to include the NDF in the output
> > provenance (this would need a new task to control the setting of the
> > flag, or it could be done within provmod).
Presumably you might have to remember to switch it on again. Option 1)
looks better to me.
> > The old model was that kappa was as general purpose as possible, and
> > all the tuning to handle details of
> > instrument/telescope/observatory/wavelength/science/etc went in the
> > pipe-line.
As far as I'm concerned it's still the current model.
If you want other specialist stuff, bundle it up in a package or script
that knows the special requirements of the data.
> Now that the SSC is being maintained specifically to
> > support JAC operations, I can see there could be an argument to change
> > this.
It's not just JAC and its users running Starlink software, even though
development is funded for JAC operations.
> In the pipeline case we can always check the output provenance to see
> whether it makes sense and handle it on a case by case basis.
That's the way to do it.