> Hmmm... but in these days in which Gb of RAM and ~Tb of hard disk are
> commonplace, 20 Mb doesn't sound like a lot to handle.
The concern was/is that not everyone has such storage, say if working on
the road with a laptop, and not to bloat the software unnecessarily.
It's not just the 20MB extra on its own. The mapped arrays take up lots
of memory too. Perhaps increasing this limit to 1000 is still in the
noise even for small data volumes.
Static arrays mean somebody has to draw the line and different
programmers or users will disagree what's the appropriate threshold. If
we could make CCDPACK allocate storage dynamically, then if you have the
resources for the volume and number of datasets all will be well,
without imposing big overheads for those with much smaller capacities.
> Is there a walkaround to live with this limit for now?
Assuming you don't have the source...
> I thought that
> perhaps taking groups of 400 images and always including one common image
> to all the groups would do the trick but for some reason it doesn't work.
Please can you elaborate on the last point. How doesn't it work? Does
the common image overlap each subset?
Without knowing the spatial distribution it's hard to be precise, but
for a roughly rectangular mosaic with N by M NDFs along each dimension
I'd divide the NDFs into approximately root N by root M
overlapping regions to make smaller mosaics and then register these root
N by root M tiles to form the final mosaic.
Peter may have a better suggestion for efficiency and minimising
artefacts matching the overlap regions.
Maalcolm
----
Starlink User Support list
For list configuration, including subscribing to and unsubscribing from the list, see
https://www.jiscmail.ac.uk/cgi-bin/webadmin?A0=STARLINK
|