Hi Bill,
> It seems it would be good to have available if we are all going to
> put our images on our web-servers.
Probably easier to either just bzip2 the images (which works reasonably well but is somewhat slow) or use one of the imgCIF "jiffy" programs to do this, which will correctly retain the header information. The pack subroutines you refer to deal with the pixel array that is the image but do not address the problem of safe storage of the header.
Plus of imgCIF is some data reduction packages can reduce the data straight from the compressed files. The minus is that it is not obvious that other programs can do this and that all settings in the cbf lib can be treated equivalently (e.g. compression)
The plus of bzip2 is that it leaves the image essentially unharmed, and you don't need any "special" programs. The minus is you have to unpack before processing. Usually you get something like a factor of 3 in size for bzip2, with the compression taking typically 1s/image.
Two things though if people are going to do this for real:
(1) ensure the result is in manageable chunks - a 5GB tarball is no use to anyone as many programs can't download more than 2^32 bytes, noone wants to download 720 images individually
(2) compute md5sums for the files so the end user can check they have downloaded correctly
I guess any script-smith could put together a couple of commands to prepare the data for www download... not difficult really.
Cheers,
Graeme
|