Some comments:
>At the Huntington Archive where we do almost excliusively images ranter
>that text documents, we have found the file size a particularly difficult
>problesm which we are still discussing. Archival storage of images, has
>for us been at the maximum "usable" resolution, and not the maximum
>possible resolution. With scanners that go up to 14,000dpi and true 5,600
>dpi, hundred Gb files are possible but who wants all that data from a
>single sheet of paper or film?
If you have a large object and want high quality reproduction - especially
of details - you have to go beyond 300dpi sometimes - and certainly file
sizes grow with object size...
>Web images are always transferred across the web at 72dpi no matter what
>the original scanning resolution was. if you save them in TIFF, they can
>be tiled in Mr. Sid or with JTIP. This allows a user to see an overview
>section as an about 1024 by 768 screen size but at 72dpi. By selecting
>subsections the viewing applets then selects the appropriate "detail" tile
>to bring up. so files for strictly web use should be sized to the exact
>screen size that is necessary and at 72dpi. this is always a much smaller
>file than the original scan which we archive at the maximum size in TIFF.
Not sure all the technicalities are correct here... files accessed from the
internet are displayed one-picture element per screen pixel. If your screen
is 85dpi you see 85 samples per screen inch... etc etc. If the original is
sampled at 300dpi you will get the whole image but see it 3-4 times bigger
than original.... and strain your browser!
>
>>I would add that particularly when dealing with greyscale or colour images,
>>files of that size become particularly difficult to handle from a storage
>>and manipulation standpoint. A 600 dpi colour image can be as large as 100
>>mb (for a letter/A4 page) in uncompressed format -- most image editing
>>applications require 3 to 4 times the size of any image for system memory to
>>work with the image; hence 300 mb to 400 mb of RAM just for the application
>>(never mind the system itself).
If you have to edit and process large images try our free image processing
package vips:
http://azul.ecs.soton.ac.uk/~vips
it handles images up to 2GB each within any amiunt of ram - even 32MB. But
under Unix/Linux preferably... (or NT with a unix simmulator: cygwin)
that's what we use in the National Gallery and several sites which make
800MB images regularly.
>>Based on our experiences, images approximately 1000 pixels high in typical
>>situations provides an image that is quite readable on the web. However,
>>this will depend on your source material. A better rule of thumb (as opposed
>>to overall resolution) is to look at the resultant height of the characters
>>on screen.
agree - it is annoying to have to scroll around slightly bigger pics...
>
Hope this helps too...?
Yours,
Kirk Martinez
University of Southampton. http://www.ecs.soton.ac.uk/~km
Multimedia Research Group
Dept. of Electronics and Computer Science
University of Southampton, UK
01703 594491
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|