On 17/02/10 18:33, Daniel O'Donovan wrote:
> On 17 Feb 2010, at 17:25, Justin Lecher wrote:
>
>>> It's inconvenient, but I don't think that there will be an easy way to handle 2 GB files in 32 bit machines.
>>
>> I never used but I know that there are a couple of archive programs like
>> 7zip which have LFS.
>
>
> I'm currently playing about with some large 4D experiments (although only 835MB max) and performance can be quite bad at times (I have a 64 bit machine and 4GB of RAM). We're trying to think of a smart way to handle large data sets - Wayne uses 'Contour Files' currently. A clever way to compress the spectra in memory, or to filter and store only the useful bits would be great. If you have any ideas, do let us know!
>
> Dan
>
> Daniel O'Donovan
> [log in to unmask]
>
>
>
In other projects I used cara combined with plain nmrpipe ft3 data and
that was really horrible. Now with analysis and and azara formatted
data, it performance really fine.
I definitely see an performance increase switching from
32bit/2GBRAM/ONBOARD-graphic to 64bit/4GB/NV8800 but the bottle neck is
the reading from disc. If you need different blocks of data, it could
take sometime.
--
Justin Lecher
Institute for Neuroscience and Biophysics
ISB 3 - Institute for structural biochemistry
Research Centre Juelich GmbH,
52425 Juelich,Germany
phone: +49 2461 61 5385
|