D'oh. Please replace all instances of (2^32 - 1) with (2^31 - 1) in the text below.
Dan
On 17 Feb 2010, at 16:54, Daniel O'Donovan wrote:
> Hi Justin,
>
> I'm afraid I'm still not clear on what sort of system and install you have. If you could give us that, and also the trace from your error we might know more.
>
> But, if I'm getting this right (correct me), you want to open a large spectrum file ( > 2GB) on a 32 bit computer.
>
> (Computer science bit)
>
> When your OS opens the spectrum file, it sucks in all the information in that file and notes its size for future reference. On a 32 bit computer, we're limited to 32 bit integers, that is integers of maximum size (2^32 - 1). If a file is larger than this then any reference to the end of the file will be greater than 2^32-1 and so cause a buffer overflow crash. A 2 GB file is larger than 2^32-1, but as you noticed a 1.9GB file is not (no crash).
>
> You *can* get around this limitation by using 'long integers', that is 2x32 bit integers stuck together, giving us a maximum of (2^64 - 1) - much larger than we would ever need (in this decade)! There may be patches for this in the Linux kernel but it's very ugly. The obvious thing would be to switch to using a 64 bit native machine whose 'int' s are all 2^64-1.
>
> Now the crash that you're getting is because of this 32 bit limitation. However, this crash could occur in
> 1) the OS itself - in which case a large file patch to your kernel *could* work but I wouldn't want to be the one trying it!
> 2) In our memops C code, we could rewrite this to use 'long int' s - but this would be at the (minimal) expense of efficiency for small spec. files (and a lot of expense for Wayne) or
> 3) in Python itself, in which case you would have to email the Python developers and explain explicitly what you want them to do.
>
> Possible ways around this:
>
> 1) Slice your spec so that we have two smaller, more manageable files (but then two spectrums and so two peak lists etc. etc.)
> 2) Re-process your data at a lower resolution, or chop out empty parts of the spectrum (again reducing the file to < 2 GB)
> 3) Get a 64 bit computer.
>
> It's inconvenient, but I don't think that there will be an easy way to handle 2 GB files in 32 bit machines.
>
> But then again, if this isn't the case we can only help if you send your machine details AND the stacktrace from the error.
>
> Dan
>
>
> On 17 Feb 2010, at 16:26, Justin Lecher wrote:
>
>> Thanks for the answer Dan,
>>
>> pricinply I agree through reading in the net. But what I also found was
>> the support for large files on 32bit platforms. Isn't that possible with
>> python or is it because of the data format which nmr data is stored?
>>
>> Regarding the RAM usage, I never had problems with analysis. Even when I
>> have more then 10 open windows and a large number spectra visible. More
>> restrictive is the I/O when moving to another position.
>>
>> justin
>>
>>
>> --
>> Justin Lecher
>> Institute for Neuroscience and Biophysics
>> ISB 3 - Institute for structural biochemistry
>> Research Centre Juelich GmbH,
>> 52425 Juelich,Germany
>> phone: +49 2461 61 5385
>>
>>
>
> Daniel O'Donovan
> [log in to unmask]
>
>
>
Daniel O'Donovan
[log in to unmask]
|