On Mon, 30 Oct 2006, Tim Jenness wrote:
> On Mon, 30 Oct 2006, David Berry wrote:
>
> > > Another data point, fresh build of my 64bit system and it works
> > > fine. I'm wondering now whether it's a compiler optimization issue?
> > > I used g95 (2 weeks old) and gcc 4.1.1 with -g -O2. The 32-bit Hilo
> > > system is old gcc3/g77 combination with -g -O2.
> >
> > Got round to looking at this at long last. Unfortunately I can't get
> > the problem to appear. This is using FC4 on i386 with g95 (gcc 4.0.1).
> > Are there any g77 machines at RAL?
> >
>
> Ok. I'm inclined to stop worrying about it for the moment and just note the
> bizarre problem. I've tried turning off optimization in ARY and NDF to no
> avail. (maybe I should try a fresh build with just -g).
Re-opening this one, I've ran into something that looks similar. If I
LINPLOT a spectrum of mine I get a core dump that gives:
#0 0x0000002a99816c34 in ary_bound_ (iary=0x2a979773a0, ndimx=0x2a957aff0c,
lbnd=0x7fbfffc190, ubnd=0x7fbfffc110, ndim=0x522c18, status=0x7fbfffdb7c)
at ary_bound.f:150
150 NDIM = ACB_NDIM( IACB )
which is the same line that Tim reported. Now tracking this error back
through the stack reveals the problem, on line 554 of kpg1_asget.f:
CALL NDF_BOUND( INDF, NDF__MXDIM, LBND, UBND, NDIM, STATUS )
that NDIM is a given parameter, but is being modified by this call, and is
declared as a PARAMETER in linplot.f.
DISPLAY has the same problem (NDIM declared as a PARAMETER), so I think
this should also solve that issue, but cannot repeat the problem myself.
David,
I've changed kpg1_asget to fix this, but you should check I've not broken
anything.
Cheers,
Peter.
|