I don't think there is a universally agreed to answer. In my opinion
several things led to the decision to drop Interval Arithmetic.
In no particular order:
The general magnitude of the task: just lots of stuff to do in a
relatively short time.
The shifting target problem. There were several different opinions
about what IA should be. Especially for questions about how
optimization could/should/would work. Questions like is X*X the same
as X**2? Or is X=Y; Z=X*Y the same as Z=Y*Y? It's not that people
didn't have answers to these questions, it's that different
people had different answers.
Should we allow things like X = 2*Y? or how about X= 3.14*Y
where the mixed modes can completely destroy the concept of
interval arithmetic. Same question for X = 3.14. Should this
be disallowed or be allowed as just one more dumb thing a
programmer can do?
The need for lots of new operators to handle relations for intervals
that partially or completely overlapped, were containd in, were
degenerate, etc., without obvious modern syntax like "<".
Uncertainity about how routines like MAX would work.
No good way to specify constants. Fortran already defines 3.14
as a real constant independent of any context. So how do you
set X = 3.14 +- .01 without introducing conversion inaccuracies
by treating 3.14 as a real? The two obvious ways are to have a
new syntax thing like X= {3.14, .01} where the {} pair mean that
the constants are to be converted in an interval sense, not in a
"real" sense and then converted to interval. The other approach is
to invent some sort of conversion function that takes, for example,
a character string argument: X = make_interval ("3.14, 01").
Neither seemed like a great idea. But, we sort of did the latter
when we allowed user defined overloading of the structure
constructor. If you want to implement IA as a user derived type
you can easily(?) overload the structure constructor to do
perfect conversion.
This is also a problem for I/O, especially for input where the form
of the number might imply what type it is. What should READ *, X
do if the user enters 3.14?
No good way to specify expected results. This is mostly a red herring,
since Fortran in general just says everything is processor dependent.
But, to me, the standard needed to try to specify something more
specific than [-INF,+INF] (or even [-NAN,+NAN] ;-} ) for the
expected results.
Politics. Some people thought computers were being called on to
essentially make more and more important decisions (does any human
being really really completely understand all of the calculations
that go into a nuclear reactor safety code?). Others noted that
none of their customers ever asked for anything remotely like IA.
Others thought that OOP was more important and we had limited
resources.
Those are just my guesses, I wasn't at the meeting where IA was
eliminated, so I can freely guess without being burdened by a lot
of useless facts ;-) . Personally, I think the first two in the
list are the main ones.
Finally, it's obvious that there is an answer to all of the above
questions/concerns. SUN has implemented IA.
Dick Hendrickson
Aleksandar Donev wrote:
>
> Hello,
>
> Does someone know what happened with proposals/ideas to add interval
> arithmetic support to Fortran? I have been looking at interval
> arithmetic-based global optimization codes/papers and read that someone
> was working on a proposal to add this to F200x, but that apparently did
> not work out?
>
> Thanks for any info,
> Aleksandar
>
> --
> __________________________________
> Aleksandar Donev
> Complex Materials Theory Group (http://cherrypit.princeton.edu/)
> Princeton Materials Institute & Program in Applied and Computational Mathematics
> @ Princeton University
> Address:
> 419 Bowen Hall, 70 Prospect Avenue
> Princeton University
> Princeton, NJ 08540-5211
> E-mail: [log in to unmask]
> WWW: http://atom.princeton.edu/donev
> Phone: (609) 258-2775
> Fax: (609) 258-6878
> __________________________________
|