On Nov 22, 2004, at 9:20 PM, Keith Bierman wrote:
> Richard E Maine wrote:
>> Admittedly, computers have gotten fast enough that it has gotten hard
>> for me to time the original version by hand, so these days I override
>> its normal algorithm and tell it to use one that's bad enough that I
>> can ask for 500 iterations and it will actually do that many instead
>> of
>> stopping after 4 or 5 because it has converged. :-)
>
> so it spends all it's time computing near zero? If so, it sounds like
> it's going to be pretty distorted in that few applications spend nearly
> all of their time in underflow ;>
Nah. That's more like what would have happened if I use the usual
algorithms and do something like turn off all convergence critera and
other sanity checks and try to force 500 iterations. And I'd have to
actually make code mods to turn off some of the checks because some are
basic enough that they aren't normally disabled.
But if I tell it to use a naive steepest descent (which is one of the
options, albeit a rarely used one), it is still wandering around
without having yet gotten very near to convergence at 500 iterations.
Not an unusual amount of underflow. Should be pretty representative of
realistic computation time per iteration - just not very representative
of the number of iterations needed.
--
Richard Maine | Good judgment comes from experience;
[log in to unmask] | experience comes from bad judgment.
| -- Mark Twain
|