Hi,
> If the amount of data in code block is not terribly large, and the same
data is
> used in code block 2, then it is possible that the data still resides in
cache
> and code block 2 gets executed for almost free.
Well, CODE BLOCK 2 uses exactly the same data as block 1, and they do fit in
cache. Basically, these are tree operations (with lots of indirections). The
first block traverses a path from a node up to the tree of the root, and the
second updates the values at the nodes along that path if that is determined
necessary in part 1. So maybe you are right, that could be it. I still am
puzzled about the marginally faster (total execution time) speed. This may
be an algorithmic detail (since the update in part 2 changes the tree that 1
operates on), although from what I know this should not be the case.
Thanks,
Aleksandar, who hates cache-penalties that make algorithm design so much
more difficult than with pen and paper...
_____________________________________________
Aleksandar Donev
http://www.pa.msu.edu/~donev/
[log in to unmask]
(517) 432-6770
Department of Physics and Astronomy
Michigan State University
East Lansing, MI 48824-1116
_____________________________________________
|