The actual argument that corresponds to an assumed shape dummy argument
need not have unit stride along its first dimension.
Do you know what your compilers do in the presence of assumed-shape dummy
arguments?
1. Do they generate a subprogram that always looks at a descriptor for an
assumed shape dummy argument, even when calculating subscripts along the
first dimension, (i.e. they don't assume unit stride in the first
dimension of assumed-shape dummy arguments), and leave the actual
argument as-is, or do they
2. generate a subprogram that assumes assumed-shape dummy arguments have
unit stride in the first dimension and take a copy of the actual
argument in the case that it isn't unit stride in the first dimension,
or do they
3. generate a subprogram that assumes assumed-shape dummy arguments
have unit stride in the first dimension and take a copy of the
actual argument always, or do they
4. generate a copy of the dummy argument within the subprogram in the
case it doesn't have unit stride, and leave the actual argument as-is,
or do they
5. generate a copy of the dummy argument within the subprogram always,
and leave the actual argument as-is,
6. or do they do something else I haven't mentioned?
The question underlying these questions is "If I use assumed-shape dummy
arguments, can I count on performance (nearly) as good as for assumed-size
dummy arguments in the case that the corresponding actual argument has unit
stride in its first dimension?"
Thanks in advance and best regards from
Van Snyder
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|