Dear Andreas,
sorry for the delay.
> I am currently analyzing an parametric activation experiment in PET with the
> polynomial regression technique as described by Buchel et al. (Neuroimage
> 4:60-66). Using a first- and second-order expansion, the results seem to
> show that almost every voxel that has a significant linear relationship to
> the parameter also has a significant second-order contribution. That is
> confirmed by plotting the activation.
Be sure that you have mean corrected the regressor before squaring it. By
subtracting the mean you make sure that 1st and 2nd order expansions are almost
orthigonal.
> (1) To get a clearer picture of this, I'd like to formally test for voxels
> that show a significant first-order but no significant second-order effect
> and I'd appreciate advice on how to implement this.
In SPM99 this can be done by an F-contrast. If the Design matrix is as follows:
box-car (0 order) linear (1st order) square (2nd order), the F contrast testing
for a significant contribution of the linear regressor would be:
-1 2 -1
> (2) Secondly, it seems that the nonlinearity seen in the experiment
> represents a sort of ceiling effect. In the above-mentioned paper it is
> suggested to try cosine basis functions which may be more sensitive to this
> kind of phenomenon. How do I construct these regressors from my parameter
> values?
instead of specifying a linear function just expand the parameter with a cosine
function:
f(x) = cos(x);
in more general terms, a fourier set might be better.
-Christian
--
Dr. Christian Buechel
Neurologische Universitaetsklinik, Haus B
Universitaets-Krankenhaus Eppendorf
Martinistr. 52
D-20246 Hamburg
Germany
Tel.: +49-40-42803-4726
Fax.: +49-40-42803-5086
email:[log in to unmask]
www.uke.uni-hamburg.de/kliniken/neurologie/
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|