Dear Erick,
I tested the current internal version on a dataset with random phase
and the results seem to be as good as with fixed phase so there
doesn't seem to be a problem in practice not only in theory. I'm not
sure about the public version you have though.
Best,
Vladimir
On Fri, Mar 5, 2010 at 5:43 PM, Karl Friston
<[log in to unmask]> wrote:
> Dear Erik,
>
>> We have been working hard on understanding
>> more about MSP, taking 30 papers, the Statistical Parametric Mapping
>> book, and: code.
>
> We will have to invite you to lecture on our SPM Courses :)
>
>
>> We made dozens of simulations, took data from 4 projects, and debugged
>> several times the processing line-by-line to understand what is going
>> on. I hope the level of our questions raised a bit.
>
> As Vladimir noted we are pursuing the same strategy here; trying to spot
> errors
> and make the code more efficient and robust. This is a slow process and
> depends on
> the sort of work you are doing. For example, Gareth Barnes has just
> discovered
> a serious instability in the spatial normalization (or realignment) of
> multiple subjects.
> Having said this, there is consensus that things are starting to converge;
> with the
> number of successes starting to exceed the number of failures!
>
>
>> For now, there are two (selected among many...):
>>
>> 1) In spm_eeg_invert, at the SSR and SST calculation, the variance is
>> calculated across temporal modes, then summed across spatial modes.
>> Question: What sense does it make?
>
> The variance explained is over time and channels (i.e., temporal and spatial
> modes, after projection of the data). It would be simple to derive the
> percent
> variance explained (over channels/spatial models) as a function of time (and
> vice versa). But this would not be such a useful summary of the accuracy
> (fit).
>
> Note that SPM inverts time-series not single time-bins. In other words,
> contrary
> to conventional approaches, it explains the variance in the data over time
> (not just
> over space). This means that if you give it a single time-bin, you would get
> nothing,
> because there are no temporal variations in signal.
>
>
>> ** Note that we get a NaN if, for any reason, you have only one temporal
>> mode: unlikely, but happens, for instance, if you run
>> simmegdip_for_faces with 1 fT/sqrt(Hz) noise; and I would be suspicious
>> of it anyway even for 2-3 temporal modes (in this case, sum(UY) yielded
>> 1.09, which I suppose shouldn't be larger than 1).
>
> This sounds like a bug. It is not impossible to get one temporal mode. The
> number
> of temporal modes corresponds to the number of linearly separable
> contributions to
> the signal. For example, if you had two dipoles with distinct time courses,
> you would
> get two temporal modes. I have never seen NaNs unless there has been a
> scaling
> artefact (with numerical overflow). It would be useful if you could
> reproduce this
> with a current version of the code?
>
>
>> 2) We never had good results with induced responses in real data. This
>> was not tested as extensively as ERPs, because in the simulation they
>> worked similarly. However, we were trying to understand how could this
>> PCA approch even work for non-phase locked data, and noticed very
>> recently that we overlooked a detail: in the simulations, the phase is
>> always the same! Just randomize the phase in simmegdip_for_faces and see
>> what happens. I am sending a png showing the average across trials in
>> one simulation, for two datasets (in/out of phase). Question: how can
>> PCA work with induced responses?
>
> This should be no problem. There will be two temporal models corresponding
> to the
> characteristic frequency that are pi/2 out of phase with each other. You can
> see this
> numerically with the following
>
>>> Y = spm_phase_shuffle(sin(2*pi*[1:128]'/8)*ones(1,8));
>>> [U S] = spm_svd(Y);
>>> plot(U)
>
> In short, a sinusoidal signal with randomly distributed phase can be
> explained
> completely by a mixture of two temporal modes (a sine and cosine function)
>
> I hope this helps :)
>
> With very best wishes,
>
> Karl
>
|