You shouldn't specify such a small explicit mask at the first level.
Instead fit a model for the entire brain and use small volume correction
later on if you want to restrict your inference within a reduced search
On 26/04/17 13:18, [log in to unmask] wrote:
> p.s. the error appears to be related to the number of scans as well - it
> is more likely to appear when there are few scans (<100).
> On Wed, Apr 26, 2017 at 2:06 PM, <[log in to unmask]
> <mailto:[log in to unmask]>> wrote:
> Dear Guillaume,
> I am having the same error but it does not seem to relate to the
> design (I tested this using several alternatives).
> It rather always occurs only if I specify a small explicit mask
> (e.g. 54 voxels) in the 1st level fMRI model specification.
> I hope this helps to pin down its source.
> Best regards,
> On Mon, Apr 24, 2017 at 12:25 PM, Guillaume Flandin
> <[log in to unmask] <mailto:[log in to unmask]>> wrote:
> Dear Angelique,
> This usually means there is something odd with the data or the
> model you
> specified, or both. What does your design matrix looks like?
> With a mixed design, the simplest would be to specify one-sample or
> two-sample t-tests for each question you have (main effects,
> Best regards,
> On 22/04/17 12:16, Angelique Van Ombergen wrote:
> > Hi all,
> > I’m running a 2X2 ANOVA in SPM (with one within-group factor
> (pre and
> > post scan) and one between-group factor (experimental and
> control group)
> > and n=8 in each group).
> > However, when running the model estimation, I get the
> following error:
> > Chunk 1/1 :
> > Failed 'Model estimation'
> > Error using spm_est_non_sphericity (line 207)
> > Please check your data: There are no significant voxels.
> > In file
> > (v6827), function "spm_est_non_sphericity" at line 207.
> > In file "/Users/angeliquevanombergen/spm12/spm_spm.m" (v6842),
> > "spm_spm" at line 431.
> > In file
> > (v5809), function "spm_run_fmri_est" at line 33.
> > The following modules did not run:
> > Failed: Model estimation
> > I am just wondering whether this is an actual error or just
> the fact
> > that there’s nothing significant (as it basically says).
> However, I’ve
> > never had this error before, so I’m not sure.
> > When I run a paired t-test between pre and post for each group
> > separately, I do find significant differences though.
> > Thanks for the advice!
> > Best wishes,
> > Angelique
> > drs. Angelique Van Ombergen
> > *Antwerp University Research centre for Equilibrium and
> Aerospace (AUREA) *
> > *FWO research fellow*
> > *University of Antwerp*
> > _______________________________
> > Antwerp University Hospital (UZA)
> > Wilrijkstraat 10 (route 71, B904)
> > 2650 Edegem (Antwerp)
> > Belgium
> > lab phone: +32 (0)3 821 33 07 <tel:%2B32%20%280%293%20821%2033%2007>
> > mobile phone: +32 (0)4 73 237 820 <tel:%2B32%20%280%294%2073%20237%20820>
> > web: www.uantwerpen.be/aurea <http://www.uantwerpen.be/aurea>
> > <http://www.uantwerpen.be/aurea
> <http://www.uantwerpen.be/aurea>> or
> > <http://www.uantwerpen.be/angelique-vanombergen
> > blog: www.neuropolisblog.com <http://www.neuropolisblog.com>
> <http://www.neuropolisblog.com/ <http://www.neuropolisblog.com/>>
> Guillaume Flandin, PhD
> Wellcome Trust Centre for Neuroimaging
> University College London
> 12 Queen Square
> London WC1N 3BG
Guillaume Flandin, PhD
Wellcome Trust Centre for Neuroimaging
University College London
12 Queen Square
London WC1N 3BG