Dear list,
In Possum there are 2 ways of specifying the noise level. You can add
thermal noise with a certain SNR, or you can specify a value for sigma.
Can someone explain the relationship between SNR, sigma, mean intensity,
etc?
In my case, with a specified SNR of 20, i get a sigma of
.01311808984375000000 with a medintensity of 33.582310. For a SNR of 10,
i get a sigma of .02623617968750000000 with a medintensity of 33.582310.
When i calculate the standarddeviation of the image, i get 5.01, so
clearly sigma != stdv. What is the difference between them? How are they
related? And most importantly, how are they calculated?
The literature is very vague on this issue. For example, in one article
i found that SNR = (meanintensity activation - meanintensity
rest)/standarderror noise. Is this correct? If so, is this on the voxel,
ROI or image level? Also, what is the relation between my standard
deviation (as calculated with fslstats), the standarderror of the noise
(whatever that is) and sigma as specified in Possum?
Many thanks,
--
Bjorn Roelstraete
Ghent University
Department of Data-analysis
H. Dunantlaan 1, B-9000 Gent, Belgium
Tel: 32-9-2646434
E-mail: [log in to unmask]
|