Dear Christian,
Thank you so much, the code works!
I'm so glad :)
 
Am I correct that you calculated the mutual information value from H(A)+H(B)-H(A,B)
which gives the code
H  = H/(sum(H(:))+eps);
s1 = sum(H,1);
s2 = sum(H,2);
H  = H.*log2((H+eps)./(s2*s1+eps));
mi = sum(H(:));
fprintf('Mutual information: %3.4f\n',mi); 
 
if I want to add the value of
1. joint entropy H(A,B)= H(A)+H(B)
and
2. normalised mutual information = H(A)+H(B)/H(A,B)
 
What code should it be?
Could you please help again?
 
Thank you very much,
Panatsada
 
Dear Panatsada,

try the attached function, that is providing either the histogram of one image or the joint
histogram of two images.

Regards,

Christian

--
____________________________________________________________________________

Christian Gaser, Ph.D.
Assistant Professor of Computational Neuroscience
Department of Psychiatry
Friedrich-Schiller-University of Jena
Jahnstrasse 3, D-07743 Jena, Germany
Tel: ++49-3641-934752 Fax: ++49-3641-934755
e-mail: [log in to unmask]
http://dbm.neuro.uni-jena.de

On Thu, 4 Mar 2010 18:24:00 +0000, Panatsada Awikunprasert
<[log in to unmask]> wrote:

>Dear SPMers,
>Can anyone please tell me how to create a joint histogram?
>I want to measure how good the registration (clinial data)! is, can anyone tell,
>please?
>Thank you very much,
>Panatsada


Do you have a story that started on Hotmail? Tell us now