Hi Peter,
the first workspace does indeed work, seems like I did something wrong while creating the second workspace as 128x128x1x50 (to be converted to 128x128x50) instead of 128x128x6x50 while squeeze still solves it for the first workspace.
The script now indeed runs correctly on my computer too. I think I will try to do all this data reorganization within SPM next, instead of going back and forth between manipulating .nii files in FSL then back into matlab to avoid overlooking mistakes like this (or make matlab use FSL functions directly as I'm aware is possible).
While there's now indeed a matrix that can be exported with spm_write_vol as in the original T2star script, unfortunately the final output values are quite different (systematicaly lower) to those of our original homemade software, which is strange since again makeT2star managed an almost identical output and seemed easier to optimize to in fact do it faster. I will now try to troubleshoot and find out why this is the case, but obviously this is much easier now that I can visualize the output and compare which changes make the values seem more adequate compared to our old data and the literature to isolate where the error now is, so thanks again for dedicating your time to help me with my problems running this.
Best wishes, Javier
>Good morning,
>Interestingly, on my computer
>>> load('XYZ\javierspm\workspacejavier.mat')
>>> indata = squeeze(indata(:,:,1,:));
>>> T2Map(indata, tes, 0, maxT2, display);
>checks out, while
>>> load('XYZ\javierspm2\matlabworkspace.mat')
>>> indata = squeeze(indata(:,:,1,:));
>>> T2Map(indata, tes, 0, maxT2, display);
>does not. So you might want to double-check your inputs (indata).
>Additionally, the former does not display anything but blue for minimum
>= 10;
>Good luck!
>Kind regards,
>Peter
Am 06.04.2021 um 10:02 schrieb Javier Arcos H ódar:
> Hi,
>
> that's among the first things I checked, lowering minimum to 0.1, and the problem persists (now that I tried with 0 which I originally judged inadequate the result is the same, full of 0s and NaNs)
>
> I've found too low values in the minimum-based exclusion is problematic with the T2star script if too low. When I exported the data to Rstudio, a software I'm more comfortable with, 2.5 already lowered the voxel values correlation between the output of our old software and the script from comfortably above >.90 to .635, so fitting the curve while accepting those tiny values seems to greatly reduce the accuracy of the output. So even if that worked it could lead to other problems, but it doesn't seem to be the real cause.
>
> Thanks for the suggestion anyway, like I said I will try other data acqusitions, less echo times, other scripts for this purpose now that I can get spm_read_vols to work properly, etc. to see if I can narrow down the problem before further asking here in the mailing list.
>
>
> El 05-04-2021 17:07, Peter Stöhrmann escribió:
>> Hallo,
>>
>> To me, this appears to be related to the variable "minimum", which is
>> used for thresholding (T2Map.m, line 50). Please try using a lower
>> threshold, e.g. 0 just to verify whether that's the problem.
>>
>> Kind regards,
>>
>> Peter
>>
>> Am 05.04.2021 um 13:23 schrieb Javier Arcos H ódar:
>>> Greetings,
>>>
>>> thanks for the advice regarding the for loop, and regarding the squeeze function, which indeed was the last necessary step for the script to run. Unfortunately now I find myself with the ouput matrix being, apparently, nothing but NaN and 0s. I will try to explore further whether this happens with our other files even though it's still a bit strange (number of slices shouldn't matter/mask should work, and makeT2star works on our data) or search for alternatives now that I know what my problem with importing .nii as a matrix was.
>>>
>>> Thanks for your help
>>>
>>>
>>> El 05-04-2021 12:58, Peter Stöhrmann escribió:
>>>> Hallo,
>>>>
>>>> Currently your variable "indata" is still 4-dimensional, i.e.
>>>> 128x128x1x50 (you can check the dimensions using either size(indata)
>>>> or whos(indata)).
>>>>
>>>> In order to remove any dimensions of size 1 you could use squeeze(indata).
>>>>
>>>> The IMHO easiest way to use all the slices separately, though, would
>>>> be to load the masked 4D file just as you did before and then use a
>>>> for-loop to reformat and feed the data into the function, similar to
>>>> how is done in 'makeT2star.m' or if you will something along the lines
>>>> of
>>>>
>>>> nSlices = size(indata, 3); % nSlices = 6
>>>>
>>>> for ii = 1 : nSlices
>>>>
>>>> outputInProperFormatForFurtherUse =
>>>> T2Map(squeeze(indata(:,:,ii,:)), tes, minimum, maxT2, display);
>>>>
>>>> end
>>>>
>>>> Kind regards,
>>>>
>>>> Peter
>>>>
>>>> Am 05.04.2021 um 09:45 schrieb Javier Arcos H ódar:
>>>>> Greetings,
>>>>>
>>>>> sorry for the delay but I got issues with a Windows update interacting with my VirtualBox and then it was Easter so I could not access my work computer.
>>>>>
>>>>> first off apologies, because the cause of the error seems to indeed be something that was in the comments of the original script: I got so focused on trying to repurpose the other script for our objectives I forgot about reading that part in my brief attempt with T2Map only. I tried to fix this by using FSL function (again, perhaps clumsily, I will also find what SPM functions do this later) fslsplit to get 6 128x128x50 images, where the third dimension is all the echo times for each slice, meaning I would have to run the script 6 times and then merge the files.
>>>>>
>>>>> This is ok, doable, but the problem remains that when I import using spm_vol (.nii file) to get the header and spm_read_vols (.nii file, mask), even though nii file is now clearly a 3d file, it is still imported as a 4-D double, returning the same error. Initially what I find in the spm wiki relative to turning 4d data into 3d seems to refer to different issues (fMRI data for example) so I'd be glad if you could point out if SPM offers any simple way to solve this, or whether the problem is that using spm_read_vols will always give a 4D matrix and thus I should import as matrix in some other way.
>>>>>
>>>>> An issue that is particular to our sort of acqusition unfortunately is that our data is so big that I can't check the matrix object that is imported in matlab workspace because it has too many elements. I will attempt something like importing just 10 echo times to see if the solution is obvious then.
>>>>>
>>>>> Adding another link of the workspace as it currently is https://www.mediafire.com/file/oejpgk095szcugz/javierspm2.rar/file hope this isn't too basic of a question
>>>>>
>>>>> Thanks in advance
>>>>>
>>>>>
>>>>> El 31-03-2021 11:44, Peter Stöhrmann escribió:
>>>>>> Dear Javier Arcos Hódar,
>>>>>>
>>>>>> The problem seems to be your "indata" matrix which is 128x128x6x50
>>>>>>
>>>>>> The function T2Map seems to expect "indata" to be 3-dimensional only,
>>>>>> though (T2Map.m: "indata is assumed to be dimensioned as X * Y * te -
>>>>>> i.e., same slice, multiple te")
>>>>>>
>>>>>> This causes an unexpected behavior in line 72 (ydata = inData(r,c,:);)
>>>>>> resulting in ydata being a 1x1x300 matrix which cannot be reshaped in
>>>>>> a N-by-1 vector where N=6.
>>>>>>
>>>>>> Hope this helped!
>>>>>>
>>>>>> Kind regards,
>>>>>>
>>>>>> Peter
>>>>>> Am 30.03.2021 um 20:26 schrieb Javier Arcos H ódar:
>>>>>>> Greetings SPM experts,
>>>>>>>
>>>>>>> I'm Javier Arcos Hódar, a research assistant based on the Alberto Sols institute for biomedical research in Madrid. I've been tasked with investigating alternatives to our homemade software developed in Matlab for generation of parametric maps of MRI data, in case there are unnoticed advantages in other software or some update ends up leaving us without being able to use our own.
>>>>>>>
>>>>>>> One of the tasks I'm trying to make work within the SPM framework is making parametric T1, T2 and T2 star maps. I find myself in the strange situation where I'm managing to make the makeT2star script (here https://github.com/npnl/T2-Maps) work, while T2map, something makeT2star needs to function, gives me an error when attempted in isolation.
>>>>>>>
>>>>>>> Attached at the end is a zip file with my 128x128x6x50 nii file (it's preclinical rodent data) and the mask I had to make in FSL even though I'd like to learn to make them within SPM too sooner than later. I'm not at all sure that the last echo times add much information to be honest, but I'd like to test this in detail before discussing with my group about changing the way acquisition is done. In any case it's just to note that 36 onwards there's barely, if any, 1 values in the mask due to how low the voxel values get.
>>>>>>>
>>>>>>> In any case, my procedure is simply:
>>>>>>> spm_vol newspm_23_subscan_0.nii <- to get the header
>>>>>>> display = 1, maxT2 = 250, minimum = 10, tes = [12 24 36 48 60 72 84 96 108 120 132 144 156 168 180 192 204 216 228 240 252 264 276 288 300 312 324 336 348 360 372 384 396 408 420 432 444 456 468 480 492 504 516 528 540 552 564 576 588 600] for the script specific variables
>>>>>>>
>>>>>>> and then
>>>>>>>
>>>>>>> indata = spm_read_vols (header, 'maskT2.nii')
>>>>>>> (correct me if I'm mistaken, but this should import the image as a matrix)
>>>>>>>
>>>>>>> To finally attempt
>>>>>>>
>>>>>>> T2Map (indata, tes, minimum, maxT2, display)
>>>>>>>
>>>>>>> which instead of working returns:
>>>>>>>
>>>>>>> Error using reshape
>>>>>>> To RESHAPE the number of elements must not change.
>>>>>>>
>>>>>>> Error in T2Map (line 73)
>>>>>>> ydata = reshape(ydata,N,1);
>>>>>>>
>>>>>>> I'm not sure if it's something wrong with my .nii image or mask or that I've misused spm_read_vols function.
>>>>>>>
>>>>>>> If necessary one can download the data, the mask, the script and my matlab workspace when I get the error at https://www.mediafire.com/file/t2un8o1xz90wjnj/javierspm.rar/file
>>>>>>>
>>>>>>> I hope it isn't asking too much for a specific script that uses SPM functions instead of SPM proper, but maybe the error is something easy to recognize for those experienced with regular SPM use.
>>>>>>>
>>>>>>> Thanks in advance
>>>>>>>
>>> Peter
>>>
>>> Am 05.04.2021 um 09:45 schrieb Javier Arcos H ódar:
>>> Greetings,
>>>
>>> sorry for the delay but I got issues with a Windows update interacting with my VirtualBox and then it was Easter so I could not access my work computer.
>>>
>>> first off apologies, because the cause of the error seems to indeed be something that was in the comments of the original script: I got so focused on trying to repurpose the other script for our objectives I forgot about reading that part in my brief attempt with T2Map only. I tried to fix this by using FSL function (again, perhaps clumsily, I will also find what SPM functions do this later) fslsplit to get 6 128x128x50 images, where the third dimension is all the echo times for each slice, meaning I would have to run the script 6 times and then merge the files.
>>>
>>> This is ok, doable, but the problem remains that when I import using spm_vol (.nii file) to get the header and spm_read_vols (.nii file, mask), even though nii file is now clearly a 3d file, it is still imported as a 4-D double, returning the same error. Initially what I find in the spm wiki relative to turning 4d data into 3d seems to refer to different issues (fMRI data for example) so I'd be glad if you could point out if SPM offers any simple way to solve this, or whether the problem is that using spm_read_vols will always give a 4D matrix and thus I should import as matrix in some other way.
>>>
>>> An issue that is particular to our sort of acqusition unfortunately is that our data is so big that I can't check the matrix object that is imported in matlab workspace because it has too many elements. I will attempt something like importing just 10 echo times to see if the solution is obvious then.
>>>
>>> Adding another link of the workspace as it currently is https://www.mediafire.com/file/oejpgk095szcugz/javierspm2.rar/file hope this isn't too basic of a question
>>>
>>> Thanks in advance
>>>
>>>
>>> El 31-03-2021 11:44, Peter Stöhrmann escribió:
>>> Dear Javier Arcos Hódar,
>>>
>>> The problem seems to be your "indata" matrix which is 128x128x6x50
>>>
>>> The function T2Map seems to expect "indata" to be 3-dimensional only,
>>> though (T2Map.m: "indata is assumed to be dimensioned as X * Y * te -
>>> i.e., same slice, multiple te")
>>>
>>> This causes an unexpected behavior in line 72 (ydata = inData(r,c,:);)
>>> resulting in ydata being a 1x1x300 matrix which cannot be reshaped in
>>> a N-by-1 vector where N=6.
>>>
>>> Hope this helped!
>>>
>>> Kind regards,
>>>
>>> Peter
>>> Am 30.03.2021 um 20:26 schrieb Javier Arcos H ódar:
>>> Greetings SPM experts,
>>>
>>> I'm Javier Arcos Hódar, a research assistant based on the Alberto Sols institute for biomedical research in Madrid. I've been tasked with investigating alternatives to our homemade software developed in Matlab for generation of parametric maps of MRI data, in case there are unnoticed advantages in other software or some update ends up leaving us without being able to use our own.
>>>
>>> One of the tasks I'm trying to make work within the SPM framework is making parametric T1, T2 and T2 star maps. I find myself in the strange situation where I'm managing to make the makeT2star script (here https://github.com/npnl/T2-Maps) work, while T2map, something makeT2star needs to function, gives me an error when attempted in isolation.
>>>
>>> Attached at the end is a zip file with my 128x128x6x50 nii file (it's preclinical rodent data) and the mask I had to make in FSL even though I'd like to learn to make them within SPM too sooner than later. I'm not at all sure that the last echo times add much information to be honest, but I'd like to test this in detail before discussing with my group about changing the way acquisition is done. In any case it's just to note that 36 onwards there's barely, if any, 1 values in the mask due to how low the voxel values get.
>>>
>>> In any case, my procedure is simply:
>>> spm_vol newspm_23_subscan_0.nii <- to get the header
>>> display = 1, maxT2 = 250, minimum = 10, tes = [12 24 36 48 60 72 84 96 108 120 132 144 156 168 180 192 204 216 228 240 252 264 276 288 300 312 324 336 348 360 372 384 396 408 420 432 444 456 468 480 492 504 516 528 540 552 564 576 588 600] for the script specific variables
>>>
>>> and then
>>>
>>> indata = spm_read_vols (header, 'maskT2.nii')
>>> (correct me if I'm mistaken, but this should import the image as a matrix)
>>>
>>> To finally attempt
>>>
>>> T2Map (indata, tes, minimum, maxT2, display)
>>>
>>> which instead of working returns:
>>>
>>> Error using reshape
>>> To RESHAPE the number of elements must not change.
>>>
>>> Error in T2Map (line 73)
>>> ydata = reshape(ydata,N,1);
>>>
>>> I'm not sure if it's something wrong with my .nii image or mask or that I've misused spm_read_vols function.
>>>
>>> If necessary one can download the data, the mask, the script and my matlab workspace when I get the error at https://www.mediafire.com/file/t2un8o1xz90wjnj/javierspm.rar/file
>>>
>>> I hope it isn't asking too much for a specific script that uses SPM functions instead of SPM proper, but maybe the error is something easy to recognize for those experienced with regular SPM use.
>>>
>>> Thanks in advance
>>>
|