For CIFTI reading and writing you wonąt need that.
Peace,
Matt.
On 1/27/16, 4:24 PM, "FSL - FMRIB's Software Library on behalf of Michael
F.W. Dreyfuss" <[log in to unmask] on behalf of [log in to unmask]>
wrote:
>Thank you. Lastly, I notice that the gifti README suggests:
>
>"Note that the handling of gzipped data requires either Java (dzip) or a
>MEX file (miniz), so don't start MATLAB with the -nojvm option unless you
>compiled the zstream.c MEX file"
>
>Yet in the setting.sh script for fix, the
>FSL_FIX_MLOPTS="-nojvm -nodisplay -nodesktop -nosplash"
>
>which is then used as the options with which to run MATLAB for fix.
>
>So should I change that to not have the -nojvm option? I'm not sure if I
>have this MEX file (or what it is, or where it would be, or how to find
>out).
>
>Thanks,
>Michael
>________________________________________
>From: FSL - FMRIB's Software Library <[log in to unmask]> on behalf of
>Michael F.W. Dreyfuss <[log in to unmask]>
>Sent: Wednesday, January 27, 2016 4:22 PM
>To: [log in to unmask]
>Subject: Re: [FSL] hcp_fix
>
>Thanks, I do have that installed, but we changed some things on our
>server, so it wasn't being found anymore. To add it to the matlab path
>for fix to run, can I simply add the path of gifti to this line in
>hcp_fix:
>
>ML_PATHS="addpath('${FSL_FIXDIR}');
>addpath('${FSL_MATLAB_PATH}');addpath('/home/caseylab_curate/software/gift
>i-1.5');"
>
>Or better: add GIFTI_PATH=/home/caseylab_curate/software/gifti-1.5 to my
>own $PATH and then
>
>if [ -z "${GIFTI_PATH}" ]; then
> echo "GIFTI path not set for MATLAB to read CIFTI files!" >&2
> usage
>fi
>
>ML_PATHS="addpath('${FSL_FIXDIR}');
>addpath('${FSL_MATLAB_PATH}');addpath('${GIFTI_PATH}');"
>
>To be consistent with the rest of that script. How does that sound?
>
>Also, I noticed that now gifti has a version 1.6. Is it worth updating
>from 1.5?
>
>Thanks a lot,
>Michael
>
>
>________________________________________
>From: FSL - FMRIB's Software Library <[log in to unmask]> on behalf of
>Matt Glasser <[log in to unmask]>
>Sent: Wednesday, January 27, 2016 12:52 PM
>To: [log in to unmask]
>Subject: Re: [FSL] hcp_fix
>
>Hi Michael,
>
>Do you have the CIFTI reader/writer matlab scripts installed? You can get
>them from here:
>
>https://urldefense.proofpoint.com/v2/url?u=https-3A__wiki.humanconnectome.
>org_display_PublicData_HCP-2BUsers-2BFAQ&d=BQICAg&c=lb62iw4YL4RFalcE2hQUQe
>alT9-RXrryqt9KZX2qu2s&r=rPclmYysc_z1plf99IoNsmxWf1JolkKMmL6bXnYFSwg&m=0O2N
>TuQThZ8GqpzkvJ4SyuWP94uL10pjt1zeKSoUrKU&s=QkaqBoFvLGic-h9TJFNkIrsb3CNNn491
>QWJL2H0r7mw&e=
>
>Peace,
>
>Matt.
>
>On 1/27/16, 9:39 AM, "FSL - FMRIB's Software Library on behalf of Michael
>F.W. Dreyfuss" <[log in to unmask] on behalf of [log in to unmask]>
>wrote:
>
>>Hi,
>>
>>I've been trying to set up hcp_fix, but have not been getting the proper
>>output. In running hcp_fix, I get the output volume file of
>>${run}_hp2000.nii.gz, and the ${run}_hp2000.ica directory is produced,
>>but I am not getting the files ${run}_Atlas_hp2000_clean.dtseries.nii or
>>${run}_hp2000_clean.nii.gz to use in analysis. Everything seems to be
>>running fine, and I cannot find where in hcp_fix of fix itself these
>>files are produced, just where they are renamed and moved. May I ask if
>>you have any suggestion of what may be going on here and how I can get
>>the output I need? Would be very helpful.
>>
>>For reference, this is the output printed to the terminal:
>>
>>
>>[mid2018@node142 Results]$ $FSL_FIXDIR/hcp_fix $run/${run}.nii.gz 2000
>>processing FMRI file TGNG3 with highpass 2000
>>running highpass
>>running MELODIC
>>running FIX
>>FIX Feature extraction for Melodic output directory: TGNG3_hp2000.ica
>> create edge masks
>> run FAST
>> registration of standard space masks
>> extract features
>>FIX Classifying components in Melodic directory: TGNG3_hp2000.ica using
>>training file:
>>/home/caseylab_curate/software/fix1.06/training_files/HCP_hp2000.RData
>>and threshold 10
>>FIX Applying cleanup using cleanup file:
>>TGNG3_hp2000.ica/fix4melview_HCP_hp2000_thr10.txt and motion cleanup set
>>to 1
>>
>>
>>Thank you very much,
>>Michael
|