I found the issue. The fslinfo script occasionally includes the
"filename" field of nifti header in its output due to the egrep
argument: |l.*m[ai][xn]| (whenever the nifti filename string passes
this filter).
This unpredictable number of fields disrupts a correct reading by
fix_1a_extract_features.m in feature extraction step. It looks like
piping an extra |grep -v "filename" in the fslinfo wrapper script
fixes this issue.
Sourena
On 5/15/17, Sourena Soheili <[log in to unmask]> wrote:
> Hi,
>
> I am trying to train a FIX classifier on my functional data. The R
> code halts with the following error at the SVM stage (I'm on kernlab
> version 0.9.25):
>
>> lin.svm.weights <- calcLinSVMWeights(hcp.data)
> Setting default kernel parameters
> line search fails 1.502636e-05 -2.359475 -0.0003564997 1.228997e-09
> 1.366259e-14 -4.561056e-11 -4.926764e-18Warning message:
> In .local(x, ...) : Variable(s) `' constant. Cannot scale data.
>
>> thr.svm <- quantile(w.svm, q.thr)
> Error in quantile.default(w.svm, q.thr) :
> missing values and NaN's not allowed if 'na.rm' is FALSE
> Calls: quantile -> quantile.default
> Execution halted
>
> Thanks,
> Sourena
>
|