Print

Print


Oh, so you don't need to shut down X? i.e. If I'm already logged into a graphical session I don't have to either init 3, or press Ctrl+Shift+Backspace? That's very handy.

Thanks,
Chris
________________________________________
From: FSL - FMRIB's Software Library [[log in to unmask]] on behalf of Moises Hernandez Fernandez [[log in to unmask]]
Sent: Friday, October 25, 2013 5:36 AM
To: [log in to unmask]
Subject: Re: [FSL] GPU bedpostx

Hi Chris,

You can do it like that or directly with <Ctrl><Alt><F1> in order to switch to the first text terminals.
But with the second option, you should not go back to your Desktop terminal while bedpostx is running. If you do that the GPU memory will mix data from the screen and the CUDA process.

The timeout to update the screen (in case you are running a graphic process, such as a Desktop), is 5 seconds. And bedpostx take more than 5 seconds.

Cheers,
Moises.


On 24 October 2013 19:23, Chris Watson <[log in to unmask]<mailto:[log in to unmask]>> wrote:
Hi Moises,
So does this mean that (assuming only 1 GPU), you have to change to runlevel 3 (or 2), and only then you can use bedpostx_gpu?

Chris

On 10/24/2013 07:31 AM, Moises Hernandez Fernandez wrote:
Hi,

bedpostx_gpu need a GPU, free of any graphic task. This is because the GPU need to refresh the screen and you can not use this GPU for CUDA tasks that take a long time.

In Linux you can run bedpostx_gpu in command line mode (then the GPU is free of any graphic task), or if you have 2 GPUs you can use one for graphics (it does not need to be Nvidia) and the other one for bedpostx_gpu (this one need to be Nvidia).

We have never tried in a Virtualbox over Windows, but I suggest to install linux in this machine and have both OS (dual boot).

In OSX we found a problem. Normally new  Apple laptops come with 2 GPUs. A non-Nvidia GPU and a Nvidia GPU. When you use the non-Nvidia for graphics, then the Nvidia-GPU is not accessible, so you can not use it. You need to select the Nvidia GPU, but then you can not run long processes such as bedpostx_gpu. So at the moment we do not support an OSX version. I am looking for a solution.

Cheers,
Moises.


On 24 October 2013 10:02, Saad Jbabdi <[log in to unmask]<mailto:[log in to unmask]>> wrote:
Hi Andreas - not sure, you need to ask Moises.

Cheers
Saad.


On 24 Oct 2013, at 09:29, Andreas Bartsch <[log in to unmask]<mailto:[log in to unmask]>> wrote:

Hi Saad,

bedpostx_gpu is not available for OSX, right?
Cheers,
Andreas

Von: Saad Jbabdi <[log in to unmask]<mailto:[log in to unmask]>>
Antworten an: FSL - FMRIB's Software Library <[log in to unmask]<mailto:[log in to unmask]>>
Datum: Mittwoch, 23. Oktober 2013 21:37
An: <[log in to unmask]<mailto:[log in to unmask]>>
Betreff: Re: [FSL] GPU bedpostx

Hi Amir

Have you looked at the instructions here: http://fsl.fmrib.ox.ac.uk/fsl/fslwiki/FslInstallation#Running_bedpostX_on_a_GPU_or_GPU_cluster
?

Cheers
Saad.

On 23 Oct 2013, at 20:23, Amir Zolal <[log in to unmask]<mailto:[log in to unmask]>> wrote:

Dear all,

I am trying to configure my computer to use bedpostx in the GPU mode. I am running a laptop with Nvidia GeForce 630M, which has cuda and the computing capacity over 2. I installed Centos 6, succeeded in configuring bumblebee to support the Optimus graphic card switching and installed the CUDA toolkit from the nVidia website.

How do I test if bedpostx actually uses the gpu to calculate?

The output in the terminal after around 15 minutes is:
subjectdir is /home/zolal/DTItest/101915/T1w/Diffusion
Making bedpostx directory structure
Queuing preprocessing stages
Queuing parallel processing stage
0 slices processed
6 slices processed

Which would mean that the 145 slices would get processed in around 6 hours (which is not what I was expecting with GPU). top shows 100% use of the CPU by xfibers, which is weird because I have a 4 core CPU and when I ran bedpostx in Virtualbox (Linux guest in Windows host) before installing CentOs I had 25% use. Plus the computer runs very smoothly, there is actually no lag, so the 100% CPU use is weird.

Update: 9 slices processed after 20 minutes.

Ok .. finally the question clearly: how do I know, whether bedpostx uses the GPU for the calculation? What is the output?
What are the required environment variables?

I am sorry, I couldnt find it on the FSLwiki.

Thanks,

Amir Zolal

--
Saad Jbabdi
University of Oxford, FMRIB Centre

JR Hospital, Headington, OX3 9DU, UK
(+44)1865-222466<tel:%28%2B44%291865-222466>  (fax 717)
www.ndcn.ox.ac.uk/team/researchers/saad-jbabdi<http://www.ndcn.ox.ac.uk/team/researchers/saad-jbabdi>


--
Saad Jbabdi
University of Oxford, FMRIB Centre

JR Hospital, Headington, OX3 9DU, UK
(+44)1865-222466<tel:%28%2B44%291865-222466>  (fax 717)
www.ndcn.ox.ac.uk/team/researchers/saad-jbabdi<http://www.ndcn.ox.ac.uk/team/researchers/saad-jbabdi>