JiscMail Logo
Email discussion lists for the UK Education and Research communities

Help for SPM Archives


SPM Archives

SPM Archives


SPM@JISCMAIL.AC.UK


View:

Message:

[

First

|

Previous

|

Next

|

Last

]

By Topic:

[

First

|

Previous

|

Next

|

Last

]

By Author:

[

First

|

Previous

|

Next

|

Last

]

Font:

Proportional Font

LISTSERV Archives

LISTSERV Archives

SPM Home

SPM Home

SPM  September 2019

SPM September 2019

Options

Subscribe or Unsubscribe

Subscribe or Unsubscribe

Log In

Log In

Get Password

Get Password

Subject:

Re: SPM Digest - 23 Sep 2019 to 24 Sep 2019 (#2019-259)

From:

BrainSymposium <[log in to unmask]>

Reply-To:

BrainSymposium <[log in to unmask]>

Date:

Wed, 25 Sep 2019 18:20:32 +0000

Content-Type:

text/plain

Parts/Attachments:

Parts/Attachments

text/plain (1 lines)

Please remove this email from your list.

Thank you.



-----Original Message-----

From: SPM (Statistical Parametric Mapping) <[log in to unmask]> On Behalf Of SPM automatic digest system

Sent: Tuesday, September 24, 2019 6:00 PM

To: [log in to unmask]

Subject: SPM Digest - 23 Sep 2019 to 24 Sep 2019 (#2019-259)



There are 5 messages totaling 2563 lines in this issue.



Topics of the day:



  1. DPABISurf V1.2 Is Released

  2. Effective connectivity analysis (DCM), second level analysis

  3. Postdoc position in Computational Cognitive Neuroscience (Max Planck

     Institute CBS, Leipzig, Germany)

  4. PhD positions in Computational Cognitive Neuroscience (Max Planck

     Institute CBS, Leipzig, Germany)

  5. co-registration problem with HCP data (both 3T and 7T)



----------------------------------------------------------------------



Date:    Tue, 24 Sep 2019 09:12:55 +0800

From:    YAN Chao-Gan <[log in to unmask]>

Subject: DPABISurf V1.2 Is Released



Dear Colleagues,



We are pleased to announce the release of DPABISurf V1.2!



DPABISurf is a surface-based resting-state fMRI data analysis toolbox evolved from DPABI/DPARSF, as easy-to-use as DPABI/DPARSF. DPABISurf is based on fMRIPprep 1.5.0 (Esteban et al., 2018)(RRID:SCR_016216), and based on FreeSurfer 6.0.1 (Dale et al., 1999)(RRID:SCR_001847), ANTs 2.2.0 (Avants et al., 2008)(RRID:SCR_004757), FSL 5.0.9 (Jenkinson et al., 2002)(RRID:SCR_002823), AFNI 20160207 (Cox, 1996)(RRID:SCR_005927), SPM12 (Ashburner, 2012)(RRID:SCR_007037), PALM alpha112 (Winkler et al., 2016), GNU Parallel (Tange, 2011), MATLAB (The MathWorks Inc., Natick, MA, US) (RRID:SCR_001622), Docker (https://nam02.safelinks.protection.outlook.com/?url=https%3A%2F%2Fdocker.com&amp;data=02%7C01%7Cbrainsymposium%40UTDALLAS.EDU%7C778dabbb7b634ba7d73008d74143822f%7C8d281d1d9c4d4bf7b16e032d15de9f6c%7C0%7C0%7C637049630524575704&amp;sdata=LSgMzFNScYYH1rNA6DfHckKwZ1ziSkPb7E0EdyXcvMs%3D&amp;reserved=0) (RRID:SCR_016445), and DPABI

V4.2 (Yan et al., 2016)(RRID:SCR_010501). DPABISurf provides user-friendly graphical user interface (GUI) for pipeline surface-based preprocessing, statistical analyses and results viewing, while requires no programming/scripting skills from the users.



<https://nam02.safelinks.protection.outlook.com/?url=http%3A%2F%2Fwww.rfmri.org%2Fdpabi&amp;data=02%7C01%7Cbrainsymposium%40UTDALLAS.EDU%7C778dabbb7b634ba7d73008d74143822f%7C8d281d1d9c4d4bf7b16e032d15de9f6c%7C0%7C0%7C637049630524585704&amp;sdata=z5KCj73MsGNX0Qg4HshJKVZNMX00HRSDuSCT%2F1kRlYw%3D&amp;reserved=0>



The DPABISurf pipeline first converts the user specified data into BIDS format (Gorgolewski et al., 2016), and then calls fMRIPprep 1.5.0 docker to preprocess the structural and functional MRI data, which integrates FreeSurfer, ANTs, FSL and AFNI. With fMRIPprep, the data is processed into FreeSurfer fsaverage5 surface space and MNI volume space. DPABISurf further performs nuisance covariates regression (including ICA-AROMA) on the surface-based data (volume-based data is processed as well), and then calculate the commonly used R-fMRI metrics: amplitude of low frequency fluctuation (ALFF) (Zang et al., 2007), fractional ALFF (Zou et al., 2008), regional homogeneity (Zang et al., 2004), degree centrality (Zuo and Xing, 2014), and seed-based functional connectivity. DPABISurf also performs surface-based smoothing by calling FreeSurfer’s mri_surf2surf command.

These processed metrics then enters surfaced-based statistical analyses within DPABISurf, which could perform surfaced-based permutation test with TFCE by integrating PALM. Finally, the corrected results could be viewed by the convenient surface viewer DPABISurf_VIEW, which is derived from spm_mesh_render.m.



<https://nam02.safelinks.protection.outlook.com/?url=http%3A%2F%2Fwww.rfmri.org%2Fdpabi&amp;data=02%7C01%7Cbrainsymposium%40UTDALLAS.EDU%7C778dabbb7b634ba7d73008d74143822f%7C8d281d1d9c4d4bf7b16e032d15de9f6c%7C0%7C0%7C637049630524585704&amp;sdata=z5KCj73MsGNX0Qg4HshJKVZNMX00HRSDuSCT%2F1kRlYw%3D&amp;reserved=0>



DPABISurf is designed to make surface-based data analysis require minimum manual operations and almost no programming/scripting experience. We anticipate this open-source toolbox will assist novices and expert users alike and continue to support advancing R-fMRI methodology and its application to clinical translational studies.



DPABISurf is open-source and distributed under GNU/GPL, available with DPABI at https://nam02.safelinks.protection.outlook.com/?url=http%3A%2F%2Fwww.rfmri.org%2Fdpabi&amp;data=02%7C01%7Cbrainsymposium%40UTDALLAS.EDU%7C778dabbb7b634ba7d73008d74143822f%7C8d281d1d9c4d4bf7b16e032d15de9f6c%7C0%7C0%7C637049630524585704&amp;sdata=z5KCj73MsGNX0Qg4HshJKVZNMX00HRSDuSCT%2F1kRlYw%3D&amp;reserved=0. It supports Windows 10 Pro, MacOS and Linux operating systems. You can run it with or without MATLAB.



1. With MATLAB.

1.1. Please go to https://nam02.safelinks.protection.outlook.com/?url=http%3A%2F%2Fwww.rfmri.org%2Fdpabi&amp;data=02%7C01%7Cbrainsymposium%40UTDALLAS.EDU%7C778dabbb7b634ba7d73008d74143822f%7C8d281d1d9c4d4bf7b16e032d15de9f6c%7C0%7C0%7C637049630524585704&amp;sdata=z5KCj73MsGNX0Qg4HshJKVZNMX00HRSDuSCT%2F1kRlYw%3D&amp;reserved=0 to download DPABI.

1.2. Add with subfolders for DPABI in MATLAB's path setting.

1.3. Input 'dpabi' and then follow the instructions of the "Install" Button on DPABISurf.

2. Without MATLAB.

2.1. Install Docker.

2.2. Terminal: docker pull cgyan/dpabi

2.3. Terminal: docker run -d --rm -v

/My/FreeSurferLicense/Path/license.txt:/opt/freesurfer/license.txt

-v /My/Data/Path:/data -p 5925:5925 cgyan/dpabi x11vnc -forever -shared -usepw -create -rfbport 5925

/My/FreeSurferLicense/Path/license.txt: Where you stored the FreeSurferLicense got from https://nam02.safelinks.protection.outlook.com/?url=https%3A%2F%2Fsurfer.nmr.mgh.harvard.edu%2Fregistration.html&amp;data=02%7C01%7Cbrainsymposium%40UTDALLAS.EDU%7C778dabbb7b634ba7d73008d74143822f%7C8d281d1d9c4d4bf7b16e032d15de9f6c%7C0%7C0%7C637049630524585704&amp;sdata=VkNhuZrGPfebUCE%2BX0f0F5WPKa%2BSubzQmE%2BIL9ga8T8%3D&amp;reserved=0.

/My/Data/Path: This is where you stored your data. In Docker, the path is /data.

2.4. Open VNC Viewer, connect to localhost:5925, the password is 'dpabi'.

2.5. In the terminal within the VNC Viewer, input "bash", and then input:

/opt/DPABI/DPABI_StandAlone/run_DPABI_StandAlone.sh ${MCRPath}





Now please enjoy the StandAlone version of DPABISurf with GUI!





If you don't want to run with GUI, you can also call the compiled version of DPABISurf_run. E.g., docker run -it --rm -v /My/FreeSurferLicense/Path/license.txt:/opt/freesurfer/license.txt

-v /My/Data/Path:/data cgyan/dpabi /bin/bash /opt/DPABI/DPABI_StandAlone/run_DPABISurf_run_StandAlone.sh ${MCRPath} /data/DPABISurf_Cfg.mat New features of DPABISurf_V1.2_190919 within DPABI_V4.2_190919 (download at https://nam02.safelinks.protection.outlook.com/?url=http%3A%2F%2Frfmri.org%2Fdpabi&amp;data=02%7C01%7Cbrainsymposium%40UTDALLAS.EDU%7C778dabbb7b634ba7d73008d74143822f%7C8d281d1d9c4d4bf7b16e032d15de9f6c%7C0%7C0%7C637049630524585704&amp;sdata=EaSZtunU8tnsLp%2FDitFHEJGzZ4utdYV%2Fo683cA%2BBnBk%3D&amp;reserved=0, please also update the docker file by: docker pull

cgyan/dpabi):

1. DPABISurf_V1.2_190919 updated.

1.1. A quality control module was added to DPABISurf. Now users can quality control surface reconstruction, EPI to T1 registration and T1 to MNI registration for all the subjects in one HTML file, respectively (based on fmriprep 1.5.0). For volume-based analysis, users can also generate group mask for DPABISurf, and exclude subjects by thresholding coverage and head motion.

1.2. DPABISurf now also output sulcus depth and volume in fsaverage and

fsaverage5 spaces for statistical analysis.

1.3. In results organizer of DPABISurf, the redundant files would not be organized now. In addition, the fmriprep and freesurfer files were backed up, while excluding T1 image that may have privacy information such as face.

2. DPABI_VIEW has a new function "Surface View with DPABISurf_VIEW" now.

This function will convert the files to fsaverage surface using freesurfer's mri_vol2surf command. Then the results were displayed by calling DPABISurf_VIEW to generate surface-based picture.

Tips:

1) For Linux or Mac OS, please start matlab from terminal in order to reach docker in DPABI (e.g., Linux: matlab; Mac: open /Applications/MATLAB_R2018a.app/).

2) Before running DPABISurf_Pipeline, you can test the docker environment by running DPABI->DPABISurf->Utilities->Volume-Surface Projector. If the file can be successfully projected to surface, then the software is all set.





References:



   - Ashburner, J. (2012). SPM: a history. *Neuroimage*, 62(2), 791-800,

   doi:10.1016/j.neuroimage.2011.10.025.

   - Avants, B.B., Epstein, C.L., Grossman, M., Gee, J.C. (2008). Symmetric

   diffeomorphic image registration with cross-correlation: evaluating

   automated labeling of elderly and neurodegenerative brain. *Med Image

   Anal*, 12(1), 26-41, doi:10.1016/j.media.2007.06.004.

   - Cox, R.W. (1996). AFNI: software for analysis and visualization of

   functional magnetic resonance neuroimages. *Comput Biomed Res*, 29(3),

   162-173.

   - Dale, A.M., Fischl, B., Sereno, M.I. (1999). Cortical surface-based

   analysis. I. Segmentation and surface reconstruction. *Neuroimage*,

   9(2), 179-194, doi:10.1006/nimg.1998.0395.

   - Esteban, O., Markiewicz, C.J., Blair, R.W., Moodie, C.A., Isik, A.I.,

   Erramuzpe, A., Kent, J.D., Goncalves, M., DuPre, E., Snyder, M., Oya, H.,

   Ghosh, S.S., Wright, J., Durnez, J., Poldrack, R.A., Gorgolewski, K.J.

   (2018). fMRIPrep: a robust preprocessing pipeline for functional MRI. *Nat

   Methods*, doi:10.1038/s41592-018-0235-4.

   - Gorgolewski, K.J., Auer, T., Calhoun, V.D., Craddock, R.C., Das, S.,

   Duff, E.P., Flandin, G., Ghosh, S.S., Glatard, T., Halchenko, Y.O.,

   Handwerker, D.A., Hanke, M., Keator, D., Li, X., Michael, Z., Maumet, C.,

   Nichols, B.N., Nichols, T.E., Pellman, J., Poline, J.B., Rokem, A.,

   Schaefer, G., Sochat, V., Triplett, W., Turner, J.A., Varoquaux, G.,

   Poldrack, R.A. (2016). The brain imaging data structure, a format for

   organizing and describing outputs of neuroimaging experiments. *Sci Data*,

   3, 160044, doi:10.1038/sdata.2016.44.

   - Jenkinson, M., Bannister, P., Brady, M., Smith, S. (2002). Improved

   optimization for the robust and accurate linear registration and motion

   correction of brain images. *Neuroimage*, 17(2), 825-841.

   - Tange, O. (2011). Gnu parallel-the command-line power tool. *The

   USENIX Magazine*, 36(1), 42-47.

   - Winkler, A.M., Ridgway, G.R., Douaud, G., Nichols, T.E., Smith, S.M.

   (2016). Faster permutation inference in brain imaging. *Neuroimage*,

   141, 502-516, doi:10.1016/j.neuroimage.2016.05.068.

   - Yan, C.G., Wang, X.D., Zuo, X.N., Zang, Y.F. (2016). DPABI: Data

   Processing & Analysis for (Resting-State) Brain Imaging.

   *Neuroinformatics*, 14(3), 339-351, doi:10.1007/s12021-016-9299-4.

   - Zang, Y., Jiang, T., Lu, Y., He, Y., Tian, L. (2004). Regional

   homogeneity approach to fMRI data analysis. *Neuroimage*, 22(1),

   394-400, doi:http://dx.doi.org/10.1016/j.neuroimage.2003.12.030.

   - Zang, Y.F., He, Y., Zhu, C.Z., Cao, Q.J., Sui, M.Q., Liang, M., Tian,

   L.X., Jiang, T.Z., Wang, Y.F. (2007). Altered baseline brain activity in

   children with ADHD revealed by resting-state functional MRI. *Brain Dev*,

   29(2), 83-91, doi:10.1016/j.braindev.2006.07.002.

   - Zou, Q.-H., Zhu, C.-Z., Yang, Y., Zuo, X.-N., Long, X.-Y., Cao, Q.-J.,

   Wang, Y.-F., Zang, Y.-F. (2008). An improved approach to detection of

   amplitude of low-frequency fluctuation (ALFF) for resting-state fMRI:

   Fractional ALFF. *Journal of Neuroscience Methods*, 172(1), 137-141, doi:

   https://nam02.safelinks.protection.outlook.com/?url=http%3A%2F%2Fdx.doi.org%2F10.1016%2Fj.jneumeth.2008.04.012&amp;data=02%7C01%7Cbrainsymposium%40UTDALLAS.EDU%7C778dabbb7b634ba7d73008d74143822f%7C8d281d1d9c4d4bf7b16e032d15de9f6c%7C0%7C0%7C637049630524585704&amp;sdata=T%2BD4lJZTWj7v6T%2BDqL5ZN8r3j9g8sAlV2CEKXl12Gyo%3D&amp;reserved=0.

   - Zuo, X.-N., Xing, X.-X. (2014). Test-retest reliabilities of

   resting-state FMRI measurements in human brain functional connectomics: A

   systems neuroscience perspective. *Neuroscience & Biobehavioral Reviews*,

   45, 100-118, doi:http://dx.doi.org/10.1016/j.neubiorev.2014.05.009.



Best,



Chao-Gan



--

Chao-Gan YAN, Ph.D.

Professor, Principal Investigator

Director, International Big-Data Center for Depression Research Deputy Director, Magnetic Resonance Imaging Research Center Institute of Psychology, Chinese Academy of Sciences

16 Lincui Road, Chaoyang District, Beijing 100101, China

-

Initiator

<https://nam02.safelinks.protection.outlook.com/?url=http%3A%2F%2Frfmri.org%2FDPARSF&amp;data=02%7C01%7Cbrainsymposium%40UTDALLAS.EDU%7C778dabbb7b634ba7d73008d74143822f%7C8d281d1d9c4d4bf7b16e032d15de9f6c%7C0%7C0%7C637049630524585704&amp;sdata=%2FnPH1bcYb%2B5N4Bsh84TuAPy5SBxQcxUb51Whfj5S3n4%3D&amp;reserved=0>DPABI <https://nam02.safelinks.protection.outlook.com/?url=http%3A%2F%2Frfmri.org%2FDPABI&amp;data=02%7C01%7Cbrainsymposium%40UTDALLAS.EDU%7C778dabbb7b634ba7d73008d74143822f%7C8d281d1d9c4d4bf7b16e032d15de9f6c%7C0%7C0%7C637049630524585704&amp;sdata=cnJQS8%2BFW6BDiQyUzZQva0MwE0H44SxHtKKxjIUdrJA%3D&amp;reserved=0>

<https://nam02.safelinks.protection.outlook.com/?url=http%3A%2F%2Frfmri.org%2FDPARSF&amp;data=02%7C01%7Cbrainsymposium%40UTDALLAS.EDU%7C778dabbb7b634ba7d73008d74143822f%7C8d281d1d9c4d4bf7b16e032d15de9f6c%7C0%7C0%7C637049630524585704&amp;sdata=%2FnPH1bcYb%2B5N4Bsh84TuAPy5SBxQcxUb51Whfj5S3n4%3D&amp;reserved=0>, <https://nam02.safelinks.protection.outlook.com/?url=http%3A%2F%2Fdpabi.org&amp;data=02%7C01%7Cbrainsymposium%40UTDALLAS.EDU%7C778dabbb7b634ba7d73008d74143822f%7C8d281d1d9c4d4bf7b16e032d15de9f6c%7C0%7C0%7C637049630524585704&amp;sdata=OjslQJj6osWn%2FBrKA%2FdWfyQL%2FRcX6Y5fzGfEuLI%2BcpE%3D&amp;reserved=0>DPARSF

<https://nam02.safelinks.protection.outlook.com/?url=http%3A%2F%2Frfmri.org%2FDPARSF&amp;data=02%7C01%7Cbrainsymposium%40UTDALLAS.EDU%7C778dabbb7b634ba7d73008d74143822f%7C8d281d1d9c4d4bf7b16e032d15de9f6c%7C0%7C0%7C637049630524585704&amp;sdata=%2FnPH1bcYb%2B5N4Bsh84TuAPy5SBxQcxUb51Whfj5S3n4%3D&amp;reserved=0>, PRN <https://nam02.safelinks.protection.outlook.com/?url=http%3A%2F%2Frfmri.org%2FPRN&amp;data=02%7C01%7Cbrainsymposium%40UTDALLAS.EDU%7C778dabbb7b634ba7d73008d74143822f%7C8d281d1d9c4d4bf7b16e032d15de9f6c%7C0%7C0%7C637049630524585704&amp;sdata=9d975ykosGpxVpEYOCI2X7FjHn30ExB0rg5BNAXhLAI%3D&amp;reserved=0> and The R-fMRI Network <https://nam02.safelinks.protection.outlook.com/?url=http%3A%2F%2Frfmri.org&amp;data=02%7C01%7Cbrainsymposium%40UTDALLAS.EDU%7C778dabbb7b634ba7d73008d74143822f%7C8d281d1d9c4d4bf7b16e032d15de9f6c%7C0%7C0%7C637049630524585704&amp;sdata=NtX8A4B0ncyUFX4AofOzCzZ1jMvU2FUWq9gUYgF3k%2FA%3D&amp;reserved=0> (RFMRI.ORG <https://nam02.safelinks.protection.outlook.com/?url=http%3A%2F%2Frfmri.org%2F&amp;data=02%7C01%7Cbrainsymposium%40UTDALLAS.EDU%7C778dabbb7b634ba7d73008d74143822f%7C8d281d1d9c4d4bf7b16e032d15de9f6c%7C0%7C0%7C637049630524585704&amp;sdata=CK14%2FrShYjHrzcn1z955Z3jqtQtYddxiF28fzCuR72Q%3D&amp;reserved=0>)

https://nam02.safelinks.protection.outlook.com/?url=http%3A%2F%2Frfmri.org%2Fyan&amp;data=02%7C01%7Cbrainsymposium%40UTDALLAS.EDU%7C778dabbb7b634ba7d73008d74143822f%7C8d281d1d9c4d4bf7b16e032d15de9f6c%7C0%7C0%7C637049630524595699&amp;sdata=Xv7BGVf3uRpFZt6%2B07cT5S9%2BajcKWBC3kWRqRDtAocA%3D&amp;reserved=0

https://nam02.safelinks.protection.outlook.com/?url=http%3A%2F%2Fscholar.google.com%2Fcitations%3Fuser%3DlJQ9B58AAAAJ&amp;data=02%7C01%7Cbrainsymposium%40UTDALLAS.EDU%7C778dabbb7b634ba7d73008d74143822f%7C8d281d1d9c4d4bf7b16e032d15de9f6c%7C0%7C0%7C637049630524595699&amp;sdata=brF4Ye8pN8s%2BbvyStoUkd3bphrhkduDuOS52z%2Bb3DCE%3D&amp;reserved=0



------------------------------



Date:    Tue, 24 Sep 2019 12:29:24 +0100

From:    Tali Weiss <[log in to unmask]>

Subject: Re: Effective connectivity analysis (DCM), second level analysis



Thank you for your quick answer!



Sorry, I was not clear regarding the parametric modulation, I like to run DCM on a task (I use emotional fiilms as events). 

DCM: fit timeseries



I attach SPM.m. All my analysis are on the parametric modulated regressor (I demean the weights) In DCM I did include film (the regressor "unmodulated") NO include regressor of parametric modulation YES



is it correct?

The question is whether region A effect region B during emotional aroused films.





------------------------------



Date:    Tue, 24 Sep 2019 15:52:00 +0200

From:    Martin Hebart <[log in to unmask]>

Subject: Postdoc position in Computational Cognitive Neuroscience (Max Planck Institute CBS, Leipzig, Germany)



The independent research group Vision and Computational Cognition (led by Dr. Martin Hebart) at the Max Planck Institute for Human Cognitive and Brain Sciences in Leipzig, Germany, invites applications for a postdoctoral researcher.



*Our research group *seeks to understand how humans recognize and categorize visually-perceived objects. Towards this goal, we use a wide range of behavioral and neuroimaging methods, including visual psychophysics, online crowdsourcing, functional MRI and magnetoencephalography (MEG). The emphasis lies on identifying and characterizing reproducible and interpretable behavioral and brain activity patterns that serve as a basis for our understanding of mental and neural representations of objects.



*The Max Planck Institute* is equipped with excellent neuroimaging research facilities (several Siemens 3T and 7T MRI scanners, 306-channel MEG, EEG, virtual reality, eye-tracking) and offers dedicated support staff. Leipzig has been called “Germany’s new cultural hot spot“ (The Guardian) and is located just a little over an hour south of Berlin.



*The successful applicant* will have a strong interest in the analysis of large-scale neuroimaging datasets using encoding models and machine-learning methods. Candidates are expected to have considerable experience in the analysis of neuroimaging data, strong programming skills in Python or Matlab, and a solid understanding of inferential statistics.

Experience with machine learning-methods and related tools (e.g.

Scikit-Learn, PyTorch/TensorFlow) and data/code sharing platforms (e.g.

GitHub, Docker) are desirable. Candidates are expected to have a PhD in psychology, cognitive science, neuroscience, computer science, or a related field.



The expected starting date is flexible but no earlier than February 1st, 2020. The position is funded for three years, with the possibility of an extension for an additional year.



For the full job advertisement including more information about the group, qualifications, required documents and application procedures, please go to

https://nam02.safelinks.protection.outlook.com/?url=http%3A%2F%2Fwww.cbs.mpg.de%2Fvacancies&amp;data=02%7C01%7Cbrainsymposium%40UTDALLAS.EDU%7C778dabbb7b634ba7d73008d74143822f%7C8d281d1d9c4d4bf7b16e032d15de9f6c%7C0%7C0%7C637049630524595699&amp;sdata=2JGFp92T67ZQbYUrNscshMUwUIgcjcHdFkYaeURBS5E%3D&amp;reserved=0 (subject heading: *“PD 15/19”*). All applications received by *November 15th, 2019* will be considered.



In case of questions, please contact Martin Hebart at [log in to unmask]



------------------------------



Date:    Tue, 24 Sep 2019 15:52:11 +0200

From:    Martin Hebart <[log in to unmask]>

Subject: PhD positions in Computational Cognitive Neuroscience (Max Planck Institute CBS, Leipzig, Germany)



The independent research group Vision and Computational Cognition (led by Dr. Martin Hebart) at the Max Planck Institute for Human Cognitive and Brain Sciences in Leipzig, Germany, invites applications for PhD students.



*Our research group* seeks to understand how humans recognize and categorize visually-perceived objects. Towards this goal, we use a wide range of behavioral and neuroimaging methods, including visual psychophysics, online crowdsourcing, functional MRI and magnetoencephalography (MEG). The emphasis lies on identifying and characterizing reproducible and interpretable behavioral and brain activity patterns that serve as a basis for our understanding of mental and neural representations of objects.



*The Max Planck Institute* is equipped with excellent neuroimaging research facilities (several Siemens 3T and 7T MRI scanners, 306-channel MEG, EEG, virtual reality, eye-tracking) and offers dedicated support staff. Leipzig has been called “Germany’s new cultural hot spot“ (The Guardian) and is located just a little over an hour south of Berlin.



*The successful applicant* will have a passion for visual and cognitive neuroscience, with good analytical and quantitative skills, and the ability to work both independently and as part of a team. Candidates are expected to have experience in the analysis of behavioral or neuroimaging data (e.g.

fMRI, MEG, or EEG) and have good programming skills in Python, Matlab,or R.

Experience with machine-learning methods (e.g. Scikit-Learn, PyTorch, TensorFlow), data/code sharing platforms (e.g. GitHub, Docker), or basic web programming (HTML, JavaScript) are desirable. Scientific publications are not required; however, applicants are expected to demonstrate scientific writing skills.



The expected starting date is flexible but no earlier than February 1st, 2020. The position is funded for three years, with the possibility of an extension for an additional year.



For the full job advertisement including more information about the group, qualifications, required documents and application procedures, please go to

https://nam02.safelinks.protection.outlook.com/?url=http%3A%2F%2Fwww.cbs.mpg.de%2Fvacancies&amp;data=02%7C01%7Cbrainsymposium%40UTDALLAS.EDU%7C778dabbb7b634ba7d73008d74143822f%7C8d281d1d9c4d4bf7b16e032d15de9f6c%7C0%7C0%7C637049630524595699&amp;sdata=2JGFp92T67ZQbYUrNscshMUwUIgcjcHdFkYaeURBS5E%3D&amp;reserved=0 (subject heading: *“PhD 14/19”*). All applications received by *November 15th, 2019* will be considered.



In case of questions, please contact Martin Hebart at [log in to unmask] <[log in to unmask]>.



------------------------------



Date:    Tue, 24 Sep 2019 23:01:35 +0300

From:    Vadim Axel <[log in to unmask]>

Subject: Re: co-registration problem with HCP data (both 3T and 7T)



Thank you very much for the answer. I will try to use field maps.

I was surprised because I never had similar problem with my own data. But the HCP data I would expect to be of a good quality.



On Mon, Sep 23, 2019 at 12:59 PM Ashburner, John <[log in to unmask]>

wrote:



> Coregistration only estimates the six parameters of a rigid-body 

> transformation. Distortions in fMRI mean that this is often not enough 

> to achieve good alignment.  You could try using some form of field 

> mapping to try to correct the distortions.  For DWI, the 

> blip-up/blip-down approach to fixing distortions works well, but this tends not to be used for fMRI.

>

> Best regards,

> -John

>

> ------------------------------

> *From:* SPM (Statistical Parametric Mapping) <[log in to unmask]> on 

> behalf of Vadim Axel <[log in to unmask]>

> *Sent:* 21 September 2019 15:01

> *To:* [log in to unmask] <[log in to unmask]>

> *Subject:* [SPM] co-registration problem with HCP data (both 3T and 

> 7T)

>

> Dear list,

>

> I tried to preprocess the raw HCP fMRI data in SPM. For some reason, 

> the co-registration between T1 and EPI is imprecise (both 3T and 7T EPI data).

> I attach the screenshot (the EPI is displaced relative to T1 in a 

> vertical axis). I tried several HCP subjects, but I still get a 

> problem. I tried to reorient first one of the images to make two 

> modalities closer, but the problem persists. I use SPM12 Coregister 

> (Estimate & Reslice option). Any idea?  Here is the data if someone would like to take a look:

> https://nam02.safelinks.protection.outlook.com/?url=https%3A%2F%2Fwe.t

> l%2Ft-hUo4IQ16rV&amp;data=02%7C01%7Cbrainsymposium%40UTDALLAS.EDU%7C77

> 8dabbb7b634ba7d73008d74143822f%7C8d281d1d9c4d4bf7b16e032d15de9f6c%7C0%

> 7C0%7C637049630524595699&amp;sdata=5wH6i12VgTiP6aDN2eVoW1OpfLVa59ap7zc

> 9kycdD9U%3D&amp;reserved=0 

> <https://nam02.safelinks.protection.outlook.com/?url=https%3A%2F%2Fwe.

> tl%2Ft-hUo4IQ16rV&amp;data=02%7C01%7Cbrainsymposium%40UTDALLAS.EDU%7C7

> 78dabbb7b634ba7d73008d74143822f%7C8d281d1d9c4d4bf7b16e032d15de9f6c%7C0

> %7C0%7C637049630524595699&amp;sdata=5wH6i12VgTiP6aDN2eVoW1OpfLVa59ap7z

> c9kycdD9U%3D&amp;reserved=0>

> This is subject 100610.

>

> Many thanks,

> Vadim

>



------------------------------



End of SPM Digest - 23 Sep 2019 to 24 Sep 2019 (#2019-259)

**********************************************************

Top of Message | Previous Page | Permalink

JiscMail Tools


RSS Feeds and Sharing


Advanced Options


Archives

April 2024
March 2024
February 2024
January 2024
December 2023
November 2023
October 2023
September 2023
August 2023
July 2023
June 2023
May 2023
April 2023
March 2023
February 2023
January 2023
December 2022
November 2022
October 2022
September 2022
August 2022
July 2022
June 2022
May 2022
April 2022
March 2022
February 2022
January 2022
December 2021
November 2021
October 2021
September 2021
August 2021
July 2021
June 2021
May 2021
April 2021
March 2021
February 2021
January 2021
December 2020
November 2020
October 2020
September 2020
August 2020
July 2020
June 2020
May 2020
April 2020
March 2020
February 2020
January 2020
December 2019
November 2019
October 2019
September 2019
August 2019
July 2019
June 2019
May 2019
April 2019
March 2019
February 2019
January 2019
December 2018
November 2018
October 2018
September 2018
August 2018
July 2018
June 2018
May 2018
April 2018
March 2018
February 2018
January 2018
December 2017
November 2017
October 2017
September 2017
August 2017
July 2017
June 2017
May 2017
April 2017
March 2017
February 2017
January 2017
December 2016
November 2016
October 2016
September 2016
August 2016
July 2016
June 2016
May 2016
April 2016
March 2016
February 2016
January 2016
December 2015
November 2015
October 2015
September 2015
August 2015
July 2015
June 2015
May 2015
April 2015
March 2015
February 2015
January 2015
December 2014
November 2014
October 2014
September 2014
August 2014
July 2014
June 2014
May 2014
April 2014
March 2014
February 2014
January 2014
December 2013
November 2013
October 2013
September 2013
August 2013
July 2013
June 2013
May 2013
April 2013
March 2013
February 2013
January 2013
December 2012
November 2012
October 2012
September 2012
August 2012
July 2012
June 2012
May 2012
April 2012
March 2012
February 2012
January 2012
December 2011
November 2011
October 2011
September 2011
August 2011
July 2011
June 2011
May 2011
April 2011
March 2011
February 2011
January 2011
December 2010
November 2010
October 2010
September 2010
August 2010
July 2010
June 2010
May 2010
April 2010
March 2010
February 2010
January 2010
December 2009
November 2009
October 2009
September 2009
August 2009
July 2009
June 2009
May 2009
April 2009
March 2009
February 2009
January 2009
December 2008
November 2008
October 2008
September 2008
August 2008
July 2008
June 2008
May 2008
April 2008
March 2008
February 2008
January 2008
December 2007
November 2007
October 2007
September 2007
August 2007
July 2007
June 2007
May 2007
April 2007
March 2007
February 2007
January 2007
2006
2005
2004
2003
2002
2001
2000
1999
1998


JiscMail is a Jisc service.

View our service policies at https://www.jiscmail.ac.uk/policyandsecurity/ and Jisc's privacy policy at https://www.jisc.ac.uk/website/privacy-notice

For help and support help@jisc.ac.uk

Secured by F-Secure Anti-Virus CataList Email List Search Powered by the LISTSERV Email List Manager