Please remove this email from your list.
Thank you.
-----Original Message-----
From: SPM (Statistical Parametric Mapping) <[log in to unmask]> On Behalf Of SPM automatic digest system
Sent: Tuesday, September 24, 2019 6:00 PM
To: [log in to unmask]
Subject: SPM Digest - 23 Sep 2019 to 24 Sep 2019 (#2019-259)
There are 5 messages totaling 2563 lines in this issue.
Topics of the day:
1. DPABISurf V1.2 Is Released
2. Effective connectivity analysis (DCM), second level analysis
3. Postdoc position in Computational Cognitive Neuroscience (Max Planck
Institute CBS, Leipzig, Germany)
4. PhD positions in Computational Cognitive Neuroscience (Max Planck
Institute CBS, Leipzig, Germany)
5. co-registration problem with HCP data (both 3T and 7T)
----------------------------------------------------------------------
Date: Tue, 24 Sep 2019 09:12:55 +0800
From: YAN Chao-Gan <[log in to unmask]>
Subject: DPABISurf V1.2 Is Released
Dear Colleagues,
We are pleased to announce the release of DPABISurf V1.2!
DPABISurf is a surface-based resting-state fMRI data analysis toolbox evolved from DPABI/DPARSF, as easy-to-use as DPABI/DPARSF. DPABISurf is based on fMRIPprep 1.5.0 (Esteban et al., 2018)(RRID:SCR_016216), and based on FreeSurfer 6.0.1 (Dale et al., 1999)(RRID:SCR_001847), ANTs 2.2.0 (Avants et al., 2008)(RRID:SCR_004757), FSL 5.0.9 (Jenkinson et al., 2002)(RRID:SCR_002823), AFNI 20160207 (Cox, 1996)(RRID:SCR_005927), SPM12 (Ashburner, 2012)(RRID:SCR_007037), PALM alpha112 (Winkler et al., 2016), GNU Parallel (Tange, 2011), MATLAB (The MathWorks Inc., Natick, MA, US) (RRID:SCR_001622), Docker (https://nam02.safelinks.protection.outlook.com/?url=https%3A%2F%2Fdocker.com&data=02%7C01%7Cbrainsymposium%40UTDALLAS.EDU%7C778dabbb7b634ba7d73008d74143822f%7C8d281d1d9c4d4bf7b16e032d15de9f6c%7C0%7C0%7C637049630524575704&sdata=LSgMzFNScYYH1rNA6DfHckKwZ1ziSkPb7E0EdyXcvMs%3D&reserved=0) (RRID:SCR_016445), and DPABI
V4.2 (Yan et al., 2016)(RRID:SCR_010501). DPABISurf provides user-friendly graphical user interface (GUI) for pipeline surface-based preprocessing, statistical analyses and results viewing, while requires no programming/scripting skills from the users.
<https://nam02.safelinks.protection.outlook.com/?url=http%3A%2F%2Fwww.rfmri.org%2Fdpabi&data=02%7C01%7Cbrainsymposium%40UTDALLAS.EDU%7C778dabbb7b634ba7d73008d74143822f%7C8d281d1d9c4d4bf7b16e032d15de9f6c%7C0%7C0%7C637049630524585704&sdata=z5KCj73MsGNX0Qg4HshJKVZNMX00HRSDuSCT%2F1kRlYw%3D&reserved=0>
The DPABISurf pipeline first converts the user specified data into BIDS format (Gorgolewski et al., 2016), and then calls fMRIPprep 1.5.0 docker to preprocess the structural and functional MRI data, which integrates FreeSurfer, ANTs, FSL and AFNI. With fMRIPprep, the data is processed into FreeSurfer fsaverage5 surface space and MNI volume space. DPABISurf further performs nuisance covariates regression (including ICA-AROMA) on the surface-based data (volume-based data is processed as well), and then calculate the commonly used R-fMRI metrics: amplitude of low frequency fluctuation (ALFF) (Zang et al., 2007), fractional ALFF (Zou et al., 2008), regional homogeneity (Zang et al., 2004), degree centrality (Zuo and Xing, 2014), and seed-based functional connectivity. DPABISurf also performs surface-based smoothing by calling FreeSurfer’s mri_surf2surf command.
These processed metrics then enters surfaced-based statistical analyses within DPABISurf, which could perform surfaced-based permutation test with TFCE by integrating PALM. Finally, the corrected results could be viewed by the convenient surface viewer DPABISurf_VIEW, which is derived from spm_mesh_render.m.
<https://nam02.safelinks.protection.outlook.com/?url=http%3A%2F%2Fwww.rfmri.org%2Fdpabi&data=02%7C01%7Cbrainsymposium%40UTDALLAS.EDU%7C778dabbb7b634ba7d73008d74143822f%7C8d281d1d9c4d4bf7b16e032d15de9f6c%7C0%7C0%7C637049630524585704&sdata=z5KCj73MsGNX0Qg4HshJKVZNMX00HRSDuSCT%2F1kRlYw%3D&reserved=0>
DPABISurf is designed to make surface-based data analysis require minimum manual operations and almost no programming/scripting experience. We anticipate this open-source toolbox will assist novices and expert users alike and continue to support advancing R-fMRI methodology and its application to clinical translational studies.
DPABISurf is open-source and distributed under GNU/GPL, available with DPABI at https://nam02.safelinks.protection.outlook.com/?url=http%3A%2F%2Fwww.rfmri.org%2Fdpabi&data=02%7C01%7Cbrainsymposium%40UTDALLAS.EDU%7C778dabbb7b634ba7d73008d74143822f%7C8d281d1d9c4d4bf7b16e032d15de9f6c%7C0%7C0%7C637049630524585704&sdata=z5KCj73MsGNX0Qg4HshJKVZNMX00HRSDuSCT%2F1kRlYw%3D&reserved=0. It supports Windows 10 Pro, MacOS and Linux operating systems. You can run it with or without MATLAB.
1. With MATLAB.
1.1. Please go to https://nam02.safelinks.protection.outlook.com/?url=http%3A%2F%2Fwww.rfmri.org%2Fdpabi&data=02%7C01%7Cbrainsymposium%40UTDALLAS.EDU%7C778dabbb7b634ba7d73008d74143822f%7C8d281d1d9c4d4bf7b16e032d15de9f6c%7C0%7C0%7C637049630524585704&sdata=z5KCj73MsGNX0Qg4HshJKVZNMX00HRSDuSCT%2F1kRlYw%3D&reserved=0 to download DPABI.
1.2. Add with subfolders for DPABI in MATLAB's path setting.
1.3. Input 'dpabi' and then follow the instructions of the "Install" Button on DPABISurf.
2. Without MATLAB.
2.1. Install Docker.
2.2. Terminal: docker pull cgyan/dpabi
2.3. Terminal: docker run -d --rm -v
/My/FreeSurferLicense/Path/license.txt:/opt/freesurfer/license.txt
-v /My/Data/Path:/data -p 5925:5925 cgyan/dpabi x11vnc -forever -shared -usepw -create -rfbport 5925
/My/FreeSurferLicense/Path/license.txt: Where you stored the FreeSurferLicense got from https://nam02.safelinks.protection.outlook.com/?url=https%3A%2F%2Fsurfer.nmr.mgh.harvard.edu%2Fregistration.html&data=02%7C01%7Cbrainsymposium%40UTDALLAS.EDU%7C778dabbb7b634ba7d73008d74143822f%7C8d281d1d9c4d4bf7b16e032d15de9f6c%7C0%7C0%7C637049630524585704&sdata=VkNhuZrGPfebUCE%2BX0f0F5WPKa%2BSubzQmE%2BIL9ga8T8%3D&reserved=0.
/My/Data/Path: This is where you stored your data. In Docker, the path is /data.
2.4. Open VNC Viewer, connect to localhost:5925, the password is 'dpabi'.
2.5. In the terminal within the VNC Viewer, input "bash", and then input:
/opt/DPABI/DPABI_StandAlone/run_DPABI_StandAlone.sh ${MCRPath}
Now please enjoy the StandAlone version of DPABISurf with GUI!
If you don't want to run with GUI, you can also call the compiled version of DPABISurf_run. E.g., docker run -it --rm -v /My/FreeSurferLicense/Path/license.txt:/opt/freesurfer/license.txt
-v /My/Data/Path:/data cgyan/dpabi /bin/bash /opt/DPABI/DPABI_StandAlone/run_DPABISurf_run_StandAlone.sh ${MCRPath} /data/DPABISurf_Cfg.mat New features of DPABISurf_V1.2_190919 within DPABI_V4.2_190919 (download at https://nam02.safelinks.protection.outlook.com/?url=http%3A%2F%2Frfmri.org%2Fdpabi&data=02%7C01%7Cbrainsymposium%40UTDALLAS.EDU%7C778dabbb7b634ba7d73008d74143822f%7C8d281d1d9c4d4bf7b16e032d15de9f6c%7C0%7C0%7C637049630524585704&sdata=EaSZtunU8tnsLp%2FDitFHEJGzZ4utdYV%2Fo683cA%2BBnBk%3D&reserved=0, please also update the docker file by: docker pull
cgyan/dpabi):
1. DPABISurf_V1.2_190919 updated.
1.1. A quality control module was added to DPABISurf. Now users can quality control surface reconstruction, EPI to T1 registration and T1 to MNI registration for all the subjects in one HTML file, respectively (based on fmriprep 1.5.0). For volume-based analysis, users can also generate group mask for DPABISurf, and exclude subjects by thresholding coverage and head motion.
1.2. DPABISurf now also output sulcus depth and volume in fsaverage and
fsaverage5 spaces for statistical analysis.
1.3. In results organizer of DPABISurf, the redundant files would not be organized now. In addition, the fmriprep and freesurfer files were backed up, while excluding T1 image that may have privacy information such as face.
2. DPABI_VIEW has a new function "Surface View with DPABISurf_VIEW" now.
This function will convert the files to fsaverage surface using freesurfer's mri_vol2surf command. Then the results were displayed by calling DPABISurf_VIEW to generate surface-based picture.
Tips:
1) For Linux or Mac OS, please start matlab from terminal in order to reach docker in DPABI (e.g., Linux: matlab; Mac: open /Applications/MATLAB_R2018a.app/).
2) Before running DPABISurf_Pipeline, you can test the docker environment by running DPABI->DPABISurf->Utilities->Volume-Surface Projector. If the file can be successfully projected to surface, then the software is all set.
References:
- Ashburner, J. (2012). SPM: a history. *Neuroimage*, 62(2), 791-800,
doi:10.1016/j.neuroimage.2011.10.025.
- Avants, B.B., Epstein, C.L., Grossman, M., Gee, J.C. (2008). Symmetric
diffeomorphic image registration with cross-correlation: evaluating
automated labeling of elderly and neurodegenerative brain. *Med Image
Anal*, 12(1), 26-41, doi:10.1016/j.media.2007.06.004.
- Cox, R.W. (1996). AFNI: software for analysis and visualization of
functional magnetic resonance neuroimages. *Comput Biomed Res*, 29(3),
162-173.
- Dale, A.M., Fischl, B., Sereno, M.I. (1999). Cortical surface-based
analysis. I. Segmentation and surface reconstruction. *Neuroimage*,
9(2), 179-194, doi:10.1006/nimg.1998.0395.
- Esteban, O., Markiewicz, C.J., Blair, R.W., Moodie, C.A., Isik, A.I.,
Erramuzpe, A., Kent, J.D., Goncalves, M., DuPre, E., Snyder, M., Oya, H.,
Ghosh, S.S., Wright, J., Durnez, J., Poldrack, R.A., Gorgolewski, K.J.
(2018). fMRIPrep: a robust preprocessing pipeline for functional MRI. *Nat
Methods*, doi:10.1038/s41592-018-0235-4.
- Gorgolewski, K.J., Auer, T., Calhoun, V.D., Craddock, R.C., Das, S.,
Duff, E.P., Flandin, G., Ghosh, S.S., Glatard, T., Halchenko, Y.O.,
Handwerker, D.A., Hanke, M., Keator, D., Li, X., Michael, Z., Maumet, C.,
Nichols, B.N., Nichols, T.E., Pellman, J., Poline, J.B., Rokem, A.,
Schaefer, G., Sochat, V., Triplett, W., Turner, J.A., Varoquaux, G.,
Poldrack, R.A. (2016). The brain imaging data structure, a format for
organizing and describing outputs of neuroimaging experiments. *Sci Data*,
3, 160044, doi:10.1038/sdata.2016.44.
- Jenkinson, M., Bannister, P., Brady, M., Smith, S. (2002). Improved
optimization for the robust and accurate linear registration and motion
correction of brain images. *Neuroimage*, 17(2), 825-841.
- Tange, O. (2011). Gnu parallel-the command-line power tool. *The
USENIX Magazine*, 36(1), 42-47.
- Winkler, A.M., Ridgway, G.R., Douaud, G., Nichols, T.E., Smith, S.M.
(2016). Faster permutation inference in brain imaging. *Neuroimage*,
141, 502-516, doi:10.1016/j.neuroimage.2016.05.068.
- Yan, C.G., Wang, X.D., Zuo, X.N., Zang, Y.F. (2016). DPABI: Data
Processing & Analysis for (Resting-State) Brain Imaging.
*Neuroinformatics*, 14(3), 339-351, doi:10.1007/s12021-016-9299-4.
- Zang, Y., Jiang, T., Lu, Y., He, Y., Tian, L. (2004). Regional
homogeneity approach to fMRI data analysis. *Neuroimage*, 22(1),
394-400, doi:http://dx.doi.org/10.1016/j.neuroimage.2003.12.030.
- Zang, Y.F., He, Y., Zhu, C.Z., Cao, Q.J., Sui, M.Q., Liang, M., Tian,
L.X., Jiang, T.Z., Wang, Y.F. (2007). Altered baseline brain activity in
children with ADHD revealed by resting-state functional MRI. *Brain Dev*,
29(2), 83-91, doi:10.1016/j.braindev.2006.07.002.
- Zou, Q.-H., Zhu, C.-Z., Yang, Y., Zuo, X.-N., Long, X.-Y., Cao, Q.-J.,
Wang, Y.-F., Zang, Y.-F. (2008). An improved approach to detection of
amplitude of low-frequency fluctuation (ALFF) for resting-state fMRI:
Fractional ALFF. *Journal of Neuroscience Methods*, 172(1), 137-141, doi:
https://nam02.safelinks.protection.outlook.com/?url=http%3A%2F%2Fdx.doi.org%2F10.1016%2Fj.jneumeth.2008.04.012&data=02%7C01%7Cbrainsymposium%40UTDALLAS.EDU%7C778dabbb7b634ba7d73008d74143822f%7C8d281d1d9c4d4bf7b16e032d15de9f6c%7C0%7C0%7C637049630524585704&sdata=T%2BD4lJZTWj7v6T%2BDqL5ZN8r3j9g8sAlV2CEKXl12Gyo%3D&reserved=0.
- Zuo, X.-N., Xing, X.-X. (2014). Test-retest reliabilities of
resting-state FMRI measurements in human brain functional connectomics: A
systems neuroscience perspective. *Neuroscience & Biobehavioral Reviews*,
45, 100-118, doi:http://dx.doi.org/10.1016/j.neubiorev.2014.05.009.
Best,
Chao-Gan
--
Chao-Gan YAN, Ph.D.
Professor, Principal Investigator
Director, International Big-Data Center for Depression Research Deputy Director, Magnetic Resonance Imaging Research Center Institute of Psychology, Chinese Academy of Sciences
16 Lincui Road, Chaoyang District, Beijing 100101, China
-
Initiator
<https://nam02.safelinks.protection.outlook.com/?url=http%3A%2F%2Frfmri.org%2FDPARSF&data=02%7C01%7Cbrainsymposium%40UTDALLAS.EDU%7C778dabbb7b634ba7d73008d74143822f%7C8d281d1d9c4d4bf7b16e032d15de9f6c%7C0%7C0%7C637049630524585704&sdata=%2FnPH1bcYb%2B5N4Bsh84TuAPy5SBxQcxUb51Whfj5S3n4%3D&reserved=0>DPABI <https://nam02.safelinks.protection.outlook.com/?url=http%3A%2F%2Frfmri.org%2FDPABI&data=02%7C01%7Cbrainsymposium%40UTDALLAS.EDU%7C778dabbb7b634ba7d73008d74143822f%7C8d281d1d9c4d4bf7b16e032d15de9f6c%7C0%7C0%7C637049630524585704&sdata=cnJQS8%2BFW6BDiQyUzZQva0MwE0H44SxHtKKxjIUdrJA%3D&reserved=0>
<https://nam02.safelinks.protection.outlook.com/?url=http%3A%2F%2Frfmri.org%2FDPARSF&data=02%7C01%7Cbrainsymposium%40UTDALLAS.EDU%7C778dabbb7b634ba7d73008d74143822f%7C8d281d1d9c4d4bf7b16e032d15de9f6c%7C0%7C0%7C637049630524585704&sdata=%2FnPH1bcYb%2B5N4Bsh84TuAPy5SBxQcxUb51Whfj5S3n4%3D&reserved=0>, <https://nam02.safelinks.protection.outlook.com/?url=http%3A%2F%2Fdpabi.org&data=02%7C01%7Cbrainsymposium%40UTDALLAS.EDU%7C778dabbb7b634ba7d73008d74143822f%7C8d281d1d9c4d4bf7b16e032d15de9f6c%7C0%7C0%7C637049630524585704&sdata=OjslQJj6osWn%2FBrKA%2FdWfyQL%2FRcX6Y5fzGfEuLI%2BcpE%3D&reserved=0>DPARSF
<https://nam02.safelinks.protection.outlook.com/?url=http%3A%2F%2Frfmri.org%2FDPARSF&data=02%7C01%7Cbrainsymposium%40UTDALLAS.EDU%7C778dabbb7b634ba7d73008d74143822f%7C8d281d1d9c4d4bf7b16e032d15de9f6c%7C0%7C0%7C637049630524585704&sdata=%2FnPH1bcYb%2B5N4Bsh84TuAPy5SBxQcxUb51Whfj5S3n4%3D&reserved=0>, PRN <https://nam02.safelinks.protection.outlook.com/?url=http%3A%2F%2Frfmri.org%2FPRN&data=02%7C01%7Cbrainsymposium%40UTDALLAS.EDU%7C778dabbb7b634ba7d73008d74143822f%7C8d281d1d9c4d4bf7b16e032d15de9f6c%7C0%7C0%7C637049630524585704&sdata=9d975ykosGpxVpEYOCI2X7FjHn30ExB0rg5BNAXhLAI%3D&reserved=0> and The R-fMRI Network <https://nam02.safelinks.protection.outlook.com/?url=http%3A%2F%2Frfmri.org&data=02%7C01%7Cbrainsymposium%40UTDALLAS.EDU%7C778dabbb7b634ba7d73008d74143822f%7C8d281d1d9c4d4bf7b16e032d15de9f6c%7C0%7C0%7C637049630524585704&sdata=NtX8A4B0ncyUFX4AofOzCzZ1jMvU2FUWq9gUYgF3k%2FA%3D&reserved=0> (RFMRI.ORG <https://nam02.safelinks.protection.outlook.com/?url=http%3A%2F%2Frfmri.org%2F&data=02%7C01%7Cbrainsymposium%40UTDALLAS.EDU%7C778dabbb7b634ba7d73008d74143822f%7C8d281d1d9c4d4bf7b16e032d15de9f6c%7C0%7C0%7C637049630524585704&sdata=CK14%2FrShYjHrzcn1z955Z3jqtQtYddxiF28fzCuR72Q%3D&reserved=0>)
https://nam02.safelinks.protection.outlook.com/?url=http%3A%2F%2Frfmri.org%2Fyan&data=02%7C01%7Cbrainsymposium%40UTDALLAS.EDU%7C778dabbb7b634ba7d73008d74143822f%7C8d281d1d9c4d4bf7b16e032d15de9f6c%7C0%7C0%7C637049630524595699&sdata=Xv7BGVf3uRpFZt6%2B07cT5S9%2BajcKWBC3kWRqRDtAocA%3D&reserved=0
https://nam02.safelinks.protection.outlook.com/?url=http%3A%2F%2Fscholar.google.com%2Fcitations%3Fuser%3DlJQ9B58AAAAJ&data=02%7C01%7Cbrainsymposium%40UTDALLAS.EDU%7C778dabbb7b634ba7d73008d74143822f%7C8d281d1d9c4d4bf7b16e032d15de9f6c%7C0%7C0%7C637049630524595699&sdata=brF4Ye8pN8s%2BbvyStoUkd3bphrhkduDuOS52z%2Bb3DCE%3D&reserved=0
------------------------------
Date: Tue, 24 Sep 2019 12:29:24 +0100
From: Tali Weiss <[log in to unmask]>
Subject: Re: Effective connectivity analysis (DCM), second level analysis
Thank you for your quick answer!
Sorry, I was not clear regarding the parametric modulation, I like to run DCM on a task (I use emotional fiilms as events).
DCM: fit timeseries
I attach SPM.m. All my analysis are on the parametric modulated regressor (I demean the weights) In DCM I did include film (the regressor "unmodulated") NO include regressor of parametric modulation YES
is it correct?
The question is whether region A effect region B during emotional aroused films.
------------------------------
Date: Tue, 24 Sep 2019 15:52:00 +0200
From: Martin Hebart <[log in to unmask]>
Subject: Postdoc position in Computational Cognitive Neuroscience (Max Planck Institute CBS, Leipzig, Germany)
The independent research group Vision and Computational Cognition (led by Dr. Martin Hebart) at the Max Planck Institute for Human Cognitive and Brain Sciences in Leipzig, Germany, invites applications for a postdoctoral researcher.
*Our research group *seeks to understand how humans recognize and categorize visually-perceived objects. Towards this goal, we use a wide range of behavioral and neuroimaging methods, including visual psychophysics, online crowdsourcing, functional MRI and magnetoencephalography (MEG). The emphasis lies on identifying and characterizing reproducible and interpretable behavioral and brain activity patterns that serve as a basis for our understanding of mental and neural representations of objects.
*The Max Planck Institute* is equipped with excellent neuroimaging research facilities (several Siemens 3T and 7T MRI scanners, 306-channel MEG, EEG, virtual reality, eye-tracking) and offers dedicated support staff. Leipzig has been called “Germany’s new cultural hot spot“ (The Guardian) and is located just a little over an hour south of Berlin.
*The successful applicant* will have a strong interest in the analysis of large-scale neuroimaging datasets using encoding models and machine-learning methods. Candidates are expected to have considerable experience in the analysis of neuroimaging data, strong programming skills in Python or Matlab, and a solid understanding of inferential statistics.
Experience with machine learning-methods and related tools (e.g.
Scikit-Learn, PyTorch/TensorFlow) and data/code sharing platforms (e.g.
GitHub, Docker) are desirable. Candidates are expected to have a PhD in psychology, cognitive science, neuroscience, computer science, or a related field.
The expected starting date is flexible but no earlier than February 1st, 2020. The position is funded for three years, with the possibility of an extension for an additional year.
For the full job advertisement including more information about the group, qualifications, required documents and application procedures, please go to
https://nam02.safelinks.protection.outlook.com/?url=http%3A%2F%2Fwww.cbs.mpg.de%2Fvacancies&data=02%7C01%7Cbrainsymposium%40UTDALLAS.EDU%7C778dabbb7b634ba7d73008d74143822f%7C8d281d1d9c4d4bf7b16e032d15de9f6c%7C0%7C0%7C637049630524595699&sdata=2JGFp92T67ZQbYUrNscshMUwUIgcjcHdFkYaeURBS5E%3D&reserved=0 (subject heading: *“PD 15/19”*). All applications received by *November 15th, 2019* will be considered.
In case of questions, please contact Martin Hebart at [log in to unmask]
------------------------------
Date: Tue, 24 Sep 2019 15:52:11 +0200
From: Martin Hebart <[log in to unmask]>
Subject: PhD positions in Computational Cognitive Neuroscience (Max Planck Institute CBS, Leipzig, Germany)
The independent research group Vision and Computational Cognition (led by Dr. Martin Hebart) at the Max Planck Institute for Human Cognitive and Brain Sciences in Leipzig, Germany, invites applications for PhD students.
*Our research group* seeks to understand how humans recognize and categorize visually-perceived objects. Towards this goal, we use a wide range of behavioral and neuroimaging methods, including visual psychophysics, online crowdsourcing, functional MRI and magnetoencephalography (MEG). The emphasis lies on identifying and characterizing reproducible and interpretable behavioral and brain activity patterns that serve as a basis for our understanding of mental and neural representations of objects.
*The Max Planck Institute* is equipped with excellent neuroimaging research facilities (several Siemens 3T and 7T MRI scanners, 306-channel MEG, EEG, virtual reality, eye-tracking) and offers dedicated support staff. Leipzig has been called “Germany’s new cultural hot spot“ (The Guardian) and is located just a little over an hour south of Berlin.
*The successful applicant* will have a passion for visual and cognitive neuroscience, with good analytical and quantitative skills, and the ability to work both independently and as part of a team. Candidates are expected to have experience in the analysis of behavioral or neuroimaging data (e.g.
fMRI, MEG, or EEG) and have good programming skills in Python, Matlab,or R.
Experience with machine-learning methods (e.g. Scikit-Learn, PyTorch, TensorFlow), data/code sharing platforms (e.g. GitHub, Docker), or basic web programming (HTML, JavaScript) are desirable. Scientific publications are not required; however, applicants are expected to demonstrate scientific writing skills.
The expected starting date is flexible but no earlier than February 1st, 2020. The position is funded for three years, with the possibility of an extension for an additional year.
For the full job advertisement including more information about the group, qualifications, required documents and application procedures, please go to
https://nam02.safelinks.protection.outlook.com/?url=http%3A%2F%2Fwww.cbs.mpg.de%2Fvacancies&data=02%7C01%7Cbrainsymposium%40UTDALLAS.EDU%7C778dabbb7b634ba7d73008d74143822f%7C8d281d1d9c4d4bf7b16e032d15de9f6c%7C0%7C0%7C637049630524595699&sdata=2JGFp92T67ZQbYUrNscshMUwUIgcjcHdFkYaeURBS5E%3D&reserved=0 (subject heading: *“PhD 14/19”*). All applications received by *November 15th, 2019* will be considered.
In case of questions, please contact Martin Hebart at [log in to unmask] <[log in to unmask]>.
------------------------------
Date: Tue, 24 Sep 2019 23:01:35 +0300
From: Vadim Axel <[log in to unmask]>
Subject: Re: co-registration problem with HCP data (both 3T and 7T)
Thank you very much for the answer. I will try to use field maps.
I was surprised because I never had similar problem with my own data. But the HCP data I would expect to be of a good quality.
On Mon, Sep 23, 2019 at 12:59 PM Ashburner, John <[log in to unmask]>
wrote:
> Coregistration only estimates the six parameters of a rigid-body
> transformation. Distortions in fMRI mean that this is often not enough
> to achieve good alignment. You could try using some form of field
> mapping to try to correct the distortions. For DWI, the
> blip-up/blip-down approach to fixing distortions works well, but this tends not to be used for fMRI.
>
> Best regards,
> -John
>
> ------------------------------
> *From:* SPM (Statistical Parametric Mapping) <[log in to unmask]> on
> behalf of Vadim Axel <[log in to unmask]>
> *Sent:* 21 September 2019 15:01
> *To:* [log in to unmask] <[log in to unmask]>
> *Subject:* [SPM] co-registration problem with HCP data (both 3T and
> 7T)
>
> Dear list,
>
> I tried to preprocess the raw HCP fMRI data in SPM. For some reason,
> the co-registration between T1 and EPI is imprecise (both 3T and 7T EPI data).
> I attach the screenshot (the EPI is displaced relative to T1 in a
> vertical axis). I tried several HCP subjects, but I still get a
> problem. I tried to reorient first one of the images to make two
> modalities closer, but the problem persists. I use SPM12 Coregister
> (Estimate & Reslice option). Any idea? Here is the data if someone would like to take a look:
> https://nam02.safelinks.protection.outlook.com/?url=https%3A%2F%2Fwe.t
> l%2Ft-hUo4IQ16rV&data=02%7C01%7Cbrainsymposium%40UTDALLAS.EDU%7C77
> 8dabbb7b634ba7d73008d74143822f%7C8d281d1d9c4d4bf7b16e032d15de9f6c%7C0%
> 7C0%7C637049630524595699&sdata=5wH6i12VgTiP6aDN2eVoW1OpfLVa59ap7zc
> 9kycdD9U%3D&reserved=0
> <https://nam02.safelinks.protection.outlook.com/?url=https%3A%2F%2Fwe.
> tl%2Ft-hUo4IQ16rV&data=02%7C01%7Cbrainsymposium%40UTDALLAS.EDU%7C7
> 78dabbb7b634ba7d73008d74143822f%7C8d281d1d9c4d4bf7b16e032d15de9f6c%7C0
> %7C0%7C637049630524595699&sdata=5wH6i12VgTiP6aDN2eVoW1OpfLVa59ap7z
> c9kycdD9U%3D&reserved=0>
> This is subject 100610.
>
> Many thanks,
> Vadim
>
------------------------------
End of SPM Digest - 23 Sep 2019 to 24 Sep 2019 (#2019-259)
**********************************************************
|