i have preprocessed my resting state data using fmriprep. If I understand it rightly, bold images are co-rerigstered to MNI152NLin2009cAsym template. Now, I want to extract some ROIs. Are there any atlases avaliable in this space?
I have tried to use AAL2 atlas, but ROI-masks and bold images do not match.
Does anybody have any ideas how to solve this problem?
The voxel size or FOV do not match (because FMRIPREP uses the resolution of the input images), but the locations should overlap nicely when you open the data and the atlas in Mango or FSLeyes.
To extract timeseries from ROIs please follow this nilearn tutorial http://nilearn.github.io/auto_examples/03_connectivity/plot_signal_extraction.html#sphx-glr-auto-examples-03-connectivity-plot-signal-extraction-py
thank you for your reply on my question
i have calculated tSNR values for specific ROIs using
Harvard-Oxford-Atlas. I have preprocessed my data with fmriprep first.
Then I have coregistered the scull-stripped T1w image to the space of
atlas, and then coregistered the functional data using the same
matrix. After this it is possible to calculate ROI specific tSNR
values with a simple function from nipype.
Could you maybe take a look on my code?
What do you think, if it is a good way to do it?
Thank you very much!
Your task can be done in a much simpler way:
- Calculate tSNR directly using
_bold_space-MNI152NLin2009cAsym_preproc.nii.gz file created by FMRIPREP. It is already in the right space so there is no need to perform any coregistration.
- Use the NiftiLabelsMasker class to extract average values per parcel from the tSNR map file created in step 1.
The shape of my
_bold_space-MNI152NLin2009cAsym_preproc.nii.gz images from fmriprep output is: 97x115x75
The shape of Harvard-Oxford atlas images imported from nilearn datasets is: 91x109x91
(also avaliable at https://neurovault.org/images/1699/)
i have uploaded one example visual report and a “_task-rest_run-01_bold_space-MNI152NLin2009cAsym_preproc.nii.gz” of my own brain to:
Sorry, but i do not understand how they can be in the same space, if their shapes are different?
thank you for the link, now i understand how it works with MNI space.
Here are two screenshots:
first: tsnr_file with resampled Harvard-Oxford atlas with nilearn
as described here: http://nilearn.github.io/auto_examples/04_manipulating_images/plot_resample_to_template.html#sphx-glr-auto-examples-04-manipulating-images-plot-resample-to-template-py
source_img = HarvardOxford-cortl-maxprob-thr25-2mm.nii.gz
template = sub-01_task-rest_run-01_bold_space-MNI152NLin2009cAsym_preproc.nii.gz
second: re-registeres tsnr_file (as described above with FLIRT using T1 registration matrix) with unresampled Harvard-Oxford atlas
Do you have an idea, why the resampled image looks that “pixelized”?
My best guess is that you have not set
nearest when downsampling the atlas using
Yes, that was the problem! Thank you!
@Dmitriy_Desser could you please specify which nipype version you used? I tried to run tSNR_ROI_cort.ipynb but got the following error:
190912-14:31:41,392 nipype.workflow ERROR:
Could not import plugin module: nipype.pipeline.plugins
Any help would be appreciated!
I have used nipype version 1.1.9 to run this script last time. I have had a similar error message caused by graphviz. Could you maybe try to disable the line “tsnr_wf.write_graph(graph2use=‘colored’, format=‘svg’, simple_form=True)” (In ) and try it again?
Hi @Dmitriy_Desser, thanks for your response!
I am using ‘1.2.2’ nipype version. Following your recommendation I installed graphviz and then got this error ImportError: Could not import plugin module: nipype.pipeline.plugins while running the last code line tsnr_wf.run(). Did some research and found out it might have been due to Python 3, so I created a new Python 2.7 enviroment and installed everything again, and now, am able to run tsnr_wf.write_graph and generate the graph, but I still get the same error message regarding nipype.pipeline plugins on tsnr_wf.run(). Any ideas about what could it be? I ran all this on Anaconda, Windows 10 Enterprise.
Then I did the same workflow on a MacIOS, Anaconda Python 2.7 enviroment then got this following error: TypeError: can’t pickle PyCapsule objects. And for Python3, the same nipype error as described above.
Thanks in advance!
@Dmitriy_Desser also, while running apply_mask = Node(fsl.ApplyMask(), name=“apply_mask”) got the following warning:
190913-14:10:23,642 nipype.interface WARNING:
FSLOUTPUTTYPE environment variable is not set. Setting FSLOUTPUTTYPE=NIFTI
Ah ok, I see. Well you need to install FSL first and add it to the $PATH variable. FSL works only on Linux and MAC, so therefore you only can run the script on your MAC…
@Dmitriy_Desser you’re right! I managed to solve this problem by installing fsl on my Mac (sorry for the naive questions, it’s my first time using Nipype!). Now, I get the following error while trying to run the workflow:
TypeError: can’t pickle _thread._local objects
As far as I could dig into this, it seems to be an error with deepcopy(). You may see a similar issue here https://github.com/neuropycon/ephypype/issues/5, any ideas on how I can manage to solve this? Maybe change the workflow dict? Thanks so much!
Due to the limitation of 3 replays to new users, I edited my previous post with my replay:
Hi @Dmitriy_Desser, yes my data is in a BIDS format and I am using the following FMRIPREP outputs (of course I changed their names according to BIDS):
- anatomical: sub-01_space-MNI152NLin2009cAsym_desc-preproc_T1w.nii.gz
- functional: sub-01_task-rest_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz
Here you can find the code: https://github.com/mariecpereira/Random/blob/master/tSNR_ROI_cort_mydata.ipynb
Thanks a lot!!
Not sure, but I think that the script is not able to find the data. Do you have a valid bids dataset and already preprocessed with fmriprep?
Could you maybe post a screenshot of your data sttucture?
I just wanted to mention two developments regarding this issue:
fMRIPrep now allows you to generate derivatives exactly matching templates other than MNI152NLin2009cAsym. That is using the new
--output-spaces argument. For instance:
--output-spaces MNI152NLin6Asym:res-2 will give you preprocessed fMRI with the grid of the FSL MNI template (i.e., 2mm isotropic resolution and 91x109x91 matrix size).
TemplateFlow now includes the harvard-oxford template resampled to the default template of fMRIPrep (MNI152NLin2009cAsym).
Cool, so the resampling step is not needed any more for this jupyter notebook
Great, it is good to know! Thanks!