nothing happens. How to use this WaterShedSkullStrip() API so that a segmented image is ouput ?
Old question:
I have installed nilearn on my ubuntu 16.8 LTS. I am trying to implement some basic preprocessing steps to analyse abide datasets currently with the dparsf pipeline. I am wondering if nilearn has a way to segment the brain images into grey and white matter and compare the densities of these regions ?
Hi @mgxd thanks so much for replying
i did run the cmd but it gives me the ‘command mri_watershed not found’. Should freesurfer software be installed for this to work ?
Yes, for the most part Nipype serves as a command line wrapper for these commands, but the underlying software that does the calculations will need to be installed on the system.
@oesteban@mgxd thanks for the suggestion. I am trying to analyse ABIDE datasets and i do not think it is available in the BIDS format. On my Ubuntu (16.04 LTS, 700 gb disk, 15 gb memory), i have already installed nilearn, nipype. freesurfer looks like >4 gb of space. Eventually my goal is to do some machine learning on the dataset. I installed CPAC but i’ve found it hard to get it configured and setup on a standalone machine.
i think fmriprep needs docker installed (openneuro processes only bids format datasets) but not sure if my ubuntu version has support for docker.
Can i do this installation on google cloud or aws ?I’d really appreciate suggestions on what combinations of software would serve the above task well.
You can also pip install fmriprep, however this way you will need to make sure that all the dependencies are met and functional (AFNI, ANTS, FreeSurfer, FSL, C3D, etc.).
that was very helpful. thanks!
I ended up using datalad and installed FSL for the segmentation. Using the Fast routine, is it possible to segment multiple images parallely? The UI seems to support just one file at a time which also takes 10-20 mins. Is there a tool for parallel segmentation of anatomical images ?
Thank you once again!
Just for a complete disclosure – all the kudos for ABIDE in BIDS should go to INDI folks who released it in BIDS – we have just “exposed” it from their S3 (fcp-indi) bucket via DataLad. If you know/have more of interesting datasets in S3 let us know – it is quite easy to provide a git-annex repo for an S3 bucket (unless need to partition it more into sub-datasets etc)
Unfortunately, when I run the BIDS validator tool, the dataset seems not-compliant with BIDS. Especially because the TR time is missing. I can see that there’s a resources file with the TR from different acquisition files and also it seems to be in the nifti header, so I am puzzled.