Performing full glm analysis with fsl on the bold images preprocessed by fmriprep without re-registering the data to the MNI space

Hi everyone,
Did anyone try to analyze the data pre-processed by fmriprep (we used version 1.0.0-rc2) with fsl?
We need to perform first level, second level and group level analysis, and we couldn’t make fsl’s feat perform it without registering the data again to the MNI (we used linear 3 dofs to minimize it as much as possible, but we still want to find a way to avoid registering the data twice).
Using randomise is not a solution since we must perform the second level analysis, for which fsl requires the registration, even when we are using randomise for the group analysis (and we also want to be able to use flame and not just randomise).

Thank you very much in advance!
Rotem

2 Likes

To clarify, we’ve tried tricking FSL by running the first level as usual and then replacing the .mat files with the identity matrix and setting the reference brain to the original brain, but this doesn’t always work, from what I understand. Of course we can pull apart the analysis into the 20 or so separate commands found in the log file and run it ourselves, but I’m afraid this will leave a lot of room for error. That said, this is the route we’ll likely end up taking unless anybody has an tips. I figured somebody must have done this previously…

Hi, I haven’t done this with the output of FMRIPREP, but have encountered the same registration issue with the HCP data (which is already registered to MNI space). For that, their solution is to use link to a identity matrix and the standard brain:

ln -s $FSLDIR/etc/flirtsch/ident.mat $feat/reg/example_func2standard.mat
ln -s $FSLDIR/etc/flirtsch/ident.mat $feat/reg/standard2example_func.mat
ln -s $FSLDIR/data/standard/MNI152_T1_2mm.nii.gz $feat/reg/standard.nii.gz

https://wiki.humanconnectome.org/display/PublicData/Advice+for+FEAT+Analysis+of+HCP+task+fMRI+data

This seems to work ok for us, and I suspect a similar solution could work with other datasets that are already registered.

Best,
David

3 Likes

Thank you very much!

Thanks David, super helpful! We needed to adapt your code slightly, since we used fmriprep. Since fmriprep retains the voxel sizes from the original BOLD data, we found that MNI152 will fail and instead the last line you suggested needed to be replaced by
ln -s $feat/mean_func.nii.gz $feat/reg/standard.nii.gz

Again, thanks!

Jeanette

2 Likes

Glad it helped, and good to know about about fmriprep! I look forward to moving over to that and other BIDS apps soon.

Best,
David

@Jeanette_Mumford Made a video describing this: https://www.youtube.com/watch?time_continue=7&v=U3tG7JMEf7M

6 Likes

Hi Experts

When using FSL to do GLM analysis for FMRIPREP preprocessed data, what is the correct way to enter the motion parameters in the confounds.tsv into FEAT?
For instance, if using the translation and rotation regressors, should we put each column into a single one column text file and select custom (1 entry per volume) , convolution:none, and cancel temporal derivatives, for each of the EVs for motion regressors?

Thank you very much!

Chen-Chia

Hi All,

To follow up on this previous comment, I am also a little confused about how to incorporate the confound regressors produced by fmriprep into the 1st level FEAT analysis. I assume that the combination of confound regressors I ultimately include will need to be isolated in one or more separate text files, but I am unsure what the format of these text file(s) should be. I would be very grateful for any guidance you might be able to provide.

Many thanks in advance!
Monica

See Confounds from fmriprep: which one would you use for GLM?

Please check our preprint https://doi.org/10.1101/694364 and the code where you can see how to reuse some confounds https://github.com/poldracklab/ds003-post-fMRIPrep-analysis (everything is done via Nipype, so expect some steep learning curve)