FSL FEAT: error in execution

Hi,

I have a question regarding this answer.

I was following the steps in FSL but in “Stats”, when I select “Add additional confounds EVs”, what type of .txt file do I add here (scanner to T1 mode image or T1w to scanner mode image or is it a different file from the func folder from fMRIPrep) if I used fMRIPrep for preprocessing?

I would really appreciate if you could provide me some advice regarding this question.

Thank you in advance for your help.

Best regards,

Rubina Chandnani

Hi @Rubina,

For this you’d want to load a .txt file that contains a subset of columns from each functional BOLD confounds files (e.g. motion parameters, the highpass cosine columns that fmriprep provides, etc). The confounds files are provided from fmriprep processing, but be sure to not upload ALL the confounds; you’ll only want the .txt file to contain a fraction of them. Which ones you choose will depend on the type of analysis you’re planning to conduct.

Hi @dlevitas,

Thank you for your response.

I decided not to upload the .txt file to keep things simple, however, I have a question related to the first-level and higher-level analysis.

I ran a first-level and higher-level analysis following the steps in this link: Mumford Brain Stats — FEAT registration workaround since I preprocessed the data using fMRIPrep.

However, after doing the higher-level analysis, the reg summaries were missing. I receive a warning message that reads:

“WARNING:: Inconsistent orientations for individual images when attempting to merge.
Merge will use voxel-based orientation which is probably incorrect - PLEASE CHECK!”

I’ve also attached the log file to describe the errors I’m receiving. Higher-level analysis FSL log report.txt (35.4 KB)

I would really appreciate your guidance and/or advice on how to fix these errors.

Many thanks and best regards,

Rubina Chandnani

Unfortunately, I’m not well-versed with FSL warning & error messages and I haven’t come across this specific message before. My suggestion would be to reach out to the FSL team on the FSL message board.

I would also recommend ensuring that that FEAT registration workaround was done correctly (the end of the video specifies how you can check your work).

Hi Rubina,

I have also used the FEAT registration work around. I did not encounter this issue at the higher-level analysis and her steps seemed to work accordingly. I second dlevitas in making sure the work around was correctly executed after the first-level.

Thank you for your response, @aftonnelson

I just read this post FSL scripting: error in execution of FSL FEAT - #8 by dlevitas and realised that I have to run the first-level analysis and feed the fMRIPrep data in the second-level analysis. Where do I save the .fsf file since I have 3 runs?

Thank you in advance for your help.

Best,

Rubina

If you’re using the FSL FEAT graphical user interface then there is a “Save” button at the bottom that will create an .fsf file based on the information you’ve provided. I’d recommend not saving until once you’ve finished entering information for the first-level analysis.

An issue with FEAT is automation, given that during the first level you must perform FEAT on each run, and the fsf files can be a bit complicated. You can check out this tutorial by Andy Jahn that focuses on how to solve this issue. If instead you’d prefer an easier (but longer) solution, you would perform first-level FEAT on each subject’s run.

I performed the first-level analysis with FSL creating 3 .fsf files for each run. After running the higher-level analysis, I checked and the images between stats/cope#.nii.gz and reg_standard/stats/cope#.nii.gz are not the same.

I followed the steps using “Mumford Brain Stats-FEAT registration workaround” and used the exact codes, however, I can’t see what I missed here.

Is there something I should’ve done that I’m not seeing?

When you pre-processed your data with fMRIPrep, which --output-spaces did you use, and did you specify --use-aroma by any chance?

MNI152NLin2009cAsym was used as the output spaces and I didn’t specify --use-aroma.

I’m not entirely sure whether or not this will help, but you could try re-running fmriprep with --output-spaces MNI152NLin6Asym. The MNI152NLin6Asym output space is the exact same as FSL’s MNI152. The one you used is slightly different. This may solve the issue you’re having when trying to analyze the data in FSL FEAT.

Thank you for the suggestion, but I preprocessed the data using fMRIPrep. Is there no way around this issue?

There’s nothing preventing you from re-processing your data with fMRIPrep, other than time constraints. To save time, you can reuse the same working directory from before (--w). If you incorporated FreeSurfer in your fMRIPrep pipeline, you can reuse that information by specifying the path in --fs-subjects-dir.

Again, this suggestion is simply to ensure that your fMRIPrep functional output is in the same space as FSL’s MNI152 standard space, which may be the cause of your issues.

Okay, I will re-process the data with fMRIPrep.

I have two follow-up questions:

  1. Would this code be correct to help me save time?

fmriprep-docker --w /mnt/c/Users/rubin/BIDS /mnt/c/Users/rubin/derivatives participant --participant-label sub-C0120061 --fs-license-file /mnt/c/Users/rubin/fs-license/license.txt --low-mem --output-spaces MNI152NLin6Asym

  1. Once I have pre-processed the data with fMRIPrep, and when I run FSL, do I have to change the standard space to MNI152NLin6Asym?

I really appreciate your help in pointing me in the right direction.

If that’s the same working directory path that you used before then you should be good.

Your fMRIPrep output will now be in MNI152NLin6Asym space; I don’t think you need to change anything from FSL FEAT’s perspective, other than making sure you’re feeding in the correct data.

After reading through dlevitas responses… I think our situations differ in that I did specify --use-aroma in fmriprep, which is why I guess I did not have this issue (although I cannot say for sure). In my case, when I specified ICA aroma, it created the output space for both MNI152NLin6Asym and MNI152NLin2009cAsym.

@aftonnelson when you specify --use-aroma fMRIPrep will automatically add MNI152NLin6Asym to --output-spaces even if it wasn’t explicitly specified. My hunch is that the error @Rubina is having is because she was trying to feed FEAT data that wasn’t in FSL’s MNI152 standard space.

Following up from the point you made,

  1. I re-processed the data to be in MNI152NLin6Asym space.
  2. I set up all the variables and save it in .fsf file format.
  3. When I run the first-level analysis, in the ‘Data’ tab, I include the preproc bold file in the MNI152NLin6Asym space from fMRIPrep, in ‘Pre-Stats’, I turn off the motion correction: none, I turn on the BET brain extraction and high-pass filter. In the ‘Stats’ tab, I choose ‘Full model setup’ and set up the EV and Contrasts and hit ‘Go’.

Is there something else I should add and or /remove that I’m not doing?

These are fine, you can also turn of slice timing correction, as this is done by fMRIPrep.

On the “Registration” tab, be sure to follow the instructions from the FEAT workaround video posted earlier in this thread.

This seems correct. If you haven’t, you can also view the design matrix as a visual check to ensure that everything has been set up properly.

On the “Stats” tab, I would strongly recommend selecting “Add additional confounds EVs” and add a text file where each column is a confound you would like to have regressed from the data. In your fMRIPrep output, each functional BOLD acquisition contains a corresponding desc-confounds_timeseries.tsv file with many confounds to choose from. You’ll want to select a subset of confounds to include in your design matrix. Specifically, since you’ve turned off motion correction in FEAT, you’ll want to include the motion confounds from the desc-confounds_timeseries.tsv files. For an example, see this post.

@dlevitas, thank you so much for your constructive feedback.

I took a look at the post but I’m not sure what regressors to add and what to modify in this specific code:

motion_confounds = [x for x in confounds_file.data if ‘trans’ in x or ‘rot’ in x]