Using fMRIPrep motion confounds in FEAT's Stats first level


I used fMRIPrep for processing my data and was planning on using FEAT for further analysis. I wanted to use rotx,y,z, trans,x,y,z as my regressors but was wondering if anyone could help me in figuring out on how to include that in FEAT. Would I have 6 text files, 1 for each regressor and then input them in as a “Custom 1 entry per volume” regressor along with my stimuli EV text file? Any help would be appreciated!


1 Like

Hi Ash,

For this, you would need to create a .txt file (preferably tab-separated) per run per subject, containing all these regressors. In your fMRIPrep output directory you should have a *desc-confounds_timeseries.tsv file for each subject’s run. To generate the .txt file with the confounds you wantyou would do something like this (in this case, python):

import pandas as pd

# Load file
confounds_file = pd.read_csv('fmriprep-20.2.0/sub-01/func/sub-01_task-bart_run-1_desc-confounds_timeseries.tsv, sep='\t')

# Extract head motion confounds, including temporal derivatives and quadratics
motion_confounds = [x for x in if 'trans' in x or 'rot' in x] 
extracted_confounds = confounds_file[motion_confounds]

# Save confounds as .txt file
extracted_confounds.to_csv('sub-01_run-1_FEATconfounds.txt', sep='\t', index=False)

In the first level FEAT, in the “Stats” section, you’ll select “Add additional confounds EVs” and click on the box saying "Select confounds EVs text file(s), and select the confounds.txt file that was just generated. Once you’ve done that and gone through your “Full model setup”, you can click on the “view design” box at the bottom to see the design matrix, which should include your confounds.

You would then need to repeat these steps for each subject run.

Your reference to “Custom 1 entry per volume” pertains to the timing information of your EV(s), which is different from the confounds EVs that we just included in the model.

Hope this helps


I appreciate the detailed step by step. Yes this helps and it makes sense, thank you Dan!