First-level analysis via nistats

Hi everyone!

I’m sorry I have a very general question, but I’m absolutely stuck and would appreciate any help.
I’m trying to run a first-level analysis using nistats. But I keep getting only zeros and nans in r-squares, residual, and z-maps after computing contrasts.
Is there any mistake in my code that I can’t see?

    glm = FirstLevelModel(
        t_r=t_r,   
        hrf_model='glover',
        drift_model='cosine',
        #high_pass=0.01,
        high_pass=0.008,
        mask_img = fmri_mask, #nifti image
        smoothing_fwhm=3.5,   
        noise_model='ar1',
        standardize=False,
        minimize_memory=False)

    #fist level model
    #events and confounds clean are data frames
    glm = glm.fit(fmri_img, events=events, confounds=confounds_clean)

    design_matrix = glm.design_matrices_[0]
    #plot
    ax = plot_design_matrix(design_matrix.iloc[:,0:10])
    ax.get_images()[0].set_clim(0, 0.2)

    #r`2: returns a list, take the first element, because we only have one run
    r2_img = glm.r_square[0]
    plotting.plot_stat_map(r2_img, threshold=0.2)

    resids = glm.residuals[0]

    n_columns = len(design_matrix.columns)
    #10 contrasts
    contrasts = {'0back-1back': np.pad([1, -1, 0, 0, 0], (0,n_columns-5)),
                      '0back-2back': np.pad([1, 0, -1, 0, 0], (0,n_columns-5)),
                      '0back-3back': np.pad([1, 0, 0, -1, 0], (0,n_columns-5)),
                      '0back-fix': np.pad([1, 0, 0, 0, -1], (0,n_columns-5)),
                      '1back-2back': np.pad([0, 1, -1, 0, 0], (0,n_columns-5)),
                      '1back-4back': np.pad([0, 1, 0, -1, 0], (0,n_columns-5)),
                      '1back-fix': np.pad([0, 1, 0, 0, -1], (0,n_columns-5)),
                      '2back-4back': np.pad([0, 0, 1, -1, 0], (0,n_columns-5)),
                      '2back-fix': np.pad([0, 0, 1, 0, -1], (0,n_columns-5)),
                      '4back-fix': np.pad([0, 0, 0, 1, -1], (0,n_columns-5))
                      }
    for index, (contrast_id, contrast_val) in enumerate(contrasts.items()):
        print('Contrast % 2i out of %i: %s' % (index + 1, len(contrasts), contrast_id))
        # estimate the contasts
        # note that the model implictly computes a fixed effect across the two sessions
        #or here z_map = fmri_glm
        z_map = glm.compute_contrast(contrast_val, output_type='z_score')
        plotting.plot_stat_map(z_map)

I would appreciate any help or suggestions!

Ahoi hoi @alado,

thank you very much for your interesting post.
Would it be possible to share a bit more information regarding the
experimental design and input data, maybe also including some graphical
output (e.g., design matrix, etc.)? You mentioned that R-squared, residuals and z-maps are 0 or NaN after computing contrasts. However, your code appears to assessing R-squared and residuals before computing contrasts. Could you please elaborate, sorry if I misunderstood something. Also: do you get any warnings (e.g., related to the rank deficiency of the matrix)? Did you check your contrasts via plot_contrast_matrix? Did you evaluate if your adaptations from this example are fitting?
With that information, it’ll be easier to figure out what might go wrong and help you to solve the problem.

Cheers, Peer

@PeerHerholz, thanks a lot for your reply!
In the first figure, you can see the first 10 columns from the design matrix. 1 to 5 are my trial types (from nback task), rest - confounds.

Figure 2020-11-08 180745
In the second, entire design matrix (a bit messy with x-axis!)
Figure 2020-11-08 180943

these are the warnings I get when I try to fit the GLM with events and confounds.


R-squared and residuals contain zeros before computing contrasts as well.
I tried these examples:
https://nistats.github.io/auto_examples/02_first_level_models/plot_fiac_analysis.html#sphx-glr-auto-examples-02-first-level-models-plot-fiac-analysis-py
https://nistats.github.io/auto_examples/02_first_level_models/plot_fiac_analysis.html#sphx-glr-auto-examples-02-first-level-models-plot-fiac-analysis-py
https://github.com/lukassnoek/nilearn-stats-tutorial/blob/master/nilearn_stats.ipynb

and they seem to work when I just run the example, but I get the same problem with 0 if I use my own data. Indeed, I assess r-squared and residuals before computing contrasts and they are already just matrices filled with zeros and nans. So at least for me, it seems that something wrong happens even before I compute contrasts.

now I’m trying to use first_level_models_from_bids from this tutorial to see if it’ll be better:
https://nistats.github.io/auto_examples/05_complete_examples/plot_bids_analysis.html#sphx-glr-auto-examples-05-complete-examples-plot-bids-analysis-py
but I can’t tell you if it worked because I ran into different problems there :slight_smile:

Regarding the data. I have a BIDS-compliant dataset preprocessed with fMRIprep.
I used confounds.tsv from fMRIprep (i replace inf and nan values with 0), events.tsv (‘onset’, ‘duration’, ‘trial_type’, ‘RT’, 'accuracy) (i also tried to remove some columns from the data frame before fitting the model, but it didn;t change the result), mask from fMRIprep, and fMRI data from fMRIPrep (in MNI space).
Here are the data: https://drive.google.com/drive/folders/11pO_mDZcNXLbbEtDK1tuZMMjH_mcsXLc?usp=sharing

1 Like

Hi, as far as I Can see, the design matrix has a huge number of columns, hence it is likely that it is rank-deficient.
Can you share it in some way so that we check it ?
Best,

1 Like

Just to jump in –

If you’re using the fMRIPrep generated confounds file, I’d strongly recommend against using it as-is. You’ll want to sub-select the _desc-confounds_timeseries.tsv file as discussed here.

I’m not sure if this is what you meant re: removing columns, but it’s definitely something to check !

Hi!
Thanks a lot for your reply! I really appreciate that people want to help me!
I’ve uploaded my file with confounds and confounds_clean.csv here:
https://drive.google.com/drive/folders/11pO_mDZcNXLbbEtDK1tuZMMjH_mcsXLc?usp=sharing

Indeed, it has 267 columns. I’ll try with fewer columns and post my result here.

So it looks like you were right!
When I tried to fit the model with a smaller design matrix (96 confounds) it worked perfectly well!
Thanks a lot for your help and you time!