I was fitting a simple second-level model and computing contrast to conduct a one sample t test following the tutorials. Instead of directly passing in a list of first level model objects, I passed in a pandas df with 3 columns subject_label , map_name and effects_map_path following the documentation here.
Here’s my code (contrast_df is the pandas dataframe I passed to the model):
The fit command runs but the compute_contrast command throws an error:
line 467, in compute_contrast
File "/miniconda3/envs/popcorn/lib/python3.11/site-packages/nilearn/glm/second_level/second_level.py",line 127, in _check_first_level_contrast
if isinstance(second_level_input, FirstLevelModel):
File "/miniconda3/envs/popcorn/lib/python3.11/site-packages/pandas/core/frame.py", line 3804, in __getitem__
indexer = self.columns.get_loc(key)
File "/miniconda3/envs/popcorn/lib/python3.11/site-packages/pandas/core/indexes/base.py", line 3805, in get_loc
raise KeyError(key) from err
Upon looking into the code, the issue seems to be that because my second_level_input is a pandas df, second_level_input tries to index into a column in the df called ‘0’. Obviously that column does not exist. The same line of isinstance(second_level_input, FirstLevelModel) also appears in _check_second_level_input which is called in the fit command, and that runs fine for me. In _check_second_level_input the code checks for different cases of second_level_input, so I wonder if doing the same here in _check_first_level_contrast will get rid of the issue?
I am completely new to Nilearn, so any input is appreciated! Thanks.
You need to specify a contrast_def variable that specifies your contrast. Example, if you want to find the difference between columns 1 and columns 2, while controlling for column three, it would be [1, -1, 0].
Hi @Steven, thanks so much for your quick reply and for pointing me to the tutorial! In my case I simply wanted to test if column 1 of my design matrix predicts the fmri signal, so if I’m not mistaken, my contrast_def = [1,0,0,…0], where the length of the array = ncols of the design matrix? I previously specified this contrast in my first level model but did not add the argument first_level_contrast in my second_level_model.compute_contrast command. My apologies. But weirdly, now that I added this array as the first_level_contrast, it threw me an error:
File "/miniconda3/envs/popcorn/lib/python3.11/site-packages/nilearn/glm/model.py", line 191, in Tcontrast
raise ValueError("t contrasts should be length P=%d, "
ValueError: t contrasts should be length P=102, but this is length 16
16 is the number of columns in my first level design matrix and according to the docs of compute_contrast that should be the shape of my first_level_contrast argument? Please let me know what I missed here. Thank you!
Thank you for your reply Steven! Sorry to bother you again.
My first-level design matrix has 16 columns, with 1 regressor of interest, 14 nuisance motion regressors and 1 intercept column. To test if the regressor of interest (the first column of the design matrix) correlates with the fmri signal, I used this contrast definition for the first level, which yields an array of (16,) with 1 at the zeroth index and 0 everywhere else:
I did not specify the second_level_contrast argument in SecondLevelModel.compute_contrast, because the docs said
The default (None) is accepted if the design matrix has a single column, in which case the only possible contrast array((1)) is applied
I guess I might have misunderstood the documentation. I thought the second_level_contrast argument in SecondLevelModel.compute_contrast is to specify the second level contrast definition, while the first_level_contrast argument in SecondLevelModel.compute_contrast is identical to the first level contrast definition back when I called compute_contrast on the first level model (ie your screenshot)?