'NiftiMasker' object has no attribute 'high_variance_confounds'

Summary of what happened:

I was running MACM sample code with nimare, and when it comes to

mkda = MKDAChi2(kernel__r=2) #should be ten
results = mkda.fit(dset_sel, dset_unsel)
corr = FWECorrector(method="montecarlo", n_iters=10) #should be 10000
cres = corr.transform(results)

it returns the error: ‘NiftiMasker’ object has no attribute ‘high_variance_confounds’, what went wrong here?

Hi @Nan_Wang,

What version of nilearn are you using?


thank you for your response!
nilearn’s version is: 0.10.0
nimare’s version is 0.0.12

Hmm, that is strange, I am unable to reproduce an error using a simple set up, using 0.10.2 on my end but should only be a little different and probably not related to the bug you’re getting.

import nilearn.maskers
masker = nilearn.maskers.NiftiMasker()


Just in case can you try updating to 0.10.3 (current stable release) and try again?

Until now, I have tried with 0.10.0, 0.10.1, 0.10.2, 0.10.3 and none of them worked out, quite strange

All exactly same error. I can’t really figure out what went wrong cause the code is the same of the MACM tutorial, except that I replaced dset file into : dset_file = “neurosynth_dataset.pkl.gz”. The rest is the same. :frowning:

Can you post a minimalist example producing the issue ? Thx in advance,

Thank you bthirion!

By minimalist example, do you mean the exact line of code that produce this error?
Then it’s this line of code:
cres = corr.transform(results)

And the rest of code can be seen here: (all same except that the dataset has been replaced)

Hi @Nan_Wang,

I wonder if this file (neurosynth_dataset.pkl.gz) was created with the same nilearn’s and nimare’s versions you are loading it. Could you try creating a new Neurosynth Dataset object to see if that solves this issue.


I’m also struggling a bit with the example. Can you give a script that does not rely on some external data, or provide a the data along with the script ?