How to use the nimare.decode.continuous

Dear all,

Recently, I would like to use the meta-decoding function of the Neurosynth. It seems that the server doesn’t work and I cannot see the results of the cognitive decoding. Therefore, I turned to use NiMARE and tried to use similar functions. However, the process of model fitting is time-consuming and I am still waiting for the results.

My question is I do not know whether the code I use is correct or fittable. For example, I am not sure what should be written in the “CorrelationDecoder()” or just nothing. I referred to the function of the Neurosynth and also want to use the unthresholded Z-map as the images. Here is my codes:

import nibabel as nib
import os
from nimare.extract import fetch_neurosynth
from nimare.io import convert_neurosynth_to_dataset
from nimare.decode import continuous

atom = nib.load('xxx)
ex = nib.load('xxx')
vis = nib.load('xxx')

out_dir = os.path.abspath('xxx')  
os.makedirs(out_dir, exist_ok=True)

files = fetch_neurosynth(
    data_dir=out_dir,
    version="7",
    overwrite=False,
    source="abstract",
    vocab="terms",
)

neurosynth_db = files[0]
neurosynth_dset = convert_neurosynth_to_dataset(
    coordinates_file=neurosynth_db["coordinates"],
    metadata_file=neurosynth_db["metadata"],
    annotations_files=neurosynth_db["features"],
)
neurosynth_dset.save(os.path.join(out_dir, "neurosynth_dataset.pkl.gz"))

decoder = continuous.CorrelationDecoder()
decoder.fit(neurosynth_dset)

decoded_df1 = decoder.transform(atom)
decoded_df2 = decoder.transform(ex)
decoded_df3 = decoder.transform(im)
decoded_df1 = decoded_df1.sort_values(by='correlation', ascending=False)
decoded_df2 = decoded_df2.sort_values(by='correlation', ascending=False)
decoded_df3 = decoded_df3.sort_values(by='correlation', ascending=False)

Hi there,

Your code looks good to me. Yes, training the decoder takes a while. I suggest increasing the number of CPUs with n_cores. When it finishes training, you can save the object for future use, as you did with the Neurosynth dataset object.

decoder.save(os.path.join(out_dir, "neurosynth_decoder.pkl.gz"))

Alternatively, you can download the meta-analytic maps from this OSF link (OSF) and train the decoder using the load_imgs function, which will be much faster.

decoder = CorrelationDecoder()
decoder.load_imgs(path_to_images, mask=neurosynth_dset.masker)

Thank you very much for your kind help; I will try what you have suggested.