Conjunctions with nipype

Hey,

I have two main effects at the group level (calculated using the nipype SPM interface) and I’d like to create a conjunction map (using the conjunction null rather than the global). Does anyone have suggestions on how to do that in nipype?

Thanks

Hey! I would like to revive this thread as I have the same question. Does anyone have an answer yet?

I ended up writing a function and wrapping it in a separate workflow. I’m afraid I’m not sure how to post nicely formatted code here, but the function is:

def conj_stat_maps(map1, map2, out_file=None):
    """
    Runs a conjunction on stat maps (returning a map with the min t-stat)

    Creates a conjunction of two statistical maps (typically the result of
    EstimateContrast or Threshold from the spm interface). 
    
    Args:
        map1 (str): filename of the first stat map in the conjunction
        map2 (str): filename of the second stat map in the conjunction
        
    Optional:
        out_file (str): output filename. If None (default), creates
            'conj_map.nii' in current directory
        
    Returns:
        out_file (str): output filename (absolute path)
    
    """
    
    # Imports all required packages so the func can be run as a nipype
    # Function Node
    import nilearn.image as nli
    import os.path
    
    if (out_file is None):
        out_file = 'conj_map.nii'
        
    conj_tmap = nli.math_img('np.minimum(img1*(img1>0), img2*(img2>0)) + ' + \
                             'np.maximum(img1*(img1<0), img2*(img2<0))', 
                             img1=map1, 
                             img2=map2)
    
    conj_tmap.to_filename(out_file)
    return os.path.abspath(out_file)

And then it’s wrapped as a workflow here:

def conj_maps(nipype_params, l2_subdir, con_list, map_name, extent_threshold=0,
              matlab_path='', wf_name=None):
    """
    Creates nipype workflow for conjuntion of statistical maps

    Args:
        nipype_params (dict): dict with the following fields:
            wd (str): path to nipype work dir
            sinkdir (str): path to datasink base dir
        l2_subdir (str): path to directory with level2 stat map outputs 
            (relative to nipype sinkdir)
        con_list (list of str): list of two contrasts to use for conjunction
            (e.g. ['con_0001', 'con_0002'])
        map_name (str): stat map name, will typically be spmT_0001.nii for
            unthresholded map and spmT_0001_thr.nii for thresholded
        
    Optional:
        extent_threshold (int): min number of voxels for cluster thresholding
            after conjunction. If 0 (default), won't run cluster thresholding.
            Should only be used when thresholded maps are provided.
        matlab_path (str): path to system matlab. Only relevant if
            extent_threshold>0
        wf_name (str): the name to use for the workflow. If None (default), 
            the workflow name will be the function name
            
    Returns:
        nipype workflow for creating a conjunction of level2 maps
    
    """
    
    # Checks whether the nipype work and base directories exists and creates
    # if they don't
    if (not os.path.exists(nipype_params['wd'])):
        os.mkdir(nipype_params['wd'])
    
    # If no workflow name was given - sets to function name    
    if (wf_name is None):
        wf_name = sys._getframe().f_code.co_name
        

    con1 = con_list[0]
    con2 = con_list[1]
    
    # selectfiles to select maps
    selectfiles = Node(SelectFiles({'map1': l2_subdir + f'/{con1}/' + map_name,
                    'map2': l2_subdir + f'/{con2}/' + map_name},
                    base_directory=nipype_params['sinkdir'], 
                    sort_filelist=True), name="selectfiles")
        
    conjmaps = Node(Function(input_names=['map1', 'map2'], 
                output_names=['out_file'], function=conj_stat_maps),
                name='conjmaps')
        
    datasink = Node(DataSink(base_directory=nipype_params['sinkdir'],
                             container=''),
                    name="datasink")
    
    # Initiation of the 2nd-level analysis workflow
    wf = Workflow(name=wf_name)
    wf.base_dir = nipype_params['wd']
    
    # Connect up the 2nd-level analysis components
    wf.connect([(selectfiles, conjmaps, [('map1', 'map1'), ('map2', 'map2')]),
                     (conjmaps, datasink, [('out_file','group.' + wf_name + '.@conj')])])
        
    # If a cluster threshold was provided, adds a node for cluster thresholding
    if (extent_threshold>0):
        clusterthresh = Node(niu.ClusterThresh(matlab_path=matlab_path, 
                        extent_threshold=10), name='clusterthresh')
        wf.connect([(conjmaps, clusterthresh, [('out_file','in_file')]),
                     (clusterthresh, datasink, [('out_file','group.' + \
                            wf_name + '.@conj_thr'),
                    ('cluster_label_file', 'group.' + wf_name + \
                     '.@conj_label')])])
        
    return wf

At some point this will all be on github, but I hope it helps for now

1 Like

Thank you for your reply, that should do it!

Hello @ayab,

one question: which interface are you using for the ClusterThresh function? In your code, you are specifying the node as “niu.ClusterThresh”. Which module are you importing as “niu” in your case?

Sorry, I forgot to add that:

Set of three classes used to wrap matlab code to set a cluster threshold

for a stat map. Based on the Threshold class, but assumes the map has

already been thresholded at the voxel level, and adds cluster thresholding

based only on cluster extent

class ClusterThreshInputSpec(BaseInterfaceInputSpec):
in_file = File(exists=True, copyfile=True, mandatory=True)
extent_threshold = traits.Int(mandatory=True)
matlab_path = traits.Str(’’, usedefault=True)
out_fname = traits.Str(‘conj_map_thr.nii’, usedefault=True)

class ClusterThreshOutputSpec(TraitedSpec):
out_file = File(exists=True)
cluster_label_file = File()

class ClusterThresh(BaseInterface):
“”"Extracts clusters based on extent threshold from thresholded map

Intended to be run as a Node after conjunction of thresholded maps (that
already have voxel-level thresholding, assumes nan in all voxels that
do not pass threshold)

Examples
--------
>>> clusterThresh = ClusterThresh()
>>> clusterThresh.inputs.in_file = '/in_path/conj_map.nii'
>>> clusterThresh.inputs.extent_threshold = 10
>>> clusterThresh.run() 
outputs: out_file (path to output .nii file with cluster-thresholded map)
"""

input_spec = ClusterThreshInputSpec
output_spec = ClusterThreshOutputSpec

def _run_interface(self, runtime):
    
    # Sets output files (in same dir)
    out_file = os.path.join(os.path.dirname(self.inputs.in_file),
                    self.inputs.out_fname)
    cluster_label_file = os.path.join(os.path.dirname(self.inputs.in_file),
                    'cluster_labels.nii')
    
        
    d = dict(in_file=self.inputs.in_file, out_file=out_file,
             cluster_label_file=cluster_label_file, 
             extent_threshold=self.inputs.extent_threshold,
             matlab_path=self.inputs.matlab_path)
    
    
    # Matlab code that runs the cluster thresholding, based on spm
    # Threshold interface
    script = Template("""in_file='$in_file';
                      out_file = '$out_file';
                      cluster_label_file = '$cluster_label_file';
                      matlab_path = '$matlab_path';
                      extent_threshold = $extent_threshold;
                      
                      % If matlab SPM path is given - adds it to path
                      if (~isempty(matlab_path))
                          addpath(matlab_path);
                      end
                      
                      stat_map_vol = spm_vol(in_file);
                      [stat_map_data, stat_map_XYZmm] = ...
                          spm_read_vols(stat_map_vol);
                      
                      Z = stat_map_data(:)';
                      [x,y,z] = ind2sub(size(stat_map_data), ...
                          (1:numel(stat_map_data))');
                      XYZ = cat(1, x', y', z');
                        
                      XYZth = XYZ(:, ~isnan(Z));
                      Zth = Z(~isnan(Z));
                      
                      max_size = 0;
                      max_size_index = 0;
                      th_nclusters = 0;
                      nclusters = 0;
                      if isempty(XYZth)
                          thresholded_XYZ = [];
                          thresholded_Z = [];
                      else
                          voxel_labels = spm_clusters(XYZth);
                          nclusters = max(voxel_labels);
                          
                          thresholded_XYZ = [];
                          thresholded_Z = [];
                          new_vox_labels = [];
                          
                          for i = 1:nclusters
                              cluster_size = sum(voxel_labels==i);
                              if cluster_size > extent_threshold
                                  thresholded_XYZ = cat(2, ...
                                    thresholded_XYZ, ...
                                    XYZth(:,voxel_labels == i));
                                  thresholded_Z = cat(2, ...
                                    thresholded_Z, Zth(voxel_labels == i));
                                  th_nclusters = th_nclusters + 1;
                                  new_vox_labels = cat(2, new_vox_labels,...
                                    th_nclusters*ones(1,cluster_size));
                              end
                          end
                      end
                      
                      if isempty(thresholded_XYZ)
                          thresholded_Z = [0];
                          thresholded_XYZ = [1 1 1]';
                          th_nclusters = 0;
                      end
                      
                      spm_write_filtered(thresholded_Z,thresholded_XYZ,...
                        stat_map_vol.dim',stat_map_vol.mat,...
                        'thresholded map', out_file);
                      
                      if (th_nclusters>0)
                          spm_write_filtered(new_vox_labels,...
                            thresholded_XYZ,stat_map_vol.dim',...
                            stat_map_vol.mat,'cluster_labels map',...
                            cluster_label_file);
                      end                            
                  
                      exit;
                      """).substitute(d)
                    
    mlab = MatlabCommand(script=script, mfile=True)
    result = mlab.run()
    return result.runtime

def _list_outputs(self):
    outputs = self._outputs().get()
    outputs['out_file'] = os.path.join(os.path.dirname(
            self.inputs.in_file), self.inputs.out_fname)
    outputs['cluster_label_file'] = os.path.join(os.path.dirname(
            self.inputs.in_file), 'cluster_labels.nii')
    return outputs

Just remove the niu prefix. I import a file I have with various utility functions as niu, and I didn’t notice something else from the file was needed for this to run as well. The ClusterThresh node is like spm threshold, but assumes the map is already thresholded at the voxel level and only thresholds by cluster size (note this is simple thresholding, you have to provide with a specific minimal cluster size)