Fixed-effects workflow for fmriprep outputs?



Hi all,

I’m writing a fixed-effects workflow that uses outputs from fmriprep. I started working from this example by @ChrisGorgolewski and re-wrote it as a Nipype workflow to make it easier to iterate over subjects and runs. As far as I could tell, the original example models a single run per session per subject; I’m having trouble adapting this code to model multiple runs per subject.

I’ve put my work so far in this repository. I’ve gotten as far as running model estimation for individual runs with FILM, but I’m not sure how to feed the outputs of FILM (filmgls) to a fixed-effects workflow (fixedfx). I built the first-level workflow so that it iterates over different runs from the start, so I think the next step is to merge the copes, varcopes, etc. across all runs of a single subject before feeding them into the fixed effects workflow.

I tried using a JoinNode (mergemodel) between filmgls and fixedfx to do this. Here’s the relevant bit from block #44:

             (mask, mergesource, [('out_file', 'mask')]),
             (filmgls, mergesource, [('copes', 'copes'),
                                    ('varcopes', 'varcopes'),
                                    ('dof_file', 'dof_file')]),
             (mergesource, mergemodel, [('mask', 'mask'),
                                        ('copes', 'copes'),
                                        ('varcopes', 'varcopes'),
                                        ('dof_file', 'dof_file')]),
             (mergemodel, fixedfx, [(('mask', pickfirst), 'flameo.mask_file'),
                                    (('copes', sort_copes), 'inputspec.copes'),
                                    ('dof_file', 'inputspec.dof_files'),
                                    (('varcopes', sort_copes), 'inputspec.varcopes'),
                                    (('copes', num_copes), 'l2model.num_copes')]),

However, I got this error:

	 [Node] Setting-up "l1.fixedfx.varcopemerge" in     "/scratch/users/nvelez/swist_cache/l1_model/task-tomloc_model-localizer_sub-04/l1/fixedfx/_run_2/varcopemerge".
ValueError                                Traceback (most recent call last)
<ipython-input-45-5b9f69c9be41> in <module>()
----> 1

/src/nipype/nipype/pipeline/engine/ in run(self, plugin, plugin_args, updatehash)
    593         if str2bool(self.config['execution']['create_report']):
    594             self._write_report_info(self.base_dir,, execgraph)
--> 595, updatehash=updatehash, config=self.config)
    596         datestr = datetime.utcnow().strftime('%Y%m%dT%H%M%S')
    597         if str2bool(self.config['execution']['write_provenance']):

/src/nipype/nipype/pipeline/plugins/ in run(self, graph, config, updatehash)
     42                 if self._status_callback:
     43                     self._status_callback(node, 'start')
---> 44       
     45                 if self._status_callback:
     46                     self._status_callback(node, 'end')

/src/nipype/nipype/pipeline/engine/ in run(self, updatehash)
    415         # Check hash, check whether run should be enforced
    416'[Node] Setting-up "%s" in "%s".', self.fullname, outdir)
--> 417         cached, updated = self.is_cached()
    419         # If the node is cached, check on pklz files and finish

/src/nipype/nipype/pipeline/engine/ in is_cached(self, rm_outdated)
    302         # Update hash
--> 303         hashed_inputs, hashvalue = self._get_hashval()
    305         # The output folder does not exist: not cached

/src/nipype/nipype/pipeline/engine/ in _get_hashval(self)
   1052             return self._hashed_inputs, self._hashvalue
-> 1054         self._check_iterfield()
   1055         hashinputs = deepcopy(self._interface.inputs)
   1056         for name in self.iterfield:

/src/nipype/nipype/pipeline/engine/ in _check_iterfield(self)
   1218             if not isdefined(getattr(self.inputs, iterfield)):
   1219                 raise ValueError(("Input %s was not set but it is listed "
-> 1220                                   "in iterfields.") % iterfield)
   1221         if len(self.iterfield) > 1:
   1222             first_len = len(

ValueError: Input in_files was not set but it is listed in iterfields.

Any suggestions on how to debug this would be much appreciated! For reference, I’ve included a diagram of the workflow at the end of the notebook.



By looking at your code my best guess is that the function sort_copes returns an empty list. One way to debug this would be to add an assertion checking that length of the list is not zero.


Probably not relevant for you anymore, but I ran into a similar issue using JoinNode (and hence found this old thread). Part of the issue is in the joinsource of the JoinNode. It should be joinsource='inputnode' in this case, not joinsource='mergesource' (the syntax here are not accurate in a couple ways). I think your script will run with these changes (code in context)

def sort_filmgls_output(copes_grouped_by_run, varcopes_grouped_by_run):
    def reshape_lists(files_grouped_by_run):
        import numpy as np
        if not isinstance(files_grouped_by_run, list):
            files = [files_grouped_by_run]
            files = files_grouped_by_run
        if all(len(x) == len(files[0]) for x in files): n_files = len(files[0])

        all_files = np.array(files).flatten()
        files_grouped_by_contrast = all_files.reshape(int(len(all_files) / n_files), n_files).T.tolist()
        return files_grouped_by_contrast
    copes_grouped_by_contrast = reshape_lists(copes_grouped_by_run)
    varcopes_grouped_by_contrast = reshape_lists(varcopes_grouped_by_run)

    return copes_grouped_by_contrast, varcopes_grouped_by_contrast

pass_run_data = pe.Node(niu.IdentityInterface(fields = ['mask', 'dof_file', 'copes', 'varcopes']), 'pass_run_data')

join_run_data = pe.JoinNode(niu.IdentityInterface(fields=['masks', 'dof_files', 'copes', 'varcopes']),  joinsource='inputnode',  joinfield=['masks', 'dof_files', 'copes', 'varcopes'], name='join_run_data')

group_by_contrast = pe.Node(niu.Function(input_names=['copes_grouped_by_run', 'varcopes_grouped_by_run'], output_names=['copes_grouped_by_contrast', 'varcopes_grouped_by_contrast'], function=sort_filmgls_output), name='group_by_contrast')
    (mask, pass_run_data, [('out_file', 'mask')]),
    (filmgls, pass_run_data, [
        ('copes', 'copes'),
        ('varcopes', 'varcopes'),
        ('dof_file', 'dof_file') ]),
    (pass_run_data, join_run_data, [
        ('mask', 'masks'),
        ('dof_file', 'dof_files'),
        ('copes', 'copes'),
        ('varcopes', 'varcopes') ]),
    (join_run_data, group_by_contrast, [
        ('copes', 'copes_grouped_by_run'),
        ('varcopes', 'varcopes_grouped_by_run')]),
    (group_by_contrast, fixed_fx, [
        ('copes_grouped_by_contrast', 'inputspec.copes'),
        ('varcopes_grouped_by_contrast', 'inputspec.varcopes')]),
    (join_run_data, fixed_fx, [
        (('masks', pickfirst), 'flameo.mask_file'),
        ('dof_files', 'inputspec.dof_files'),
        (('copes', num_copes), 'l2model.num_copes')]), ### number of runs


Wow, thank you so much!! I was actually just looking at this script again this week. :slight_smile: I’ll try it and let you know how it goes.


HI @dae, quick side question:
You mention that the syntax here is not accurate in a couple of ways. In what way and where do you mean? Would be great if you could open either an issue or directly a PR to help us improve the tutorial.


@dae: Just to follow up, your changes did work—thank you! I’m just running into a separate problem now with FLAMEO. :slight_smile: