Grabbing output from dti workflow

I’m using the create_bedpostx_pipeline() workflow to process some DTI data, and right now all of the outputs get piped to an outputnode per the script.

What’s the cleanest way to have all of the necessary outputs from bedpostX copied to an output directory of my choice.

It appears the possible out_fields are defined below…

Outputs::
outputnode wraps all XFibres outputs

out_fields = [‘dyads’, ‘dyads_disp’,
‘thsamples’, ‘phsamples’, ‘fsamples’,
‘mean_thsamples’, ‘mean_phsamples’, ‘mean_fsamples’]

based on reading:

What’s the cleanest way to point all of these files to a datasink directory? Do I have to individually connect each potential out_fields as a specific connection?

i.e. something akin to
(bpwf,datasink, [(‘outputnode.dyads’,‘bedpostX.@dyads’),
(outputnode.thsamples’, ‘bedpostX.@thsamples’),
etc, etc, etc,…

I basically want to grab all the necessary files so I can run probtrackx from it…