Dear all,
I am writing a nypipe workflow to be run in parallel. I have the following folder strutcture:
base_dir
|_ folder1
| |_ t1w.nii.gz
|_ another_folder
| |_ t1w.nii.gz
.....
|_ last_folder
| |_ t1w.nii.gz
DataGrabber does an excellent job and takes all the t1w.nii.gz files as input
grabber = nipype.Node(interface=nipype.DataGrabber(infields=['arg'],outfields=['out_file']), name='grabber')
grabber.inputs.base_directory = base_dir
grabber.inputs.sort_filelist = False
grabber.inputs.template = '*/%s.nii.gz'
grabber.inputs.arg = 't1w'
then I set DataSink
sink = nipype.Node(interface=nipype.DataSink(),name='sink')
sink.inputs.base_directory = base_dir
and write the remaining parts of the pipeline
# Neck removal by FSL robustfov
neck_remove=nipype.MapNode(interface=fsl.RobustFOV(), name='neck_remove', iterfield=['in_file'])
neck_remove.inputs.out_roi="t1w_fov.nii.gz"
workflow.connect([(grabber, neck_remove, [('out_file', 'in_file')]),
(neck_remove, sink, [('out_roi', '@in_file')]), ])
workflow.run('MultiProc', plugin_args={'n_procs': 5})
This works nicely and put the outputs in this structure
base_dir
|_ neck_remove0
| |_ t1w_fov.nii.gz
|_ neck_remove1
| |_ t1w_fov.nii.gz
....
|_ folder
| |_ t1w.nii.gz
|_ another_folder
| |_ t1w.nii.gz
.....
Is it possible to force DataSink to put the output files in the same directory as the respective inputs? The structure I had in mind was:
base_dir
|_ folder
| |_ t1w.nii.gz
| |_ t1w_fov.nii.gz
|_ another_folder
| |_ t1w.nii.gz
| |_ t1w_fov.nii.gz
.....
Thank you very much!