Nipype: clean working directory while workflow is running

nipype
fmri

#1

Hello,

I need trial-wise beta estimations for my analysis. For this I have setup runs and trials as iterables. So for each subject, run and trial I compute a GLM and save the beta images. This creates a huge amount of data in the working directory because each node saves the data (e.g. functional .nii files for each run) in a separate folder.

I know there is the experimental config option to delete node folders, but this crashes the workflow as SPM is looking for a file that is not there (my guess is that it gets deleted but is actually still needed).

As a workaround I have tried to connect a self-written function to the output of the datasink. This way I know that the workflow for the specific subject, run and trial has finished. In my function I then delete the node folders for that specific subject, run and trial. However, the workflow still has errors because nodes try to access files that are already deleted.

I came here to ask if anybody else has tried to clean up the working directory “on the fly” and might be able to give me some advice. Thanks!


#2

Does no one have an idea?


#3

Could you please file an issue on nipype’s github page with the specific crash error message that you received? it’s possible that we may be able to fix the experimental option more easily than an additional script.

one way to add a script would be through the status_callback option of a plugin. this would allow you to monitor and remove things as necessary, but i think it would be easier to let the workflow engine manage these things.

ps. nipype also automatically removes unnecessary outputs by default to reduce disk space requirements.