Ica-aroma error with fmriprep: JoinTSVColumns requires a value for input 'join_file'

fmriprep

#1

Hello,
I am using fmriprep(version 1.0.8) to preprocess functional data (resting + task).
And the commands I used is just as below

singularity run $ImagePath /scratch/users/xiaoqian/aTBS/source /scratch/users/xiaoqian/aTBS/processed participant --participant-label $SUBID --omp-nthreads 8 --ignore fieldmaps --ignore slicetiming --longitudinal --fs-license-file /home/xiaoqian/scripts/freesurfer/license.txt --no-freesurfer --low-mem --use-aroma -w /scratch/users/xiaoqian/work --ignore-aroma-denoising-errors

Then for some tasks, I got errors like

Since I have already added the “–ignore-aroma-denoising-errors” option, I have no clue why this happened. Any suggestion will be helpful.

Thanks,
Xiaoqian


RuntimeError: ICA-AROMA failed; fmriprep version 1.1.6
#2

A few things to check:

  1. Do you get an error without --ignore-aroma-denoising-errors - if so what is it?
  2. Could you try passing --cleanenv to singularity (just after run).
  3. Could you try latest version of FMRIPREP (1.0.11)?

#3

Thanks for the quick reply.
1. Do you get an error without --ignore-aroma-denoising-errors - if so what is it?

For this one I actually tried and the errors are the same.

2. Could you try passing --cleanenv to singularity (just after run).
3. Could you try latest version of FMRIPREP (1.0.11)?

I will try the last two and let you know what’s going on.

Best~
Xiaoqian


#4

Hi @Helen, could you try the latest release of FMRIPREP (1.0.14)? That one should fix this issue.


#5

Thanks, Oesteban.
Trying now.
Another thing I want to mention is while using version 1.0.11, if I changed the number of nodes to 16 (mem to 64G), the error actually has gone. And my data set is longitudinal which has two time point (for each subject, we have 1 structural + 2 functional scans for two different days ). Does this mean this error actually relate to memory requirement, and I have to make sure that I have allocate at least 32G memory for each time point of data?

Best,
Xiaoqian


#6

It is very likely that this error is related to memory requirements. If one of the tasks on which that node depends was killed, Python is unaware and nipype could go on without noticing. When this node is submitted to run it finds that no file is present and crashes.

Glad you worked it out!