Issues with home directory in fMRIprep?

Summary of what happened:

I want my home directory to be local under scratch and NOT at home on my HPC. My base home directory is /home/06953/js678/

Command used (and if a helper script was used, a link to the helper script or the command generated):

I’ve written the following command to change my home directory…

unset PYTHONPATH; singularity run -B /scratch/06953/jes6785/JORGE_TEST_DATA/:/scratch/06953/js678/JORGE_TEST_DATA/ \
-B /scratch/06953/js678/home-fmriprep/:/home/fmriprep --home /home/fmriprep \
    --cleanenv \
/work/06953/js678/Containers/fmriprep_23.0.2.sif    \ /scratch/06953/js678/JORGE_TEST_DATA/ \
/scratch/06953/js678/JORGE_TEST_DATA/derivatives/fmriprep-v23.0.2/ participant  \
--participant-label sub-a001  \
-w /scratch/06953/jes6785/working_dir/ \
--fs-license-file /scratch/06953/js678/JORGE_TEST_DATA/code/license.txt   \
-v  --skip_bids_validation --bids-filter-file /scratch/06953/js678/JORGE_TEST_DATA/code/ses-01_bf.json         --anat-derivatives /scratch/06953/js678/JORGE_TEST_DATA/derivatives/freesurfer/

Version:

23.0.2

Environment (Docker, Singularity, custom installation):

Singularity

Data formatted according to a validatable standard? Please provide the output of the validator:

Relevant log outputs (up to 20 lines):

Everytime it gets to the autorecon stage it errors with the following message…

nipype.pipeline.engine.nodes.NodeExecutionError: Exception raised while executing Node autorecon1.

Cmdline:
        recon-all -autorecon1 -i /scratch/06953/jes6785/JORGE_TEST_DATA/sub-a001/ses-01/anat/sub-a001_ses-01_rec-norm_T1w.nii.gz -noskullstrip -noT2pial -noFLAIRpial -cw256 -hires -openmp 8 -subjid sub-a001 -sd /scratch/06953/jes6785/JORGE_TEST_DATA/derivatives/fmriprep-v23.0.2/sourcedata/freesurfer -expert /scratch/06953/jes6785/working_dir/fmriprep_23_0_wf/single_subject_a001_wf/anat_preproc_wf/surface_recon_wf/autorecon1/expert.opts
Stdout:

Stderr:
        **/home/fmriprep/06953/js678/working_dir/fmriprep_23_0_wf/single_subject_a001_wf/anat_preproc_wf/surface_recon_wf/autorecon1**: No such file or directory.
Traceback:
        RuntimeError: subprocess exited with code 1.

Screenshots / relevant information:

I don’t understand why it is going to this folder location (bolded above) when I specified home to be /home/fmriprep/ - it appears to be inserting “06953/js678” into the command line. This is actually my base user directory. Does anyone know why this might be happening and how I can rectify it?

When I remove the change to my home directory command, the preprocessing seems to work, but it does eventually fail as my HPC doesn’t seem to like move across directories when running fMRIprep

Hi @AustinBipolar,

I have relabeled your post as Software Support and reorganized it according to the post template, which provides useful information. Please post your future software-related issues using this category/template in the future.

There’s a few problems here we should address before we continue:

This is a bit confusing to me. Are we talking two different computing systems here? Containers cannot work across systems, unless one is mounted on the other - is that the case? And what is the motivation for changing home drives?

Please review this page for reusing precomputed anatomical information.. In short, not only is this anatomical fast track not recommended, but also it is not correct to point it to freesurfer outputs (instead it should point to another fMRIPrep or other BIDS-valid anatomical output).

This isn’t correct for doing what you are trying to do. Here is the description of the --help argument.

-H, --home string a home directory specification. spec can either be a src path or src:dest pair. src is the source path of the home directory outside the container and dest overrides the home directory within the container. (default "/root")

So, unless you specify a src:dst style of argument, Singularity expects the path to exist outside the container. Since you are mounting a drive and renaming as /home/fmriprep, I am going to guess that it doesn’t exist outside the container.

This is a bit redundant since you are not renaming the mount here, you can leave it as just -B /scratch/06953/jes6785/JORGE_TEST_DATA/.

It looks this working_dir subdirectory isn’t mounted, looking at the original bind strings.

Are your data BIDS valid? What is the output from the BIDS validator? Anything special about the filter file that could be relevant here?

In short, I would update your container to the most recent version (23.1.3 at this time) and try the following:

unset PYTHONPATH; singularity run \ 
-e --containall -B /scratch/06953/jes6785 \
--home /scratch/06953/js678/home-fmriprep/:/home/fmriprep \
/work/06953/js678/Containers/fmriprep_23.1.3.sif \
/scratch/06953/js678/JORGE_TEST_DATA/ \
/scratch/06953/js678/JORGE_TEST_DATA/derivatives/fmriprep-v23.1.3/ \
participant \
--participant-label sub-a001 \
-w /scratch/06953/jes6785/working_dir/ \
--fs-license-file /scratch/06953/js678/JORGE_TEST_DATA/code/license.txt \
-v  --bids-filter-file /scratch/06953/js678/JORGE_TEST_DATA/code/ses-01_bf.json

Best,
Steven