Hi there,
been using fMRIprep on local clusters w/out problems and am now expanding to an HPC system on campus.
BIDS structure for a single subject has been created w/ heudiconv and validated.
I am trying to run the following command on the anats of a single subject:
singularity run --cleanenv -B /crash-work/fhopp/bids_nii/:/data /usr/local/fmriprep-1.5.0.simg\
/data /data/derivatives/fmriprep\
participant\
--participant-label 01 -w /crash-work/fhopp/ --nthreads 16 --omp-nthreads 16 --mem_mb 10000 --anat-only --notrack --fs-no-reconall --fs-license-file /crash-work/fhopp/.fs_license.txt
Curioulsy, it appears that fMRIprep runs and can read the /data, as I get the following output:
This dataset appears to be BIDS compatible.
Summary: Available Tasks: Available Modalities:
98 Files, 3.03GB TODO: full task name for exp T1w
1 - Subject physiobase T2w
1 - Session practise dwi
TODO: full task name for physiobase bold
TODO: full task name for practise events
exp fieldmap
If you have any questions, please post on https://neurostars.org/tags/bids.
Making sure the input data is BIDS compliant (warnings can be ignored in most cases).
191107-18:10:11,472 nipype.workflow IMPORTANT:
Running fMRIPREP version 1.5.0:
* BIDS dataset path: /data.
* Participant list: ['01'].
* Run identifier: 20191107-181009_5e7b185d-ec35-4ce8-92d9-6c810ba165c4.
191107-18:10:20,531 nipype.workflow IMPORTANT:
Works derived from this fMRIPrep execution should include the following boilerplate:
Results included in this manuscript come from preprocessing
performed using *fMRIPrep* 1.5.0
(@fmriprep1; @fmriprep2; RRID:SCR_016216),
which is based on *Nipype* 1.2.2
(@nipype1; @nipype2; RRID:SCR_002502).
However, I then get the following errors:
Captured warning (<class 'UserWarning'>): [Errno 2] No such file or directory. joblib will operate in serial mode
/usr/local/miniconda/lib/python3.7/site-packages/sklearn/externals/joblib/_multiprocessing_helpers.py:28: UserWarning: [Errno 2] No such file or directory. joblib will operate in serial mode
warnings.warn('%s. joblib will operate in serial mode' % (e,))
/usr/local/miniconda/lib/python3.7/site-packages/sklearn/externals/joblib/_multiprocessing_helpers.py:28: UserWarning: [Errno 2] No such file or directory. joblib will operate in serial mode
warnings.warn('%s. joblib will operate in serial mode' % (e,))
fMRIPrep failed: [Errno 2] No such file or directory
Traceback (most recent call last):
File "/usr/local/miniconda/lib/python3.7/site-packages/nipype/pipeline/plugins/multiproc.py", line 136, in __init__
mp_context = mp.context.get_context(
AttributeError: module 'multiprocessing.context' has no attribute 'get_context'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/miniconda/lib/python3.7/site-packages/fmriprep/cli/run.py", line 407, in main
fmriprep_wf.run(**plugin_settings)
File "/usr/local/miniconda/lib/python3.7/site-packages/nipype/pipeline/engine/workflows.py", line 583, in run
runner = plugin_mod(plugin_args=plugin_args)
File "/usr/local/miniconda/lib/python3.7/site-packages/nipype/pipeline/plugins/multiproc.py", line 144, in __init__
self.pool = ProcessPoolExecutor(max_workers=self.processors)
File "/usr/local/miniconda/lib/python3.7/concurrent/futures/process.py", line 542, in __init__
pending_work_items=self._pending_work_items)
File "/usr/local/miniconda/lib/python3.7/concurrent/futures/process.py", line 158, in __init__
super().__init__(max_size, ctx=ctx)
File "/usr/local/miniconda/lib/python3.7/multiprocessing/queues.py", line 42, in __init__
self._rlock = ctx.Lock()
File "/usr/local/miniconda/lib/python3.7/multiprocessing/context.py", line 67, in Lock
return Lock(ctx=self.get_context())
File "/usr/local/miniconda/lib/python3.7/multiprocessing/synchronize.py", line 162, in __init__
SemLock.__init__(self, SEMAPHORE, 1, 1, ctx=ctx)
File "/usr/local/miniconda/lib/python3.7/multiprocessing/synchronize.py", line 59, in __init__
unlink_now)
FileNotFoundError: [Errno 2] No such file or directory
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/miniconda/bin/fmriprep", line 10, in <module>
sys.exit(main())
File "/usr/local/miniconda/lib/python3.7/site-packages/fmriprep/cli/run.py", line 483, in main
subject_list, output_dir, work_dir, run_uuid, packagename='fmriprep')
File "/usr/local/miniconda/lib/python3.7/site-packages/niworkflows/reports/core.py", line 431, in generate_reports
for subject_label in subject_list
File "/usr/local/miniconda/lib/python3.7/site-packages/niworkflows/reports/core.py", line 431, in <listcomp>
for subject_label in subject_list
File "/usr/local/miniconda/lib/python3.7/site-packages/niworkflows/reports/core.py", line 418, in run_reports
subject_id=subject_label, packagename=packagename)
File "/usr/local/miniconda/lib/python3.7/site-packages/niworkflows/reports/core.py", line 267, in __init__
self._load_config(Path(config))
File "/usr/local/miniconda/lib/python3.7/site-packages/niworkflows/reports/core.py", line 285, in _load_config
self.index(settings['sections'])
File "/usr/local/miniconda/lib/python3.7/site-packages/niworkflows/reports/core.py", line 294, in index
self.layout = BIDSLayout(self.root, config='figures', validate=False)
File "/usr/local/miniconda/lib/python3.7/site-packages/bids/layout/layout.py", line 185, in __init__
self._validate_root()
File "/usr/local/miniconda/lib/python3.7/site-packages/bids/layout/layout.py", line 318, in _validate_root
raise ValueError("BIDS root does not exist: %s" % self.root)
ValueError: BIDS root does not exist: /crash-work/fhopp/reportlets/fmriprep/sub-01
I am not sure of the root cause of this problem (I am very new to singularity), but wonder whether it is a problem of binding to singularity (-B) or something else? Why does it tell me that BIDS root does not exist when it can read its content earlier? I must add that the IT person for HPC created the image, so – as of now – I cannot provide information on how the singularity image was created but if this is relevant info I will reach out.
Thanks in advance!!