Hey everyone,
I’m currently running mriqc on a data set in BIDS.
To do so, I’m using the poldracklab/mriqc
docker container like the following:
sudo docker run -it --rm -v $bidsdir:/data:ro -v $output_dir:/out poldracklab/mriqc:latest /data /out --verbose-reports participant --participant_label 07
For some participants everything works totally fine. However, the command results in an error in some participants, when processing the functional images.
The error itself seems to happen during the generation of the html report and looks like the following:
2018-02-06 16:12:04,692 niworkflows:INFO Successfully created report (/usr/local/src/mriqc/work/workflow_enumerator/funcMRIQC/SpatialNormalization/_in_file_..data..sub-07..func..sub-07_task-test_run-02_bold.nii.gz/EPI2MNI/report.svg) Fatal Python error: Segmentation fault
Current thread 0x00007fbd4ccca700 (most recent call first): File "/usr/local/miniconda/lib/python3.6/site-packages/matplotlib/image.py", line 411 in _make_image File "/usr/local/miniconda/lib/python3.6/site-packages/matplotlib/image.py", line 719 in make_image File "/usr/local/miniconda/lib/python3.6/site-packages/matplotlib/image.py", line 495 in draw File "/usr/local/miniconda/lib/python3.6/site-packages/matplotlib/artist.py", line 63 in draw_wrapper File "/usr/local/miniconda/lib/python3.6/site-packages/matplotlib/image.py", line 147 in flush_images File "/usr/local/miniconda/lib/python3.6/site-packages/matplotlib/image.py", line 163 in _draw_list_compositing_images File "/usr/local/miniconda/lib/python3.6/site-packages/matplotlib/axes/_base.py", line 2409 in draw File "/usr/local/miniconda/lib/python3.6/site-packages/matplotlib/artist.py", line 63 in draw_wrapper File "/usr/local/miniconda/lib/python3.6/site-packages/matplotlib/image.py", line 139 in _draw_list_compositing_images File "/usr/local/miniconda/lib/python3.6/site-packages/matplotlib/figure.py", line 1143 in draw File "/usr/local/miniconda/lib/python3.6/site-packages/matplotlib/artist.py", line 63 in draw_wrapper File "/usr/local/miniconda/lib/python3.6/site-packages/matplotlib/backends/backend_svg.py", line 1248 in _print_svg File "/usr/local/miniconda/lib/python3.6/site-packages/matplotlib/backends/backend_svg.py", line 1212 in print_svg File "/usr/local/miniconda/lib/python3.6/site-packages/matplotlib/backend_bases.py", line 2192 in print_figure File "/usr/local/miniconda/lib/python3.6/site-packages/matplotlib/figure.py", line 1572 in savefig File "<string>", line 34 in _big_plot File "/usr/local/miniconda/lib/python3.6/site-packages/niworkflows/nipype/interfaces/utility/wrappers.py", line 137 in _run_interface File "/usr/local/miniconda/lib/python3.6/site-packages/niworkflows/nipype/interfaces/base/core.py", line 485 in run File "/usr/local/miniconda/lib/python3.6/site-packages/niworkflows/nipype/pipeline/engine/nodes.py", line 596 in _run_command File "/usr/local/miniconda/lib/python3.6/site-packages/niworkflows/nipype/pipeline/engine/nodes.py", line 520 in _run_interface File "/usr/local/miniconda/lib/python3.6/site-packages/niworkflows/nipype/pipeline/engine/nodes.py", line 443 in run File "/usr/local/miniconda/lib/python3.6/site-packages/niworkflows/nipype/pipeline/plugins/multiproc.py", line 62 in run_node File "/usr/local/miniconda/lib/python3.6/multiprocessing/pool.py", line 119 in worker File "/usr/local/miniconda/lib/python3.6/multiprocessing/process.py", line 93 in run File "/usr/local/miniconda/lib/python3.6/multiprocessing/process.py", line 249 in _bootstrap File "/usr/local/miniconda/lib/python3.6/multiprocessing/popen_fork.py", line 74 in _launch File "/usr/local/miniconda/lib/python3.6/multiprocessing/popen_fork.py", line 20 in __init__ File "/usr/local/miniconda/lib/python3.6/multiprocessing/context.py", line 277 in _Popen File "/usr/local/miniconda/lib/python3.6/multiprocessing/context.py", line 223 in _Popen File "/usr/local/miniconda/lib/python3.6/multiprocessing/process.py", line 105 in start File "/usr/local/miniconda/lib/python3.6/multiprocessing/pool.py", line 233 in _repopulate_pool File "/usr/local/miniconda/lib/python3.6/multiprocessing/pool.py", line 240 in _maintain_pool File "/usr/local/miniconda/lib/python3.6/multiprocessing/pool.py", line 366 in _handle_workers File "/usr/local/miniconda/lib/python3.6/threading.py", line 864 in run File "/usr/local/miniconda/lib/python3.6/threading.py", line 916 in _bootstrap_inner File "/usr/local/miniconda/lib/python3.6/threading.py", line 884 in _bootstrap
Then the docker container keeps running, doing nothing, until I kill the process. The corresponding .json
file in mriqc/derivatives/
(e.g. sub-07_task-test_run-01_bold.json
) looks fine and normal.
Weirdly, it’s not appearing randomly, but participant specific: sub-01, sub-03, sub-04, sub-05, sub-06
work every time, while the rest of the participants don’t work (no matter what).
Comparing the .json
files of participants that worked and those that didn’t, no fundamental difference are visible.
The same accounts for the underlying raw data: multiband epi, TR=0.512 sec., 790 images per run, two runs per participant.
The structural pipeline works completely fine for every participant.
Has anyone an idea what the problem might be?
Best, Peer