Hello,
Received following error almost at the end of the pipeline.
Any idea why this could be ?
Thanks
BL
Applying LTAtransformInterp (resample_type 1)
writing to /out/freesurfer/sub-001/mri/T2.prenorm.mgz...
mri_normalize -sigma 0.5 -nonmax_suppress 0 -min_dist 1 -aseg /out/freesurfer/sub-001/mri/aseg.presurf.mgz -surface /out/freesurfer/sub-001/surf/rh.white identity.nofile -surface /out/freesurfer/sub-001/surf/lh.white identity.nofile /out/freesurfer/sub-001/mri/T2.prenorm.mgz /out/freesurfer/sub-001/mri/T2.norm.mgz
mghRead(/out/freesurfer/sub-001/mri/T2.prenorm.mgz): could not read 409600 bytes at slice 231
using Gaussian smoothing of bias field, sigma=0.500
disabling nonmaximum suppression
retaining points that are at least 1.000mm from the boundary
using segmentation for initial intensity normalization
reading from /out/freesurfer/sub-001/mri/T2.prenorm.mgz...
No such file or directory
mri_normalize: could not open source file /out/freesurfer/sub-001/mri/T2.prenorm.mgz
No such file or directory
Linux f3b30f91ee80 3.16.0-30-generic #40~14.04.1-Ubuntu SMP Thu Jan 15 17:43:14 UTC 2015 x86_64 x86_64 x86_64 GNU/Linux
recon-all -s sub-001 exited with ERRORS at Tue Jun 5 19:09:49 UTC 2018
To report a problem, see http://surfer.nmr.mgh.harvard.edu/fswiki/BugReporting
Hi Chris,
I had to kill preprocessing at the middle, and re-ran without deleting precomputed files. But , it went well after cleaning the precomputed files/folder-no more errors.
(using Docker, version: V1.0.15). Thanks
Hi Chris,
I am still having same issues ( a new subject’s data ). This time, even after deleting the precomputed files ( /…/freesurfer/subject/ and fmriprep/subject/) did not work.
I am using docker ( in linux) and with commandline function: fmriprep-docker – fs-licence-file /usr/…/.txt data/in/ data/out participant --participitant_level 110 --low-mem
I did check T2, it looks good on mricorn. I got same error before too but that subject’s preprocessing went well after deleting the pre-computed freesurfer and fmriprep data and re-running the pipeline. But at this time, deleting pre-computed data did not work either. I am having same issue with a couple of more subjects today …I can email you the logs ( which ?) if helps in figuring out the issues.
Yes, almost at same place except : it points different slice no —at this try, says slice 210.
mghRead(/out/freesurfer/sub-110/mri/T2.prenorm.mgz): could not read 409600 bytes at slice 210
I have copied a couple of lines from terminal output- just before the error message,see below.
Thanks
BL
j_ras = (-0.0590446, 0.994429, 0.0873195)
k_ras = (-0.0100862, -0.088062, 0.996064)
INFO: Reading transformation from file /out/freesurfer/sub-110/mri/transforms/T2raw.lta...
Reading transform with LTAreadEx()
reading template info from volume /out/freesurfer/sub-110/mri/orig.mgz...
INFO: Applying transformation from file /out/freesurfer/sub-110/mri/transforms/T2raw.lta...
---------------------------------
INFO: Transform Matrix (linear_ras_to_ras)
0.99998 -0.00482 0.00409 0.00893;
0.00486 0.99992 -0.01136 0.37669;
-0.00403 0.01138 0.99993 -0.02971;
0.00000 0.00000 0.00000 1.00000;
---------------------------------
Applying LTAtransformInterp (resample_type 1)
writing to /out/freesurfer/sub-110/mri/T2.prenorm.mgz...
mri_normalize -sigma 0.5 -nonmax_suppress 0 -min_dist 1 -aseg /out/freesurfer/sub-110/mri/aseg.presurf.mgz -surface /out/freesurfer/sub-110/surf/rh.white identity.nofile -surface /out/freesurfer/sub-110/surf/lh.white identity.nofile /out/freesurfer/sub-110/mri/T2.prenorm.mgz /out/freesurfer/sub-110/mri/T2.norm.mgz
mghRead(/out/freesurfer/sub-110/mri/T2.prenorm.mgz): could not read 409600 bytes at slice 210
using Gaussian smoothing of bias field, sigma=0.500
disabling nonmaximum suppression
retaining points that are at least 1.000mm from the boundary
using segmentation for initial intensity normalization
reading from /out/freesurfer/sub-110/mri/T2.prenorm.mgz...
No such file or directory
mri_normalize: could not open source file /out/freesurfer/sub-110/mri/T2.prenorm.mgz
No such file or directory
Linux 39d43690994f 3.16.0-30-generic #40~14.04.1-Ubuntu SMP Thu Jan 15 17:43:14 UTC 2015 x86_64 x86_64 x86_64 GNU/Linux
recon-all -s sub-110 exited with ERRORS at Tue Aug 7 02:37:24 UTC 2018
For more details, see the log file /out/freesurfer/sub-110/scripts/recon-all.log
To report a problem, see http://surfer.nmr.mgh.harvard.edu/fswiki/BugReporting
Standard error:
Return code: 1
Sentry is attempting to send 1 pending error messages
Waiting up to 10 seconds
Press Ctrl-C to quit
fMRIPrep: Please report errors to https://github.com/poldracklab/fmriprep/issues
Just an update on my last email ->>
Just ran another subject, got same error message with " cann’t allocate memory" (below).
Commandline function: fmriprep-docker – fs-licence-file /usr/…/.txt data/in data/out participant --participitant_level 112 --low-mem --mem-mb 32000
Thanks
BL
Applying LTAtransformInterp (resample_type 1)
writing to /out/freesurfer/sub-112/mri/T2.prenorm.mgz...
mri_normalize -sigma 0.5 -nonmax_suppress 0 -min_dist 1 -aseg /out/freesurfer/sub-112/mri/aseg.presurf.mgz -surface /out/freesurfer/sub-112/surf/rh.white identity.nofile -surface /out/freesurfer/sub-112/surf/lh.white identity.nofile /out/freesurfer/sub-112/mri/T2.prenorm.mgz /out/freesurfer/sub-112/mri/T2.norm.mgz
znzTAGskip: tag=1077952576, failed to calloc 1077952512 bytes!
using Gaussian smoothing of bias field, sigma=0.500
disabling nonmaximum suppression
retaining points that are at least 1.000mm from the boundary
using segmentation for initial intensity normalization
reading from /out/freesurfer/sub-112/mri/T2.prenorm.mgz...
Cannot allocate memory
Linux 58f5f98badef 3.16.0-30-generic #40~14.04.1-Ubuntu SMP Thu Jan 15 17:43:14 UTC 2015 x86_64 x86_64 x86_64 GNU/Linux
recon-all -s sub-112 exited with ERRORS at Tue Aug 7 11:56:22 UTC 2018
For more details, see the log file /out/freesurfer/sub-112/scripts/recon-all.log
To report a problem, see http://surfer.nmr.mgh.harvard.edu/fswiki/BugReporting
Standard error:
Return code: 1
Sentry is attempting to send 1 pending error messages
Waiting up to 10 seconds
Press Ctrl-C to quit
fMRIPrep: Please report errors to https://github.com/poldracklab/fmriprep/issues
I have added " --mem-mb 32000" in command-line to allocate max of 32gb to docker (as shown below). Please let me know if this need to be changed or does not work this way?
Thank you so much
BL
#==================================================
fmriprep-docker – fs-licence-file /usr/…/.txt data_in data_out participant --participitant_level 112 --mem-mb 32000
This conversation seems divorced from the original context… But we can certainly try that. I would test by running inside the container without mounting your output directory, to make sure that it’s not the mount that’s the problem.