Heudiconv Embedder Error

heudiconv

#1

Update: this error about bogus slice timing was due to negative slice times in the CSA header, and not something with heudiconv. I will leave this up in case someone else tries to process ABCD data and runs into this issue.

I still get the embedder error below with all subjects, though.

I’ve been getting the error below trying to convert Siemens dicoms. I was trying to find where exactly this originates, but I’m not sure. Can it be safely ignored? It doesn’t actually look like I have TRs < 0 when I look at slice times from the dicom hdr, which is what it looks like triggers the warning here .

INFO: Converting /data/picsl/mackey_group/public_data/ABCD/test_bids/heudiconv/sub-NDARINV14R3MHDX/run005 (383 DICOMs) -> /data/picsl/mackey_group/public_data/ABCD/test_bids/heudiconv/sub-NDARINV14R3MHDX . Converter: dcm2niix . Output types: ('nii.gz',)
190204-16:49:54,884 nipype.workflow INFO:
	 [Node] Setting-up "convert" in "/tmp/dcm2niixdcazm1tk/convert".
INFO: [Node] Setting-up "convert" in "/tmp/dcm2niixdcazm1tk/convert".
190204-16:49:55,59 nipype.workflow INFO:
	 [Node] Running "convert" ("nipype.interfaces.dcm2nii.Dcm2niix"), a CommandLine Interface with command:
dcm2niix -b n -z y -x n -t n -m n -f sub-NDARINV14R3MHDX -o . -s n -v n /tmp/dcm2niixdcazm1tk/convert
INFO: [Node] Running "convert" ("nipype.interfaces.dcm2nii.Dcm2niix"), a CommandLine Interface with command:
dcm2niix -b n -z y -x n -t n -m n -f sub-NDARINV14R3MHDX -o . -s n -v n /tmp/dcm2niixdcazm1tk/convert
190204-16:49:57,825 nipype.interface INFO:
	 stdout 2019-02-04T16:49:57.825764:Compression will be faster with 'pigz' installed
INFO: stdout 2019-02-04T16:49:57.825764:Compression will be faster with 'pigz' installed
190204-16:49:57,826 nipype.interface INFO:
	 stdout 2019-02-04T16:49:57.825764:Chris Rorden's dcm2niiX version v1.0.20181114  GCC7.3.0 (64-bit Linux)
INFO: stdout 2019-02-04T16:49:57.825764:Chris Rorden's dcm2niiX version v1.0.20181114  GCC7.3.0 (64-bit Linux)
190204-16:49:57,826 nipype.interface INFO:
	 stdout 2019-02-04T16:49:57.825764:Found 383 DICOM file(s)
INFO: stdout 2019-02-04T16:49:57.825764:Found 383 DICOM file(s)
190204-16:49:57,826 nipype.interface INFO:
	 stdout 2019-02-04T16:49:57.825764:slices stacked despite varying acquisition numbers (if this is not desired recompile with 'mySegmentByAcq')
INFO: stdout 2019-02-04T16:49:57.825764:slices stacked despite varying acquisition numbers (if this is not desired recompile with 'mySegmentByAcq')
190204-16:49:57,826 nipype.interface INFO:
	 stdout 2019-02-04T16:49:57.825764:Warning: Siemens MoCo? Bogus slice timing (range -400..320, TR=800ms)
INFO: stdout 2019-02-04T16:49:57.825764:Warning: Siemens MoCo? Bogus slice timing (range -400..320, TR=800ms)

And yes, I’m going install pigz as well.
Slice times using dicom_hdr -slice_times:

-- Siemens timing (60 entries): 0.0 86399600.0 80.0 86399680.0 160.0 86399760.0 240.0 86399840.0 320.0 86399920.0 0.0 86399600.0 80.0 86399680.0 160.0 86399760.0 240.0 86399840.0 320.0 86399920.0 0.0 86399600.0 80.0 86399680.0 160.0 86399760.0 240.0 86399840.0 320.0 86399920.0 0.0 86399600.0 80.0 86399680.0 160.0 86399760.0 240.0 86399840.0 320.0 86399920.0 0.0 86399600.0 80.0 86399680.0 160.0 86399760.0 240.0 86399840.0 320.0 86399920.0 0.0 86399600.0 80.0 86399680.0 160.0 86399760.0 240.0 86399840.0 320.0 86399920.0 ```

#2

I’m also getting the error below on embedder, which I haven’t figured out whether is worrisome or not.

	 [Node] Error on "embedder" (/tmp/embedmetaz0iyilwm/embedder)
WARNING: [Node] Error on "embedder" (/tmp/embedmetaz0iyilwm/embedder)
ERROR: Embedding failed: 'odict_values' object does not support indexing

@mgxd


#3

I’m not very familiar with any of this, but your slice timing has values like “86399600.0”… It says the acceptable range is -400 through 320. Maybe these values being 83 million are throwing it off. I followed what he did here: https://www.youtube.com/watch?v=O1kZAuR7E00

I was on a siemens machine and it worked out. I’m not sure why your slice timing looks like that though


#4

Yes, see my update above ^ @Justin_Smith1 . The data has…got something wrong with the slice times. Basically, there are negative values in the CSA header (what is referenced as -400-320 in the bogus slice times error, which are not within the acceptable range). This leads to an overflow error when reading them as unsigned, and the huge values seem using dicom_hdr.
Update: this error about bogus slice timing was due to negative slice times in the CSA header, and not something with heudiconv. I will leave this up in case someone else tries to process ABCD data and runs into this issue.

However, I am still getting the embedder error with other folks with correct slice times!


#5

Hi @Ursula_Tooley

This problem is being addressed in https://github.com/nipy/heudiconv/pull/306


#6

Update from heudiconv: if you received the embedding issue, it should look very similar to having used the --minmeta flag (just dcm2niix sidecar JSON). So do not need to rerun.*

Awesome, thanks @mgxd! It’s not immediately clear to me from reading through the issue–does this affect my converted nifits and .json sidecars (i.e. do I have to reconvert?), and if so, when should I pull a new version of heudiconv? My files were passing BIDS validation, so I wasn’t too worried, but I can also rerun with --minmeta if that will provide me with additional reassurance that things are okay.