"ImageTypeText" is missing from sidecars when using dcm2bids

Hi all,
when I use the dcm2niix in MRIcroGL, I get some extra info called “ImageTypeText” that I don’t get when using dcm2bids. (This is not the “ImageType” - both dcm2bids and dcm2niix give that. I’m specifically looking for “ImageTypeText”.) We are using a Siemens VIDA which, when taking a T1 or T2, puts out the raw original and a “normalized” image, and the “ImageTypeText” lets us know which version is which. Can I configure the sidecar output (with the config.json file or somehow) to show this info? It’s minor, but valuable info.

H @Wade_Weber and welcome to neurostars!

You can make additional JSON sidecar changes through the dcm2bids config file. For example, the following adds TaskName: 'rest' to a functional file.

  "descriptions": [
      "dataType": "func",
      "modalityLabel": "bold",
      "customLabels": "task-rest",
      "criteria": {
        "SeriesDescription": "Axial EPI-FMRI (Interleaved I to S)*"
      "sidecarChanges": {
        "TaskName": "rest"

However, if this field needs to be pulled dynamically from the files, then that is a different story. I don’t know if that field output is specific to MRIcroGL, as opposed to just the dcm2niix version. You can make sure your version of dcm2niix that dcm2bids invokes is the same as in MRIcroGL.


Please see dcm2niix issue 236. In brief, Siemens used to store private text in the public ImageType (0008,0008) tag, with XA30 Siemens is beginning to store these details in the private ImageTypeText (0021,1175) tag. Confusingly, the same XA30 device can store information in either the public or private tag depending on whether the user choses Enhanced or Classic DICOMs. The Siemens Research Collaboration Manager associated with your center can help you understand this transition.

The Siemens XA data has evolved dramatically from XA10 through XA30 thanks to community engagement. This means we get much richer datasets. However, due to this evolution it is crucial that you always use the latest stable release of dcm2niix. Here is a BIDS sidecar from a XA30 sequence that extracts the newly minted private ImageTypeText tag. The final line of the sidecar reveals that the recent stable version “ConversionSoftwareVersion”: “v1.0.20220717” was used.

Hey Chris,

Thanks for the help. We’re using the Enhanced DICOM output from the VIDA. This is my first attempt using dcm2bids - if they aren’t using the most up-to-date version of dcm2niix, is there a way to for dcm2bids to use and “external version” of dcm2niix?

Hi @Wade_Weber,

dcm2bids will take whatever version of dcm2niix you have in your path. You can download/install the latest version dcm2niix without any issue.


Maybe you want to directly ask the developers of Dcm2Bids. They do say dcm2niix is mandatory but any version will work great depending on your config file.

I do strongly encourage you to develop a relationship with the Siemens Research Collaboration Manager associated with your site. They can provide insight regarding how to get the most from your instrument and are your advocate for lobbying Siemens to ease research usage of these new instruments.

Ooooh… I was about to post I had the most up-to-date dcm2niix on my computer, but I’m doing the work on a different cluster… Awesome, thanks. I’ll contact them to update dcm2niix.

Thanks to you, too, @neurolabusc!


Hi @abore,

I’m changing my answer a bit - I am actually working with member AustinBipoler (you helped her last September with a different issue with the dcm2bids container we’re using). So we aren’t using the stand-alone, we’re using the cdm2bids container, and I’m sure the dcm2niix version is contained within that. Do you know who I can contact to get the version used inside the container updated to the most recent version?


Working on it. I’m the maintainer of dcm2bids.

It should be available. I created a new tag called: 2.1.9_latest_dcm2niix
Tell me if it fixes your issues.


Oh, awesome! Thanks, I’ll give it a try and let you know.

FANTASTIC! Thant worked perfectly. Thanks so much @abore.