Registration and Transformation to Qsiprep output

Dear Neurostar community,

I am trying to register and transform a parcellation from freesurfer (destrieux atlas, such as aparc.a2009s+aseg.mgz in freesurfer /mri output folder) to the preprocessed diffusion MRI scans.

I have been using antspy. I think I was successful in achieving my goal but I have the impression I am not using a standard strategy.

What strategy would you advise to go for here?

I tried looking into different ways:

  1. I tried using the .h5 files output from qsiprep. Though I wasn’t successful at finding code that works with antspy. Any suggestions?
transformFile = '/derivatives/qsiprep/sub-x/anat/sub-x_from-MNI152NLPC_mode-image_xfm.h5'
  1. I used the middle volume of diffusion scans as a reference, and registered/transformed the freesurfer parcellation to it, using this code:
import ants
import numpy as np

movDir = '/derivatives/freesurfer/sub-x/mri/aparc.a2009s+aseg.mgz'
targetDir = '/derivatives/qsiprep/sub-x/ses-04/dwi/sub-x_ses-04_acq-HCPdir99_space-ACPC_desc-preproc_dwi.nii.gz'


movImg = ants.image_read(movDir)
targImg = ants.image_read(targetDir)

##1: We need the images to have the same dimensionnality in order to perform the registration.
#Given that diffusion data 
# Get the number of volumes in the 4th dimension
_, _, _, num_volumes = targImg.shape 

# Find the middle index of the 4th dimension
middle_index = num_volumes // 2

# Convert the 4D ANTs image to a NumPy array for slicing
targ_data = targImg.numpy()

# Extract the 3D middle volume
middle_volume_data = targ_data[:, :, :, middle_index]

midTargImg = ants.from_numpy(middle_volume_data, 
                            spacing=targImg.spacing[:3],
                            origin=targImg.origin[:3],  # Use the first 3 elements of the origin
                            direction=targImg.direction[:3, :3])  # Extract the 3x3 direction matrix)

##2 Run the registration
regAnat2Diff = ants.registration( movImg, midTargImg, 'SyN', reg_iterations = [100,100,20] )
mytx = regAnat2Diff['invtransforms']

##3 Apply the transformation
atlaswarpedimage = ants.apply_transforms( fixed = midTargImg, 
                                       moving = movImg , 
                                       transformlist = mytx, 
                                       interpolator  = 'nearestNeighbor', 
                                       whichtoinvert = [True,False])

ants.plot( midTargImg, atlaswarpedimage, overlay_alpha = 0.5 )

  1. To avoid the debatable approach of taking the middle volume of the diffusion scan, I thought about using the anatomical scan output from qsiprep instead, that should be a 3D scan in the same space as the diffusion scan. Do you think this approach is better? On my end the output looked worse than approach #2.

Thanks in advance for the help!

Hi @ldaumail,

The .h5 file should work directly with ants.registration.apply_transforms — ANTsPy dev (latest) documentation

I’d rather use the qsiprep anatomical as the registration target.

I believe the genericLabel is preferred for parcellations in ANTs.

The following is similar to how QSIRecon aligns the freesurfer brain to QSIPrep output. Note you have to convert the fs brain to nifti format.

# Register FreeSurfer brain to QSIPrep T1w
antsRegistration --collapse-output-transforms 1 \
    --dimensionality 3 --float 0 \
    --initial-moving-transform [ ${QSIPREP_T1}, ${FS_BRAIN_NII}, 1 ] \
    --initialize-transforms-per-stage 0 --interpolation BSpline \
    --output [ ${OUTDIR}/transform, ${OUTDIR}/transform_Warped.nii.gz ] \
    --transform Rigid[ 0.1 ] \
    --metric Mattes[ ${QSIPREP_T1}, ${FS_BRAIN_NII}, 1, 32, Random, 0.25 ] \
    --convergence [ 1000x500x250x100, 1e-06, 10 ] \
    --smoothing-sigmas 3.0x2.0x1.0x0.0mm --shrink-factors 8x4x2x1 \
    --use-histogram-matching 0 \
    --masks [ ${QSIPREP_T1_MASK} NULL ] \
    --winsorize-image-intensities [ 0.002, 0.998 ] \
    --write-composite-transform 0

Best,
Steven

1 Like

Hi Steven,

Thanks for your reply.

I am a bit confused by the bash code you posted. I can’t find any of those parameters in the python version.

Please see attached my outputs when I use either the anatomical from qsiprep output or the .h5 file. It looks like both transformation yielded poor outcomes. The worst one is with .h5 file (either direction yields similar outputs).

movDir = '/Users/ldaumail3/Documents/research/ampb_mt_tractometry_analysis/ampb/derivatives/freesurfer/sub-x/mri/aparc.a2009s+aseg.mgz'
targetDir = '/Users/ldaumail3/Documents/research/ampb_mt_tractometry_analysis/ampb/derivatives/qsiprep/sub-x/anat/sub-x_space-ACPC_desc-preproc_T1w.nii.gz'

movImg = ants.image_read(movDir)
targImg = ants.image_read(targetDir)

##2 Run the registrati

#3 bis: Apply the transform using the transform file output from qsiprep
mytx = ['/Users/ldaumail3/Documents/research/ampb_mt_tractometry_analysis/ampb/derivatives/qsiprep/sub-x/anat/sub-x_from-ACPC_to-MNI152NLin2009cAsym_mode-image_xfm.h5']
atlaswarpedimage = ants.apply_transforms( fixed = targImg, 
                                       moving = movImg , 
                                       transformlist = mytx, 
                                       interpolator  = 'genericLabel', 
                                       whichtoinvert = [False])
ants.plot( midTargImg, atlaswarpedimage, overlay_alpha = 0.5 )

Best,
-Loïc


The bash code is for the ANTs CLI, Anatomy of an antsRegistration call · ANTsX/ANTs Wiki · GitHub, and should also map on to the python version: Registration — ANTsPy dev (latest) documentation. Which parameters are you confused about?

The .h5 you used is from ACPC (preprocessed) to MNI. FreeSurfer is not in MNI space, so that xfm is not relevant here.

Best,
Steven

I am not finding those:

Also, you recommended genericLabel for the interpolator, but I see bSpline in the code you shared.

is --convergence a parameter that would include reg_iterations in python?

Is the bash code you shared one single function that corresponds to both ants.registration and ants.apply_transform?

Could you write the bash code in python so I can see the equivalent of each parameter?

Thanks for your patience,

Best,
-Loïc

The bash code is for registering the fs brain and qsiprep t1 together. That same transform would then be applied to the atlas with the label interpolation.

Thanks, I’ll try this out then. I wanted to build an exclusively python coded pipeline, however. Is it possible to do it with antspy?

import ants

# Load images
qsiprep_t1 = ants.image_read("${QSIPREP_T1}")
fs_brain_nii = ants.image_read("${FS_BRAIN_NII}")
qsiprep_t1_mask = ants.image_read("${QSIPREP_T1_MASK}")

# Perform registration
reg = ants.registration(
    fixed=qsiprep_t1,
    moving=fs_brain_nii,
    type_of_transform="Rigid",  # Corresponds to --transform Rigid[ 0.1 ]
    initial_transform=ants.create_initial_affine(
        fixed=qsiprep_t1, 
        moving=fs_brain_nii, 
        use_principal_axis=True
    ),
    mask=qsiprep_t1_mask,  # Corresponds to --masks [ ${QSIPREP_T1_MASK} NULL ]
    reg_iterations=(1000, 500, 250, 100),  # Corresponds to --convergence [ 1000x500x250x100, 1e-06, 10 ]
    verbose=True
)

# Save outputs
ants.image_write(reg["warpedmovout"], "${OUTDIR}/transform_Warped.nii.gz")
ants.write_transform(reg["fwdtransforms"], "${OUTDIR}/transform")

Hi Steven,

Thanks for the help. I am still having issues however.

Here is the code I use:

import ants
import numpy as np
import nibabel as nib
import os

WORK_DIR = '/work'

MOV_FSBRAIN_MGZ = '/derivatives/freesurfer/sub-x/mri/aparc.a2009s+aseg.mgz'
FIXED_QSIPREPT1 = '/derivatives/qsiprep/sub-x/anat/sub-x_space-ACPC_desc-preproc_T1w.nii.gz'
QSIPREP_T1_MASK = '/derivatives/qsiprep/sub-x/anat/sub-x_space-ACPC_desc-brain_mask.nii.gz'

## We need to convert fs brain .mgz to nifti format for registration
# Load the .mgz file
mgz_img = nib.load(MOV_FSBRAIN_MGZ)

# Save as NIfTI format
MOV_FSBRAIN_NII = os.path.join(WORK_DIR, "sub-x/aparc.a2009s+aseg.nii")
nib.save(mgz_img, MOV_FSBRAIN_NII)

# movImg = ants.image_read(movDir)
# targImg = ants.image_read(targetDir)

##2 Run the registration
qsiprep_t1 = ants.image_read(FIXED_QSIPREPT1)
fs_brain_nii = ants.image_read(MOV_FSBRAIN_NII)
qsiprep_t1_mask = ants.image_read(QSIPREP_T1_MASK)

# Perform registration
reg = ants.registration(
    fixed=qsiprep_t1,
    moving=fs_brain_nii,
    type_of_transform= 'SyN', #"Rigid",  # Corresponds to --transform Rigid[ 0.1 ]
    mask=qsiprep_t1_mask,  # Corresponds to --masks [ ${QSIPREP_T1_MASK} NULL ]
    reg_iterations=(1000, 500, 250, 100),  # Corresponds to --convergence [ 1000x500x250x100, 1e-06, 10 ]
    verbose=True
)

# Save outputs
ants.image_write(reg["warpedmovout"], os.path.join(WORK_DIR, "sub-x/transform_Warped.nii.gz"))
#ants.write_transform(reg["fwdtransforms"], os.path.join(WORK_DIR, "sub-x/transform"))

mytx = reg['invtransforms']

##3 Apply the transformation
fsbrain_warped = ants.apply_transforms( fixed = qsiprep_t1, 
                                       moving = fs_brain_nii , 
                                       transformlist = mytx,                                       
                                       interpolator  = 'genericLabel', 
                                       whichtoinvert = [True, False])

ants.plot( qsiprep_t1, fsbrain_warped, overlay_alpha = 0.5 )

Here is the output:

I’m also unable to save the transform:

ants.write_transform(reg["fwdtransforms"], os.path.join(WORK_DIR, "sub-x/transform"))

I am also unable to use it in place of invtransforms.

Thanks,
Best,
-Loïc

Hi @ldaumail,

Why use SyN if you are registering the same brain to itself?

Why are you inverting the xfm here, if the the applying step is fs–>qsiprep just like the registration step?

An error message would help.

Best,
Steven

I’m not sure I get this, a rigid registration would only translate/rotate the existing scans. The fs brain is in a different space from the qsiprep output anatomical, isn’t it?

If I run a rigid registration, this is the output I am getting:

When trying to save the fwdtransforms, this is the error I am getting:
Exception: Only ANTsTransform instances can be written to file. Check that you are not passing in a filepath to a saved transform.

Yes, but the brain is still unwarped from native. It is just not rotated to ACPC orientation.

Also it looks like you are registering the aparc_aseg, not the brain output from FS.

Would you be able to say what is wrong in this following code then?

import ants
import numpy as np
import nibabel as nib
import os

WORK_DIR = '/Users/ldaumail3/Documents/research/ampb_mt_tractometry_analysis/ampb/work'

MOV_FSBRAIN_MGZ = '/Users/ldaumail3/Documents/research/ampb_mt_tractometry_analysis/ampb/derivatives/freesurfer/sub-x/mri/aparc.a2009s+aseg.mgz'
FIXED_QSIPREPT1 = '/Users/ldaumail3/Documents/research/ampb_mt_tractometry_analysis/ampb/derivatives/qsiprep/sub-x/anat/sub-NSxGxHNx1952_space-ACPC_desc-preproc_T1w.nii.gz'
QSIPREP_T1_MASK = '/Users/ldaumail3/Documents/research/ampb_mt_tractometry_analysis/ampb/derivatives/qsiprep/sub-x/anat/sub-x_space-ACPC_desc-brain_mask.nii.gz'

## We need to convert fs brain .mgz to nifti format for registration
# Load the .mgz file
mgz_img = nib.load(MOV_FSBRAIN_MGZ)

# Save as NIfTI format
MOV_FSBRAIN_NII = os.path.join(WORK_DIR, "sub-x/aparc.a2009s+aseg.nii")
nib.save(mgz_img, MOV_FSBRAIN_NII)

# movImg = ants.image_read(movDir)
# targImg = ants.image_read(targetDir)

##2 Run the registration
# regAnat2Diff = ants.registration( movImg, targImg, 'SyN', initial_transform = [targImg, movImg, 1], aff_metric = mattes,reg_iterations = [100,100,20] )

qsiprep_t1 = ants.image_read(FIXED_QSIPREPT1)
fs_brain_nii = ants.image_read(MOV_FSBRAIN_NII)
qsiprep_t1_mask = ants.image_read(QSIPREP_T1_MASK)

# Perform registration
reg = ants.registration(
    fixed=qsiprep_t1,
    moving=fs_brain_nii,
    type_of_transform= 'Rigid', #"Rigid",  # Corresponds to --transform Rigid[ 0.1 ]
    mask=qsiprep_t1_mask,  # Corresponds to --masks [ ${QSIPREP_T1_MASK} NULL ]
    reg_iterations=(1000, 500, 250, 100),  # Corresponds to --convergence [ 1000x500x250x100, 1e-06, 10 ]
    verbose=True
)

# Save outputs
ants.image_write(reg["warpedmovout"], os.path.join(WORK_DIR, "sub-x/transform_Warped.nii.gz"))
#ants.write_transform(reg["fwdtransforms"], os.path.join(WORK_DIR, "sub-x/transform"))


mytx = reg['invtransforms']

##3 Apply the transformation
fsbrain_warped = ants.apply_transforms( fixed = qsiprep_t1, 
                                       moving = fs_brain_nii , 
                                       transformlist = mytx,                                       
                                       interpolator  = 'genericLabel', 
                                       ) #whichtoinvert = [True, False]

ants.plot( qsiprep_t1, fsbrain_warped, overlay_alpha = 0.5 )

Why is the qsiprep T1 not perfectly matching the warped fs brain?

You are registering the parcellation instead of the T1 brain from FS. If that brain is not masked you should supply a mask for it (I forget if it is masked but I think it is).

Have you seen if those two files (the mgz and nii) are equivalent? I usually trust mriconvert from freesurfer to do that conversion.

Why are you loading the inverse transform if the direction of transformation (fs → qsiprep) is the same in the registration?

COncerning your second point, I am indeed trying to resample the parcellation, as I am planning to use some ROIs from this atlas parcellation to isolate diffusion MRI ROIs

Yes but you need to register brain-to-brain, and then apply that registration to the atlas.

1 Like

Ok thank you very much!

This seems to work now.

This is the code I used:

import ants
import numpy as np
import os
import os.path as op
import sys



utils = '/code/utils'
sys.path.append(op.expanduser(f'{utils}'))

WORK_DIR = '/work'

FIXED_QSIPREPT1 = '/derivatives/qsiprep/sub-x/anat/sub-x_space-ACPC_desc-preproc_T1w.nii.gz'
QSIPREP_T1_MASK = '/derivatives/qsiprep/sub-x/anat/sub-x_space-ACPC_desc-brain_mask.nii.gz'

MOV_FSBRAIN_MGZ = '/derivatives/freesurfer/sub-x/mri/brain.mgz'
MOV_FSPARC_MGZ = '/derivatives/freesurfer/sub-x/mri/aparc.a2009s+aseg.mgz'

## We need to convert fs brain .mgz to nifti format for registration
# Save directory with NIfTI format
MOV_FSBRAIN_NII = os.path.join(WORK_DIR, "sub-x/brain.nii")
# Run mri_convert
freesurferCommand = f'mri_convert {MOV_FSBRAIN_MGZ} {MOV_FSBRAIN_NII}'
os.system(f'bash {utils}/callFreesurferFunction.sh -s "{freesurferCommand}"')

##2 Run the registration
# regAnat2Diff = ants.registration( movImg, targImg, 'SyN', initial_transform = [targImg, movImg, 1], aff_metric = mattes,reg_iterations = [100,100,20] )

qsiprep_t1 = ants.image_read(FIXED_QSIPREPT1)
fs_brain_nii = ants.image_read(MOV_FSBRAIN_NII)
qsiprep_t1_mask = ants.image_read(QSIPREP_T1_MASK)

# Perform registration
reg = ants.registration(
    fixed=qsiprep_t1,
    moving=fs_brain_nii,
    type_of_transform= 'Rigid', #"Rigid",  # Corresponds to --transform Rigid[ 0.1 ]
    mask=qsiprep_t1_mask,  # Corresponds to --masks [ ${QSIPREP_T1_MASK} NULL ]
    reg_iterations=(1000, 500, 250, 100),  # Corresponds to --convergence [ 1000x500x250x100, 1e-06, 10 ]
    verbose=True
)

# Save outputs
ants.image_write(reg["warpedmovout"], os.path.join(WORK_DIR, "sub-x/transform_Warped.nii.gz"))
# ants.write_transform(reg["fwdtransforms"], os.path.join(WORK_DIR, "sub-x/fwdtransform"))


mytx = reg['fwdtransforms']

##3 Apply the transformation to parcellation
# Convert parcellation to nifti
MOV_FSPARC_NII = os.path.join(WORK_DIR, "sub-x/aparc.a2009s+aseg.nii")
# Run mri_convert
freesurferCommand = f'mri_convert {MOV_FSPARC_MGZ} {MOV_FSPARC_NII}'
os.system(f'bash {utils}/callFreesurferFunction.sh -s "{freesurferCommand}"')

fs_parc_nii = ants.image_read(MOV_FSPARC_NII)
fsparc_warped = ants.apply_transforms( fixed = qsiprep_t1, 
                                       moving = fs_parc_nii , 
                                       transformlist = mytx,                                       
                                       interpolator  = 'genericLabel', 
                                       ) #whichtoinvert = [True, False]

ants.plot( qsiprep_t1, fsparc_warped, overlay_alpha = 0.5 )```


Still unable to save the fwd transforms, but this is not a big deal for now.

Thanks a lot for your time, I truly appreciate the help!

![Screenshot 2025-01-28 at 3.48.57 PM|664x500](upload://toFaU2d83GL95Vp0WPHbGn49kkr.png)
1 Like