I am trying to register and transform a parcellation from freesurfer (destrieux atlas, such as aparc.a2009s+aseg.mgz in freesurfer /mri output folder) to the preprocessed diffusion MRI scans.
I have been using antspy. I think I was successful in achieving my goal but I have the impression I am not using a standard strategy.
What strategy would you advise to go for here?
I tried looking into different ways:
I tried using the .h5 files output from qsiprep. Though I wasn’t successful at finding code that works with antspy. Any suggestions?
I used the middle volume of diffusion scans as a reference, and registered/transformed the freesurfer parcellation to it, using this code:
import ants
import numpy as np
movDir = '/derivatives/freesurfer/sub-x/mri/aparc.a2009s+aseg.mgz'
targetDir = '/derivatives/qsiprep/sub-x/ses-04/dwi/sub-x_ses-04_acq-HCPdir99_space-ACPC_desc-preproc_dwi.nii.gz'
movImg = ants.image_read(movDir)
targImg = ants.image_read(targetDir)
##1: We need the images to have the same dimensionnality in order to perform the registration.
#Given that diffusion data
# Get the number of volumes in the 4th dimension
_, _, _, num_volumes = targImg.shape
# Find the middle index of the 4th dimension
middle_index = num_volumes // 2
# Convert the 4D ANTs image to a NumPy array for slicing
targ_data = targImg.numpy()
# Extract the 3D middle volume
middle_volume_data = targ_data[:, :, :, middle_index]
midTargImg = ants.from_numpy(middle_volume_data,
spacing=targImg.spacing[:3],
origin=targImg.origin[:3], # Use the first 3 elements of the origin
direction=targImg.direction[:3, :3]) # Extract the 3x3 direction matrix)
##2 Run the registration
regAnat2Diff = ants.registration( movImg, midTargImg, 'SyN', reg_iterations = [100,100,20] )
mytx = regAnat2Diff['invtransforms']
##3 Apply the transformation
atlaswarpedimage = ants.apply_transforms( fixed = midTargImg,
moving = movImg ,
transformlist = mytx,
interpolator = 'nearestNeighbor',
whichtoinvert = [True,False])
ants.plot( midTargImg, atlaswarpedimage, overlay_alpha = 0.5 )
To avoid the debatable approach of taking the middle volume of the diffusion scan, I thought about using the anatomical scan output from qsiprep instead, that should be a 3D scan in the same space as the diffusion scan. Do you think this approach is better? On my end the output looked worse than approach #2.
I am a bit confused by the bash code you posted. I can’t find any of those parameters in the python version.
Please see attached my outputs when I use either the anatomical from qsiprep output or the .h5 file. It looks like both transformation yielded poor outcomes. The worst one is with .h5 file (either direction yields similar outputs).
movDir = '/Users/ldaumail3/Documents/research/ampb_mt_tractometry_analysis/ampb/derivatives/freesurfer/sub-x/mri/aparc.a2009s+aseg.mgz'
targetDir = '/Users/ldaumail3/Documents/research/ampb_mt_tractometry_analysis/ampb/derivatives/qsiprep/sub-x/anat/sub-x_space-ACPC_desc-preproc_T1w.nii.gz'
movImg = ants.image_read(movDir)
targImg = ants.image_read(targetDir)
##2 Run the registrati
#3 bis: Apply the transform using the transform file output from qsiprep
mytx = ['/Users/ldaumail3/Documents/research/ampb_mt_tractometry_analysis/ampb/derivatives/qsiprep/sub-x/anat/sub-x_from-ACPC_to-MNI152NLin2009cAsym_mode-image_xfm.h5']
atlaswarpedimage = ants.apply_transforms( fixed = targImg,
moving = movImg ,
transformlist = mytx,
interpolator = 'genericLabel',
whichtoinvert = [False])
ants.plot( midTargImg, atlaswarpedimage, overlay_alpha = 0.5 )
The bash code is for registering the fs brain and qsiprep t1 together. That same transform would then be applied to the atlas with the label interpolation.
I’m not sure I get this, a rigid registration would only translate/rotate the existing scans. The fs brain is in a different space from the qsiprep output anatomical, isn’t it?
If I run a rigid registration, this is the output I am getting:
When trying to save the fwdtransforms, this is the error I am getting:
Exception: Only ANTsTransform instances can be written to file. Check that you are not passing in a filepath to a saved transform.
You are registering the parcellation instead of the T1 brain from FS. If that brain is not masked you should supply a mask for it (I forget if it is masked but I think it is).
Have you seen if those two files (the mgz and nii) are equivalent? I usually trust mriconvert from freesurfer to do that conversion.
Why are you loading the inverse transform if the direction of transformation (fs → qsiprep) is the same in the registration?
COncerning your second point, I am indeed trying to resample the parcellation, as I am planning to use some ROIs from this atlas parcellation to isolate diffusion MRI ROIs