Warping functional ROIs from MNI space to wmFOD template space

Hi MRtrix Community,

I write for a suggestion on the best steps to ‘move’ (or warp) functional ROIs (6mm sphere) from the MNI to the FOD template space. These ROIs were obtained with a functional localizer (fMRI) and will be the seeds to generate ‘tracks of interest’ from a whole-brain sift track file.

This issue was partially covered in a couple of old posts, but the poor description of the steps made me even more confused (sorry! :S).

My general understanding is:

  • [ ] Take the first volume of the FOD template (coord 3 0), convert in nii.gz (mrconvert), and register it to an MNI template e.g., MNI152_T1_1mm_brain.nii.gz (mrregister with nl_warp options to generate MNI2template.mif and template2MNI.mif). Is this correct? Because when I try to do that I get into an error: “Linear registration failed, transformation parameters are NaN.”.
  1. Use these warps (i.e., specifically MNI2template.mif) to move the functional ROI from the MNI to the FOD template space (…which function do you suggest here?…).

It would be kind of you tell me what is wrong and which other steps am I missing.

1 Like


With recent versions of MRtrix, prior to throwing this error, you should have seen a warning message that the images do not overlap, hence registration failed to the point that the lowest cost is non-overlapping images. Anatomical and DWI-based images are difficult to register with mrregister as they have different contrasts and intensity ranges and we currently only support the sum of squares metric.

For now, I’d register the anatomical (T1) and DWI-based (average b=0 or first ODF volume) images with ANTs. Assuming you want to use nonlinear registration with antsRegistration, the steps are to align the moving image input.nii (your T1) with the fixed reference image reference.nii (your average b0 or first volume of the ODF image) are currently described in the wiki. (For linear registration, see for instance here).

For example:

First, get the average b0 image from your preprocessed DWI image (prior to dwi2fod) with dwiextract -bzero and mrmath mean, save it as .nii file. Then register that to the T1 image:

antsRegistration --verbose 1 --dimensionality 3 --float 0 --output [ants,antsWarped.nii.gz,antsInverseWarped.nii.gz] --interpolation Linear --use-histogram-matching 1 --winsorize-image-intensities [0.005,0.995] --transform Rigid[0.1] --metric CC[reference.nii,input.nii,1,4,Regular,0.1] --convergence [1000x500x250x100,1e-6,10] --shrink-factors 8x4x2x1 --smoothing-sigmas 3x2x1x0vox --transform Affine[0.1] --metric CC[reference.nii,input.nii,1,4,Regular,0.2] --convergence [1000x500x250x100,1e-6,10] --shrink-factors 8x4x2x1 --smoothing-sigmas 3x2x1x0vox --transform SyN[0.1,3,0] --metric CC[reference.nii,input.nii,1,4] --convergence [100x70x50x20,1e-6,10] --shrink-factors 4x2x2x1 --smoothing-sigmas 2x2x1x0vox -x [reference_mask.nii.gz,input_mask.nii.gz]
  1. Generate an identity (deformation field) warp using the image you wish to warp (“source”; or “moving” image):
warpinit input.mif identity_warp[].nii
  1. Transform this identity warp using the registration program that was used to generate the warp.

For example, if you are using the ANTs registration package:

for i in {0..2}; do
    antsApplyTransforms -d 3 -e 0 -i identity_warp${i}.nii -o mrtrix_warp${i}.nii -r reference.nii -t ants1Warp.nii.gz -t ants0GenericAffine.mat --default-value 2147483647
  1. Correct the warp

The resulting 3 3D images mrtrix_warp?.nii hold the voxel-specific corresponding (x, y, z) scanner-space locations. However, locations in the target space to which no known location exists within the moving space need to be handled explicitly. Neglecting this can result in many voxels in the output mrtrix_warp pointing to the location defined by the out of bounds value, such as the origin (0.0, 0.0, 0,0).

Out of bounds locations should ideally be marked by the transformation command with a unique special value that prevents using that part of the warp. Manually convert out of bound markers to vectors of nans with the tool warpcorrect:

$ warpcorrect mrtrix_warp[].nii mrtrix_warp_corrected.mif
  1. Warp the image
mrtransform input.nii -warp mrtrix_warp_corrected.mif warped_input_image.mif

You can check that the conversion went as expected by comparison with ANTs’ output:

mrview input_warped.mif -overlay.load antsWarped.nii.gz -overlay.opacity 0.3
  1. Apply the warp to your ROI (with regridding):
mrtransform roi_input.nii -warp mrtrix_warp_corrected.mif -interp nearest roi_reference.mif

Note that the resulting image has the same voxel grid as the reference image (DWI). You might want to increase the resolution of the ROI for tracking. You can upsample the warp prior to applying it with mrresize -scale and use linear interpolation to preserve spatial resolution.

  1. Check everything worked: mrview fod.mif -roi.load roi_reference.mif -roi.opacity 0.3


1 Like

Thanks @maxpietsch ! =)

You reply was very detailed and clear. And, most important, it seems to work perfectly!


1 Like

Dear experts,

It seems like this was exactly the answer I was looking for as I’m trying to register MNI to wmfod_template so that I can register my ROI in MNI space to my wmfod_template accordingly.

However, its seems like in the most recent version or mrtrix the antsRegistration no longer exists? Is that correct? If yes, what would be the way to do the same as described above, yet given the latest release?

Thanks a lot in advance!!

Best wishes,

antsRegistration is a command provided by the ANTs package. Make sure it is installed and in the PATH.

Dear @maxpietsch,

Thanks you for your quick reply and I’m sorry the answer was rather straightforward. Just overlooked to possibility of the command coming from another package.

Yet, I’m still facing some issues. When I run the antsRegistration command I get the following warning: No valid points were found during metric evaluation. For image metrics, verify that the images overlap appropriately. For instance, you can align the images centres by translation.

Indeed, my MNI image and the wmfod_template do not overlap with completely. Yet whatever registration tool I try, I don’t get to two images to overlap.

Any idea whats going wrong? I can upload the two images I’m trying to register, if necessary.

That’s an issue with antsRegistration or your usage of this tool, so can’t really provide much help here.

You could align images within MRtrix based on their center of mass or spatial extent (mrregister $A $B -rigid_init_translation mass -type rigid -rigid_niter 0 -rigid rigid.txt -rigid_scale 1) and apply it to the image header (using mrtransform -linear). Note that you’d then need to either combine transformations (the linear and the one obtained from ANTs and converted to MRtrix format) or apply both transformations consecutively to all images of interest. I’d advise against that and there is probably a straightforward way to do this within ANTs that does not require to jump through hoops.

Make sure you initialise the transformation ( initial-moving-transform) and if that does not work, I’d suggest you have a look at this page, search for example usages of this command and if that does not work head over to the ANTs issue tracker.

Sometimes the antsRegistration command fails at finding an initial translation and rotation, you can do it with as a separate step before the registration, they have two commands to calculate an initial affine transformation: antsAI and antsantsAffineInitializer, these commands do almost the same thing but differ on the parameters you cau tweak.
Then, you can use the output inside the antsRegistration call by specifying the --initial-moving-transform [ output_antsAI.mat ]
That is what has worked for me.


Dear @diagiraldo and @maxpietsch,

Based on the valuable inputs of both of you I manages to do the registration of my ROI in MNI space to the population template!

Thanks a lot.


1 Like

Dear Mrtrix team.
First, thanks so much for the clear instructions!
I have a question regarding the nature of this transformations, since I don’t have a deep understanding of this warp and transformation functions.

I am warping a dwi b=0 image to a MNI T1 template. I have succeeded thanks to maxpietsch and his guide. Nonetheless, my question is regarding the FODs. Do I need a new processing step to modify the FODs regarding the identity warp ?

My understanding is that the FODs are already warped given that the underlying image, the DWI b=0, is already warped, but I’m not sure.

1 Like

Hi Alex,

To calculate FODs you’d need more than the b=0 signal so I am assuming that you used the dMRI data to extract the b=0 image and to calculate the FODs. If your FOD image was generated from the image that contained the b=0 image, it would share the same space. Hence you’d need to apply the same warp to bring it to MNI space as was used to bring the b=0 image to that space.
Note that nonlinear transformations of raw dMRI data results in artefacted data, do not warp the dMRI data and calculate the FODs in the new space, rather calculate the FODs in dMRI native space and warp the FODs with appropriate reorientation (mrtransform -reorient_fod yes).

You don’t need to recreate the warp as it contains the mapping from one space to another and is not tied to a particular image; it can be applied to either image.

The wiki entry Registration, transformation, regridding: concepts might help.


1 Like

Hello people,
sorry for coming back to this question, but I didn’t fully get the pipeline.
It seems to me that it works on the single subject level, but I do not understand at which stage the wmfod_template comes into the loop.

Or maybe I got the input wrong. Can I select as input.nii → MNI152_T1_1mm_Brain.nii.gz, and as reference.nii → wmfod_template.nii (converted from the .mif resulting from the population_template function) ?

Thanks to all of you for this useful tips!

Welcome Giuseppe!

There might be some residual confusion from the contents of prior posts. While this sort of registration & transformation of data is frequently done for subject-specific data in conjunction with a template, in order to draw data associated with a particular template (e.g. a parcellation) into the space of the individual participant, it’s also entirely applicable between different templates, to draw that associated data into the space of an otherwise unrelated template.

You won’t be able to reasonably use the entire WM FOD template image in such an application, since the MNI template does not itself contain FOD data and therefore the orientation-dependent contrast provided by FOD data can’t be meaningfully used. You can however derive scalar data from the FODs, and use that as 3D image data for registration. The simplest is to simply extract the first volume, corresponding to the l=0 term.

As far as “input” vs. “reference”, if the registration is truly symmetric, then it should be possible to arrive at the same result either way; the only difference is whether you use the “forward” or “inverse” warp to transform the data from MNI space to that of your own template.


1 Like