Registration of structural and diffusion weighted data

Hi Superclear,

Maybe try the skull-stripped version of the T1 image. It seems the frontal CSF part of the b0 image is registered to the skull here.
This post can also be of interest to you: Distortion correction using T1

Cheers,

Thibo

1 Like

Hi @SuperClear,

yes, in the end it worked.
Are both your images 3d and not one 3d and the dwi 4d?
You can use the command provided by @maxpietsch here: ants reg.

You could use this to get get a 3d dwi before using above mentioned command:

ExtractSliceFromImage 4 anonym_b0.nii anonym_b0_volume0.nii.gz 3 0

then it should work.

With epi_reg I got as well huge data-sets, so I didn’t continue to use that.

For registration from t1 to mni152 i use the betted t1 (skullstrip via fsl bet) and register it with:

flirt [options] -in <inputvol> -ref <refvol> -omat <outputmatrix>

and use then the generated .mat file for other data sets from the same aquisition:
flirt [options] -in <inputvol> -ref <refvol> -applyxfm -init <matrix> -out <outputvol>

It seems to me that your images are not in talairach space, since for that, the anterior and posterior commissure should lie on a horizontal line.
But I think one of the experts should comment on those processes.
Still, I hope that helped a bit.

Best regards, Lucius

1 Like

Hi, Lucius,

Sweet thanks for the help!

I registered t1 to mni152 by using linear registration and non-linear registration.
I find it that non-linear registeration is better matched to MNI152 :

the register commands:
flirt -in t1_skulled.nii.gz -ref MNI152_T1_1mm_brain.nii.gz -omat t1_skulled_to_MNI152.mat
and use then the generated .mat as input of --aff in fnirt:
fnirt --in=t1_skulled.nii.gz --ref=MNI152_T1_1mm_brain.nii.gz --aff=t1_skulled_to_MNI152.mat --iout=t1_skulled_in_MNI152_fnirt_out.nii.gz

I will try ants registration later.

Thanks, Chaoqing

Hi, Thibo,

Thanks for your advice!

Yeah, you’re right, I didn’t do any distortion correction on diffusion data.
Here on the sagittal view of the registered DTI and T1 image, it is misplaced.

I will then try to correct the distortion as you suggested, and then do registration. Hope it will work.

Thanks,
Chaoqing

1 Like

It seems works, but the output file is extremely large (original DTI->61.8 MB , output DTI->519.4 MB), which brings a lot of pressure when I do fiber tracking.

With epi_reg I got as well huge data-sets, so I didn’t continue to use that.

Sounds like the script is automatically re-gridding the input DWI to T1 image space; this is similar to the default operation of flirt. Unfortunately it doesn’t look like there’s a way to get around this: With flirt we can request the affine matrix and then apply it to just the header transformation, but epi_reg is non-linear. You could simply resample the resulting DWI series back to a lower resolution again, but you’d be performing two interpolations sequentially, which is generally not advised.

Hi Kerstin,

Thanks for the very informative steps you shared. I’m wondering why you would use mri/nu.mgz and not mri/T1.mgz instead? Or simply use brainmask.mgz which is already skill-stripped?

@Kerstin, @Michiko,

I am personally using norm.mgz, since it is a final FreeSurfer product of intensity normalization and skull stripping (which is not the case of the brainmask.mgz), before intensity filtering of subcortical structures ( which is brain.mgz). Are there reasons why not to use norm.mgz?

Antonin

Hi @Antonin_Skoch,

Thanks for raising this point - do you know how much that step of intensity normalization would affect registration?

It would mainly depend on the cost function used in registration. Its effect will be probably minor. I would expect much more substantial effect on segmentation.

Hi Kerstin,

Thank you for this approach. I have been using this method with good success. I would also like to ask, if the same transformations if applied to the WM_bin image used in the bbr registration step, could be used as a mask to be supplied to the ACT? (Considering the fact that I don’t have the reverse phase encoded data).

Regards,
Archith

Hi, Archith,

suggested approach is using 5ttgen fsl command on T1 structural image to generate proper input for ACT.
However, using distortion correction on DWI is highly recommended, since you would not get sufficient spatial correspondence between DWI and structural image, especially in frontal areas.
See https://mrtrix.readthedocs.io/en/latest/quantitative_structural_connectivity/act.html

Antonin

1 Like

Hi Antonin,

Thank you very much for the clarification.

Regards,
Archith

why do you use the inverse tranform while not tranform the T1 image to DWI directly, would this be worse?

Hi @LiuYuchen,

When you use the bbr correction (note the -cost bbr in the flirt call) you need to have a WM mask in the reference image, this is why by using this metric you need to register the B0 to the T1w and then invert it. I hope this helps.

Best regards,

Manuel

Could you please recommend a tracking pipeline for data without distortion correciton?

Could you please recommend a tracking pipeline for data without distortion correciton?

As far as tractography is concerned, this is just tracking without ACT, which is no different to what people were doing before ACT existed.

Given you asked about a pipeline, it depends on exactly what you are trying to achieve; there’s more than one possible experimental pipeline that utilises tractography. But if you’re talking specifically about connectome construction (reasonably likely given the thread is about co-localisation of anatomical information), historically I’ve found the process so ill-posed without ACT that I’ve not bothered putting the effort into trying to find something that even somewhat works. Doing a three-tissue decomposition would help a great deal, and you would need to use a much larger radial search minimum distance when assigning streamlines to parcels.

I’m regularly both asked and have done the asking when it comes to performing distortion correction in the absence of reversed phase-encoding data. Unfortunately I’ve both been underwhelmed by solutions I’ve tried and not received a sufficiently confident recommendation of a solution that I could verify and pass on.