Dwidenoise with large kernels

preprocessing

#1

Greetings,

I have multi-shell HARDI data from b = 1000 to 10,000 s/mm^2 with 64 directions per b-value shell. Along with the b0 images (n = 11), there are roughly 750 total volumes. This doesn’t happen consistently but sometimes if I use a kernel size such as 11x11x11 or 9x9x9, I get noise maps with voxel patches of zero noise. If I drop down to say 7x7x7, the issue is resolved. I wonder though, since the kernel size should be larger than the number of total volumes being denoised, if using a smaller kernel will not denoise the data optimally.

Is this a known issue with datasets that are large such as this one?

Best regards,

Hunter Moss


#2

Hi Hunter

First of all, there is no reason for the kernel to be larger than the number of volumes in the dataset. In either case, the method will look construct a matrix of M x N (noDWIs x patchsize) and test its SVD spectrum against the MP-distribution. However, it is a known issue that the MN cases sometimes behave differently.

We have a code branch that tries to address this, but this is not yet merged to master. If you want to give it a go, you can git checkout denoising_updates and run ./build bin/dwidenoise.

Best regards

Daan


#3

Hi Daan,

Thanks for the reply. I understand that it is not necessary but as recommended on the DWI denoising page it says that for maximal SNR boost the number of kernel elements should be larger than the number of DWI images. Is this no longer recommended?

-Hunter