Hi all and happy holidays,
I am following the steps inside dwiintensitynorm script to normalize the HCP data (~860 subjects). The script is designed mostly for small groups and does not make use of a computing cluster, therefore I am trying to parallelize the work myself. Until now I have computed FA for all subjects, and created a cleanup script to remove the typical rim with high FA outside the brain. But I am stuck at population_template
. I tried this script on a subset of 10 subjects and it took a 1-2 hours. I have also read on this list cases of 3-day computations for 40 subjects. This means the script will take âforeverâ on 860 subjects.
Can you give me some advise how to tackle this problem? I have thought various solutions:
- Find a way to parallelize the exact computations inside the
population_template
script in a cluster environment. But I am not sure what computations are being run exactly. - Use the 5TT map and the respective FA map of each subject to select voxels with high FA within white matter. My guess is that this should give anyway a good estimation of the signal we are trying to find out with
dwiintensitynorm
. - Use alternative methods for building the common template. ANTs has scripts for building population templates, although that may take a while, too. Another idea I had is to make an average FA map, register all subjects to that map using a quick linear or greedy algorithm, then repeat this procedure a couple of times to get a good map in 1-2 days.
If I had to choose, option 2 is the most straightforward, but I am not sure what are the implications of selecting slightly different voxels for each subject. The idea is to take the median value anywayâŚ
Also, for an eventual fixel analyses the population template must come from FODs. In the documentation you advise to create it for a subsample of ~40 subjects. Does this advice still stand for our HCP dataset. Again, I wish these maps were already computed and made publicly available. It would be a great contribution to open source science.
Thank you for any help.
Dorian