Fixel Analysis Computer Question

Dear Experts,

Can someone please advise on the computing power required for the statistical analysis step for fiber density (step 16 in the fixel analysis). I am running an analysis with ~100 subjects and am unable to do so on a 4 core computer with 16GB of memory without crashing the computer. The memory load required for this step seems a bit unreasonable. I’ve even run into memory issues trying to run this on a super computer cluster. I’ve been at the last step of this analysis for over a month now and I guess the next thing to look into is possibly upgrading lab computers. Are people running these analyses on 64GB+ machines and if so what is the expected timeframe of this analysis?

Thank you in advance,

J

Hi James,

The primary computing requirement for Connectivity-based Fixel Enhancement (fixelcfestats) is the RAM necessary for storing the fixel-fixel connectivity matrix. In our own diffusion analyses (2.5mm acquired, 1.25mm template), this requires more than half of the memory available on our 128GB machines, so 64GB would not be adequate. The memory usage cannot be reduced any further in the software implementation, and that connectivity matrix is fundamental to the operation of the algorithm.

The solution if you do not have access to adequate computing resources is to reduce the resolution of the template image, since this reduces the total number of fixels within the brain mask and hence the size of that connectivity matrix. I just checked and this isn’t mentioned in the documentation, so I’ll make sure it’s added at some point.

The number of subjects will not significantly influence either the memory requirements or processing time (since the statistical enhancement is more expensive in both time and memory than calculation of the initial t-statistics). Additionally, the number of cores on the system will only influence the processing time required, and should not affect whether or not the command is able to complete successfully (unless some external force terminates it due to a long processing time).

Cheers
Rob

2 Likes

The upsampling of the data (before CSD) may give slightly better contrast for the FOD images, and having access to these at a higher spatial resolution typically helps group-wise template building and registration of subjects to the template.
But indeed, after all of that is done, the template may happily be down-sampled again to a resolution that helps with reducing memory usage when building that immense connectivity matrix. Even at lower resolution (after all these steps), you’ll still potentially have a sharper template and spatially better matched subjects than if that intermediary upsampling had not been done.

1 Like

Just a few comments to add to what’s already been said:

Yes, it does need a lot of RAM, but I can assure you that the memory load is entirely reasonable considering what’s going on. A lot of effort went into keeping these requirements to a minimum, and while there may be ways to reduce this further, they’re unlikely to make a dramatic difference. The fundamental problem is we need the fixel-fixel correlation matrix to be computed, and as @rsmith mentioned, it’s large. On top of that, it needs to be resident in RAM since the streamlines are randomly located - there’s no way to predict which bit of the matrix will need to be updated at any point during processing, so we need to have it available in RAM. Any alternative would involve a lot of random disk reads/writes and processing would slow down to a crawl.

What is lacking though are more overt warnings about these RAM requirements in the documentation. There is one mention of this as a note at the end of the fixel mask computation step, but nothing in the most relevant section, or upfront in the introduction. I think we could definitely do something about this…

Thank you all for the help and thorough answers.

James

Hi Rob,

Sometimes I want to run FBA on my own machine, just for testing, and I wonder how much RAM I need if I keep template resolution at 2.5mm (human brain).
16GB? 24GB? 32GB? More?

Best,
Oren

If we assume that RAM scales with the square of the number of fixels (size of fixel-fixel connectivity matrix, ignoring differences in sparsity as a function of resolution), and the number of fixels scales with the inverse cube of the voxel size, then RAM requirements at 2.5mm should be ~ 1.5% of that at 1.25mm. The reality will be greater than that, but I’d naively expect it to be well below 16GB.

Thanks Rob.
A related question – does anybody have an experience with running FBA with some of the required memory being swapped (i.e. having less physical memory than the size of the connectivity matrix), but with the swap being on SSD (or Intel’s Optane). Is it still very slow or is performance reasonable?

Oren

Does it make sense to reduce RAM requiremnts by having a fixel mask. Let’s say you are interested in only one particular (tractography defined) network. Would that help?

Does it make sense to reduce RAM requirements by having a fixel mask. Let’s say you are interested in only one particular (tractography defined) network. Would that help?

Yes, the code was intentionally written so that if a fixel mask is utilised, only the data corresponding to connectivity between fixels within the mask is stored, such that memory usage may be reduced by doing so.