I’m using the connectomestats command to perform TFNBS on a simple regression, with 27 observations and 1000 nodes (hence 1000x1000 connectivity matrix for each observation). It appears that the calculation is pretty slow on my machine - stuck at " [ 2%] Running permutations… " for about 30 minutes already. I wonder if there’s any system requirement for such analysis, or is there an upper limit of data size that connectomestats can handle?

I’m not quite sure why the connectomestats command would take so long. The TFNBS statistical enhancement step is not particularly computationally difficult. Given you only have 27 inputs, I don’t think it’s the same issue as was reported for fixelcfestats elsewhere.

There’s a test that you can do yourself that will at least get some way to figuring out where the problem is. Re-run connectomestats but specify the “none” algorithm. If that runs very quickly, then it’s the TFNBS enhancement that’s taking a long time; if it’s also very slow, then the issue is independent of TFNBS, and maybe there’s something that’s causing the GLM fit to take a long time.

I would also suggest manually checking those output files that the command will have already generated at that point. One way that TFCE can take a long time is if there are very large test statistic values generated, because the integration ends up counting from 0 to some large number in 0.1 increments (exponential integration steps would be better, but I digress); such a problem tends to be evident in the unshuffled data.