using MRtrix version 3.0.2 – can you confirm that nothing else has happened to that file since it was produced? It hasn’t been loaded in MatLab or something?
Assuming this is indeed a straight output of that command, can I ask you to try again with our latest release (3.0.3), just in case there was a bug in 3.0.2 that we might have fixed since? If that still happens with the most up to date code, we’ll need to investigate and figure out why this might be occurring…
Hi @jdtournier, I’ve uninstalled and then reinstalled MRTRIX, and get the same error…I’m wondering if I need to start from the beginning of the pipeline after the reinstallation though? I went back one step ( fixelconnectivity fixel_mask/ tracks_2_million_sift.tck matrix/ )
And got the same error.
Also, how do I confirm the version number of my newly installed mrtrix?
Ok, that’s not good. We’ll need to investigate to figure out what the problem is. Any chance you could share the input data you’re providing to the command so we can replicate on our side?
No, that shouldn’t be necessary. Micro version updates (e.g. 3.0.2 to 3.0.3) should only consist of bug fixes. New features or anything that changes behaviour should only be introduced in minor releases (i.e. the next one will be 3.1.0).
Any MRtrix command with the -version option, e.g. fixelconnectivity -version.
Yes happy to share the input data. There’s quite a lot but does it work to just share some of the downstream data, like the ‘template’ folder? Let me know how to send your way.
And yes, using -version I can confirm I’m still getting the error with version 3.0.3
OK, I’ve had a look at the data you sent, and I can’t reproduce the issue. When I ran the command, this was the output:
$ fixelconnectivity fixel_mask/ tracks_2_million_sift.tck matrix/
fixelconnectivity: [100%] preloading data for "fixel_mask/directions.mif"
fixelconnectivity: [100%] computing fixel-fixel connectivity matrix
fixelconnectivity: [100%] Normalising and writing fixel-fixel connectivity matrix to directory "matrix/"
(using MRtrix 3.0.3 compiled with Eigen 3.3.9).
The output fixel data files were as expected:
$ mrinfo matrix/fixels.mif
Image name: "matrix/fixels.mif"
Dimensions: 1940251728 x 1 x 1
Voxel size: 1 x 1 x 1
Data strides: [ 1 2 3 ]
Data type: unsigned 32 bit integer (little endian)
Intensity scaling: offset = 0, multiplier = 1
Transform: 1 0 0 0
0 1 0 0
0 0 1 0
command_history: /home/donald/exp/mrtrix3/bin/fixelconnectivity fixel_mask/ tracks_2_million_sift.tck matrix/ (version=3.0.3)
However, looking at the code, I think the issue might be that the command may have crashed before completing. When the output images are first created, the dim: entry is indeed left blank, and overwritten with the correct value at the end of processing, once the final number of fixel-fixel connections stored in the output is actually known. If for any reason the command doesn’t make it to the end of that function, the output file will be left in an unusable state, just as you’re seeing.
My best guess for why the command might fail on your system is an out of memory error. This command takes up all of the 32GB of RAM on my system to complete. I wouldn’t be surprised if it failed if your system has anything less than that…
OK, I guess the first thing to verify is whether the command did indeed crash. Could you try running it again with the -debug option and record exactly what is shown in the terminal?
Another option would be to check the size of the output file. On my system, this is what I get:
$ ls -l matrix/
-rw-r--r-- 1 donald donald 7761007248 Jan 10 13:08 fixels.mif
-rw-r--r-- 1 donald donald 27833880 Jan 10 13:08 index.mif
-rw-r--r-- 1 donald donald 7761007252 Jan 10 13:08 values.mif
Another possibility for the crash is running out of storage space on the destination device – it does require ~15GB after all . A quick way to double-check this is to run this command within the folder destined to contain your outputs:
df -h .
Make sure the value in the Avail column is sufficient to store all of this.
Thank you for all of the suggestions! I was running the entire analysis on an external drive with 2TB storage, which I believe is more than all my files for the analysis combined, but it was still the problem! When I moved everything over to my desktop and ran the analysis, it worked! Thanks a lot!
Normalization of the smoothing matrix by the sum of smoothing kernel weights is looped by value rather than reference, so smoothing matrix data remain un-normalized. As a result, input data are scaled during smoothing, with the scaling factor being the sum of smoothing matrix weights for each individual fixel. As this scaling will be identical for all subjects, results of statistical inference should not be affected. It will however influence the interpretation of any direct utilization of the beta coefficients, which will all be correspondingly scaled by this per-fixel factor.