Dwipreproc strategy for mismatched AP-PA PE dir volumes

Dear All,

These are rookie questions to optimize dwipreproc output.
I have two sets acquired sequentially within the same scan session (each set contains 10 Xb=0 images acquired first in the respective set, then 60 dwi b=700; 1st set (70 vols) AP, 2nd set (70 vols) PA encode direction, on SIemens Trio).
The expected dwipreproc output in this case is a 70-vol mif file (‘result.mif’). However, I get an 140-vol result mif. What happened? I inspected the gradient file, grad.b (generated by dwipreproc with dwgrad), which I attach at the end of the post, and I noted that for one (1, !) dicom/vol, the AP (vol no 11) and PA (vol no 81) do not have identical gradients (see the end of this post, mismatch highlighted).
I verified this is the case with another mrtrix tool
dcminfo -a -csa 11.dcm
and I got

[CSA] PhaseEncodingDirectionPositive: 0            
[CSA] NumberOfImagesInMosaic: 64           
[CSA] DiffusionGradientDirection: -0.99996662 0.00577977 0.00577977 

and
dcminfo -a -csa 81.dcm

[CSA] PhaseEncodingDirectionPositive: 1            
[CSA] NumberOfImagesInMosaic: 64           
[CSA] DiffusionGradientDirection: -0.99996662 -0.00577977 0.00577977 
  1. I wanted to make sure this is a hardcoded header issue and not a reading error through mrtrix. If it’s a header issue, it could be a SIemens soft error in the grad table prior to scanning or a header imprimatur error after scanning. Have you seen these kind of mismatches, if so which is likely the cause?

  2. Due to this mismatch (and a resulting no of image pairs less than half the total no of volumes), dwipreproc does not recombine any volume pairs, and as such the full complement (140 vol) of the AP-PA vols are present in the output.
    As a strategy (with an eye to the subsequent tractography/connectome steps), should I a. leave it as is (140 unrecombined volumes, 2. throw the two vols wtih mismatched gradients, and obtain a recombined dwipreproc output with 68 vols?

Thank you,

Octavian

0 0 0 0
0 0 0 0
0 0 0 0
0 0 0 0
0 0 0 0
0 0 0 0
0 0 0 0
0 0 0 0
0 0 0 0
0 0 0 0
0.9999665937 **0.005779769848** 0.005779769848 700.0000368
0.5875926931 0.3829750433 -0.7127867446 700.0002071
-0.8965316409 0.02854692621 -0.4420589213 700.000186
-0.7297876840999999 -0.4174306552 -0.5414439808 700.0000497
0.1536670776 0.9661813907 0.2070747437 699.9998396
-0.1157393372 0.7788450582999999 0.6164452781999999 699.9999132
-0.4077204677 0.4213358983 -0.8100864652 699.9999392
-0.2101675197 0.7588345748999999 -0.6164411583 699.9999357
0.09406538893999999 0.9714638658 0.2177375944 700.0001645999999
-0.1338656833 0.03835077523 -0.9902571367 700.0001743
0.1414193641 -0.7561122484 0.6389795233 700.0000583999999
0.6349663947999999 0.7495585392 0.1870285373 699.9999454
-0.07897827875000001 0.9735814178 -0.2142467137 699.9998449
0.1053177552 0.02634857879 0.9940894944000001 700.0000642
-0.2985741736 0.3936984116 -0.8693992314 700.0000299
-0.2061008557 0.8080201324 0.5519292553 699.9999612
0.330717313 0.04518756768 -0.9426474116 700.0000719
-0.8023465825 0.5941747345 -0.05653623948 700.000013
-0.3362168467 -0.7549717477 0.563006121 700.0000970999999
0.5904156618 0.7653146424 0.2563256608 699.9999956
-0.08771089305 0.4718639226 0.8772976904999999 700.0001109
-0.5275911142 0.4379350937 0.727915153 700.0001481
-0.3670493312 0.9283692136 0.05835573337 699.9999191000001
-0.310691844 0.9285377328 0.2031951203 700.0003426
0.6850654949 0.4457889702 0.5761574973 700.0000308
-0.02947340731 0.9686296317 -0.2467548475 700.0001276
0.3364008703 0.7499107038 0.5696212697 700.0001237
0.5762407491 0.02994827839 0.8167311061 700.0000752
0.2700293981 0.7533021311 0.5996832692 700.0001652
-0.935704635 -0.1912383069 0.2964536152 700.0000224
-0.8889692825 0.3163268711 0.3311660088 700.0002165
-0.8863728578 -0.03891974464 -0.4613332964 700.000193
-0.5446834922 0.592151255 0.5938659651 699.9999172
-0.3514880656 0.02077607915 0.9359618017 700.0000573
0.4139362044 0.9025600041 -0.1185000327 700.0000867
-0.7348035679 -0.008301839411 0.6782291619 700.0000993
-0.7375225303 -0.6167724619 0.2750495364 700.0000183
0.256068723 0.4135049094 0.8737519666 699.9997652
0.9173711924 0.3788423623 0.1221006136 699.9998437
-0.2561567026 -0.7709836178 0.5830677532 700.0000403
0.5895746762 0.4405645371 -0.6769819716 699.999914
-0.8920502727 -0.1924893606 0.4088938212 699.9999958
-0.6804012780000001 0.1484654387 0.7176434452 700.000107
-0.8800430634 0.3162582104 0.3542667793 700.0000423
-0.6197260936 0.7022418021 0.3504226311 700.0001953
-0.8948303882999999 -0.4363450894 -0.09424191772 700.0000338999999
0.1627313891 0.4241828577 -0.8908352251 700.0000077
-0.5060559555 0.7822293085000001 -0.363352004 699.9999847
-0.10541933 -0.4534475631 0.8850265942 699.9998669
-0.8818681135999999 0.4412505868 -0.1661521888 700.0000101000001
-0.5996046607 0.03527683298 -0.7995184776000001 699.9998816999999
-0.6934537158 0.7024538956 0.1602512667 700.0000286
0.3922631547 0.9029053048 -0.1757601432 700.0000545
-0.7614106176 0.4093902526 -0.5026464886000001 700.0000596
-0.2101465957 0.40228334 0.8910704364000001 699.9998955999999
0.3756599196 0.3857284298 0.8426702815 699.9999643
-0.7343735380999999 -0.6164237728 0.2841077944 700.0001751999999
-0.9866537536 0.1519988129 -0.05840146725 700.0000658
-0.5268015073 0.8112047227 -0.2538248802 700.0002197
0.6632603418 -0.4789704279 0.5750417795 700.0001229
0 0 0 0
0 0 0 0
0 0 0 0
0 0 0 0
0 0 0 0
0 0 0 0
0 0 0 0
0 0 0 0
0 0 0 0
0 0 0 0
0.9999665937 **-0.005779769848** 0.005779769848 700.0000368
0.5875926931 0.3829750433 -0.7127867446 700.0002071
-0.8965316409 0.02854692621 -0.4420589213 700.000186
-0.7297876840999999 -0.4174306552 -0.5414439808 700.0000497
0.1536670776 0.9661813907 0.2070747437 699.9998396
-0.1157393372 0.7788450582999999 0.6164452781999999 699.9999132
-0.4077204677 0.4213358983 -0.8100864652 699.9999392
-0.2101675197 0.7588345748999999 -0.6164411583 699.9999357
0.09406538893999999 0.9714638658 0.2177375944 700.0001645999999
-0.1338656833 0.03835077523 -0.9902571367 700.0001743
0.1414193641 -0.7561122484 0.6389795233 700.0000583999999
0.6349663947999999 0.7495585392 0.1870285373 699.9999454
-0.07897827875000001 0.9735814178 -0.2142467137 699.9998449
0.1053177552 0.02634857879 0.9940894944000001 700.0000642
-0.2985741736 0.3936984116 -0.8693992314 700.0000299
-0.2061008557 0.8080201324 0.5519292553 699.9999612
0.330717313 0.04518756768 -0.9426474116 700.0000719
-0.8023465825 0.5941747345 -0.05653623948 700.000013
-0.3362168467 -0.7549717477 0.563006121 700.0000970999999
0.5904156618 0.7653146424 0.2563256608 699.9999956
-0.08771089305 0.4718639226 0.8772976904999999 700.0001109
-0.5275911142 0.4379350937 0.727915153 700.0001481
-0.3670493312 0.9283692136 0.05835573337 699.9999191000001
-0.310691844 0.9285377328 0.2031951203 700.0003426
0.6850654949 0.4457889702 0.5761574973 700.0000308
-0.02947340731 0.9686296317 -0.2467548475 700.0001276
0.3364008703 0.7499107038 0.5696212697 700.0001237
0.5762407491 0.02994827839 0.8167311061 700.0000752
0.2700293981 0.7533021311 0.5996832692 700.0001652
-0.935704635 -0.1912383069 0.2964536152 700.0000224
-0.8889692825 0.3163268711 0.3311660088 700.0002165
-0.8863728578 -0.03891974464 -0.4613332964 700.000193
-0.5446834922 0.592151255 0.5938659651 699.9999172
-0.3514880656 0.02077607915 0.9359618017 700.0000573
0.4139362044 0.9025600041 -0.1185000327 700.0000867
-0.7348035679 -0.008301839411 0.6782291619 700.0000993
-0.7375225303 -0.6167724619 0.2750495364 700.0000183
0.256068723 0.4135049094 0.8737519666 699.9997652
0.9173711924 0.3788423623 0.1221006136 699.9998437
-0.2561567026 -0.7709836178 0.5830677532 700.0000403
0.5895746762 0.4405645371 -0.6769819716 699.999914
-0.8920502727 -0.1924893606 0.4088938212 699.9999958
-0.6804012780000001 0.1484654387 0.7176434452 700.000107
-0.8800430634 0.3162582104 0.3542667793 700.0000423
-0.6197260936 0.7022418021 0.3504226311 700.0001953
-0.8948303882999999 -0.4363450894 -0.09424191772 700.0000338999999
0.1627313891 0.4241828577 -0.8908352251 700.0000077
-0.5060559555 0.7822293085000001 -0.363352004 699.9999847
-0.10541933 -0.4534475631 0.8850265942 699.9998669
-0.8818681135999999 0.4412505868 -0.1661521888 700.0000101000001
-0.5996046607 0.03527683298 -0.7995184776000001 699.9998816999999
-0.6934537158 0.7024538956 0.1602512667 700.0000286
0.3922631547 0.9029053048 -0.1757601432 700.0000545
-0.7614106176 0.4093902526 -0.5026464886000001 700.0000596
-0.2101465957 0.40228334 0.8910704364000001 699.9998955999999
0.3756599196 0.3857284298 0.8426702815 699.9999643
-0.7343735380999999 -0.6164237728 0.2841077944 700.0001751999999
-0.9866537536 0.1519988129 -0.05840146725 700.0000658
-0.5268015073 0.8112047227 -0.2538248802 700.0002197
0.6632603418 -0.4789704279 0.5750417795 700.0001229

Hi Octavian,

I’m assuming you’re using the -rpe_header option in this instance?

Because the current implementation tests for exact equivalence between gradient directions with zero tolerance when testing whether or not the volumes should be combined together, such a small difference will indeed result in dwipreproc not combining any volumes. In the changes I have proposed to deal with your other reported dwipreproc issues, I do adjust this to permit some numerical tolerance between gradient directions; but there is inevitably a balance between what that tolerance is, compared to the severity of errors in the gradient table.

I do think that I may have seen such an encoding issue in the past, but unfortunately I don’t recall specifics; others may have had more experience. But the fact that it’s the first non-b=0 volume would suggest to me that it’s an encoding problem rather than a decoding problem.

My advice would be to manually edit the gradient table and correct the discrepancy before running dwipreproc. The difference in the two vector orientations is really very small. Doing the volume combination should provide benefits over retaining all 140 volumes, due to the differential weighting of volumes within each pair in areas of signal stretching / compression.

As an aside: One consequence of acquiring all volumes with one phase encode direction, followed by another phase encode direction, is that once you correct for subject rotation between the matched volume pairs, the diffusion sensitisation gradient orientations relative to the anatomy could be quite inconsistent between the two volumes within each pair; yet we combine the volumes anyway. These subject rotations are probably of greater magnitude than the discrepancy between those two gradient vectors…

Rob