If the standard filtered-backprojection algorithm with a filter of the form g(f) = |f| h(f) is applied to noisy projections, all of which have a noise power spectra density (NPSD) Sp(f), then the resulting computed tomographic (CT) reconstruction has a two dimensional NPSD of the form, S(f) ~ |f| |h(f)|^2 Sp(f). For proper reconstruction, h(f) must approach a non-zero constant as f _> 0. Provided Sp(f) is constant, i.e. white projection noise, the CT noise at low frequencies is suppressed by the |f| factor. This low-frequency suppression results in a long-range negative spatial correlation of the CT noise, If white noise is spatially averaged over a circle of diameter d, then the variance in the averaged values will behave as var ~ d^{-2}. For CT noise, the variance drops faster than d^{-2}. Simple signal-to-noise ratio considerations suggest that the dependence of the minimum detectable contrast on the diameter of the circle to be detected could be significantly different in the presence of CT noise than in that of white noise. Simulated reconstructions of a suitable detectability pattern demonstrate these differences may not exist unless the image is spatially smoothed before observation. It is pointed out that the pixel width used in the image display should be from 1/3 to l/2 the width of the point-spread function to avoid discrete binning problems.
Keywords: noise power spectrum, CT reconstruction, noise variance, detectability, long-range negative spatial correlation
Get full paper (pdf, 9.4 MB)
Return to publication list
Send e-mail to author at kmh@hansonhub.com