Spectral optimization of iodine-enhanced CT: Quantifying the effect of tube voltage on image quality and radiation exposure determined at an anthropomorphic phantom

posted by Thomas Henzler, MD | Sep 21, 2017


To provide an experimental basis for spectral optimization of iodine-enhanced CT by a quantitative analysis of image quality and radiation dose characteristics consistently measured for a large variety of scan settings at an anthropomorphic phantom.


CT imaging and thermoluminescent dosimetry were performed at an anthropomorphic whole-body phantom with iodine inserts for different tube voltages (U, 70-140kV) and current-time products (Q, 60-300mAs). For all U-Q combinations, the iodine contrast (C), the noise level (N) and, from these, the contrast-to-noise ratio (CNR) of reconstructed CT images were determined and parameterized as a function of U, Q or the measured absorbed dose (D). Finally, two characteristic curves were derived that give the relative increase of CNR at constant D and the relative decrease of D at constant CNR when lowering U.


Lowering U affects the measured CNR only slightly but markedly reduces D. For example, reducing U from 120kV to 70kV increases the CNR at constant D by a factor of nearly 1.8 or, alternatively, reduces D at constant CNR by a factor of nearly 5.


Spectral optimization by lowering U is an effective approach to attain the necessary CNR for a specific diagnostic task at hand while at the same time reducing radiation exposure as far as practically achievable. The characteristic curves derived in this study from extensive measurements at a reference ‘person’ can support CT users in an easy-to-use manner to select an appropriate voltage for various clinical scenarios.


Authors: Brix G, Lechel U, Sudarski S, Trumm C, Henzler, T.

Full text available at: Phys Med. 2016 Aug;32(8):999-1006.

Follow us
  • RSS Feed

This website is for medical experts only, read more here.
© Spirit Link GmbH 2007-2018. All rights reserved.