How QCA Works

horizontal rule


Topics on this page:

  X-Ray Image Formation, Edge Detection, Least-Cost Optimization, Lesion Measurements, Catheter Calibration, Validation


X-Ray Image Formation

First we must understand a bit about the formation of the coronary artery image from X-rays. 

Figure A represents the cross-section of the contrast-filled lumen of an idealized coronary artery. 

Figure B is a plot of the idealized X-ray absorption across the lumen or the density of the X-ray image of the lumen (they are approximately the same thing). The density is depicted both by the height of the curve and by the darkness of the area under the curve. 

The absorption of X-rays is exponentially proportional to the thickness of the absorbing material. As a result, the center of the lumen absorbs very many more X-rays than the edges. In fact, the very edge of the lumen absorbs no X-rays at all! From this we can draw a somewhat startling conclusion: we can never see the true edges of a coronary artery lumen in an X-ray image!

In the real-world, the formation of X-ray images is affected by several factors other than just the absorption of the X-rays.

One of the complicating factors is blur introduced by the finite focal-spot of the X-ray tube and the optics of the image intensifier and camera. This is depicted in Figure C by the spreading of the base of the density curve.

The other complicating factor is "noise" in the image. Noise is caused by many things; among them are quantum mottle, intervening structures in the body, the X-ray detector and the image viewing device. The level of image noise is depicted by the dotted area at the bottom of Figure C.

A structure is only visible when its density rises both above the noise and above the perceptual threshold of the viewer. The perceptual threshold is affected by such factors as the settings of the viewing device (e.g., brightness) and the visual acuity of the viewer. The perceptual threshold is depicted by the horizontal gray line in Figure C. Only the portion of the density curve above the horizontal gray line is visible to the viewer.

The true diameter of the lumen is labeled D0 and is depicted by the red lines drawn through figures A, B and C. As you can see in Figure C, the perceived diameter of the lumen is somewhat smaller than the true diameter and will vary as the noise level and perceptual threshold go up and down.

Probably the most important benefit of digital detection of the lumen edges is the reduction of the effects of noise and unpredictable perceptual threshold on the measured size of the lumen, resulting in consistent and predictable measurements.


Edge Detection

Digital edge detection in QCA is performed by calculating the derivatives (or gradients) of the density curve across the lumen and finding their peak values. Figure D shows the magnified image of an actual contrast-filled coronary artery lumen. The green line represents a cross-section of the lumen edge that has been sampled and analyzed. Figure E shows the resulting density curve in black, the first derivative of the density curve in green and the second derivative (the derivative of the derivative) in blue. These curves are actual computations from QCAPlus using the sample from Figure D.

The first derivative of the density curve is proportional to the rate of change of the curve. As you can see in Figure E, the peak of the first derivative occurs fairly far within the lumen. The diameter resulting from using the peak of the first derivative in figures A, B, C is labeled D1 and is depicted by the green lines.

The second derivative is proportional to the rate of change of the first derivative. As you can see in Figure E, the peak of the second derivative occurs at the outer edge of the lumen. Because of the blurring of the X-ray image, the density curve is widened and the peak of the second derivative actually occurs outside the true edge of the lumen. The diameter resulting from using the peak of the second derivative in figures A, B, C is labeled D2 and is depicted by the blue lines.

The true edge of the lumen lies somewhere between the peaks of the first and second derivatives. Various techniques have been used to deal with this problem. The very first QCA program used only the first derivative and calculated diameters that were a bit too small. Another program uses the first derivative and attempts to correct for the underestimation. Another program uses a point somewhere between the first and second derivatives. However, the most successful technique has been to calculate the (weighted) sum of the first and second derivatives and use the peak of the resulting curve. This sum is shown as the red curve in Figure E. As you can see, the peak of the sum falls neatly between the first and second derivative peaks. The location of the peak of the sum of the derivatives is shown as the red dot in Figure D and clearly is very near to the visible edge of the lumen.


Least-Cost Optimization

All would now be fine if the X-ray images were not noisy. However, coronary angiography images are plagued with a high level of quantum mottle together with other image artifacts. (The "texturing" of the image in Figures F and H is quantum mottle.) Derivatives are "noise amplifiers", so the effect of image noise is to cause many false peaks in the derivative curves, some of which can be higher than the peaks generated by the lumen edge. 

I use several tools to deal with image noise. Before sampling the density values of an image, I first "blur" the image. I have chosen a blurring factor which will smooth the quantum mottle as much as possible without excessively smoothing the edges of the arteries. I also heavily smooth the derivative curves. The effects of all of this smoothing can be seen in the density and derivative curves in Figure E.

Unfortunately, the smoothing is not always sufficient to provide accurate edge detection. Sometimes there are still too many false peaks remaining. The effect of the false peaks can be seen in Figure F. Derivatives have been calculated at multiple points along the edges of the lumen and the peaks of the sums have been plotted as red dots. As you can see, the result is quite chaotic. (I have to admit that I had to work pretty hard to "cook-up" this example. Smoothing works quite well most of the time. I finally found an image of a catheter which was quite noisy and quite low in contrast. The image is so bad that it is difficult to visually perceive the edges of the catheter.)

How do I deal with false peaks in the derivative curve? First we observe that the "true" peak is usually present in the curve; it is just smaller than the biggest peak. Second, we know that adjacent points along the edge of the lumen have to be quite near to each other (they are connected). They don't jump "in-and-out" as they do in Figure F.

So I don't always choose the biggest peak at each point along the edge of the lumen. Instead, I choose a path that is connected and goes through or near most of the biggest peaks most of the time. This technique is called "optimization" and is usually accomplished by some form of dynamic programming. In the detection of coronary artery edges, the best optimization strategy seems to consist of assigning a cost to "missing" the biggest peak at each point along the contour (the more you miss the peak, the bigger the cost). I then search for an optimal path consisting of connected points that have the least total cost. The results of such "least-cost" searches through the points in Figure F are shown as the red contours in Figure G. You will note that it is quite smooth and does a "pretty good" job of following the visible edge of the catheter. (You may argue that catheters are not "lumpy" and you are correct. However, the edge detector does not know that this object is a catheter and it can only work with what it "sees" in the image. This illustrates an important point: Good QCA requires good images!


Lesion Measurements

Now that I have determined the edges of the artery lumen, we will want to calculate some dimensions. If we are dealing with a lesion, we usually want at least the minimum luminal diameter (MLD), the percent stenosis and perhaps the length of the lesion. Because I am dealing with a two-dimensional projection of the artery, I must replace the diameters with the distances between the two lumen edges I have detected.

But how do I measure distances between two wiggly lines? The consensus among QCA software designers is that a centerline should be constructed between the two edges. Diameters are then measured by constructing lines that are perpendicular the centerline, determining where the diameter line intersects each edge and then calculating the distance between the two points of intersection.

The construction of a good centerline is a "black art" and calculation methods usually are carefully guarded secrets among QCA developers.

Now that we know how to calculate diameters, we can "walk" along the centerline, calculating diameters as we go. 

Then I calculate the desired measurements from the resulting set of diameters. Calculating the MLD is easy - it's the smallest diameter. 

To calculate the length of the lesion, I must find where the lesion begins and ends. Visually, this is a subjective process; we observe "shoulders" at each end of the lesion and place the beginning and end of the lesion somewhere on these shoulders. To calculate the percent stenosis, we need a "normal" or "reference" diameter with which to compare the MLD. Visually determining the reference diameter is also highly subjective.

QCA programs attempt to make these subjective visual processes objective by using a variety of statistical and searching techniques. This is also a "black art" and the details of the methods are usually secret.

An example of diameter measurements automatically calculated by QCAPlus is shown in Figure H. The diameter labeled "a" is one of the two automatically-determined ends of the lesion; the other end of the lesion is marked by the other solid line. The diameter labeled "b" is the MLD. The dotted diameter labeled "c" is the reference for percent stenosis. (Usually QCAPlus will calculate reference diameters proximal and distal to the lesion and average them. In this example, only a reference diameter distal to the lesion was calculated by QCAPlus. There is a major side branch immediately proximal to the lesion and averaging proximal and distal reference diameters would lead to the calculation of an incorrect percent stenosis.)

Catheter Calibration

The diameter of a catheter is typically used to provide a dimensional calibration factor for coronary artery measurements. The image of the catheter should be taken from the same imaging run as the image of the lesion because X-ray tube and image intensifier distances have a significant effect on the calibration factor and can vary from run-to-run. 

During QCA, the catheter is subjected to the same edge-detection and diameter measurement processes as the lesion. The "known size" of the catheter is then used to computer the absolute dimensions of the lesion.

I wrote "known size" in quotes because this is a controversial issue. Most QCA programs use the physical diameter of the catheter as measured with a micrometer. At Stanford, we developed the belief (and demonstrated it in a published paper) that using the radiographic diameter of the catheter is preferable. We used QCAPlus to carefully measure the diameters of catheters, using a radiopaque grid for calibration (the section on validation, below, describes the technique). We found that the radiographic diameters of catheters were significantly different from their physical diameters. Moreover, catheters of the same physical dimensions but of different compositions and construction were found to have significantly different radiographic diameters. As a result, we prefered to use QCAPlus to calibrate each type of catheter with a grid and use this radiographic diameter when calculating the calibration factor.



Hopefully you now have a pretty good idea of how QCA programs measure coronary artery dimensions. However, at least one crucial question remains: how do you know that the plethora of precise numbers spewed out by your QCA software has any basis in reality?

To find out, one or more validations must have been performed. I believe that the validation should have been performed in a university research environment and published in a refereed journal. This gives much more credibility to the results. The wise consumer of QCA software will want to know the results of such validations. 

Because it is virtually impossible to know the dimensions of in-vivo coronary arteries, an in-vitro validation using artificially constructed coronary artery phantoms is usually substituted. 

A set of phantoms are constructed by making precise cylindrical holes of known diameters in blocks of Lucite plastic. The diameters are chosen to cover the range of coronary artery diameters that are typically analyzed by QCA. The holes are filled with the same contrast material used in coronary angiography and angiographic images are recorded in a cath lab. An angiographic image of such a phantom is shown in Figure I. This phantom has a cylindrical hole with two different diameters to simulate a lesion. (The lumen is in the center of the image - the bright objects above and below the lumen are screws that hold the phantom together.) 

If the simulation is done correctly, a density phantom and a scattering phantom are interposed to simulate the X-ray absorption and scattering of the human torso and angiography is performed at a variety of realistic kV and mA settings. A radiopaque grid is recorded at the same level as the phantom and is used as a "gold standard". The 1.0 cm lines of such a grid can be seen in Figure I.

The resulting phantom images are then subjected to edge detection and analysis by the QCA program, using the grid for calibration. The calculated diameters should closely match the true diameters of the phantom lumens over the range of anatomically realistic dimensions.

To provide a frame of reference for evaluating QCA results, a published in-vitro phantom validation of QCAPlus done at Stanford showed an average diameter measurement error of 0.069 mm with a standard deviation of 0.066 mm. 

Copyright 2010 (Sanders Data Systems). All rights reserved.