HARP (algorithm) explained

HARP
Developer:Image Analysis and Communications Laboratory
Operating System:Linux, Mac OS X, Windows
Genre:Cardiac Motion Tracking
Website:HARP Overview (Software Download)

Harmonic phase (HARP) algorithm[1] is a medical image analysis technique capable of extracting and processing motion information from tagged magnetic resonance image (MRI) sequences. It was initially developed by N. F. Osman and J. L. Prince at the Image Analysis and Communications Laboratory at Johns Hopkins University. The method uses spectral peaks in the Fourier domain of tagged MRI, calculating the phase images of their inverse Fourier transforms, which are called harmonic phase (HARP) images. The motion of material points through time is then tracked, under the assumption that the HARP value of a fixed material point is time-invariant. The method is fast and accurate, and has been accepted as one of the most popular tagged MRI analysis methods in medical image processing.

Background

In cardiac magnetic resonance imaging, tagging techniques[2] [3] [4] [5] make it possible to capture and store the motion information of myocardium in vivo. MR tagging uses a special pulse sequence to create temporary features – tags in the myocardium. Tags deform together with the myocardium as the heart beats and are captured by MR imaging. Analysis of the motion of the tag features in many images taken from different orientations and at different times can be used to track material points in the myocardium.[6] [7] Tagged MRI is widely used to develop and refine models of normal and abnormal myocardial motion[8] [9] [10] [11] to better understand the correlation of coronary artery disease with myocardial motion abnormalities and the effects of treatment after myocardial infarction. However, suffered from long imaging and post-processing times,[12] tagged MRI was slow in entering into routine clinical use until the HARP algorithm was developed and published in 1999.[13]

Description

HARP processing

A tagged MRI showing motion of a human heart is shown in the image (a). The effect of tagging can be described as a multiplication of the underlying image by a sinusoid tag pattern having a certain fundamental frequency, causing an amplitude modulation of the underlying image and replicating its Fourier transform into the pattern shown in (b).

HARP processing uses a bandpass filter to isolate one of the spectral peaks. For example, the circle drawn in (b) is the -3 dB isocontour of the bandpass filter used to process this data. Selection of the filters for optimal performance is discussed in this paper.[14] The inverse Fourier transform of the filtered image yields a complex harmonic image

Ik(y,t)

at image coordinates

y=[y1,

T
y
2]
and time

t

:

Ik(y,t)=Dk(y,

j\phik(y,t)
t)e

where

Dk

is called the harmonic magnitude image and

\phik

is called the harmonic phase image.The harmonic magnitude image in (c) extracted from a using the filter in (b) shows the geometry of the heart. And the harmonic phase image in (d) contains the motion of the myocardium in horizontal direction. In practice, tagged images from two directions (both horizontal and vertical, i.e.,

k

is 1 and 2) are processed to provide a 2D motion map in the image plane. Notice that the harmonic phase images are computed by taking the inverse tangent of the imaginary part divided by the real part of

Ik(y,t)

, such that the range of this computation is only in

[-\pi,+\pi)

. In other words, d is only the wrapped value of the actual phase. We denote this principle value by

ak(y,t)

; it is mathematically related to the true phase by:

ak(y,t)=mod(\phik(y,t)+\pi,2\pi)-\pi

Either

\phik

or

ak

might be called a harmonic phase (HARP) image, but only

ak

can be directly calculated and visualized. It is the basis for HARP tracking.

HARP tracking

For a fixed material point with a HARP value, only one of the points sharing the same HARP value in a later time frame is the correct match. If the apparent motion is small from one image to the next, it is likely that the nearest of these points is the correct point. The tracking result is very accurate in this case.

Consider a material point located at

ym

at time

tm

. If

ym+1

is the apparent position of this point at time

tm+1

, we have:

\phik(ym+1,tm+1)=\phik(ym,tm)

The Newton–Raphson interactive method is used to find a solution, which is:

y(n+1)=y(n)

(n)
-[\nabla\phi
k(y

,tm+1)]-1

(n)
[\phi
k(y

,tm+1)-\phik(ym,tm)]

In practice, since

\phik

is not available,

ak

is used in its place. This equation can be rewritten after a few derivations considering the "wrapping" relation between

\phik

and

ak

.

The result of HARP tracking of one frame of cardiac MRI is shown in the figure. It is obtained by calculating both motions from horizontal direction and vertical direction, resulting in a 2D vector field showing the motion of every material point on the myocardium at this time frame.

The entire HARP algorithm takes only a few minutes to perform on a normal computer and the motion tracking result is accurate (with a typical error range of

\pm1

pixel). As a result, it is now widely adopted by the medical image analysis community as a standard processing technique for tagged MRI.

See also

External links

Notes and References

  1. Osman . N.F. . N.F.Osman . McVeigh . E.R. . E.R.McVeigh . Prince . J.L. . Jerry L. Prince . Imaging Heart Motion Using Harmonic Phase MRI . . 2000 . 19 . 3 . 186–202 . 10.1109/42.845177. 10875703 . 6351307 . 10.1.1.649.7174 .
  2. Zerhouni . E.A. . Parish . D.M. . Rogers . W.J. . Yang . A. . Shapiro . E.P. . Human heart: tagging with MR imaging—a method for noninvasive assessment of myocardial motion . . 1988 . 169 . 1 . 59–63. 10.1148/radiology.169.1.3420283 . 3420283 .
  3. Axel . L. . Dougherty . L. . MR imaging of motion with spatial modulation of magnetization . . 1989 . 171 . 3 . 841–845. 10.1148/radiology.171.3.2717762 . 2717762 . 34168537 .
  4. McVeigh . E.R. . Atalar . E. . Cardiac tagging with breath-hold cine MRI . . 1992 . 28 . 2 . 318–327 . 10.1002/mrm.1910280214. 1461130 . 2041925.
  5. Fischer . S.E. . McKinnon . G.C. . Maier . S.E. . Boesiger . P. . Improved myocardial tagging contrast . . 1993 . 30 . 2 . 191–200 . 10.1002/mrm.1910300207. 8366800 . 45146949 .
  6. McVeigh . E.R. . Regional myocardial function . . 1998 . 16 . 2 . 189–206 . 10.1016/s0733-8651(05)70008-4. 9627756 . free.
  7. McVeigh . E.R. . MRI of myocardial function: motion tracking techniques . . 1996 . 14 . 2 . 137–150 . 10.1016/0730-725x(95)02009-i. 8847969 . free.
  8. Young . A.A. . Axel . L. . Three-dimensional motion and deformation of the heart wall: estimation with spatial modulation of magnetization—a model-based approach . . 1992 . 185 . 2 . 241–247. 10.1148/radiology.185.1.1523316 . 1523316 .
  9. Moore . C. . O'Dell . W. . McVeigh . E.R. . Zerhouni . E. . Calculation of three-dimensional left ventricular strains from biplanar tagged MR images . . 1992 . 2 . 2 . 165–175 . 10.1002/jmri.1880020209. 1562767 . 2041907.
  10. Clark . N.R. . Reichek . N. . Bergey . P. . Hoffman . E.A. . Brownson . D. . Palmon . L. . Axel . L. . Circumferential myocardial shortening in the normal human left ventricle . . 1991 . 84 . 1 . 67–74 . 10.1161/01.cir.84.1.67. 2060124 . free.
  11. McVeigh . E.R. . Zerhouni . E.A. . Noninvasive measurements of transmural gradients in myocardial strain with MR imaging . . 1991 . 180 . 3 . 677–683. 10.1148/radiology.180.3.1871278 . 2475677 . 1871278 .
  12. Budinger . T.F. . Berson . A. . McVeigh . E.R. . Pettigrew . R.I. . Pohost . G.M. . Watson . J.T. . Wickline . S.A. . Cardiac MR imaging: report of a working group sponsored by the National Heart, Lung, and Blood Institute . . 1998 . 208 . 3 . 573–576. 10.1148/radiology.208.3.9722831 . 9722831 .
  13. Osman . N.F. . Kerwin . W.S. . McVeigh . E.R. . Prince . J.L. . Cardiac Motion Tracking Using CINE Harmonic Phase (HARP) Magnetic Resonance Imaging . . 1999 . 42 . 6 . 1048–1060 . 10.1002/(sici)1522-2594(199912)42:6<1048::aid-mrm9>3.3.co;2-d. 10571926 . 2570035 .
  14. Osman . N.F. . Prince . J.L. . Motion estimation from tagged MR images using angle images . . 1998 . 704–708.