Fluoroscopy | |
Synonyms: | fluorography, cinefluorography, photofluorography. |
Icd10: | B?1 |
Fluoroscopy, informally referred to as "fluoro", is an imaging technique that uses X-rays to obtain real-time moving images of the interior of an object. In its primary application of medical imaging, a fluoroscope allows a surgeon to see the internal structure and function of a patient, so that the pumping action of the heart or the motion of swallowing, for example, can be watched. This is useful for both diagnosis and therapy and occurs in general radiology, interventional radiology, and image-guided surgery.
In its simplest form, a fluoroscope consists of an X-ray source and a fluorescent screen, between which a patient is placed. However, since the 1950s most fluoroscopes have included X-ray image intensifiers and cameras as well, to improve the image's visibility and make it available on a remote display screen. For many decades, fluoroscopy tended to produce live pictures that were not recorded, but since the 1960s, as technology improved, recording and playback became the norm.
Fluoroscopy is similar to radiography and X-ray computed tomography (X-ray CT) in that it generates images using X-rays. The original difference was that radiography fixed still images on film, whereas fluoroscopy provided live moving pictures that were not stored. However, modern radiography, CT, and fluoroscopy now use digital imaging with image analysis software and data storage and retrieval. Compared to other x-ray imaging modalities the source projects from below leading to horizontally mirrored images, and in keeping with historical displays the grayscale remains inverted (radiodense objects such as bones are dark whereas traditionally they would be bright).
Although visible light can be seen by the naked eye (and thus forms images that people can look at), it does not penetrate most objects (only translucent or transparent ones). In contrast, X-rays can penetrate a wider variety of objects (such as the human body), but they are invisible to the naked eye. To take advantage of the penetration for image-forming purposes, one must somehow convert the X-rays' intensity variations (which correspond to material contrast and thus image contrast) into a form that is visible. Classic film-based radiography achieves this by the variable chemical changes that the X-rays induce in the film, and classic fluoroscopy achieves it by fluorescence, in which certain materials convert X-ray energy (or other parts of the spectrum) into visible light. This use of fluorescent materials to make a viewing scope is how fluoroscopy got its name.
As the X-rays pass through the patient, they are attenuated by varying amounts as they pass through or reflect off the different tissues of the body, casting an X-ray shadow of the radiopaque tissues (such as bone tissue) on the fluorescent screen. Images on the screen are produced as the unattenuated or mildly attenuated X-rays from radiolucent tissues interact with atoms in the screen through the photoelectric effect, giving their energy to the electrons. While much of the energy given to the electrons is dissipated as heat, a fraction of it is given off as visible light.
Early radiologists would adapt their eyes to view the dim fluoroscopic images by sitting in darkened rooms, or by wearing red adaptation goggles. After the development of X-ray image intensifiers, the images were bright enough to see without goggles under normal ambient light.[1] Image Intensifiers are still being used to this day (2023) with many new models still using II (Image Intensifier) as its method of acquiring the image which is still popular due to lower cost compared to Flat Panel Detectors and there have been many debates on whether II or Flat Detector is more sensitive to X-Ray, which results in lower X-Ray Dosage used. (Depending upon what type of technology / panel is being used influences this answer greatly)
Nowadays, in all forms of digital X-ray imaging (radiography, fluoroscopy, and CT) the conversion of X-ray energy into visible light can be achieved by the same types of electronic sensors, such as flat panel detectors, which convert the X-ray energy into electrical signals: small bursts of electric current that convey information that a computer can analyze, store, and output as images. As fluorescence is a special case of luminescence, digital X-ray imaging is conceptually similar to digital gamma ray imaging (scintigraphy, SPECT, and PET) in that in both of these imaging mode families, the information conveyed by the variable attenuation of invisible electromagnetic radiation as it passes through tissues with various radiodensities is converted by an electronic sensor into an electric signal that is processed by a computer and output as a visible-light image.
Fluoroscopy has become an important tool in medical imaging to render moving pictures during a surgery or any other procedure.
Fluoroscopy is used in various types of surgical procedure, such as orthopaedic surgery and podiatric surgery.In both of those, it is used to guide fracture reduction and in use in certain procedures that have extensive hardware.[2]
In urology, fluoroscopy is used in retrograde pyelography and micturating cystourethrography to detect various abnormalities related to the urinary system.[3]
In cardiology, fluoroscopy is used for diagnostic angiography, percutaneous coronary interventions, (pacemakers, implantable cardioverter defibrillators, and cardiac resynchronization devices).[4]
Fluoroscopy can be used to examine the digestive system using a substance that is opaque to X-rays (usually barium sulfate or gastrografin), which is introduced into the digestive system either by swallowing or as an enema. This is normally as part of a double-contrast technique, using positive and negative contrast. Barium sulfate coats the walls of the digestive tract (positive contrast), which allows the shape of the digestive tract to be outlined as white or clear on an X-ray. Air may then be introduced (negative contrast), which looks black on the film. The barium meal is an example of a contrast agent swallowed to examine the upper digestive tract. While soluble barium compounds are very toxic, the insoluble barium sulfate is nontoxic because its low solubility prevents the body from absorbing it. Investigations of the gastrointestinal tract include barium enemas, defecating proctograms, barium meals and swallows, and enteroclysis.[5]
Fluoroscopy is also used in airport security scanners to check for hidden weapons or bombs. These machines use lower doses of radiation than medical fluoroscopy.[8] The reason for higher doses in medical applications is that they are more demanding about tissue contrast, and for the same reason they sometimes require contrast media.
Fluoroscopy's origins and radiography's origins can both be traced back to 8 November 1895, when Wilhelm Röntgen, or in English script Roentgen, noticed a barium platinocyanide screen fluorescing as a result of being exposed to what he would later call X-rays (algebraic x variable signifying "unknown"). Within months of this discovery, the first crude fluoroscopes were created. These experimental fluoroscopes were simply thin cardboard screens that had been coated on the inside with a layer of fluorescent metal salt, attached to a funnel-shaped cardboard eyeshade which excluded room light with a viewing eyepiece which the user held up to his eye. The fluoroscopic image obtained in this way was quite faint. Even when finally improved and commercially introduced for diagnostic imaging, the limited light produced from the fluorescent screens of the earliest commercial scopes necessitated that a radiologist sit for a period in the darkened room where the imaging procedure was to be performed, to first accustom his eyes to increase their sensitivity to perceive the faint image. The placement of the radiologist behind the screen also resulted in significant dosing of the radiologist.
In the late 1890s, Thomas Edison began investigating materials for ability to fluoresce when X-rayed, and by the turn of the century he had invented a fluoroscope with sufficient image intensity to be commercialized. Edison had quickly discovered that calcium tungstate screens produced brighter images. Edison, however, abandoned his research in 1903 because of the health hazards that accompanied the use of these early devices. Clarence Dally, a glass blower of lab equipment and tubes at Edison's laboratory was repeatedly exposed, developing radiation poisoning, later dying from an aggressive cancer. Edison himself damaged an eye in testing these early fluoroscopes.[9]
During this infant commercial development, many incorrectly predicted that the moving images of fluoroscopy would completely replace roentgenographs (radiographic still image films), but the then superior diagnostic quality of the roentgenograph and their already alluded-to safety enhancement of lower radiation dose via shorter exposure prevented this from occurring. Another factor was that plain films inherently offered recording of the image in a simple and inexpensive way, whereas recording and playback of fluoroscopy remained a more complex and expensive proposition for decades to come (discussed in detail below).
Red adaptation goggles were developed by Wilhelm Trendelenburg in 1916 to address the problem of dark adaptation of the eyes, previously studied by Antoine Beclere. The resulting red light from the goggles' filtration correctly sensitized the physician's eyes prior to the procedure, while still allowing him to receive enough light to function normally.
See main article: Shoe-fitting fluoroscope. More trivial uses of the technology emerged in the early 1920s, including a shoe-fitting fluoroscope that was used at shoe stores and department stores.[10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20] [21] [22] [23] Concerns regarding the impact of frequent or poorly controlled use were expressed in the late 1940s and 1950s. Issues raised by doctors and health professionals included the potential for burns to the skin, damage to bone, and abnormal development of the feet.[24] [25] [26] [27] [28] These concerns lead to the development of new guidelines,[29] [30] [31] regulations[32] [33] [34] and ultimately the practice's end by the early 1960s.[35] [36] [37] [38] [39] [40] [41] Shoe salesmen and industry representatives sometimes defended their use, claiming that there was no evidence of harm, and that their use prevented harm to the feet caused by poorly-fitted shoes.[42]
Fluoroscopy was discontinued in shoe-fitting because the radiation exposure risk outweighed the trivial benefit. Only important applications such as health care, bodily safety, food safety, nondestructive testing, and scientific research meet the risk-benefit threshold for use.
Analog electronics revolutionized fluoroscopy. The development of the X-ray image intensifier by Westinghouse in the late 1940s[43] in combination with closed circuit TV cameras of the 1950s allowed for brighter pictures and better radiation protection. The red adaptation goggles became obsolete as image intensifiers allowed the light produced by the fluorescent screen to be amplified and made visible in a lighted room. The addition of the camera enabled viewing of the image on a monitor, allowing a radiologist to view the images in a separate room away from the risk of radiation exposure. The commercialization of video tape recorders beginning in 1956 allowed the TV images to be recorded and played back at will.
Digital electronics were applied to fluoroscopy beginning in the early 1960s, when Frederick G. Weighart[44] [45] and James F. McNulty[46] (1929–2014) at Automation Industries, Inc., then, in El Segundo, California produced on a fluoroscope the world's first image to be digitally generated in real-time, while developing a later commercialized portable apparatus for the onboard nondestructive testing of naval aircraft. Square wave signals were detected on a fluorescent screen to create the image.
From the late 1980s onward, digital imaging technology was reintroduced to fluoroscopy after development of improved detector systems. Modern improvements in screen phosphors, digital image processing, image analysis, and flat panel detectors have allowed for increased image quality while minimizing the radiation dose to the patient. Modern fluoroscopes use caesium iodide (CsI) screens and produce noise-limited images, ensuring that the minimal radiation dose results while still obtaining images of acceptable quality.
Many names exist in the medical literature for moving pictures taken with X-rays. They include fluoroscopy, fluorography, cinefluorography, photofluorography, fluororadiography, kymography (electrokymography, roentgenkymography), cineradiography (cine), videofluorography, and videofluoroscopy. Today, the word "fluoroscopy" is widely understood to be a hypernym of all the aforementioned terms, which explains why it is the most commonly used and why the others are declining in usage. The profusion of names is an idiomatic artifact of technological change, as follows:
As soon as X-rays (and their application of seeing inside the body) were discovered in the 1890s, both looking and recording were pursued. Both live moving images and recorded still images were available from the beginning with simple equipment; thus, both "looking with a fluorescent screen" (fluoro- + -scopy) and "recording/engraving with radiation" (radio- + -graphy) were immediately named with Neo-Latin words—both words are attested since 1896.
The quest for recorded moving images, though, was a more complex challenge. In the 1890s, moving pictures of any kind (whether taken with visible light or with invisible radiation) were emerging technologies. Because the word "photography" (literally "recording/engraving with light") was long since established as connoting a still-image medium, the word "cinematography" (literally "recording/engraving movement") was coined for the new medium of visible-light moving pictures. Soon, several new words were coined for achieving moving radiographic pictures. This was often done either by filming a simple fluoroscopic screen with a movie camera (variously called fluorography, cinefluorography, photofluorography, or fluororadiography) or by taking serial radiographs rapidly to serve as the frames in a movie (cineradiography). Either way, the resulting film reel could be displayed by a movie projector. Another group of techniques included various kinds of kymography, whose common theme was capturing recordings in a series of moments, with a concept similar to movie film, although not necessarily with movie-type playback; rather, the sequential images would be compared frame by frame (a distinction comparable to tile mode versus cine mode in today's CT terminology). Thus, electrokymography and roentgenkymography were among the early ways to record images from a simple fluoroscopic screen.
Television also was under early development during these decades (1890s–1920s), but even after commercial TV began widespread adoption after World War II, it remained a live-only medium for a time. In the mid-1950s, a commercialized ability to capture the moving pictures of television onto magnetic tape (with a video tape recorder) was developed. This soon led to the addition of the "video-" prefix to the words fluorography and fluoroscopy, with the words videofluorography and videofluoroscopy attested since 1960. In the 1970s, videotape moved from TV studios and medical imaging into the consumer market with home video via VHS and Betamax, and those formats were also incorporated into medical video equipment.
Thus, over time the cameras and recording media for fluoroscopic imaging have progressed: The original kind of fluoroscopy, and the common kind for its first half-century of existence, simply used none, because for most diagnosis and treatment, they were not essential. For those investigations that needed to be transmitted or recorded (such as for training or research), movie cameras using film (such as 16 mm film) were the medium. In the 1950s, analog electronic video cameras (at first only producing live output, but later using video tape recorders) appeared. Since the 1990s, digital video cameras, flat panel detectors, and storage of data to local servers or (more recently) secure cloud servers have been used. Late-model fluoroscopes all use digital image processing and image analysis software, which not only helps to produce optimal image clarity and contrast, but also allows that result with a minimal radiation dose (because signal processing can take tiny inputs from low radiation doses and amplify them while to some extent also differentiating signal from noise).
Whereas the word "cine" in general usage refers to cinema (that is, a movie) or to certain film formats (cine film) for recording such a movie, in medical usage it refers to cineradiography or, in recent decades, to any digital imaging mode that produces cine-like moving images (for example, newer CT and MRI systems can put out to either cine mode or tile mode). Cineradiography records 30-frame/second fluoroscopic images of internal organs such as the heart taken during injection of contrast dye to better visualize regions of stenosis, or to record motility in the body's gastrointestinal tract. The predigital technology is being replaced with digital imaging systems. Some of these decrease the frame rate, but also decrease the absorbed dose of radiation to the patient. As they improve, frame rates will likely increase.
Today, owing to technological convergence, the word "fluoroscopy" is widely understood to be a hypernym of all the earlier names for moving pictures taken with X-rays, both live and recorded. Also owing to technological convergence, radiography, CT, and fluoroscopy are now all digital imaging modes using X-rays with image-analysis software and easy data storage and retrieval. Just as movies, TV, and web videos are to a substantive extent no longer separate technologies, but only variations on common underlying digital themes, so, too, are the X-ray imaging modes, and indeed, the term "X-ray imaging" is the ultimate hypernym that unites all of them, even subsuming both fluoroscopy and four-dimensional CT (4DCT), which is the newest form of moving pictures taken with X-rays. Many decades may pass before the earlier hyponyms fall into disuse, not the least because the day when 4D CT displaces all earlier forms of moving X-ray imaging may yet be distant.
The use of X-rays, a form of ionizing radiation, requires the potential risks from a procedure to be carefully balanced with the benefits of the procedure to the patient. Because the patient must be exposed to a continuous source of X-rays instead of a momentary pulse, a fluoroscopy procedure generally subjects a patient to a higher absorbed dose of radiation than an ordinary (still) radiograph. Only important applications such as health care, bodily safety, food safety, nondestructive testing, and scientific research meet the risk-benefit threshold for use. In the first half of the 20th century, shoe-fitting fluoroscopes were used in shoe stores, but their use was discontinued because it is no longer considered acceptable to use radiation exposure, however small the dose, for nonessential purposes. Much research has been directed toward reducing radiation exposure, and recent advances in fluoroscopy technology such as digital image processing and flat panel detectors, have resulted in much lower radiation doses than former procedures.
Because fluoroscopy involves the use of X-rays, a form of ionizing radiation, fluoroscopic procedures pose a potential for increasing the patient's risk of radiation-induced cancer. In addition to the cancer risk and other stochastic radiation effects, deterministic radiation effects have also been observed ranging from mild erythema, equivalent of a sunburn, to more serious burns.[47] Radiation doses to the patient depend greatly both on the size of the patient and length of the procedure, with typical skin dose rates quoted as 20–50 mGy/min.[48] Exposure times vary depending on the procedure being performed, ranging from minutes to hours.[48]
A study of radiation-induced skin injuries was performed in 1994 by the U.S. Food and Drug Administration (FDA)[49] [50] followed by an advisory to minimize further fluoroscopy-induced injuries.[51] The problem of radiation injuries due to fluoroscopy has been further addressed in review articles in 2000[52] and 2010.[53]
While deterministic radiation effects are a possibility, radiation burns are not typical in standard fluoroscopic procedures. Most procedures sufficiently long in duration to produce radiation burns are part of necessary life-saving operations.
X-ray image intensifiers generally have radiation-reducing systems such as pulsed rather than constant radiation, along with "last image hold", which "freezes" the screen and makes it available for examination without exposing the patient to unnecessary radiation.[54]
Image intensifiers have been introduced that increase the brightness of the screen, so that the patient can be exposed to a lower dose of X-rays.[55] Whilst this reduces the risk of ionisation occurring, it does not remove it entirely.
See main article: X-ray image intensifier.
The invention of X-ray image intensifiers in the 1950s allowed the image on the screen to be visible under normal lighting conditions, and provided the option of recording the images with a conventional camera. Subsequent improvements included the coupling of, at first, video cameras, and later, digital cameras using image sensors such as charge-coupled devices or active pixel sensors to permit recording of moving images and electronic storage of still images.[56]
Modern image intensifiers no longer use a separate fluorescent screen. Instead, a caesium iodide phosphor is deposited directly on the photocathode of the intensifier tube. On a typical general-purpose system, the output image is approximately 105 times brighter than the input image. This brightness gain comprises a flux gain (amplification of photon number) and minification gain (concentration of photons from a large input screen onto a small output screen) each of about 100. This level of gain is sufficient that quantum noise, due to the limited number of X-ray photons, is a significant factor limiting image quality.
Within the XRII, five mini components make up this intensifier, which are:
Image intensifiers are available with input diameters up to 45 cm, and a resolution of around two to three line pairs/mm.
See main article: Flat panel detector. The introduction of flat-panel detectors allows for the replacement of the image intensifier in fluoroscope design. Flat-panel detectors offer increased sensitivity to X-rays, so have the potential to reduce patient radiation dose. Temporal resolution is also improved over image intensifiers, reducing motion blurring. Contrast ratio is also improved over image intensifiers; flat-panel detectors are linear over a very wide latitude, whereas image intensifiers have a maximum contrast ratio of about 35:1. Spatial resolution is roughly equal, although an image intensifier operating in magnification mode may be slightly better than a flat panel.
Flat-panel detectors are considerably more expensive to purchase and repair than image intensifiers, so their use adoption is primarily in specialties that require high-speed imaging, e.g., vascular imaging and cardiac catheterization.
A number of substances have been used as radiocontrast agents, including silver, bismuth, caesium, thorium, tin, zirconium, tantalum, tungsten, and lanthanide compounds. The use of thoria (thorium dioxide) as an agent was rapidly stopped, as thorium causes liver cancer.[59]
Most modern injected radiographic positive contrast media are iodine-based. Iodinated contrast comes in two forms - ionic and nonionic compounds. Nonionic contrast is significantly more expensive than ionic (about three to five times the cost), but nonionic contrast tends to be safer for the patient, causing fewer allergic reactions and uncomfortable side effects such as hot sensations or flushing. Most imaging centers now use nonionic contrast exclusively, finding that the benefits to patients outweigh the expense.
Negative radiographic contrast agents are air and carbon dioxide (CO2). The latter is easily absorbed by the body and causes less spasm. It can also be injected into the blood, where air absolutely cannot due to the risk of an air embolism.
In addition to spatial blurring factors that plague all X-ray imaging devices, caused by such things as Lubberts effect, K-fluorescence reabsorption, and electron range, fluoroscopic systems also experience temporal blurring due to system latency. This temporal blurring has the effect of averaging frames together. While this helps reduce noise in images with stationary objects, it creates motion blurring for moving objects. Temporal blurring also complicates measurements of system performance for fluoroscopic systems.