CT dose optimization and its global impact on stakeholders
September 21, 2016
By Dr. Geoffrey West
CT dose optimization seems to have become the topic du jour in hospitals and imaging centers, at accreditation bodies and in regulators’ offices. Never before have radiologists, technologists, administrators, medical physicists, equipment manufacturers, software makers and so many other related personnel been so focused on ensuring an optimal combination of radiation dose and image quality in CT scanning. While the reasons for dose optimization in CT, and in other modalities as well, are by now well-known to the reader, what may not be as well-known are the impacts of this effort on stakeholders around the country and world. The CT dose optimization effort has affected, and been affected by, all of these groups of individuals and bodies, and some of those effects will be discussed in this article for the purpose of illustrating the current state of CT dose optimization in the U S.
The evolution of CT dose optimization efforts, recommendations and requirements
It is helpful, when evaluating the current state of CT dose optimization in the U.S., to look back at the evolution of attention and efforts in this area. While the need to maintain radiation doses as low as practicable, given the diagnostic need, has been known about for many decades, a truly sophisticated treatment of the issue is relatively recent, occurring only in the past decade.
Beginning in 2008, there was a marked increase in popular press articles concerning medical radiation exposures (and overexposures) to patients. While the overexposure cases described in these articles were extreme outliers, and some of the scientific details in other articles were not settled matters of fact, these articles and others like them did rightly focus attention on the issue of CT radiation overexposure. They also led the radiology industry to face certain uncomfortable facts, including: many or most health care facilities did not have robust knowledge, organization and administrative control over their CT protocols; many or most facilities had no idea whether their CT protocols appropriately balanced dose and image quality; and many or most facilities had no idea how their patient CT doses compared to external benchmarks or national averages. Lastly, these articles captured the attention and concern of the public, and with them, elected representatives, regulators and accreditation bodies, all of whom responded in due course.
In 2010, the Image Wisely campaign was launched by the American College of Radiology (ACR), the Radiological Society of North America (RSNA), the American Association of Physicists in Medicine (AAPM) and the American Society of Radiologic Technologists (ASRT) to address concerns about the surge of public exposure to ionizing radiation from medical imaging, and to lower the amount of radiation used in adult imaging and eliminate unnecessary procedures.
This campaign followed and augmented the Image Gently campaign, directed specifically at pediatric radiation protection, which had launched previously in 2007. Also, in 2010, the Food and Drug Administration (FDA) released an “Initiative to Reduce Unnecessary Radiation Exposure from Medical Imaging,” which targeted high-dose imaging modalities, including CT. And California passed SB1237, which required reporting of CT overexposures, recording of patient dose data in the radiology report and accreditation for all CT facilities.
In 2011, the Joint Commission (JC) issued Sentinel Event Alert, Issue 47, “Radiation Risks of Diagnostic Imaging.” In 2012, the AAPM Working Group on Standardization of CT Nomenclature and Protocols (now the Alliance for Quality Computed Tomography, AQCT) began publishing reference CT protocols. In December 2013, the JC released substantially enhanced medical imaging standards in pre-publication form. These standards, after public comment and revision, were implemented on July 1, 2015. In 2013, the National Electrical Manufacturers Association (NEMA) published its XR-29-2013 standard, which by subsequent law (in 2014) became required for full CT reimbursement starting on Jan. 1, 2016, and with more cuts on Jan. 1, 2017.
These milestones and their resulting requirement or guidance documents, in the aggregate, have created an industry (i.e., technical, regulatory and accreditation) “framework” for CT dose optimization activities in U.S. health care facilities. While I expect that framework to be augmented and refined further in the coming years, this current structure has already substantially advanced radiation safety in CT departments nationwide and is continuing to push advancement even today, as described in the following sections of this article.
Impact on health care facilities
Health care facilities performing CT imaging in the U.S. have responded to the public concern, industry recommendations and regulatory and accreditation requirements in a variety of ways. While the approaches are numerous and varied, some common elements of successful programs have emerged. For example, most facilities with successful CT dose optimization programs started these programs by assembling one or more radiology administrators, CT technologists, radiologists and medical physicists into some type of dose optimization planning committee. This committee discusses, deliberates, and through various actions, guides the creation and implementation of a dose optimization program that meets the specific needs of the facility or facilities it serves.
The specific actions resulting from this committee (or for facilities without a committee, from those individuals or groups guiding the CT dose optimization effort) can usually be grouped into the following five categories: new or upgraded CT imaging equipment; dose tracking and analysis software; administrative enhancements; staff training; and/or improved and expanded CT policies and procedures. I will address each of these categories individually, discussing the offerings that have developed and their appropriate application in an imaging department.
New or upgraded CT imaging equipment
Original equipment manufacturers (OEMs) have directed significant resources during the past decade on dose reduction and optimization improvements to their CT scanner hardware and software platforms. The ultimate result is more dose-efficient scanning generally, and more numerous and sophisticated dose-saving options on CT scanners.
These advancements, while welcome from a radiation safety standpoint, in some cases increase the complexity of scanners, necessitating increased CT technologist training and CT protocol review, and customization in order to fully realize the benefits of the new technology. As such, while new or upgraded CT imaging equipment may permit a facility to potentially reach lower dose-to- image quality ratios, and may address the aforementioned NEMA requirements for facilities with older equipment, equipment improvements alone cannot produce optimized CT doses, nor can they address the vast majority of new requirements and imperatives in the current dose optimization framework. In some cases, new or upgraded equipment is neither necessary nor particularly helpful in reducing patient doses.
Frequently, better staff training to properly take advantage of CT dose reduction features on existing scanners, combined with protocol optimization, may contribute substantially more to dose reduction efforts than scanner upgrades.
Dose tracking and analysis software
Myriad dose tracking and analysis software systems have been introduced during the past decade. These programs have grown in capabilities, sophistication and complexity, and come in all different forms and at all different price points. While CT patient dose analysis (including identification of high or low outliers, comparison of patient doses over time, comparison of patient doses between scanners, comparison of patient doses between technologists, etc.) does not require dose tracking and analysis software, we have found in our dose optimization consulting practice at West Physics that for all but very small facilities with very limited CT patient volume, it is extremely difficult and time-consuming to perform dose tracking and analysis manually without the benefit of at least basic software. As such, in most quarters of the industry, such software is rapidly becoming a de rigueur part of a highly-functioning CT department’s toolset.
While many facilities have now installed dose tracking and analysis software, these same facilities have found that this software can’t just be installed and then left to run unattended. To provide value (i.e., to help optimize doses), the software requires someone with the appropriate education, training and expertise to initially configure it; to update it as equipment, personnel, practices and needs change; and to analyze, report, interpret and act upon the data it produces. Many facilities have found that their diagnostic medical physicist is the best person to perform this function, as this person has both radiation and medical imaging equipment expertise, understands the dose values output by CT scanners and is usually well-trained in data analysis and statistics.
The diagnostic medical physicist is also not typically responsible for time-sensitive, patient-centered tasks throughout the day in the same manner as a technologist or radiologist, and thus is able to more easily structure the longer periods of work time that these types of analytical activities require. Medical physicists are not necessarily qualified or prepared to help manage a successful CT dose optimization program just by virtue of being physicists.
Facilities are learning that the designated physicist needs to have additional training and experience, specifically in CT dose optimization, and detailed knowledge of the dose reduction features on the facility’s CT scanners to ensure success. Facilities that utilize contracted physicists have found that there are fundamental differences between the consultative work required by a physicist as part of the CT dose optimization program, and the episodic work that most contracted physicists normally do (e.g., periodic equipment testing, shielding plans, fetal dose calculations, etc.).
If the physicist or the physics firm is not set up for, and not experienced with, client engagements involving frequent and ongoing communication, this can lead to problems. The most successful facilities discuss this issue openly with their current medical physics provider and accurately evaluate their provider’s ability to deliver these much more sophisticated and interactive services. Facilities with in-house physics support generally do not face this problem.
However, all facilities, regardless of whether they use in-house or contracted medical physicists, need to plan for the considerable increase in time that proper configuration and oversight of dose tracking and analysis software entails.
Facilities pursuing an effective CT dose optimization program have found, or are finding, that numerous administrative actions and programmatic enhancements are needed for success. Among the myriad action items, facilities assign various individual responsibilities relative to dose optimization (including who will be responsible for patient dose analysis and reporting); establish a standing CT dose optimization committee to review and approve protocols (and establish a meeting schedule); arrange IT resources for any dose tracking and analysis software support; update HR policies and procedures to reflect additional staff training and documentation responsibilities; develop and implement mechanisms for external benchmarking of patient doses; and decide upon and institutionalize appropriate methods and frequencies of communicating dose improvements to internal and external stakeholders.
This is only a very small sample of the full gamut of administrative enhancements and changes that facilities have been, or will be, making as CT dose optimization programs are initiated and evolve. A full treatment of the administrative changes involved in CT dose optimization programs is beyond the scope of this article, but suffice it to say that administrative actions are a critical part of the overall success of these programs.
There are a variety of staff training opportunities that occur as part of CT dose optimization efforts. Dose reduction training for technologists and radiologists is a particularly important and necessary part of any successful CT dose optimization effort. Facilities have begun finding sources for this training, and instituting the requirements and tracking processes to ensure that it is accomplished. Radiology administrators are also seeking higher-level dose program-related training at industry meetings, through their existing facility vendors or individually online. The medical physics community has significantly expanded its own internal training resources (and increasingly, its expectations for competence) in the area of CT dose optimization over the past decade. Training in this area is still evolving rapidly and there is substantial variance from facility to facility, and even from individual to individual within a facility, in CT dose-related technical topics.
More quality content for technologists and radiologists in this area is emerging every year, and some consulting medical physics firms (including my own) now offer robust CT dose optimization training along with their other CT dose optimization services.
Improved and expanded CT policies and procedures
Putting clear, specific and reasonable policies and procedures in place to codify the CT dose reduction program is an essential part of the successful CT dose optimization program. This action is a critically important one that creates a lasting structure at the facility or facilities, which enables the CT dose optimization program to continue working into the future despite staff turnover. Facilities executing successful CT dose initiatives spend time reviewing their CT policies and procedures, and put a high level of attention into the details of workflow, process, roles and responsibilities.
In some cases, facilities have found that bringing in a third-party operations/technical consultant can be helpful in this particular task, as an outsider may have seen things that work well in similar facilities, or be able to identify pitfalls from their previous experiences. No matter how the institution approaches this process, it tends to take time and some degree of iteration and experimentation before the process is complete. In the minority of facilities that have completed this action, it has been done typically (and I think rightfully) near the end of the implementation phase of the dose optimization program, as creation of high-quality policies and procedures is more likely when staff are fully trained, familiar with the processes and many details have already been worked out in the program.
The state of CT dose optimization in the U.S. has evolved substantially in the past decade, prompted by greater public and regulatory concern as well as by enhanced health care industry awareness. Administrators, technologists, radiologists and medical physicists are all more cognizant of the need to reduce patient doses while maintaining acceptable image quality. CT equipment manufacturers have improved their hardware and software to enhance dose efficiency and radiation safety. Software vendors have created sophisticated patient dose tracking and analysis products and services.
And many health care facilities have started to build the administrative and technical framework necessary for a successful and ongoing CT dose optimization program. We are still in the early days of this effort. Many CT imaging facilities still have no formal CT dose optimization program, or their program is not yet fully formed in the sense that: all CT staff are properly trained on dose reduction techniques; CT policies and procedures define and describe the operation of the CT dose optimization program; all CT protocols have been dose-optimized; appropriate CT protocol controls are in place; patient CT doses are analyzed to identify outliers, spot trends and compare to external benchmarks; and patient CT doses are reported to stakeholders appropriately and fed back into the dose optimization program, resulting in continual improvement to protocols and processes.
Much more work remains to be done industry-wide in this effort, but if the progress of the past decade is any indication, it will not be long before we look back on the days before dose optimization in the same way that we now look back on the days before routine quality assurance.
About the author: Dr. Geoffrey West, Ph.D., DABR, CHP, is the founder, president and chief medical physicist of West Physics, a world leader in providing medical physics and health physics consulting and testing services.