Patient Care and Research

Case Study 04

SRS Heterogeneity Correction and Commissioning

Download Case Study 04.

The RO-HAC discussed this event with the RO-ILS practice and appreciates their time and willingness to share more information in a confidential and protected environment. This excerpted and non-identifiable account of the event is being shared with the practice’s permission in order to educate the radiation oncology community.

Case Example


A qualified medical physicist commissioned a replacement intracranial stereotactic radiosurgery (SRS) program that included new hardware and software. The practice assumed that heterogeneity corrections were accounted for in the new treatment planning system (TPS) and included the base plate and accessories when contouring to account for their density in calculations. The result was a long physical pathlength but relatively short effective pathlength used in the dose calculations. However, the new planning software does not apply heterogeneity corrections, as indicated in the vendor supplied software manual. This incorrect assumption regarding heterogeneity correction resulted in a ~10% difference in the delivered dose as compared to the planned dose for patients treated before the miscalculation was discovered. While the treatments were within acceptable national SRS guidelines, this difference was more than intended.


The practice purchased a new linac and, in the process, decided to upgrade their existing SRS program. The immobilization system for this new SRS program included a substantial base plate and accessories, and so the physicist decided to contour them so they were included in the dose calculations. The physicist assumed that heterogeneity corrections were applied in the new SRS planning software because it had been in their previous SRS TPS from a different vendor. The vendor manual for the new software specifically stated that heterogeneity corrections are not used and not to include the baseplate and accessories in the contours. However, this information was unknown to the commissioning physicist.

Plan information was displayed in a different system which indicated that doses are calculated according to the standard algorithm. Multiple staff members assumed this meant heterogeneity corrections were included since the standard algorithm for this system is to include heterogeneity corrections.

An independent monitor unit check program from a different vendor failed to identify this error. This software had the characteristic of assigning water density to any contour outside of the designated patient surface and therefore reproduced the original error. While this feature of the second check software is listed in the manual, it was not understood by some staff. 

Independent end-to-end testing was performed using a phantom (i.e., without the baseplate and accessories) and so it did not identify the dose calculation error.

An additional, separate error occurred during commissioning. The absolute dose calibration parameters and output factors entered by the physicist were also incorrect.

A different physicist, aware that the independent monitor unit check software treats material outside the designated patient surface as bolus material, specifically removed the baseplate and accessories contours when performing a second check. This led to the discovery of the dose calculation error. An independent audit was initiated, and the heterogeneity calculation error was identified first. Based on that error alone the program estimated that the difference in planned and delivered dose for patients treated before discovery of the error was  ~15%. However, when factoring the second error of incorrect calibration parameters and output factors, the dose deviation was ~10%. Also, arcs were being used instead of static beams which further reduced the magnitude of the dose calculation errors.

The practice has since re-commissioned the SRS program using a phantom with external, independent end-to-end testing.

Contributing Factors/Root Causes

  • Expectation Bias:
    • Multiple staff assumed heterogeneity corrections were being utilized.
  • Inadequate Training and/or Competency Verification
    • Staff did not recognize and follow information in the vendor’s manual.
    • The practice did not receive any vendor training.
    • The practice was unaware of formal training sessions for commissioning this software.
  • Unclear Software Notification:
    • TPS provides general information (e.g., dose calculations done per default).
    • Heterogeneity correction information was included only in the section called “algorithm limitations” and was not displayed prominently to the user.
    • Independent monitor unit check software by default assigned water density along the pathlength, mimicking the error.
  • Workload and Administrative Pressure:
    • The practice made multiple new equipment acquisitions in rapid succession. Prior to the SRS system change, resources were prioritized towards another service line which limited the ability of the practice to upgrade equipment for several years.
    • The SRS program was commissioned during a busy time for the practice and additional staff resources were not supplied to handle the additional workload.
    • There was administrative interested in expedited commissioning.
  • Inadequate Supervision:
    • Lack of supervision of the single physicist commissioning the new SRS program due to a transition in leadership.
    • Lack of peer review or independent verification by a second physicist during commissioning.
  • Attribution Error: 
    • The practice was well-established with long-standing SRS experience.
    • The commissioning physicist was senior with many years leading the SRS program.

Lessons Learned for the Radiation Oncology Community at Large

Commissioning of treatment planning software is standard work for all radiation oncology programs. Errors during the commissioning process result in systematic errors that affect many patients, and the clinical consequences can be severe.

  • Acceptance and Commissioning:
    • All practices should review existing guidance on commissioning and acceptance testing, including 2019 Safety is No Accident, AAPM Task Group (TG) 106 on Accelerator Beam Data Commissioning, and AAPM-RSS Medical Physics Practice Guideline (MPPG) 9.a. for SRS-SBRT.
      • Within the MPPG 9.a., it explicitly references tissue heterogeneities for SRS systems. A responsibility of the medical physicist is to “[p]erform acceptance testing and commissioning of the SRS‐SBRT system, including validation of the treatment planning system accuracy with small fields and tissue heterogeneities (if relevant to the scope of SRS‐SBRT services offered), [and] accuracy of targeting through end‐to‐end (E2E) testing…”
    • Adequate resources (time and staff) should be provided when commissioning.
  • Independent Verification Procedures:
    • Peer review processes should include an independent verification (e.g., a second physicist should verify values from commissioning data to detect errors). AAPM TG-106 states: “…Check on the report and collected data. Have a qualified medical physicist perform an independent audit of the collected data and subsequent report.”
    • External, independent end-to-end testing should be performed with standardized procedures and all equipment.
  • New Equipment:
    • Whenever new equipment and/or processes are implemented, the entire team’s focus on error pathways should be heightened. Extra caution should be taken, and staff should spend time in the planning stages examining how all the various components and stakeholders of the process will interact, particularly when different equipment and systems are integrated. Testing various scenarios may be helpful.
  • Training:
    • Administrators should provide financial means and clinic coverage to support attendance at vendor supplied training.
    • New staff should also be eligible to receive specialized training (e.g., vendor provided training) to ensure that normalization of deviance is not transmitted between staff.
    • Staff competency should be assessed regularly and include knowledge of the use, functionalities, and potential weaknesses of equipment.
  • Learning and Safety Culture:
    • Practices should cultivate a high-quality safety culture, as demonstrated by this practice. Safety culture underlies a practice’s ability to make improvements because it affects what and how staff learn and how practices adapt.
    • Regardless of experience, anyone can make an error. Additionally, a valuable indicator of a practice’s safety culture is the comfort level of staff to speak up and be listened to. In this case, a junior physicist identified an error of more senior staff. This demonstrates the presence of a low power distance index in this practice, which is highly desirable.Low power distance index means that feedback is accepted between members of differing power levels within the practice.
    • Search satisfying bias, which is the tendency to stop looking once one explanation for an error is found, was not present in this case. This practice should be commended because they did not stop their investigation of this event until all sources of error were examined. It is crucial that practices conduct thorough investigations upon discovery of an error, as this practice has done.

Lessons Learned for the Vendors

Heterogeneity corrections are now standardly accounted for in radiation dose calculations for many clinical systems. If a vendor software does not include heterogeneity corrections, this information should be prominently disclosed to the user. This may be done through mandatory attendance of a vendor training course or on-site vendor presence during the commissioning process. When designing and selling equipment, vendors must think extensively about how software and equipment would be utilized in the clinic and potential pitfalls during implementation. In this instance, it is not surprising that the practice wanted to account for the additional structure of the baseplate and accessories given its thickness. Extra instructions regarding heterogeneity in this instance from the vendor would seem prudent.

Optimal software design is also very important. Vendors have the unique opportunity to elevate practitioners’ strengths and diminish their weaknesses by supporting them with equipment and software that is designed to act as a complementary safeguard. This may include functionalities such as:
  • An alert that appears on screen during plan review indicating heterogeneity corrections information.
  • Rather than stating that the practice defaults were being utilized, the software could state explicitly what setting is being utilized (e.g., heterogeneity off OR heterogeneity on).
  • It would be helpful for software from the same vendor to contain consistent features. For example, employing the same alert indicating atypical heterogeneity corrections for the practice could be shown in various software.


From the description above it is clear that multiple causal factors were at work simultaneously to allow this error to reach the patients. These include expectation bias, issues with software design and the presentation of information, and failure to identify an emerging problem on the part of multiple parties. Many practices face challenges of allocating sufficient financial and staff resources, maintaining compliance with standard protocol (e.g., reviewing manuals), and honoring experience and expertise without exuding over confidence. In this case, these challenges manifested and helped contribute to the errors.
Correspondingly, there are numerous lessons to be learned by the radiation oncology community from this event. RO-ILS is extremely grateful for the participants who submitted this event for their detailed investigation, willingness to talk with RO-HAC and Clarity PSO, and for their collaboration in writing this case study.