Early detection of potential errors during patient treatment planning

Abstract Purpose Data errors caught late in treatment planning require time to correct, resulting in delays up to 1 week. In this work, we identify causes of data errors in treatment planning and develop a software tool that detects them early in the planning workflow. Methods Two categories of errors were studied: data transfer errors and TPS errors. Using root cause analysis, the causes of these errors were determined. This information was incorporated into a software tool which uses ODBC‐SQL service to access TPS's Postgres and Mosaiq MSSQL databases for our clinic. The tool then uses a read‐only FTP service to scan the TPS unix file system for errors. Detected errors are reviewed by a physicist. Once confirmed, clinicians are notified to correct the error and educated to prevent errors in the future. Time‐cost analysis was performed to estimate the time savings of implementing this software clinically. Results The main errors identified were incorrect patient entry, missing image slice, and incorrect DICOM tag for data transfer errors and incorrect CT‐density table application, incorrect image as reference CT, and secondary image imported to incorrect patient for TPS errors. The software has been running automatically since 2015. In 2016, 84 errors were detected with the most frequent errors being incorrect patient entry (35), incorrect CT‐density table (17), and missing image slice (16). After clinical interventions to our planning workflow, the number of errors in 2017 decreased to 44. Time savings in 2016 with the software is estimated to be 795 h. This is attributed to catching errors early and eliminating the need to replan cases. Conclusions New QA software detects errors during planning, improving the accuracy and efficiency of the planning process. This important QA tool focused our efforts on the data communication processes in our planning workflow that need the most improvement.


| INTRODUCTION
Physics pretreatment plan review has been shown to be one of the most effective ways to reduce errors in radiotherapy. 1 During this review, the physicist will independently evaluate the plan quality based on clinical goals, as well as verify that all of the technical details of the plan are correct. This process has become more challenging for the physicist to perform in the modern radiotherapy era for a number of reasons. First, the number of treatment methods being offered along with their complexity continues to grow. 2 In fact, it was estimated in a 2009 study that progressing from consult to treatment delivery for an external beam radiotherapy patient required approximately 270 different steps. Of those 270 steps, it appears that around 100 are attributed to the treatment planning process. 3 Secondly, the number of computer systems required in a standard radiotherapy treatment has also grown. In a 2013 article by Moore et al., 4 they note that there are at least 11 different "software functionalities" involved in a standard radiation therapy clinic. This means that in order to complete a pretreatment physics review, the physicist must navigate through multiple electronic workspaces as well as check a significant number of parameters, often manually which can be quite complicated.
The increasing demands on physics resources to perform independent plan review have been recognized by the radiation oncology community. In response, several investigators have developed and implemented software programs which automate portions of the physics pretreatment check. These programs check items such as patient setup and prescription information, beam parameters, dose computation settings, optimization parameters, and dosimetric goals. [5][6][7][8][9][10][11][12][13][14][15][16][17] However, one area of the treatment planning process that has not been the focus of these software programs is checking the integrity of data communication processes between various software and hardware platforms. For example, a patient may be CT simulated and at the time of simulation, the therapist may accidentally mistype the patient medical record number (MRN) into the CT software. Images are acquired and imported into the treatment planning system and a plan is created and approved.
During export of the DICOM plan and CT information, the dosimetrist discovers the error in MRN as the plan and CT information are not able to be transferred to the correct patient in the record and verify system. This means that the patient cannot be treated unless the error is fixed, requiring that several steps in the planning process be repeated a second time. While this type of data error may not result in mistreatment of the patient, it is costly in terms of the efficiency of plan creation, since it takes significant time to find the error and repeat necessary planning steps, resulting in delayed start times for patients.
In this work, we investigate the causes of various data errors that can occur during the treatment planning process and develop a software tool that can automatically detect them in advance. The impact of this software on the accuracy and efficiency of treatment plan creation at our center is also presented.

2.B | Identification of data errors
Between the years of 2012-2015, data errors that were caught clinically in our Department either by the dosimetrist at the time of planning, physicist at the time of pretreatment review, or the therapist during treatment were tabulated. A review of the data errors found that they could be classified into two major categories: (a) data transfer errors and (b) user errors in the treatment planning system. For each of these data errors, a root cause analysis (RCA) was performed. This type of procedure is a well-established technique to manage errors in healthcare and involves starting at the clinical incident where the error was discovered and tracing backwards through each step of the treatment workflow until a root cause is identified. 2,18,19 RCA analysis was based on a review of pertinent files and databases related to each of the data errors.

2.C | Data error tracking software development and implementation
Once the root causes of the data error subset were determined, a software tool to automatically monitor related files and databases for potential errors was developed. In our clinical setting, we use a unix-based treatment planning system and a windows-based record and verify system. As such, a windows-based software tool could be easily implemented without the need to install additional software on the treatment planning system servers. The software tool was developed using the C/C++ language on a windows platform. It uses the PostgresSQL open database connectivity (ODBC) driver to access our centralized Pinnacle treatment planning system patient Postgres database, and uses the SQL Server ODBC driver to access our two windows-based Mosaiq MSSQL record and verify system databases. The software uses a windows file transfer protocol (FTP) service (mainly, the CInternetSession::GetFtpConnection function), to scan the treatment planning system's internal files. In order to ensure that the software had read-only access to these files and could not accidently modify any plan data, a special interface was developed using the C/C++ programming language. Details regarding which file type (image file, TPS internal file etc.) as well as what LACK ET AL. | 725 information within each file is needed for the software to check for a given data error is detailed in Table 1.
The software program was created by focusing on one data error at a time, for example, data error X. During the development phase, patients from the RCA that had data error X were used to test the software. If a false negative was encountered, then the software was modified and the process repeated until data error X could reliably be detected without false negatives. Once this was completed, verification was performed by creating data error X for a new set of patients and running the software. If false negatives were encountered, the software was modified further and testing was performed again. This iterative process of modifying and testing the software was repeated until it was confirmed that the software could correctly identify data error X without false negatives.
The data error tracking tool was implemented in our clinic at the end of 2015 and has been fully operational since the beginning of 2016. Since then there have been no false negatives discovered by the software. All false positives have been tied to research or quality assurance (QA) patient entries created in the Pinnacle database without a corresponding entry in the record and verify system databases.
To minimize the false positives detected by the software for research or QA cases, some rules have been built into the software to ignore an entry if it has "QA" or "TEST" in the name and staff members have been notified to include these keywords when creating their test patients.
Currently, the software tool is run every evening and will scan all database files modified since the previous scan the night before.
Whenever an error is detected a report is generated locally on the computer that runs the software and a review request with a short summary of the error (without patient identification information) is emailed to a physicist. The physicist then accesses the full data error report from the local computer and investigates the error. A sample data error report from the local computer is shown in Fig. 1 for a patient whose MRN on the reference DICOM CT is mismatched with the MRN in the Mosaiq database. Once the physicist confirms the error is true, the responsible clinicians (the CT technicians and dosimetrist for this case) are notified to correct the error in order to prevent it from propagating further in the treatment planning chain.

2.D | Time-cost analysis of data errors
In our experience, data errors detected in the later stages of the planning process have required time by the physicist to analyze and correct, which can significantly delay the start of treatment for patients. In an effort to characterize the improvements in clinical efficiency as a result of implementing this software tool, a time-cost T A B L E 1 Summary of data errors and root causes identified through root cause analysis procedure. Details on files scanned and information needed to check for a given error is summarized in the rightmost column.

3.A | Root cause analysis of data errors
A summary of data errors as well as their root causes as identified through the root cause analysis process is also presented in Table 1.

3.A.1 | Data transfer errors
There were three main data transfer errors identified in our study.
The first was an image slice missing from an imaging dataset. This error can be caused by (a) a DICOM transfer error between the CT console and the treatment planning system, (b) an error in the reconstruction of the 4D average image in the CT software, or (c) a Pinnacle treatment planning system import error for MR images.
Specifically, oblique MR images must be converted by the dosimetrist prior to being imported to the treatment planning system. In our clinic, this conversion process is done using the MIM software platform for fusion and image registration (MIM software Inc., Cleveland, OH, USA). This converts the MRI to the planning CT imaging coordinate system, after which images can be imported into the treatment planning system. If this conversion process is not done, then the display of the images in the treatment planning system can be incorrect. 20 If the display is incorrect, then the treatment planning system records a variable slice thickness for each MR image in the dataset. Since the image could be misrepresented in the planning system, this scenario was classified under the category of image slice missing from a dataset. The clinical significance of this type of error is that contours for target volumes and/or normal tissue structures located in the region of missing imaging data are incomplete or misrepresented. These incomplete contours can cause a failure to export the structure set from the planning system to the record and verify system as well as the on-board imaging system at the end of the treatment planning process. Depending on the exact cause, this error can be remediated by (a) re-exporting the CT from the CT console, (b) reconstructing the 4D average image a second time and re-exporting the images, or (c) properly doing an MRI conversion for oblique images in the MIM software, all followed by reimporting the images into the treatment planning system. If contours and planning were completed prior to the error being found, then both are copied from the previous image set to the new image set and the dose is recalculated. Once corrected, the clinician will need to review the images, target, and organs at risk contours as well as the recalculated plan. Depending on where the missing imaging slice is located (i.e., in target volume or critical organ at risk), replanning of the case may be necessary.
The second data transfer error identified was wrong patient information entry caused by an incorrect manual entry of patient information either in the CT simulation software or treatment plan- CT simulation software for two of our CT simulators which were creating two consecutive periods in the DICOM tag in the reconstructed average image of 4DCT datasets. With an incorrect DICOM tag in place, the record and verify system is able to receive and store the reference image, but the reference CT cannot be transferred from the record and verify system to the on-board imaging system.
To fix this problem, the image tag must be corrected and the image set must be reimported into the treatment planning system and then resent to the record and verify system. In this case, the erroneous reference image in the record and verify system cannot be deleted by clinical users. To remove the image, help from vendor technical support is required, which from our experience can take multiple days to complete.

3.A.3 | Other errors
The last category in Table 1 specifies other errors that were investigated. These errors did not occur often in the clinic, but were easy to incorporate into the software tool for monitoring. For example, performance errors within our record and verify system, specifically the Mosaiq Work Queue Element (WQE) Processor service were monitored. This service activity is responsible for data conversion within the record and verify system, and runs on the record and verify system server that does not reside in our Department. When this service stops working, DICOM data transfer to the record and verify system gets backlogged as data are received by the record and verify system's import filter, but cannot be assigned for further processing. For instance, DRR images are sent from Pinnacle to the record and verify system, but are unable to be assigned to the corresponding patient. Additionally, this backlog of patient data results in slow performance of image review activities in the record and verify system (e.g., CBCT image review). To catch this error, our software tool continually monitors the WQE service activities to see if there are an unexpectedly high number of incomplete jobs in the record and verify system database. Once the error is detected, the hospital information technology department must be contacted to reset this service on the server computers.

3.B | Frequency of data errors detected
The data error tracking tool was implemented in our clinic at the end of 2015 and was fully operational for the 2016 year. In that year, 79 data errors and 5 performance errors with the record and verify system were detected by our tool. Using the number of simulations performed in 2016 as a surrogate for the total number of patient plans generated in that year, this corresponds to a data error frequency rate in our clinic of~2.3%. The breakdown of errors by type and frequency is presented in Table 2, which shows that the most frequent data errors detected were: wrong patient identification entered (35 occurrences), incorrect CT-density curve applied (17 occurrences), and image slice missing from dataset (16 occurrences).  Table 2.

| DISCUSSION
The increased complexity of creating and delivering a radiation therapy treatment plan today has resulted in the medical physics community taking a step back and re-evaluating how quality management is executed in radiation oncology. Historically, quality management has been based on device-specific quality assurance measures where every device involved in patient treatment is tested at various frequencies and held to specific tolerances. This approach is time intensive for the medical physicist who already faces issues with limited resources. Furthermore, the problem with this approach as highlighted in the recent TG-100 report, is that this technique fails to catch errors that are tied to problems with the clinical process itself. 2 In this work, we took a closer look at data errors in the treatment planning process at our institution. Through performing a root cause analysis of data errors that occurred in our clinic we gained a better understanding of the interactions between human users and individual devices in our treatment planning workflow and the impact on treatment outcomes. Interestingly, a significant proportion of the data errors that were identified are tied to manual errors by humans (35 of 79 errors in 2016 and 36 of 42 errors in 2017).
These range from typographical errors of patient demographics in the CT simulation software or treatment planning system to incorrect manual selection of the CT dataset that is to be used for planning. The dosimetric consequences to the patient for some of these errors such as incorrect CT used for planning can be significant.
However, in general, the main consequence of these errors as shown in Table 1 is an inability to treat the patient due to rejection of DICOM data by a treatment device/software that comes later in the treatment chain. This means patient start times are delayed and inefficiencies in the treatment planning process are introduced due to the time it takes to find the data error and repeat treatment planning steps a second time.
As a result of the understanding gained from this work, some interventions were introduced into our clinical workflow in an attempt to reduce the data error rate. For example, once it was understood that two of our CT simulators had a software bug that was creating an incorrect DICOM tag for 4DCT images, a script was written to fix the DICOM tag. This script is now run prior to importing any 4DCT average image from those two simulators into the treatment planning system. This intervention reduced the frequency of this data error from seven occurrences in 2016 to none occurrences in 2017. Similarly, once it was understood that CT scanner model information was missing in the DICOM tag file for 4DCT average images, a script was written in Pinnacle which is run by the dosimetrist at the start of planning to check if the CT-density curve T A B L E 3 Maximum time estimates to correct a given data error with and without the software tool implemented. One day was considered to be equivalent to an 8-h work day. where that slice is located at the edge of the gross target volume for an IMRT plan. In this case, planning target volumes as well as associated optimization structures need to be regenerated and the case must then be reoptimized in order to ensure adequate target coverage.
Similarly, for the data error where an incorrect image is in the secondary image list, the time needed to fix this error could be less than an hour if the physician can review the patient contours right away and the changes to contours are insignificant. However, if patient contours need to be changed, the case must then be replanned, and a day is needed to correct the error entirely. The time to correct a DICOM tag data error without the software was estimated to be 1 week in Table 3. This is based on the worst-case scenario where the erroneous reference CT has been imported into the record and verify system, requiring a vendor support ticket be placed for removal. Technically, this situation could be fixed faster by exporting the reference CT directly from the treatment planning system to XVI, bypassing the record and verify system entirely. However, in our experience this method results in bypassing the numerical verification process of treatment isocenter information. Therefore, our institutional policy is to remove the erroneous CT from Mosaiq and then resend the DICOM CT data through the typical data export workflow for treatment plans (i.e., Pinnacle to Mosaiq to XVI).
Conservatively, using the time estimates in feel that the work presented here highlights an area of the treatment planning process that has not been focused on in the past and for which errors can not only result in mistreatment of the patient but also significantly compromise clinical efficiency. This loss in efficiency is due to a lack of integrity in data communication processes between various software and hardware platforms in the treatment planning chain, an area for which the radiation oncology community is still gaining a better understanding. While users cannot use the software tool developed in this work and apply it directly to their own clinics, they can follow the framework outlined here to examine their own LACK ET AL.
| 731 treatment planning process further and potentially develop their own data error tracking tool. Since the software developed is based on a C/C++ programming language, the programming tools needed to replicate this type of software are widely available. Getting access to the necessary databases to perform scanning may require users to work with their treatment planning and record and verify system vendors. If software development resources are not available, clinics could still use the framework presented here to track these types of data errors, and identify weaknesses in data integrity. Once identified, process improvements could be implemented to minimize data errors in their treatment planning workflow.

| CONCLUSIONS
In summary, data errors occur in the treatment planning process and can significantly impact treatment plan creation, in terms of quality of the plans created, and delays in treatment start times for patients. implemented in 2017 was manual data entry errors. Therefore, our clinic is planning on changing the treatment planning workflow to minimize manual data entries by humans through use of DICOM Modality Worklist. Another important aspect of this software tool is the simplicity of its design and its ability to run without installing additional software on the treatment planning system servers. This allows us to easily and independently adapt the software as new errors are identified without vendor technical support. As such, this planning error tracking tool will continue to serve an important quality assurance role within our clinic.

CONF LICT OF I NTEREST
The authors have no other relevant conflicts of interest to disclose.