作者
G.H. Andersen,Peter R. Jermain,B Crawford,Collen Foote,David M. McClatchy
摘要
Abstract Background Performing patient‐specific quality assurance (QA) using delivery log file analysis may improve workflow efficiency and error detection versus phantom‐based or portal dosimetry measurements. Purpose Due to the small number of commercial log file QA systems, cross‐vendor integration with existing LINAC, Record and Verify (R&V), and Treatment Planning System (TPS) infrastructures may pose unexpected challenges. We report on the reliability and compatibility of a commercial log file QA system with both native and cross‐vendor LINACs. Methods Mobius3D (Varian) was employed as the principal method of IMRT QA for 1708 treatment plans created in RayStation (RaySearch) on both TrueBeam (Varian) and Agility (Elekta) LINACs over the course of a year. ArcCheck (Sun Nuclear) was employed as a secondary measurement for a subset of 107 plans that failed initial log‐file QA according to institutional protocol. Timestamps were retrieved from the Mobius3D web interface at various stages in the log file QA process to evaluate the system's efficiency. Finally, contemporaneously matched sets of head & neck treatments planned for and delivered on TrueBeam ( n = 100) versus Agility ( n = 100) LINACs were also compared. Results Across all 12 treatment sites, the Mobius3D computed pre‐delivery Plan Check γ (3% dose difference, 3 mm distance to agreement criteria) was μ = 95.17%, σ = 6.12% ( μ = mean, σ = standard deviation) for Agility and μ = 99.33%, σ = 1.67% for TrueBeam ( p < 0.0001), and γ differences between Plan Check and post‐delivery QA Check were μ = ‐0.44%, σ = 0.73% for Agility and μ = ‐0.02%, σ = 0.26% for TrueBeam ( p < 0.0001). While γ distributions differed little among treatment sites for TrueBeam plans, Agility head and neck and sarcoma plans had lower overall γ and greater γ differences than other sites. For the 107 plans measured with ArcCheck, 102 plans yielded γ > 95% while the remaining five yielded 92% < γ < 95%. The time taken to scan log files into Mobius3D was 21.0 ± 10.6 min for Agility and 12.1 ± 9.1 min for TrueBeam ( p < 0.0001), but the time taken to process QA checks was 9.7 ± 8.0 min for Agility and 9.5 ± 8.9 min for TrueBeam ( p > 0.7). For the matched sets of head & neck treatments, the Plan Check γ distributions were μ = 92.29%, σ = 4.17% for Agility and μ = 99.15%, σ = 0.70% for TrueBeam ( p < 0.0001), while γ differences between Plan Check and post‐delivery QA Check were μ = ‐0.44%, σ = 0.73% for Agility and μ = ‐0.02%, σ = 0.26% for TrueBeam ( p < 0.0001). Conclusions Mobius3D plan‐based and log file‐based dosimetry agreed more closely with Varian based treatment plans compared to Elekta, despite secondary physical 3D dosimetry on Elekta plans leading to acceptable agreement. Additionally, Mobius3D took about twice as long to process log files in the Elekta format. While our patient‐specific QA workflow was greatly expedited using Mobius3D, cross‐vendor plans suffered from some degree of incompatibility.