Learning statistical correlation for fast prostate registration in image-guided radiotherapy

Yonghong Shi, Shu Liao, Dinggang Shen

Research output: Contribution to journalArticlepeer-review

6 Citations (Scopus)


Purpose: In adaptive radiation therapy of prostate cancer, fast and accurate registration between the planning image and treatment images of the patient is of essential importance. With the authors' recently developed deformable surface model, prostate boundaries in each treatment image can be rapidly segmented and their correspondences (or relative deformations) to the prostate boundaries in the planning image are also established automatically. However, the dense correspondences on the nonboundary regions, which are important especially for transforming the treatment plan designed in the planning image space to each treatment image space, are remained unresolved. This paper presents a novel approach to learn the statistical correlation between deformations of prostate boundary and nonboundary regions, for rapidly estimating deformations of the nonboundary regions when given the deformations of the prostate boundary at a new treatment image. Methods: The main contributions of the proposed method lie in the following aspects. First, the statistical deformation correlation will be learned from both current patient and other training patients, and further updated adaptively during the radiotherapy. Specifically, in the initial treatment stage when the number of treatment images collected from the current patient is small, the statistical deformation correlation is mainly learned from other training patients. As more treatment images are collected from the current patient, the patient-specific information will play a more important role in learning patient-specific statistical deformation correlation to effectively reflect prostate deformation of the current patient during the treatment. Eventually, only the patient-specific statistical deformation correlation is used to estimate dense correspondences when a sufficient number of treatment images have been acquired from the current patient. Second, the statistical deformation correlation will be learned by using a multiple linear regression (MLR) model, i.e., ridge regression (RR) model, which has the best prediction accuracy than other MLR models such as canonical correlation analysis (CCA) and principal component regression (PCR). Results: To demonstrate the performance of the proposed method, we first evaluate its registration accuracy by comparing the deformation field predicted by our method with the deformation field estimated by the thin plate spline (TPS) based correspondence interpolation method on 306 serial prostate CT images of 24 patients. The average predictive error on the voxels around 5 mm of prostate boundary is 0.38 mm for our method of RR-based correlation model. Also, the corresponding maximum error is 2.89 mm. We then compare the speed for deformation interpolation by different methods. When considering the larger region of interest (ROI) with the size of 512 512 61, our method takes 24.41 seconds to interpolate the dense deformation field while TPS method needs 6.7 minutes; when considering a small ROI (surrounding prostate) with size of 112 110 93, our method takes 1.80 seconds, while TPS method needs 25 seconds. Conclusions: Experimental results show that the proposed method can achieve much faster registration speed yet with comparable registration accuracy, compared to the TPS-based correspondence (or deformation) interpolation approach.

Original languageEnglish
Pages (from-to)5980-5991
Number of pages12
JournalMedical physics
Issue number11
Publication statusPublished - 2011 Nov


  • adaptive radiation therapy
  • fast registration
  • multiple linear regression
  • patient-specific correlation
  • predictive correlation model

ASJC Scopus subject areas

  • Biophysics
  • Radiology Nuclear Medicine and imaging


Dive into the research topics of 'Learning statistical correlation for fast prostate registration in image-guided radiotherapy'. Together they form a unique fingerprint.

Cite this