skip to main content
OSTI.GOV title logo U.S. Department of Energy
Office of Scientific and Technical Information

Title: Prostate volume contouring: A 3D analysis of segmentation using 3DTRUS, CT, and MR

Abstract

Purpose: This study evaluated the reproducibility and modality differences of prostate contouring after brachytherapy implant using three-dimensional (3D) transrectal ultrasound (3DTRUS), T2-weighted magnetic resonance (MR), and computed tomography (CT) imaging. Methods and Materials: Seven blinded observers contoured 10 patients' prostates, 30 day postimplant, on 3DTRUS, MR, and CT images to assess interobserver variability. Randomized images were contoured twice by each observer. We analyzed length and volume measurements and performed a 3D analysis of intra- and intermodality variation. Results: Average volume ratios were 1.16 for CT/MR, 0.90 for 3DTRUS/MR, and 1.30 for CT/3DTRUS. Overall contouring variability was largest for CT and similar for MR and 3DTRUS. The greatest variability of CT contours occurred at the posterior and anterior portions of the midgland. On MR, overall variability was smaller, with a maximum in the anterior region. On 3DTRUS, high variability occurred in anterior regions of the apex and base, whereas the prostate-rectum interface had the smallest variability. The shape of the prostate on MR was rounder, with the base and apex of similar size, whereas CT contours had broad, flat bases narrowing toward the apex. The average percent of surface area that was significantly different (95% confidence interval) for CT/MR was 4.1%;more » 3DTRUS/MR, 10.7%; and CT/3DTRUS, 6.3%. The larger variability of CT measurements made significant differences more difficult to detect. Conclusions: The contouring of prostates on CT, MR, and 3DTRUS results in systematic differences in the locations of and variability in prostate boundary definition between modalities. MR and 3DTRUS display the smallest variability and the closest correspondence.« less

Authors:
 [1];  [2];  [3];  [4];  [5];  [5];  [6];  [5];  [7];  [8]
  1. Dept. of Medical Physics, Tom Baker Cancer Centre, and Departments of Oncology and Physics and Astronomy, Univ. of Calgary, Calgary, Alberta (Canada). E-mail: wendy.smith@cancerboard.ab.ca
  2. Dept. of Medical Physics, London Regional Cancer Program, London, Ontario (Canada)|[Dept. of Radiation Oncology, London Regional Cancer Program, London, Ontario (Canada)
  3. Dept. of Radiation Oncology, London Regional Cancer Program, London, Ontario (Canada)|[Dept. of Oncology, Schulich School of Medicine, Univ. of Western Ontario, London, Ontario (Canada)|[Dept. of Medical Biophysics, Schulich School of Medicine, Univ. of Western Ontario, London, Ontario (Canada)
  4. Dept. of Radiation Oncology, London Regional Cancer Program, London, Ontario (Canada)|[Dept. of Oncology, Schulich School of Medicine, Univ. of Western Ontario, London, Ontario (Canada)|[Dept. of Epidemiology and Biostatistics, Schulich School of Medicine, Univ. of Western Ontario, London, Ontario (Canada)
  5. Dept. of Radiation Oncology, London Regional Cancer Program, London, Ontario (Canada)|[Dept. of Oncology, Schulich School of Medicine, Univ. of Western Ontario, London, Ontario (Canada)
  6. Dept. of Diagnostic Radiology, Schulich School of Medicine, Univ. of Western Ontario, London, Ontario (Canada)
  7. Imaging Research Labs., Robarts Research Inst., London, Ontario (Canada)
  8. Univ. of Western Ontario, London, Ontario (Canada)
Publication Date:
OSTI Identifier:
20944784
Resource Type:
Journal Article
Resource Relation:
Journal Name: International Journal of Radiation Oncology, Biology and Physics; Journal Volume: 67; Journal Issue: 4; Other Information: DOI: 10.1016/j.ijrobp.2006.11.027; PII: S0360-3016(06)03511-5; Copyright (c) 2007 Elsevier Science B.V., Amsterdam, Netherlands, All rights reserved; Country of input: International Atomic Energy Agency (IAEA)
Country of Publication:
United States
Language:
English
Subject:
62 RADIOLOGY AND NUCLEAR MEDICINE; BRACHYTHERAPY; COMPUTERIZED TOMOGRAPHY; IMAGES; MAGNETIC RESONANCE; PATIENTS; PROSTATE; RADIATION SOURCE IMPLANTS; RECTUM; SURFACE AREA

Citation Formats

Smith, Wendy L., Lewis, Craig, Bauman, Glenn, Rodrigues, George, D'Souza, David, Ash, Robert, Ho, Derek, Venkatesan, Varagur, Downey, Donal, and Fenster, Aaron. Prostate volume contouring: A 3D analysis of segmentation using 3DTRUS, CT, and MR. United States: N. p., 2007. Web. doi:10.1016/j.ijrobp.2006.11.027.
Smith, Wendy L., Lewis, Craig, Bauman, Glenn, Rodrigues, George, D'Souza, David, Ash, Robert, Ho, Derek, Venkatesan, Varagur, Downey, Donal, & Fenster, Aaron. Prostate volume contouring: A 3D analysis of segmentation using 3DTRUS, CT, and MR. United States. doi:10.1016/j.ijrobp.2006.11.027.
Smith, Wendy L., Lewis, Craig, Bauman, Glenn, Rodrigues, George, D'Souza, David, Ash, Robert, Ho, Derek, Venkatesan, Varagur, Downey, Donal, and Fenster, Aaron. Thu . "Prostate volume contouring: A 3D analysis of segmentation using 3DTRUS, CT, and MR". United States. doi:10.1016/j.ijrobp.2006.11.027.
@article{osti_20944784,
title = {Prostate volume contouring: A 3D analysis of segmentation using 3DTRUS, CT, and MR},
author = {Smith, Wendy L. and Lewis, Craig and Bauman, Glenn and Rodrigues, George and D'Souza, David and Ash, Robert and Ho, Derek and Venkatesan, Varagur and Downey, Donal and Fenster, Aaron},
abstractNote = {Purpose: This study evaluated the reproducibility and modality differences of prostate contouring after brachytherapy implant using three-dimensional (3D) transrectal ultrasound (3DTRUS), T2-weighted magnetic resonance (MR), and computed tomography (CT) imaging. Methods and Materials: Seven blinded observers contoured 10 patients' prostates, 30 day postimplant, on 3DTRUS, MR, and CT images to assess interobserver variability. Randomized images were contoured twice by each observer. We analyzed length and volume measurements and performed a 3D analysis of intra- and intermodality variation. Results: Average volume ratios were 1.16 for CT/MR, 0.90 for 3DTRUS/MR, and 1.30 for CT/3DTRUS. Overall contouring variability was largest for CT and similar for MR and 3DTRUS. The greatest variability of CT contours occurred at the posterior and anterior portions of the midgland. On MR, overall variability was smaller, with a maximum in the anterior region. On 3DTRUS, high variability occurred in anterior regions of the apex and base, whereas the prostate-rectum interface had the smallest variability. The shape of the prostate on MR was rounder, with the base and apex of similar size, whereas CT contours had broad, flat bases narrowing toward the apex. The average percent of surface area that was significantly different (95% confidence interval) for CT/MR was 4.1%; 3DTRUS/MR, 10.7%; and CT/3DTRUS, 6.3%. The larger variability of CT measurements made significant differences more difficult to detect. Conclusions: The contouring of prostates on CT, MR, and 3DTRUS results in systematic differences in the locations of and variability in prostate boundary definition between modalities. MR and 3DTRUS display the smallest variability and the closest correspondence.},
doi = {10.1016/j.ijrobp.2006.11.027},
journal = {International Journal of Radiation Oncology, Biology and Physics},
number = 4,
volume = 67,
place = {United States},
year = {Thu Mar 15 00:00:00 EDT 2007},
month = {Thu Mar 15 00:00:00 EDT 2007}
}
  • The authors have developed a semiautomatic system for segmentation of a diverse set of lesions in head and neck CT scans. The system takes as input an approximate bounding box, and uses a multistage level set to perform the final segmentation. A data set consisting of 69 lesions marked on 33 scans from 23 patients was used to evaluate the performance of the system. The contours from automatic segmentation were compared to both 2D and 3D gold standard contours manually drawn by three experienced radiologists. Three performance metric measures were used for the comparison. In addition, a radiologist provided qualitymore » ratings on a 1 to 10 scale for all of the automatic segmentations. For this pilot study, the authors observed that the differences between the automatic and gold standard contours were larger than the interobserver differences. However, the system performed comparably to the radiologists, achieving an average area intersection ratio of 85.4% compared to an average of 91.2% between two radiologists. The average absolute area error was 21.1% compared to 10.8%, and the average 2D distance was 1.38 mm compared to 0.84 mm between the radiologists. In addition, the quality rating data showed that, despite the very lax assumptions made on the lesion characteristics in designing the system, the automatic contours approximated many of the lesions very well.« less
  • Purpose: Accurate segmentation and volume estimation of the prostate gland in magnetic resonance (MR) and computed tomography (CT) images are necessary steps in diagnosis, treatment, and monitoring of prostate cancer. This paper presents an algorithm for the prostate gland volume estimation based on the semiautomated segmentation of individual slices in T2-weighted MR and CT image sequences. Methods: The proposedInter-Slice Bidirectional Registration-based Segmentation (iBRS) algorithm relies on interslice image registration of volume data to segment the prostate gland without the use of an anatomical atlas. It requires the user to mark only three slices in a given volume dataset, i.e., themore » first, middle, and last slices. Next, the proposed algorithm uses a registration algorithm to autosegment the remaining slices. We conducted comprehensive experiments to measure the performance of the proposed algorithm using three registration methods (i.e., rigid, affine, and nonrigid techniques). Results: The results with the proposed technique were compared with manual marking using prostate MR and CT images from 117 patients. Manual marking was performed by an expert user for all 117 patients. The median accuracies for individual slices measured using the Dice similarity coefficient (DSC) were 92% and 91% for MR and CT images, respectively. The iBRS algorithm was also evaluated regarding user variability, which confirmed that the algorithm was robust to interuser variability when marking the prostate gland. Conclusions: The proposed algorithm exploits the interslice data redundancy of the images in a volume dataset of MR and CT images and eliminates the need for an atlas, minimizing the computational cost while producing highly accurate results which are robust to interuser variability.« less
  • Purpose: Accurate segmentation and volume estimation of the prostate gland in magnetic resonance (MR) and computed tomography (CT) images are necessary steps in diagnosis, treatment, and monitoring of prostate cancer. This paper presents an algorithm for the prostate gland volume estimation based on the semiautomated segmentation of individual slices in T2-weighted MR and CT image sequences. Methods: The proposedInter-Slice Bidirectional Registration-based Segmentation (iBRS) algorithm relies on interslice image registration of volume data to segment the prostate gland without the use of an anatomical atlas. It requires the user to mark only three slices in a given volume dataset, i.e., themore » first, middle, and last slices. Next, the proposed algorithm uses a registration algorithm to autosegment the remaining slices. We conducted comprehensive experiments to measure the performance of the proposed algorithm using three registration methods (i.e., rigid, affine, and nonrigid techniques). Results: The results with the proposed technique were compared with manual marking using prostate MR and CT images from 117 patients. Manual marking was performed by an expert user for all 117 patients. The median accuracies for individual slices measured using the Dice similarity coefficient (DSC) were 92% and 91% for MR and CT images, respectively. The iBRS algorithm was also evaluated regarding user variability, which confirmed that the algorithm was robust to interuser variability when marking the prostate gland. Conclusions: The proposed algorithm exploits the interslice data redundancy of the images in a volume dataset of MR and CT images and eliminates the need for an atlas, minimizing the computational cost while producing highly accurate results which are robust to interuser variability.« less
  • Purpose: To evaluate the accuracy of measuring volumes using three-dimensional ultrasound (3D US), and to verify the feasibility of the replacement of CT-MR fusion images with CT-3D US in radiotherapy treatment planning. Methods: Phantoms, consisting of water, contrast agent, and agarose, were manufactured. The volume was measured using 3D US, CT, and MR devices. A CT-3D US and MR-3D US image fusion software was developed using the Insight Toolkit library in order to acquire three-dimensional fusion images. The quality of the image fusion was evaluated using metric value and fusion images. Results: Volume measurement, using 3D US, shows a 2.8more » {+-} 1.5% error, 4.4 {+-} 3.0% error for CT, and 3.1 {+-} 2.0% error for MR. The results imply that volume measurement using the 3D US devices has a similar accuracy level to that of CT and MR. Three-dimensional image fusion of CT-3D US and MR-3D US was successfully performed using phantom images. Moreover, MR-3D US image fusion was performed using human bladder images. Conclusions: 3D US could be used in the volume measurement of human bladders and prostates. CT-3D US image fusion could be used in monitoring the target position in each fraction of external beam radiation therapy. Moreover, the feasibility of replacing the CT-MR image fusion to the CT-3D US in radiotherapy treatment planning was verified.« less
  • Purpose: Automatic prostate segmentation from MR images is an important task in various clinical applications such as prostate cancer staging and MR-guided radiotherapy planning. However, the large appearance and shape variations of the prostate in MR images make the segmentation problem difficult to solve. Traditional Active Shape/Appearance Model (ASM/AAM) has limited accuracy on this problem, since its basic assumption, i.e., both shape and appearance of the targeted organ follow Gaussian distributions, is invalid in prostate MR images. To this end, the authors propose a sparse dictionary learning method to model the image appearance in a nonparametric fashion and further integratemore » the appearance model into a deformable segmentation framework for prostate MR segmentation. Methods: To drive the deformable model for prostate segmentation, the authors propose nonparametric appearance and shape models. The nonparametric appearance model is based on a novel dictionary learning method, namely distributed discriminative dictionary (DDD) learning, which is able to capture fine distinctions in image appearance. To increase the differential power of traditional dictionary-based classification methods, the authors' DDD learning approach takes three strategies. First, two dictionaries for prostate and nonprostate tissues are built, respectively, using the discriminative features obtained from minimum redundancy maximum relevance feature selection. Second, linear discriminant analysis is employed as a linear classifier to boost the optimal separation between prostate and nonprostate tissues, based on the representation residuals from sparse representation. Third, to enhance the robustness of the authors' classification method, multiple local dictionaries are learned for local regions along the prostate boundary (each with small appearance variations), instead of learning one global classifier for the entire prostate. These discriminative dictionaries are located on different patches of the prostate surface and trained to adaptively capture the appearance in different prostate zones, thus achieving better local tissue differentiation. For each local region, multiple classifiers are trained based on the randomly selected samples and finally assembled by a specific fusion method. In addition to this nonparametric appearance model, a prostate shape model is learned from the shape statistics using a novel approach, sparse shape composition, which can model nonGaussian distributions of shape variation and regularize the 3D mesh deformation by constraining it within the observed shape subspace. Results: The proposed method has been evaluated on two datasets consisting of T2-weighted MR prostate images. For the first (internal) dataset, the classification effectiveness of the authors' improved dictionary learning has been validated by comparing it with three other variants of traditional dictionary learning methods. The experimental results show that the authors' method yields a Dice Ratio of 89.1% compared to the manual segmentation, which is more accurate than the three state-of-the-art MR prostate segmentation methods under comparison. For the second dataset, the MICCAI 2012 challenge dataset, the authors' proposed method yields a Dice Ratio of 87.4%, which also achieves better segmentation accuracy than other methods under comparison. Conclusions: A new magnetic resonance image prostate segmentation method is proposed based on the combination of deformable model and dictionary learning methods, which achieves more accurate segmentation performance on prostate T2 MR images.« less