skip to main content
OSTI.GOV title logo U.S. Department of Energy
Office of Scientific and Technical Information

Title: SU-C-207B-07: Deep Convolutional Neural Network Image Matching for Ultrasound Guidance in Radiotherapy

Abstract

Purpose: Robust matching of ultrasound images is a challenging problem as images of the same anatomy often present non-trivial differences. This poses an obstacle for ultrasound guidance in radiotherapy. Thus our objective is to overcome this obstacle by designing and evaluating an image blocks matching framework based on a two channel deep convolutional neural network. Methods: We extend to 3D an algorithmic structure previously introduced for 2D image feature learning [1]. To obtain the similarity between two 3D image blocks A and B, the 3D image blocks are divided into 2D patches Ai and Bi. The similarity is then calculated as the average similarity score of Ai and Bi. The neural network was then trained with public non-medical image pairs, and subsequently evaluated on ultrasound image blocks for the following scenarios: (S1) same image blocks with/without shifts (A and A-shift-x); (S2) non-related random block pairs; (S3) ground truth registration matched pairs of different ultrasound images with/without shifts (A-i and A-reg-i-shift-x). Results: For S1 the similarity scores of A and A-shift-x were 32.63, 18.38, 12.95, 9.23, 2.15 and 0.43 for x=ranging from 0 mm to 10 mm in 2 mm increments. For S2 the average similarity score for non-related block pairsmore » was −1.15. For S3 the average similarity score of ground truth registration matched blocks A-i and A-reg-i-shift-0 (1≤i≤5) was 12.37. After translating A-reg-i-shift-0 by 0 mm, 2 mm, 4 mm, 6 mm, 8 mm, and 10 mm, the average similarity scores of A-i and A-reg-i-shift-x were 11.04, 8.42, 4.56, 2.27, and 0.29 respectively. Conclusion: The proposed method correctly assigns highest similarity to corresponding 3D ultrasound image blocks despite differences in image content and thus can form the basis for ultrasound image registration and tracking.[1] Zagoruyko, Komodakis, “Learning to compare image patches via convolutional neural networks', IEEE CVPR 2015,pp.4353–4361.« less

Authors:
; ; ;  [1]
  1. Stanford University Cancer Center, Palo Alto, CA (United States)
Publication Date:
OSTI Identifier:
22624356
Resource Type:
Journal Article
Resource Relation:
Journal Name: Medical Physics; Journal Volume: 43; Journal Issue: 6; Other Information: (c) 2016 American Association of Physicists in Medicine; Country of input: International Atomic Energy Agency (IAEA)
Country of Publication:
United States
Language:
English
Subject:
60 APPLIED LIFE SCIENCES; 61 RADIATION PROTECTION AND DOSIMETRY; ANATOMY; DESIGN; GROUND TRUTH MEASUREMENTS; IMAGES; NEURAL NETWORKS; RADIOTHERAPY

Citation Formats

Zhu, N, Najafi, M, Hancock, S, and Hristov, D. SU-C-207B-07: Deep Convolutional Neural Network Image Matching for Ultrasound Guidance in Radiotherapy. United States: N. p., 2016. Web. doi:10.1118/1.4955603.
Zhu, N, Najafi, M, Hancock, S, & Hristov, D. SU-C-207B-07: Deep Convolutional Neural Network Image Matching for Ultrasound Guidance in Radiotherapy. United States. doi:10.1118/1.4955603.
Zhu, N, Najafi, M, Hancock, S, and Hristov, D. Wed . "SU-C-207B-07: Deep Convolutional Neural Network Image Matching for Ultrasound Guidance in Radiotherapy". United States. doi:10.1118/1.4955603.
@article{osti_22624356,
title = {SU-C-207B-07: Deep Convolutional Neural Network Image Matching for Ultrasound Guidance in Radiotherapy},
author = {Zhu, N and Najafi, M and Hancock, S and Hristov, D},
abstractNote = {Purpose: Robust matching of ultrasound images is a challenging problem as images of the same anatomy often present non-trivial differences. This poses an obstacle for ultrasound guidance in radiotherapy. Thus our objective is to overcome this obstacle by designing and evaluating an image blocks matching framework based on a two channel deep convolutional neural network. Methods: We extend to 3D an algorithmic structure previously introduced for 2D image feature learning [1]. To obtain the similarity between two 3D image blocks A and B, the 3D image blocks are divided into 2D patches Ai and Bi. The similarity is then calculated as the average similarity score of Ai and Bi. The neural network was then trained with public non-medical image pairs, and subsequently evaluated on ultrasound image blocks for the following scenarios: (S1) same image blocks with/without shifts (A and A-shift-x); (S2) non-related random block pairs; (S3) ground truth registration matched pairs of different ultrasound images with/without shifts (A-i and A-reg-i-shift-x). Results: For S1 the similarity scores of A and A-shift-x were 32.63, 18.38, 12.95, 9.23, 2.15 and 0.43 for x=ranging from 0 mm to 10 mm in 2 mm increments. For S2 the average similarity score for non-related block pairs was −1.15. For S3 the average similarity score of ground truth registration matched blocks A-i and A-reg-i-shift-0 (1≤i≤5) was 12.37. After translating A-reg-i-shift-0 by 0 mm, 2 mm, 4 mm, 6 mm, 8 mm, and 10 mm, the average similarity scores of A-i and A-reg-i-shift-x were 11.04, 8.42, 4.56, 2.27, and 0.29 respectively. Conclusion: The proposed method correctly assigns highest similarity to corresponding 3D ultrasound image blocks despite differences in image content and thus can form the basis for ultrasound image registration and tracking.[1] Zagoruyko, Komodakis, “Learning to compare image patches via convolutional neural networks', IEEE CVPR 2015,pp.4353–4361.},
doi = {10.1118/1.4955603},
journal = {Medical Physics},
number = 6,
volume = 43,
place = {United States},
year = {Wed Jun 15 00:00:00 EDT 2016},
month = {Wed Jun 15 00:00:00 EDT 2016}
}