Multi-task learning for segmentation and classification of tumors in 3D automated breast ultrasound images

Yue Zhou, Houjin Chen, Yanfeng Li, Qin Liu, Xuanang Xu, Shu Wang, Pew Thian Yap, Dinggang Shen

Research output: Contribution to journalArticlepeer-review

Abstract

Tumor classification and segmentation are two important tasks for computer-aided diagnosis (CAD) using 3D automated breast ultrasound (ABUS) images. However, they are challenging due to the significant shape variation of breast tumors and the fuzzy nature of ultrasound images (e.g., low contrast and signal to noise ratio). Considering the correlation between tumor classification and segmentation, we argue that learning these two tasks jointly is able to improve the outcomes of both tasks. In this paper, we propose a novel multi-task learning framework for joint segmentation and classification of tumors in ABUS images. The proposed framework consists of two sub-networks: an encoder-decoder network for segmentation and a light-weight multi-scale network for classification. To account for the fuzzy boundaries of tumors in ABUS images, our framework uses an iterative training strategy to refine feature maps with the help of probability maps obtained from previous iterations. Experimental results based on a clinical dataset of 170 3D ABUS volumes collected from 107 patients indicate that the proposed multi-task framework improves tumor segmentation and classification over the single-task learning counterparts.

Original languageEnglish
Article number101918
JournalMedical Image Analysis
Volume70
DOIs
Publication statusPublished - 2021 May

Keywords

  • AUBS image
  • Classification
  • Joint training
  • Multi-task learning
  • Segmentation

ASJC Scopus subject areas

  • Radiological and Ultrasound Technology
  • Radiology Nuclear Medicine and imaging
  • Computer Vision and Pattern Recognition
  • Health Informatics
  • Computer Graphics and Computer-Aided Design

Fingerprint Dive into the research topics of 'Multi-task learning for segmentation and classification of tumors in 3D automated breast ultrasound images'. Together they form a unique fingerprint.

Cite this