Multiatlas-Based Segmentation Editing with Interaction-Guided Patch Selection and Label Fusion

Sang Hyun Park, Yaozong Gao, Dinggang Shen

Research output: Contribution to journalArticle

2 Citations (Scopus)

Abstract

We propose a novel multiatlas-based segmentation method to address the segmentation editing scenario, where an incomplete segmentation is given along with a set of existing reference label images (used as atlases). Unlike previous multiatlas-based methods, which depend solely on appearance features, we incorporate interaction-guided constraints to find appropriate atlas label patches in the reference label set and derive their weights for label fusion. Specifically, user interactions provided on the erroneous parts are first divided into multiple local combinations. For each combination, the atlas label patches well-matched with both interactions and the previous segmentation are identified. Then, the segmentation is updated through the voxelwise label fusion of selected atlas label patches with their weights derived from the distances of each underlying voxel to the interactions. Since the atlas label patches well-matched with different local combinations are used in the fusion step, our method can consider various local shape variations during the segmentation update, even with only limited atlas label images and user interactions. Besides, since our method does not depend on either image appearance or sophisticated learning steps, it can be easily applied to general editing problems. To demonstrate the generality of our method, we apply it to editing segmentations of CT prostate, CT brainstem, and MR hippocampus, respectively. Experimental results show that our method outperforms existing editing methods in all three datasets.

Original languageEnglish
Article number7299268
Pages (from-to)1208-1219
Number of pages12
JournalIEEE Transactions on Biomedical Engineering
Volume63
Issue number6
DOIs
Publication statusPublished - 2016 Jun 1

Fingerprint

Atlases
Labels
Fusion reactions
Weights and Measures
Brain Stem
Prostate
Hippocampus
Learning

Keywords

  • Distance-based voting,
  • interaction-guided editing
  • label fusion
  • Segmentation editing

ASJC Scopus subject areas

  • Biomedical Engineering

Cite this

Multiatlas-Based Segmentation Editing with Interaction-Guided Patch Selection and Label Fusion. / Park, Sang Hyun; Gao, Yaozong; Shen, Dinggang.

In: IEEE Transactions on Biomedical Engineering, Vol. 63, No. 6, 7299268, 01.06.2016, p. 1208-1219.

Research output: Contribution to journalArticle

@article{9aa17b2a2c524111924a975201a35e3b,
title = "Multiatlas-Based Segmentation Editing with Interaction-Guided Patch Selection and Label Fusion",
abstract = "We propose a novel multiatlas-based segmentation method to address the segmentation editing scenario, where an incomplete segmentation is given along with a set of existing reference label images (used as atlases). Unlike previous multiatlas-based methods, which depend solely on appearance features, we incorporate interaction-guided constraints to find appropriate atlas label patches in the reference label set and derive their weights for label fusion. Specifically, user interactions provided on the erroneous parts are first divided into multiple local combinations. For each combination, the atlas label patches well-matched with both interactions and the previous segmentation are identified. Then, the segmentation is updated through the voxelwise label fusion of selected atlas label patches with their weights derived from the distances of each underlying voxel to the interactions. Since the atlas label patches well-matched with different local combinations are used in the fusion step, our method can consider various local shape variations during the segmentation update, even with only limited atlas label images and user interactions. Besides, since our method does not depend on either image appearance or sophisticated learning steps, it can be easily applied to general editing problems. To demonstrate the generality of our method, we apply it to editing segmentations of CT prostate, CT brainstem, and MR hippocampus, respectively. Experimental results show that our method outperforms existing editing methods in all three datasets.",
keywords = "Distance-based voting,, interaction-guided editing, label fusion, Segmentation editing",
author = "Park, {Sang Hyun} and Yaozong Gao and Dinggang Shen",
year = "2016",
month = "6",
day = "1",
doi = "10.1109/TBME.2015.2491612",
language = "English",
volume = "63",
pages = "1208--1219",
journal = "IEEE Transactions on Biomedical Engineering",
issn = "0018-9294",
publisher = "IEEE Computer Society",
number = "6",

}

TY - JOUR

T1 - Multiatlas-Based Segmentation Editing with Interaction-Guided Patch Selection and Label Fusion

AU - Park, Sang Hyun

AU - Gao, Yaozong

AU - Shen, Dinggang

PY - 2016/6/1

Y1 - 2016/6/1

N2 - We propose a novel multiatlas-based segmentation method to address the segmentation editing scenario, where an incomplete segmentation is given along with a set of existing reference label images (used as atlases). Unlike previous multiatlas-based methods, which depend solely on appearance features, we incorporate interaction-guided constraints to find appropriate atlas label patches in the reference label set and derive their weights for label fusion. Specifically, user interactions provided on the erroneous parts are first divided into multiple local combinations. For each combination, the atlas label patches well-matched with both interactions and the previous segmentation are identified. Then, the segmentation is updated through the voxelwise label fusion of selected atlas label patches with their weights derived from the distances of each underlying voxel to the interactions. Since the atlas label patches well-matched with different local combinations are used in the fusion step, our method can consider various local shape variations during the segmentation update, even with only limited atlas label images and user interactions. Besides, since our method does not depend on either image appearance or sophisticated learning steps, it can be easily applied to general editing problems. To demonstrate the generality of our method, we apply it to editing segmentations of CT prostate, CT brainstem, and MR hippocampus, respectively. Experimental results show that our method outperforms existing editing methods in all three datasets.

AB - We propose a novel multiatlas-based segmentation method to address the segmentation editing scenario, where an incomplete segmentation is given along with a set of existing reference label images (used as atlases). Unlike previous multiatlas-based methods, which depend solely on appearance features, we incorporate interaction-guided constraints to find appropriate atlas label patches in the reference label set and derive their weights for label fusion. Specifically, user interactions provided on the erroneous parts are first divided into multiple local combinations. For each combination, the atlas label patches well-matched with both interactions and the previous segmentation are identified. Then, the segmentation is updated through the voxelwise label fusion of selected atlas label patches with their weights derived from the distances of each underlying voxel to the interactions. Since the atlas label patches well-matched with different local combinations are used in the fusion step, our method can consider various local shape variations during the segmentation update, even with only limited atlas label images and user interactions. Besides, since our method does not depend on either image appearance or sophisticated learning steps, it can be easily applied to general editing problems. To demonstrate the generality of our method, we apply it to editing segmentations of CT prostate, CT brainstem, and MR hippocampus, respectively. Experimental results show that our method outperforms existing editing methods in all three datasets.

KW - Distance-based voting,

KW - interaction-guided editing

KW - label fusion

KW - Segmentation editing

UR - http://www.scopus.com/inward/record.url?scp=84976420232&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84976420232&partnerID=8YFLogxK

U2 - 10.1109/TBME.2015.2491612

DO - 10.1109/TBME.2015.2491612

M3 - Article

C2 - 26485353

AN - SCOPUS:84976420232

VL - 63

SP - 1208

EP - 1219

JO - IEEE Transactions on Biomedical Engineering

JF - IEEE Transactions on Biomedical Engineering

SN - 0018-9294

IS - 6

M1 - 7299268

ER -