Simple Yet Effective Way for Improving the Performance of Depth Map Super-Resolution

Yoon Jae Yeo, Min Cheol Sagong, Yong Goo Shin, Seung Won Jung, Sung Jea Ko

Research output: Contribution to journalArticlepeer-review

1 Citation (Scopus)


In depth map super-resolution (SR), a high-resolution color image plays an important role as guidance for preventing blurry depth boundaries. However, excessive/deficient use of the color image features often causes performance degradation such as texture-copying/edge-smoothing in flat/boundary areas. To alleviate these problems, this letter presents a simple yet effective method for enhancing the performance of the SR without requiring significant modifications to the original SR network. To this end, we present a self-selective concatenation (SSC), which is a substitute for the conventional feature concatenation. In the upsampling layers of the SR network, the SSC extracts spatial and channel attention from both color and depth features such that color features can be selectively used for depth SR. Specifically, the SSC learns to use sufficient color features for rendering sharp depth boundaries, whereas their effects are reduced in smooth regions to prevent texture-copying. The proposed SSC can be included in any existing SR networks that have the encoder-decoder structure. The experimental results show that the proposed method can further improve the performances of existing SR networks in terms of the root mean squared error and peak signal-to-noise ratio.

Original languageEnglish
Article number9264663
Pages (from-to)2099-2103
Number of pages5
JournalIEEE Signal Processing Letters
Publication statusPublished - 2020


  • Depth map super-resolution (SR)
  • convolutional neural network
  • deep learning

ASJC Scopus subject areas

  • Signal Processing
  • Electrical and Electronic Engineering
  • Applied Mathematics


Dive into the research topics of 'Simple Yet Effective Way for Improving the Performance of Depth Map Super-Resolution'. Together they form a unique fingerprint.

Cite this