TY - GEN
T1 - Deep Temporal Color Constancy for AC Light Sources
AU - Yoo, Jun Sang
AU - Lee, Chan Ho
AU - Kim, Jong Ok
N1 - Funding Information:
This work is supported by the National Research Foundation of Korea (NRF) grant funded by the Korea government (MSIT) (No. 2019R1A2C1005834) and the MSIT(Ministry of Science and ICT), Korea, under the ITRC (Information Technology Research Center) support program (IITP-2020-0-01749-001) supervised by the IITP (Institute of Information & Communications Technology Planning & Evaluation).
Publisher Copyright:
© 2020 IEEE.
PY - 2020/12/1
Y1 - 2020/12/1
N2 - Most of the lights surrounding our world are artificial lights, whose power is supplied by alternative current (AC). The intensities of these lights are dynamically varying with time. In this paper, we propose a novel deep-learning based method for temporal color constancy. We capture this intensity variation of AC lights using high-speed camera, and use it as a close cue to learn an illuminant chromaticity. While most of the existing methods estimate an illuminant from spatial pixels, the proposed method learns temporal feature via AC flickers of a high-speed video. To effectively learn a temporal feature, the high-speed temporal correlation is fed into the proposed network, and helps it to concentrate on illuminant-attentive regions. As a result, the proposed method works well under complex illuminant environment with ambient lights, which was a very hard problem for existing spatial methods. Experimental results show that the proposed method outperforms all existing methods, and demonstrate that it works very robustly under various illuminant environments.
AB - Most of the lights surrounding our world are artificial lights, whose power is supplied by alternative current (AC). The intensities of these lights are dynamically varying with time. In this paper, we propose a novel deep-learning based method for temporal color constancy. We capture this intensity variation of AC lights using high-speed camera, and use it as a close cue to learn an illuminant chromaticity. While most of the existing methods estimate an illuminant from spatial pixels, the proposed method learns temporal feature via AC flickers of a high-speed video. To effectively learn a temporal feature, the high-speed temporal correlation is fed into the proposed network, and helps it to concentrate on illuminant-attentive regions. As a result, the proposed method works well under complex illuminant environment with ambient lights, which was a very hard problem for existing spatial methods. Experimental results show that the proposed method outperforms all existing methods, and demonstrate that it works very robustly under various illuminant environments.
KW - AC flicker
KW - high-speed video
KW - temporal color constancy
KW - temporal correlation
UR - http://www.scopus.com/inward/record.url?scp=85099432897&partnerID=8YFLogxK
U2 - 10.1109/VCIP49819.2020.9301816
DO - 10.1109/VCIP49819.2020.9301816
M3 - Conference contribution
AN - SCOPUS:85099432897
T3 - 2020 IEEE International Conference on Visual Communications and Image Processing, VCIP 2020
SP - 217
EP - 221
BT - 2020 IEEE International Conference on Visual Communications and Image Processing, VCIP 2020
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2020 IEEE International Conference on Visual Communications and Image Processing, VCIP 2020
Y2 - 1 December 2020 through 4 December 2020
ER -