SINGLE IMAGE REFLECTION REMOVAL BASED ON BI-CHANNELS PRIOR

Wei Ting Chen*, Yi Wen Chen*, Kuan Yu Chen*, Jian Jiun Ding*, Sy Yen Kuo*

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

1 Scopus citations

Abstract

Single image reflection removal is a crucial technique which can improve the performance of object detection, semantic segmentation, and various computer vision applications. In this paper, we present a novel reflection removal algorithm using bi-channel priors (i.e., the dark channel prior and the bright channel prior). We observe that the values of dark channel pixels are not near 0, and those of bright channel pixels are not closer to 1 under the reflection scenario. We first demonstrate these phenomena statistically and mathe- matically. Then, we apply these properties as the constraints in optimizing the proposed reflection removal process. Extensive experiments on several well-known benchmarks demonstrate that our approach achieves desirable reflection suppression results compared with other methods.

Original languageEnglish
Title of host publication2022 IEEE International Conference on Image Processing, ICIP 2022 - Proceedings
PublisherIEEE Computer Society
Pages2117-2121
Number of pages5
ISBN (Electronic)9781665496209
DOIs
StatePublished - 2022
Externally publishedYes
Event29th IEEE International Conference on Image Processing, ICIP 2022 - Bordeaux, France
Duration: 16 10 202219 10 2022

Publication series

NameProceedings - International Conference on Image Processing, ICIP
ISSN (Print)1522-4880

Conference

Conference29th IEEE International Conference on Image Processing, ICIP 2022
Country/TerritoryFrance
CityBordeaux
Period16/10/2219/10/22

Bibliographical note

Publisher Copyright:
© 2022 IEEE.

Keywords

  • Bi-channels prior
  • Layer separation
  • Reflection removal

Fingerprint

Dive into the research topics of 'SINGLE IMAGE REFLECTION REMOVAL BASED ON BI-CHANNELS PRIOR'. Together they form a unique fingerprint.

Cite this