Base-detail image inpainting
Ruonan Zhang (Peng Cheng Laboratory), Yurui Ren (Shenzhen Graduate School, Peking University), Ge Li (SECE, Shenzhen Graduate School, Peking University), Jingfei Qiu (Peng Cheng Laboratory) AbstractRecent advances in image inpainting have shown exciting promise with learning-based methods. Though they are effective in capturing features with some prior techniques, most of them fail to reconstruct reasonable base and detail information, so that the inpainted regions appear blurry, over-smoothed, and weird. Therefore, we propose a new ``Divider and Conquer'' model called Base-Detail Image Inpainting, which combines the reconstructed base and detail layers to generate the final subjective perception images. The base layer with low-frequency information can grasp the basic distribution while the detail layer with high-frequency information assists with the details. The joint generator overall would benefit from these two as guided anchors. In addition, we evaluate our two models over three publicly available datasets, and our experiments demonstrate that our method outperforms current state-of-the-art techniques quantitatively and qualitatively.
DOI
10.5244/C.33.218
https://dx.doi.org/10.5244/C.33.218
Files
BibTeX
@inproceedings{BMVC2019,
title={Base-detail image inpainting},
author={Ruonan Zhang and Yurui Ren and Ge Li and Jingfei Qiu},
year={2019},
month={September},
pages={218.1--218.10},
articleno={218},
numpages={10},
booktitle={Proceedings of the British Machine Vision Conference (BMVC)},
publisher={BMVA Press},
editor={Kirill Sidorov and Yulia Hicks},
doi={10.5244/C.33.218},
url={https://dx.doi.org/10.5244/C.33.218}
}
title={Base-detail image inpainting},
author={Ruonan Zhang and Yurui Ren and Ge Li and Jingfei Qiu},
year={2019},
month={September},
pages={218.1--218.10},
articleno={218},
numpages={10},
booktitle={Proceedings of the British Machine Vision Conference (BMVC)},
publisher={BMVA Press},
editor={Kirill Sidorov and Yulia Hicks},
doi={10.5244/C.33.218},
url={https://dx.doi.org/10.5244/C.33.218}
}