International Journal of Emerging Research in Engineering, Science, and Management
Vol. 1, Issue 2, pp. 18-25, Apr-Jun 2022.
https://doi.org/10.58482/ijeresm.v1i2.4

Multi-exposure Image Fusion using Patch-based Component Decomposition

Dharmika A

M Gnanapriya

PG Scholar, Dept. of ECE, Gokula Krishna College of Engineering, Sullurpeta

Professor & Vice Principal, Dept. of ECE, Gokula Krishna College of Engineering, Sullurpeta

Abstract: Multi exposure image fusion is always a challenge in task in image processing. The multiple images with the different image content, when mixed using a fusion formula generate different effects. One of the most prominent effects is ghosting effect. Ghost in effect occur even in capturing of images. The smallest ghosting effect may be treated as image blur. To handle ghosting effect as well as many other affects that are generated in the process of fusion are treated in the proposed technique. The proposal scheme introduces a completely new representation that may be explorer for the for many different applications. First the input images are decomposed into several patches. As the fusion involves multiple input images the special correlated patches are further grouped into a class. Individual patches of the class are decomposed into three logical components named strength structure and intensity. These components are calculated for all the patches of the class. Now using the rule of fusion these logical components are derived for the whole class. The decomposition of a patch into logical components is unique as well as invertible hence using the generated components patches restored. Simulation results prove the superiority of the scheme proposed.

Keywords: — multi-exposure image fusion, patch, image decomposition, deghosting, tone mapping

References:

  1. Ma and Z. Wang, “Multi-exposure image fusion: A patch-wise approach,” in Proc. IEEE Int. Conf. Imag. Process., Sep. 2015, pp. 1717–1721.
  2. Reinhard, W. Heidrich, P. Debevec, S. Pattanaik, G. Ward, and K. Myszkowski, High Dynamic Range Imaging: Acquisition, Display, and Image-Based Lighting. San Mateo, CA, USA: Morgan Kaufmann, 2010.
  3. D. Grossberg and S. K. Nayar, “Determining the camera response from images: What is knowable?” IEEE Trans. Pattern Anal. Mach. Intell., vol. 25, no. 11, pp. 1455–1467, Nov. 2003.
  4. Y. Lee, Y. Matsushita, B. Shi, I. S. Kweon, and K. Ikeuchi, “Radiometric calibration by rank minimization,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 35, no. 1, pp. 144–156, Jan. 2013.
  5. Ma, H. Yeganeh, K. Zeng, and Z. Wang, “High dynamic range image compression by optimizing tone mapped image quality index,” IEEE Trans. Image Process., vol. 24, no. 10, pp. 3086–3097, Oct. 2015.
  6. J. Burt, “The pyramid as a structure for efficient computation,” in Multiresolution Image Processing and Analysis. Springer, 1984.
  7. Ma, K. Zeng, and Z. Wang, “Objective quality assessment for colorto-gray image conversion,” IEEE Trans. Image Process., vol. 24, no. 11, pp. 3345–3356, Dec. 2015.
  8. Gu, W. Li, J. Wong, M. Zhu, and M. Wang, “Gradient field multi-exposure images fusion for high dynamic range image visualization,” J. Vis. Commun. Imag. Represent., vol. 23, no. 4, pp. 604–610, May 2012.
  9. Mertens, J. Kautz, and F. Van Reeth, “Exposure fusion: A simple and practical alternative to high dynamic range photography,” Comput. Graph. Forum, vol. 28, no. 1, pp. 161–171, Mar. 2009.
  10. G. Li, J. H. Zheng, and S. Rahardja, “Detail-enhanced exposure fusion,” IEEE Trans. Image Process., vol. 21, no. 11, pp. 4672–4676, Nov. 2012.
  11. Raman and S. Chaudhuri, “Bilateral filter based compositing for variable exposure photography,” in Proc. Eurographics, 2009, pp. 1–4.