Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
METHOD, APPARATUS AND COMPUTER PROGRAM PRODUCT FOR GENERATION OF EXTENDED DYNAMIC RANGE COLOR IMAGES
Document Type and Number:
WIPO Patent Application WO/2016/026072
Kind Code:
A1
Abstract:
In an example embodiment, a method, apparatus and computer program product are provided. The method includes facilitating receipt of a panchromatic image and a color image associated with a scene. The panchromatic image includes a first luminance data, and the color image includes a second luminance data and a first chrominance data. An extended dynamic range luminance image is generated based at least on the first luminance data and the second luminance data. An extended dynamic range color image of the scene is generated based on the extended dynamic range luminance image and the first chrominance data associated with the color image.

Inventors:
GOVINDARAO KRISHNA ANNASAGAR (IN)
UKIL SOUMIK (IN)
MUNINDER VELDANDI (US)
WANG KONGQIAO (CN)
LI JIANGWEI (CN)
YAN HE (CN)
Application Number:
PCT/CN2014/084654
Publication Date:
February 25, 2016
Filing Date:
August 18, 2014
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
NOKIA TECHNOLOGIES OY (FI)
NAVTEQ SHANGHAI TRADING CO LTD (CN)
International Classes:
H04N5/235
Foreign References:
US20100201831A12010-08-12
CN103930923A2014-07-16
CN102165783A2011-08-24
Attorney, Agent or Firm:
KING & WOOD MALLESONS (East Tower World Financial,Centre, No. 1 Dongsanhuan Zhonglu,Chaoyang District, Beijing 0, CN)
Download PDF:
Claims:
WHAT IS CLAIMED IS:

1. A method comprising:

facilitating receipt of a panchromatic image and a color image associated with a scene, the panchromatic image comprising a first luminance data, and the color image comprising a second luminance data and a first chrominance data;

generating an extended dynamic range luminance image based at least on the first luminance data and the second luminance data; and

generating an extended dynamic range color image of the scene based on the extended dynamic range luminance image and the first chrominance data associated with the color image.

2. The method as claimed in claim 1 , wherein the panchromatic image and the color image are captured by a first image sensor associated with a panchromatic camera, and a second image sensor associated with a Bayer camera, respectively, and wherein a baseline distance between the panchromatic camera and the Bayer camera is less than a threshold baseline distance.

3. The method as claimed in claim 2, further comprising metering exposures of the panchromatic camera and the Bayer camera for facilitating the panchromatic camera and the Bayer camera to capture shadow portions and highlight portions of the scene, wherein metering the exposures of the panchromatic camera and the Bayer camera facilitates in capturing the panchromatic image and the color image, as an overexposed image and an underexposed image, respectively of the scene. 4. The method as claimed in claim 3, wherein metering the exposures of the panchromatic camera and the Bayer camera comprises adjusting one or more exposure parameters associated with the panchromatic camera and the Bayer camera, the one or more exposure parameters comprising an exposure time, gain and aperture. 5. The method as claimed in any of claims 2 to 4, further comprising performing distortion calibration of the panchromatic camera and the Bayer camera for compensating distortion associated with an initial panchromatic image and an initial color image captured by the panchromatic camera and the Bayer camera, respectively to generate the panchromatic image and the color image, respectively, wherein performing the distortion calibration comprises: determining a first distortion parameter and a second distortion parameter associated with the panchromatic camera and the Bayer camera, respectively from the initial panchromatic image and the initial color image, the initial panchromatic image and the initial color image representing a convolution of the panchromatic image with the first distortion parameter and a convolution of the color image with the second distortion parameter, respectively; and

extracting the panchromatic image and the color image from the initial panchromatic image and the initial color image, respectively based on the first distortion parameter and the second distortion parameter, respectively. 6. The method as claimed in claim 5, further comprising aligning the panchromatic image and the color image.

7. The method as claimed in claim 1, wherein the panchromatic image and the color image are generated by a color filter array (CFA) based sensor associated with a camera, the CFA based sensor comprising a filter array of transparent filters and color filters for capturing the first luminance data to generate the panchromatic image and the second luminance data and the first chrominance data to generate the color image, respectively.

8. The method as claimed in claim 7, further comprising metering an exposure of the filter array of transparent filters and color filters of the camera, wherein metering the filter array of transparent filters and color filters facilitates in generating the panchromatic image as an overexposed image and the color image as an underexposed image of the scene.

9. The method as claimed in any of claims 1 to 8, wherein generating the extended dynamic range luminance image comprises fusing the first luminance data from the panchromatic image and the second luminance data from the color image.

10. The method as claimed in any of claims 1 to 9, wherein generating the extended dynamic range color image comprises fusing the extended dynamic range luminance image and the first chrominance data from the color image.

11. The method as claimed in claim 1, further comprising performing distortion calibration of the panchromatic camera and the Bayer camera for compensating distortion associated with the panchromatic image and the color image captured by the panchromatic camera and the Bayer camera, respectively, the color image being decomposed into a decomposed panchromatic image and a decomposed color image, the decomposed panchromatic image and the decomposed color image comprising the second luminance data and the first chrominance data, respectively, and wherein performing the distortion calibration comprises:

determining a set of feature pixels from the panchromatic image and a corresponding set of feature pixels from the decomposed panchromatic image based on a bi-directional pixel matching between the panchromatic image and the decomposed panchromatic image, the set of feature pixels of the panchromatic image being shifted relative to the corresponding set of feature pixels of the decomposed panchromatic image by a corresponding first plurality of shift values; and

determining, for a plurality of non-feature pixels of the panchromatic image, a corresponding second plurality of shift values based on a triangulation method applied on the set of feature pixels associated with the panchromatic image and the corresponding first plurality of shift values,

wherein generating the extended dynamic range luminance image comprises fusing the first luminance data with the second luminance data from the decomposed panchromatic image based at least on the corresponding first plurality of shift values and the corresponding second plurality of shift values,

and wherein generating the extended dynamic range color image comprises fusing the extended dynamic range luminance image and the first chrominance data from the color image.

12. The method as claimed in claims 5 or 6 or 7 or 9 or 10, further comprising preprocessing the first luminance data and the second luminance data, wherein,

pre-processing the first luminance data comprises performing an interpolation of the first luminance data to generate an interpolated first luminance data, and

pre-processing the second luminance data comprises performing at least a two-level interpolation of the second luminance data to generate an interpolated second luminance data, wherein generating the extended dynamic range luminance image comprises fusing the interpolated first luminance data and the interpolated second luminance data.

13. The method as claimed in claim 12, further comprising pre-processing the first chrominance data, wherein pre-processing the first chrominance data comprises performing at least a two-level interpolation of the first chrominance data to generate an interpolated first chrominance data,

wherein generating the extended dynamic range color image comprises fusing the extended dynamic range luminance image with the interpolated first chrominance data.

14. The method as claimed in any of claims 1 to 10, further comprising generating an intermediate color image from the color image, wherein generating the intermediate color image comprises:

performing pre-processing of the first luminance data and the second luminance data for reducing noise components in the first luminance data and the second luminance data, to thereby generate a pre-processed first luminance data and a pre-processed second luminance data;

computing, for a pixel of a plurality of pixels of the color image, a corresponding gain factor based on a comparison of corresponding first luminance components and corresponding second luminance components, the corresponding first luminance components and the corresponding second luminance components being derived from the pre-processed first luminance data and the pre-processed second luminance data, respectively; and

multiplying, for the pixel, the corresponding gain factor with a corresponding pixel value to generate corresponding pixel of the intermediate color image, the intermediate color image comprising a second chrominance data.

15. The method as claimed in claim 14, further comprising de-noising the second chrominance data of the intermediate color image. 16. The method as claimed in claims 14 or 15, wherein generating the extended dynamic range color image comprises fusing the extended dynamic range luminance image with the second chrominance data from the intermediate color image.

17. The method as claimed in any of claims 1 to 8, further comprising generating an intermediate color image from the color image, wherein generating the intermediate color image comprises:

inverting the color image to generate a color negative image;

de-hazing the color negative image for recovering a de-hazed image associated with the color image; and inverting the de-hazed image to generate the intermediate color image, the intermediate color image comprising a third luminance data and a second chrominance data.

18. The method as claimed in claim 17, wherein generating the extended dynamic range luminance image comprises fusing the first luminance data from the panchromatic image, the second luminance data from the color image, and the third luminance data from the intermediate color image.

19. The method as claimed in claim 1 , further comprising generating an intermediate color image from the color image based on a low-light removal from the color image, and de- noising the intermediate color image, wherein de-noising the intermediate color image comprises:

decomposing the intermediate color image into an initial third luminance image and an initial second chrominance image;

determining, for at least one portion of the panchromatic image, a weight information based on a difference of gray pixel values of neighboring pixels in the at least one portion; and performing selective filtering of at least one portion of the intermediate color image corresponding to the at least one portion of the panchromatic image based on the weight information of the at least one portion of the panchromatic image to generate a third luminance data and a second chrominance data associated with the intermediate color image, wherein the extended dynamic range luminance image is generated by fusing the first luminance data from the panchromatic image, the second luminance data from the color image, and the third luminance data from the intermediate color image. 20. The method as claimed in any of claims 18 or 19, wherein generating the extended dynamic range color image comprises fusing the extended dynamic range luminance image with the second chrominance data from the intermediate color image.

21. A method comprising:

facilitating receipt of a panchromatic image and a color image associated with a scene, the panchromatic image comprising a first luminance data, and the color image comprising a second luminance data and a first chrominance data;

generating an intermediate color image from the color image, wherein generating the intermediate color image comprises: inverting the color image to generate a color negative image,

de-hazing the color negative image for recovering a de-hazed image, and inverting the de-hazed image to generate the intermediate color image, the intermediate color image comprising a third luminance data and a second chrominance data; generating an extended dynamic range luminance image based on the first luminance data, the second luminance data, and the third luminance data; and

generating an extended dynamic range color image of the scene based on the dynamic range luminance image and the second chrominance data associated with the intermediate color image.

22. The method as claimed in claim 21, wherein the panchromatic image and the color image are captured by a first image sensor associated with a Bayer camera and a second image sensor associated with a panchromatic camera, respectively, and wherein the panchromatic image and the color image are captured as an overexposed image and an underexposed image, respectively of the scene.

23. The method as claimed in claim 22, wherein generating the extended dynamic range luminance image comprises fusing the first luminance data from the panchromatic image, the second luminance data from the color image, and the third luminance data from the intermediate color image.

24. The method as claimed in claim 23, wherein generating the extended dynamic range color image comprises fusing the extended dynamic range luminance image with the second chrominance data from the intermediate color image.

25. An apparatus comprising:

at least one processor; and

at least one memory comprising computer program code, the at least one memory and the computer program code configured to, with the at least one processor, cause the apparatus to at least perform:

facilitate receipt of a panchromatic image and a color image associated with a scene, the panchromatic image comprising a first luminance data, and the color image comprising a second luminance data and a first chrominance data; generate an extended dynamic range luminance image based at least on the first luminance data and the second luminance data; and

generate an extended dynamic range color image of the scene based on the extended dynamic range luminance image and the first chrominance data associated with the color image.

26. The apparatus as claimed in claim 25, wherein the panchromatic image and the color image are captured by a first image sensor associated with a panchromatic camera, and a second image sensor associated with a Bayer camera, respectively, and wherein a baseline distance between the panchromatic camera and the Bayer camera is less than a threshold baseline distance.

27. The apparatus as claimed in claim 26, wherein the apparatus is further caused, at least in part to meter exposures of the panchromatic camera and the Bayer camera for facilitating the panchromatic camera and the Bayer camera to capture shadow portions and highlight portions of the scene, wherein metering the exposures of the panchromatic camera and the Bayer camera facilitates in capturing the panchromatic image and the color image, as an overexposed image and an underexposed image, respectively of the scene. 28. The apparatus as claimed in claim 27, wherein for metering the exposures of the panchromatic camera and the Bayer camera the apparatus is further caused, at least in part to adjust one or more exposure parameters associated with the panchromatic camera and the Bayer camera, the one or more exposure parameters comprising an exposure time, gain and aperture.

29. The apparatus as claimed in any of claims 25 to 28, wherein the apparatus is further caused, at least in part to perform distortion calibration of the panchromatic camera and the Bayer camera for compensating distortion associated with an initial panchromatic image and an initial color image captured by the panchromatic camera and the Bayer camera, respectively to generate the panchromatic image and the color image, respectively, wherein performing the distortion calibration comprises:

determining a first distortion parameter and a second distortion parameter associated with the panchromatic camera and the Bayer camera, respectively from the initial panchromatic image and the initial color image, the initial panchromatic image and the initial color image representing a convolution of the panchromatic image with the first distortion parameter and a convolution of the color image with the second distortion parameter, respectively; and

extracting the panchromatic image and the color image from the initial panchromatic image and the initial color image, respectively based on the first distortion parameter and the second distortion parameter, respectively.

30. The apparatus as claimed in claim 29, wherein the apparatus is further caused, at least in part to align the panchromatic image and the color image. 31. The apparatus as claimed in claim 25, wherein the panchromatic image and the color image are generated by a color filter array (CFA) based sensor associated with a camera, the CFA based sensor comprising a filter array of transparent filters and color filters for capturing the first luminance data to generate the panchromatic image and the second luminance data and the first chrominance data to generate the color image, respectively.

32. The apparatus as claimed in claim 31 , wherein the apparatus is further caused, at least in part to meter an exposure of the filter array of transparent filters and color filters of the camera, wherein metering the filter array of transparent filters and color filters facilitates in generating the panchromatic image as an overexposed image and the color image as an underexposed image of the scene.

33. The apparatus as claimed in any of claims 25 to 32, wherein for generating the extended dynamic range luminance image the apparatus is further caused, at least in part to fuse the first luminance data from the panchromatic image and the second luminance data from the color image.

34. The apparatus as claimed in any of claims 25 to 33, wherein for generating the extended dynamic range color image the apparatus is further caused, at least in part to fuse the extended dynamic range luminance image and the first chrominance data from the color image.

35. The apparatus as claimed in claim 25, wherein the apparatus is further caused, at least in part to perform distortion calibration of the panchromatic camera and the Bayer camera for compensating distortion associated with the panchromatic image and the color image captured by the panchromatic camera and the Bayer camera, respectively, the color image being decomposed into a decomposed panchromatic image and a decomposed color image, the decomposed panchromatic image and the decomposed color image comprising the second luminance data and the first chrominance data, respectively, and wherein performing the distortion calibration comprises:

determining a set of feature pixels from the panchromatic image and a corresponding set of feature pixels from the decomposed panchromatic image based on a bi-directional pixel matching between the panchromatic image and the decomposed panchromatic image, the set of feature pixels of the panchromatic image being shifted relative to the corresponding set of feature pixels of the decomposed panchromatic image by a corresponding first plurality of shift values; and

determining, for a plurality of non-feature pixels of the panchromatic image, a corresponding second plurality of shift values based on a triangulation method applied on the set of feature pixels associated with the panchromatic image and the corresponding first plurality of shift values,

wherein generating the extended dynamic range luminance image comprises fusing the first luminance data with the second luminance data from the decomposed panchromatic image based at least on the corresponding first plurality of shift values and the corresponding second plurality of shift values,

and wherein generating the extended dynamic range color image comprises fusing the extended dynamic range luminance image and the first chrominance data from the color image.

36. The apparatus as claimed in claims 29 or 30 or 31 or 33 or 34, wherein the apparatus is further caused, at least in part to pre-process the first luminance data and the second luminance data, wherein,

pre-processing the first luminance data comprises performing an interpolation of the first luminance data to generate an interpolated first luminance data, and

pre-processing the second luminance data comprises performing at least a two-level interpolation of the second luminance data to generate an interpolated second luminance data, wherein generating the extended dynamic range luminance image comprises fusing the interpolated first luminance data and the interpolated second luminance data.

37. The apparatus as claimed in claim 36, wherein the apparatus is further caused, at least in part to pre-process the first chrominance data, wherein pre-processing the first chrominance data comprises performing at least a two-level interpolation of the first chrominance data to generate an interpolated first chrominance data,

wherein generating the extended dynamic range color image comprises fusing the extended dynamic range luminance image with the interpolated first chrominance data.

38. The apparatus as claimed in any of claims 25 to 34, wherein the apparatus is further caused, at least in part to generate an intermediate color image from the color image, wherein generating the intermediate color image comprises:

performing pre-processing of the first luminance data and the second luminance data for reducing noise components in the first luminance data and the second luminance data, to thereby generate a pre-processed first luminance data and a pre-processed second luminance data;

computing, for a pixel of a plurality of pixels of the color image, a corresponding gain factor based on a comparison of corresponding first luminance components and corresponding second luminance components, the corresponding first luminance components and the corresponding second luminance components being derived from the pre-processed first luminance data and the pre-processed second luminance data, respectively; and

multiplying, for the pixel, the corresponding gain factor with a corresponding pixel value to generate corresponding pixel of the intermediate color image, the intermediate color image comprising a second chrominance data.

39. The apparatus as claimed in claim 38, wherein the apparatus is further caused, at least in part to de-noise the second chrominance data of the intermediate color image.

40. The apparatus as claimed in claims 38 or 39, wherein for generating the extended dynamic range color image the apparatus is further caused, at least in part to fuse the extended dynamic range luminance image with the second chrominance data from the intermediate color image.

41. The apparatus as claimed in any of claims 25 to 32, wherein the apparatus is further caused, at least in part to generate an intermediate color image from the color image, wherein generating the intermediate color image comprises:

inverting the color image to generate a color negative image; de-hazing the color negative image for recovering a de-hazed image associated with the color image; and

inverting the de-hazed image to generate the intermediate coior image, the intermediate color image comprising a third luminance data and a second chrominance data.

42. The apparatus as claimed in claim 41 , wherein for generating the extended dynamic range luminance image the apparatus is further caused, at least in part to fuse the first luminance data from the panchromatic image, the second luminance data from the color image, and the third luminance data from the intermediate color image.

43. The apparatus as claimed in claim 25, wherein the apparatus is further caused, at least in part to generate an intermediate color image from the color image based on a low- light removal from the color image, and de-noise the intermediate color image, wherein de- noising the intermediate color image comprises:

decomposing the intermediate color image into an initial third luminance image and an initial second chrominance image;

determining, for at least one portion of the panchromatic image, a weight information based on a difference of gray pixel values of neighboring pixels in the at least one portion; and performing selective filtering of at least one portion of the intermediate color image corresponding to the at least one portion of the panchromatic image based on the weight information of the at least one portion of the panchromatic image to generate a third luminance data and a second chrominance data associated with the intermediate color image, wherein the extended dynamic range luminance image is generated by fusing the first luminance data from the panchromatic image, the second luminance data from the color image, and the third luminance data from the intermediate color image.

44. The apparatus as claimed in any of claims 42 or 43, wherein for generating the extended dynamic range coior image the apparatus is further caused, at least in part to fuse the extended dynamic range luminance image with the second chrominance data from the intermediate color image.

45. The apparatus as claimed in claim 25, wherein the apparatus comprises an electronic device comprising: a user interface circuitry and user interface software configured to facilitate a user to control at least one function of the electronic device through use of a display and further configured to respond to user inputs; and

a display circuitry configured to display at least a portion of a user interface of the electronic device, the display and display circuitry configured to facilitate the user to control at least one function of the electronic device.

46. The apparatus as claimed in claim 45, wherein the electronic device comprises a panchromatic camera and a Bayer camera configured to capture the panchromatic image and the color image, respectively.

47. The apparatus as claimed in claim 46, wherein the electronic device comprises a mobile phone. 48. An apparatus comprising:

at least one processor; and

at least one memory comprising computer program code, the at least one memory and the computer program code configured to, with the at least one processor, cause the apparatus to at least perform:

facilitate receipt of a panchromatic image and a color image associated with a scene, the panchromatic image comprising a first luminance data, and the color image comprising a second luminance data and a first chrominance data;

generate an intermediate color image from the color image, wherein generating the intermediate color image comprises:

inverting the color Image to generate a color negative image,

de-hazing the color negative image for recovering a de-hazed image, and inverting the de-hazed image to generate the intermediate color image, the intermediate color image comprising a third luminance data and a second chrominance data;

generate an extended dynamic range luminance image based on the first luminance data, the second luminance data, and the third luminance data; and

generate an extended dynamic range color image of the scene based on the dynamic range luminance image and the second chrominance data associated with the intermediate color image.

49. The apparatus as claimed in claim 48, wherein the panchromatic image and the color image are captured by a first image sensor associated with a Bayer camera and a second image sensor associated with a panchromatic camera, respectively, and wherein the panchromatic image and the color image are captured as an overexposed image and an underexposed image, respectively of the scene.

50. The apparatus as claimed in claim 49, wherein for generating the extended dynamic range luminance image the apparatus is further caused, at least in part to fuse the first luminance data from the panchromatic image, the second luminance data from the color image, and the third luminance data from the intermediate color image.

51. The apparatus as claimed in claim 50, wherein for generating the extended dynamic range color image the apparatus is further caused, at least in part to fuse the extended dynamic range luminance image with the second chrominance data from the intermediate color image.

52. The apparatus as claimed in claim 48, wherein the apparatus comprises an electronic device comprising:

a user interface circuitry and user interface software configured to facilitate a user to control at least one function of the electronic device through use of a display and further configured to respond to user inputs; and

a display circuitry configured to display at least a portion of a user interface of the electronic device, the display and display circuitry configured to facilitate the user to control at least one function of the electronic device.

53. The apparatus as claimed in claim 52, wherein the electronic device comprises a panchromatic camera and a Bayer camera configured to capture the panchromatic image and the color image, respectively.

54. The apparatus as claimed in claim 53, wherein the electronic device comprises a mobile phone.

55. A computer program product comprising at least one computer-readable storage medium, the computer-readable storage medium comprising a set of instructions, which, when executed by one or more processors, cause an apparatus to at least perform:

facilitate receipt of a panchromatic image and a color image associated with a scene, the panchromatic image comprising a first luminance data, and the color image comprising a second luminance data and a first chrominance data;

generate an extended dynamic range luminance image based at least on the first luminance data and the second luminance data; and

generate an extended dynamic range color image of the scene based on the extended dynamic range luminance image and the first chrominance data associated with the color image.

56. The computer program product as claimed in claim 55, wherein the panchromatic image and the color image are captured by a first image sensor associated with a panchromatic camera, and a second image sensor associated with a Bayer camera, respectively, and wherein a baseline distance between the panchromatic camera and the Bayer camera is less than a threshold baseline distance.

57. The computer program product as claimed in claim 56, wherein the apparatus is further caused, at least in part to meter exposures of the panchromatic camera and the Bayer camera for facilitating the panchromatic camera and the Bayer camera to capture shadow portions and highlight portions of the scene, wherein metering the exposures of the panchromatic camera and the Bayer camera facilitates in capturing the panchromatic image and the color image, as an overexposed image and an underexposed image, respectively of the scene.

58. The computer program product as claimed in claim 57, wherein for metering the exposures of the panchromatic camera and the Bayer camera the apparatus is further caused, at least in part to adjust one or more exposure parameters associated with the panchromatic camera and the Bayer camera, the one or more exposure parameters comprising an exposure time, gain and aperture.

59. The computer program product as claimed in any of claims 55 to 58, wherein the apparatus is further caused, at least in part to perform distortion calibration of the panchromatic camera and the Bayer camera for compensating distortion associated with an initial panchromatic image and an initial color image captured by the panchromatic camera and the Bayer camera, respectively to generate the panchromatic image and the color image, respectively, wherein performing the distortion calibration comprises:

determining a first distortion parameter and a second distortion parameter associated with the panchromatic camera and the Bayer camera, respectively from the initial panchromatic image and the initial color image, the initial panchromatic image and the initial color image representing a convolution of the panchromatic image with the first distortion parameter and a convolution of the color image with the second distortion parameter, respectively; and

extracting the panchromatic image and the color image from the initial panchromatic image and the initial color image, respectively based on the first distortion parameter and the second distortion parameter, respectively.

60. The computer program product as claimed in claim 59, wherein the apparatus is further caused, at least in part to align the panchromatic image and the color image.

61. The computer program product as claimed in claim 55, wherein the panchromatic image and the color image are generated by a color filter array (CFA) based sensor associated with a camera, the CFA based sensor comprising a filter array of transparent filters and color filters for capturing the first luminance data to generate the panchromatic image and the second luminance data and the first chrominance data to generate the color image, respectively.

62. The computer program product as claimed in claim 61 , wherein the apparatus is further caused, at least in part to meter an exposure of the filter array of transparent filters and color filters of the camera, wherein metering the filter array of transparent filters and color filters facilitates in generating the panchromatic image as an overexposed image and the color image as an underexposed image of the scene.

63. The computer program product as claimed in any of claims 55 to 62, wherein for generating the extended dynamic range luminance image the apparatus is further caused, at least in part to fuse the first luminance data from the panchromatic image and the second luminance data from the color image.

64. The computer program product as claimed in any of claims 55 to 63, wherein for generating the extended dynamic range color image the apparatus is further caused, at least in part to fuse the extended dynamic range luminance image and the first chrominance data from the color image.

65. The computer program product as claimed in claim 55, wherein the apparatus is further caused, at least in part to perform distortion calibration of the panchromatic camera and the Bayer camera for compensating distortion associated with the panchromatic image and the color image captured by the panchromatic camera and the Bayer camera, respectively, the color image being decomposed into a decomposed panchromatic image and a decomposed color image, the decomposed panchromatic image and the decomposed color image comprising the second luminance data and the first chrominance data, respectively, and wherein performing the distortion calibration comprises:

determining a set of feature pixels from the panchromatic image and a corresponding set of feature pixels from the decomposed panchromatic image based on a bi-directional pixel matching between the panchromatic image and the decomposed panchromatic image, the set of feature pixels of the panchromatic image being shifted relative to the corresponding set of feature pixels of the decomposed panchromatic image by a corresponding first plurality of shift values; and

determining, for a plurality of non-feature pixels of the panchromatic image, a corresponding second plurality of shift values based on a triangulation method applied on the set of feature pixels associated with the panchromatic image and the corresponding first plurality of shift values,

wherein generating the extended dynamic range luminance image comprises fusing the first luminance data with the second luminance data from the decomposed panchromatic image based at least on the corresponding first plurality of shift values and the corresponding second plurality of shift values,

and wherein generating the extended dynamic range color image comprises fusing the extended dynamic range luminance image and the first chrominance data from the color image.

66. The computer program product as claimed in claims 59 or 60 or 61 or 63 or 64, wherein the apparatus is further caused, at least in part to pre-process the first luminance data and the second luminance data, wherein,

pre-processing the first luminance data comprises performing an interpolation of the first luminance data to generate an interpolated first luminance data, and pre-processing the second luminance data comprises performing at least a two-level interpolation of the second luminance data to generate an interpolated second luminance data, wherein generating the extended dynamic range luminance image comprises fusing the interpolated first luminance data and the interpolated second luminance data.

67. The computer program product as claimed in claim 66, wherein the apparatus is further caused, at least in part to pre-process the first chrominance data, wherein preprocessing the first chrominance data comprises performing at least a two-level interpolation of the first chrominance data to generate an interpolated first chrominance data,

wherein generating the extended dynamic range color image comprises fusing the extended dynamic range luminance image with the interpolated first chrominance data.

68. The computer program product as claimed in any of claims 55 to 64, wherein the apparatus is further caused, at least in part to generate an intermediate color image from the color image, wherein generating the intermediate color image comprises:

performing pre-processing of the first luminance data and the second luminance data for reducing noise components in the first luminance data and the second luminance data, to thereby generate a pre-processed first luminance data and a pre-processed second luminance data;

computing, for a pixel of a plurality of pixels of the color image, a corresponding gain factor based on a comparison of corresponding first luminance components and corresponding second luminance components, the corresponding first luminance components and the corresponding second luminance components being derived from the pre-processed first luminance data and the pre-processed second luminance data, respectively; and

multiplying, for the pixel, the corresponding gain factor with a corresponding pixel value to generate corresponding pixel of the intermediate color image, the intermediate color image comprising a second chrominance data.

69. The computer program product as claimed in claim 68, wherein the apparatus is further caused, at least in part to de-noise the second chrominance data of the intermediate color image.

70. The computer program product as claimed in claims 68 or 69, wherein for generating the extended dynamic range color image the apparatus is further caused, at least in part to fuse the extended dynamic range luminance image with the second chrominance data from the intermediate color image.

71. The computer program product as claimed in any of claims 55 to 62, wherein the apparatus is further caused, at least in part to generate an intermediate color image from the color image, wherein generating the intermediate color image comprises:

inverting the color image to generate a color negative image;

de-hazing the color negative image for recovering a de-hazed image associated with the color image; and

inverting the de-hazed image to generate the intermediate color image, the intermediate color image comprising a third luminance data and a second chrominance data.

72. The computer program product as claimed in claim 71 , wherein for generating the extended dynamic range luminance image the apparatus is further caused, at least in part to fuse the first luminance data from the panchromatic image, the second luminance data from the color image, and the third luminance data from the intermediate color image.

73. The computer program product as claimed in claim 55, wherein the apparatus is further caused, at least in part to generate an intermediate color image from the color image based on a low-light removal from the color image, and de-noise the intermediate color image, wherein de-noising the intermediate color image comprises:

decomposing the intermediate color image into an initial third luminance image and an initial second chrominance image;

determining, for at least one portion of the panchromatic image, a weight information based on a difference of gray pixel values of neighboring pixels in the at least one portion; and performing selective filtering of at least one portion of the intermediate color image corresponding to the at least one portion of the panchromatic image based on the weight information of the at least one portion of the panchromatic image to generate a third luminance data and a second chrominance data associated with the intermediate color image, wherein the extended dynamic range luminance image is generated by fusing the first luminance data from the panchromatic image, the second luminance data from the color image, and the third luminance data from the intermediate color image.

74. The computer program product as claimed in any of claims 72 or 73, wherein for generating the extended dynamic range color image the apparatus is further caused, at least in part to fuse the extended dynamic range luminance image with the second chrominance data from the intermediate color image.

75. A computer program product comprising at least one computer-readable storage medium, the computer-readable storage medium comprising a set of instructions, which, when executed by one or more processors, cause an apparatus to at least perform:

facilitate receipt of a panchromatic image and a color image associated with a scene, the panchromatic image comprising a first luminance data, and the color image comprising a second luminance data and a first chrominance data;

generate an intermediate color image from the color image, wherein generating the intermediate color image comprises:

inverting the color image to generate a color negative image,

de-hazing the color negative image for recovering a de-hazed image, and inverting the de-hazed image to generate the intermediate color image, the intermediate color image comprising a third luminance data and a second chrominance data;

generate an extended dynamic range luminance image based on the first luminance data, the second luminance data, and the third luminance data; and

generate an extended dynamic range color image of the scene based on the dynamic range luminance image and the second chrominance data associated with the intermediate color image. 76. The computer program product as claimed in claim 75, wherein the panchromatic image and the color image are captured by a first image sensor associated with a Bayer camera and a second image sensor associated with a panchromatic camera, respectively, and wherein the panchromatic image and the color image are captured as an overexposed image and an underexposed image, respectively of the scene.

77. The computer program product as claimed in claim 76, wherein for generating the extended dynamic range luminance image the apparatus is further caused, at least in part to fuse the first luminance data from the panchromatic image, the second luminance data from the color image, and the third luminance data from the intermediate color image.

78. The computer program product as claimed in claim 77, wherein for generating the extended dynamic range color image the apparatus is further caused, at least in part to fuse the extended dynamic range luminance image with the second chrominance data from the intermediate color image.

79. An apparatus comprising:

means for facilitating receipt of a panchromatic image and a color image associated with a scene, the panchromatic image comprising a first luminance data, and the color image comprising a second luminance data and a first chrominance data;

means for generating an extended dynamic range luminance image based at least on the first luminance data and the second luminance data; and

means for generating an extended dynamic range color image of the scene based on the extended dynamic range luminance image and the first chrominance data associated with the color image.

80. An apparatus comprising:

means for facilitating receipt of a panchromatic image and a color image associated with a scene, the panchromatic image comprising a first luminance data, and the color image comprising a second luminance data and a first chrominance data;

means for generating an intermediate color image from the color image, wherein generating the intermediate color image comprises:

means for inverting the color image to generate a color negative image, means for de-hazing the color negative image for recovering a de-hazed image, and

means for inverting the de-hazed image to generate the intermediate color image, the intermediate color image comprising a third luminance data and a second chrominance data;

means for generating an extended dynamic range luminance image based on the first luminance data, the second luminance data, and the third luminance data; and

means for generating an extended dynamic range color image of the scene based on the dynamic range luminance image and the second chrominance data associated with the intermediate color image.

81. A computer program comprising program instructions which when executed by an apparatus, cause the apparatus to:

facilitate receipt of a panchromatic image and a color image associated with a scene, the panchromatic image comprising a first luminance data, and the color image comprising a second luminance data and a first chrominance data;

generate an extended dynamic range luminance image based at least on the first luminance data and the second luminance data; and

generate an extended dynamic range color image of the scene based on the extended dynamic range luminance image and the first chrominance data associated with the color image.

82. A computer program comprising program instructions which when executed by an apparatus, cause the apparatus to:

facilitate receipt of a panchromatic image and a color image associated with a scene, the panchromatic image comprising a first luminance data, and the color image comprising a second luminance data and a first chrominance data;

generate an intermediate color image from the color image, wherein generating the intermediate color image comprises:

inverting the color image to generate a color negative image,

de-hazing the color negative image for recovering a de-hazed image, and inverting the de-hazed image to generate the intermediate color image, the intermediate color image comprising a third luminance data and a second chrominance data; generate an extended dynamic range luminance image based on the first luminance data, the second luminance data, and the third luminance data; and

generate an extended dynamic range color image of the scene based on the dynamic range luminance image and the second chrominance data associated with the intermediate color image.

83. An apparatus substantially as hereinbefore described with reference to accompanying drawings.

84. A method substantially as hereinbefore described with reference to accompanying drawings.

Description:
METHOD, APPARATUS AND COMPUTER PROGRAM PRODUCT FOR GENERATION OF EXTENDED DYNAMIC RANGE COLOR IMAGES

TECHNICAL FIELD

[0001] Various implementations relate generally to method, apparatus, and computer program product for generation of extended dynamic range color images.

BACKGROUND

10002] Various electronic devices, for example, cameras, mobile phones, and other multimedia devices are widely used for capturing images and/or videos of a scene. Some of these electronic devices also feature a dual camera that could be utilized for capturing images with different exposure settings, and a combination of such images result in an extended dynamic range image. Such extended dynamic range images suffer from various defects such as motion artifacts (defects that appear in an image due to motion in the scene), inability to be extended to video, poor quality in low-light or shadow areas of the scene, and the like. In some instances, the captured extended dynamic range image is of a degraded quality due to improper fusion of images captured by the electronic device. In some scenarios, the improper fusion of images may be attributed to phenomenon such as those relating to calibration of cameras in the electronic device, distortion of the images, alignment of the images, and noise in the images, and the like.

SUMMARY OF SOME EMBODIMENTS

[0003] Various aspects of examples embodiments are set out in the claims.

[0004] In a first aspect, there is provided a method comprising: facilitating receipt of a panchromatic image and a color image associated with a scene, the panchromatic image comprising a first luminance data, and the color image comprising a second luminance data and a first chrominance data; generating an extended dynamic range luminance image based at least on the first luminance data and the second luminance data; and generating an extended dynamic range color image of the scene based on the extended dynamic range luminance image and the first chrominance data associated with the color image. [0005] In a second aspect, there is provided a method comprising: facilitating receipt of a panchromatic image and a color image associated with a scene, the panchromatic image comprising a first luminance data, and the color image comprising a second luminance data and a first chrominance data; generating an intermediate color image from the color image, wherein generating the intermediate color image comprises: inverting the color image to generate a color negative image, de-hazing the color negative image for recovering a de-hazed image, and inverting the de-hazed image to generate the intermediate color image, the intermediate color image comprising a third luminance data and a second chrominance data; generating an extended dynamic range luminance image based on the first luminance data, the second luminance data, and the third luminance data; and generating an extended dynamic range color image of the scene based on the dynamic range luminance image and the second chrominance data associated with the intermediate color image. [0006] In a third aspect, there is provided an apparatus comprising: at least one processor; and at least one memory comprising computer program code, the at least one memory and the computer program code configured to, with the at least one processor, cause the apparatus to perform at least: facilitate receipt of a panchromatic image and a color image associated with a scene, the panchromatic image comprising a first luminance data, and the color image comprising a second luminance data and a first chrominance data; generate an extended dynamic range luminance image based at least on the first luminance data and the second luminance data; and generate an extended dynamic range color image of the scene based on the extended dynamic range luminance image and the first chrominance data associated with the color image.

[0007] In a fourth aspect, there is provided an apparatus comprising: at least one processor; and at least one memory comprising computer program code, the at least one memory and the computer program code configured to, with the at least one processor, cause the apparatus to perform at least: facilitate receipt of a panchromatic image and a color image associated with a scene, the panchromatic image comprising a first luminance data, and the color image comprising a second luminance data and a first chrominance data; generate an intermediate color image from the color image, wherein generating the intermediate color image comprises: inverting the color image to generate a color negative image, de-hazing the color negative image for recovering a de-hazed image, and inverting the de-hazed image to generate the intermediate color image, the intermediate color image comprising a third luminance data and a second chrominance data; generate an extended dynamic range luminance image based on the first luminance data, the second luminance data, and the third luminance data; and generate an extended dynamic range color image of the scene based on the dynamic range luminance image and the second chrominance data associated with the intermediate color image.

[0008] In a fifth aspect, there is provided a computer program product comprising at least one computer-readable storage medium, the computer-readable storage medium comprising a set of instructions, which, when executed by one or more processors, cause an apparatus to at least perform: facilitate receipt of a panchromatic image and a color image associated with a scene, the panchromatic image comprising a first luminance data, and the color image comprising a second luminance data and a first chrominance data; generate an extended dynamic range luminance image based at least on the first luminance data and the second luminance data; and generate an extended dynamic range color image of the scene based on the extended dynamic range luminance image and the first chrominance data associated with the color image.

[0009] In a sixth aspect, there is provided a computer program product comprising at least one computer-readable storage medium, the computer-readable storage medium comprising a set of instructions, which, when executed by one or more processors, cause an apparatus to at least perform: facilitate receipt of a panchromatic image and a color image associated with a scene, the panchromatic image comprising a first luminance data, and the color image comprising a second luminance data and a first chrominance data; generate an intermediate color image from the color image, wherein generating the intermediate color image comprises: inverting the color image to generate a color negative image, de-hazing the color negative image for recovering a de-hazed image, and inverting the de-hazed image to generate the intermediate color image, the intermediate color image comprising a third luminance data and a second chrominance data; generate an extended dynamic range luminance image based on the first luminance data, the second luminance data, and the third luminance data; and generate an extended dynamic range color image of the scene based on the dynamic range luminance image and the second chrominance data associated with the intermediate color image. [0010] In a seventh aspect, there is provided an apparatus comprising: means for facilitating receipt of a panchromatic image and a color image associated with a scene, the panchromatic image comprising a first luminance data, and the color image comprising a second luminance data and a first chrominance data; means for generating an extended dynamic range luminance image based at least on the first luminance data and the second luminance data; and means for generating an extended dynamic range color image of the scene based on the extended dynamic range luminance image and the first chrominance data associated with the color image. [0011] In an eighth aspect, there is provided an apparatus comprising: means for facilitating receipt of a panchromatic image and a color image associated with a scene, the panchromatic image comprising a first luminance data, and the color image comprising a second luminance data and a first chrominance data; means for generating an intermediate color image from the color image, wherein means for generating the intermediate color image comprises: means for inverting the color image to generate a color negative image, means for de-hazing the color negative image for recovering a de-hazed image, and means for inverting the de-hazed image to generate the intermediate color image, the intermediate color image comprising a third luminance data and a second chrominance data; means for generating an extended dynamic range luminance image based on the first luminance data, the second luminance data, and the third luminance data; and means for generating an extended dynamic range color image of the scene based on the dynamic range luminance image and the second chrominance data associated with the intermediate color image.

[0012] In a ninth aspect, there is provided a computer program comprising program instructions which when executed by an apparatus, cause the apparatus to: facilitate receipt of a panchromatic image and a color image associated with a scene, the panchromatic image comprising a first luminance data, and the color image comprising a second luminance data and a first chrominance data; generate an extended dynamic range luminance image based at least on the first luminance data and the second luminance data; and generate an extended dynamic range color image of the scene based on the extended dynamic range luminance image and the first chrominance data associated with the color image.

[0013] In a tenth aspect, there is provided a computer program comprising program instructions which when executed by an apparatus, cause the apparatus to: facilitate receipt of a panchromatic image and a color image associated with a scene, the panchromatic image comprising a first luminance data, and the color image comprising a second luminance data and a first chrominance data; generate an intermediate color image from the color image, wherein generating the intermediate color image comprises: inverting the color image to generate a color negative image, de-hazing the color negative image for recovering a de-hazed image, and inverting the de-hazed image to generate the intermediate color image, the intermediate color image comprising a third luminance data and a second chrominance data; generate an extended dynamic range luminance image based on the first luminance data, the second luminance data, and the third luminance data; and generate an extended dynamic range color image of the scene based on the dynamic range luminance image and the second chrominance data associated with the intermediate color image.

BRIEF DESCRIPTION OF THE FIGURES

[0014] Various embodiments are illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings in which:

[0015] FIGURE 1 illustrates a device, in accordance with an example embodiment;

[0016] FIGURE 2 illustrates an apparatus for generation of an extended dynamic range color image, in accordance with an example embodiment;

[0017] FIGURE 3A illustrates an example representation of generation of an extended dynamic range luminance image, in accordance with an example embodiment;

[0018] FIGURE 3B illustrates an example representation of generation of an extended dynamic range color image, in accordance with an example embodiment; [0019] FIGURES 4A, 4B, 4C and 4D illustrate an example representation of a method for alignment of images for facilitating generation of an extended dynamic range color image, in accordance with an example embodiment; [0020] FIGURES 5A, 5B and 5C illustrate an example representation of method for performing de-noising during generation of an extended dynamic range color image, in accordance with an example embodiment; [0021] FIGURE 6 is a flowchart depicting an example method for generation of an extended dynamic range color image, in accordance with an example embodiment;

[0022] FIGURE 7 is a flowchart depicting an example method for generation of an extended dynamic range color image, in accordance with another example embodiment;

[0023] FIGURE 8 is a flowchart depicting an example method for generation of an extended dynamic range color image, in accordance with yet another example embodiment;

[0024] FIGURES 9A and 9B is a flowchart depicting an example method for generation of an extended dynamic range color image, in accordance with still another example embodiment;

[0025] FIGURE 10 is a flowchart depicting an example method for generation of an extended dynamic range color image, in accordance with still another example embodiment;

[0026] FIGURE 11 is a flowchart depicting an example method for generation of an extended dynamic range color image, in accordance with still another example embodiment;

[0027] FIGURES 12A and 12B illustrate a panchromatic image and a color image, respectively associated with a scene, in accordance with an example embodiment; and

[0028] FIGURE 12C illustrates an extended dynamic range color image of the scene being generated from a panchromatic image and a color image, in accordance with an example embodiment.

DETAILED DESCRIPTION

[0029] Example embodiments and their potential effects are understood by referring to FIGURES 1 through 12C of the drawings. [0030] FIGURE 1 illustrates a device 100 in accordance with an example embodiment. It should be understood, however, that the device 100 as Illustrated and hereinafter described is merely illustrative of one type of device that may benefit from various embodiments, therefore, should not be taken to limit the scope of the embodiments. As such, it should be appreciated that at least some of the components described below in connection with the device 100 may be optional and thus in an example embodiment may include more, less or different components than those described in connection with the example embodiment of FIGURE 1. The device 100 could be any of a number of types of mobile electronic devices, for example, portable digital assistants (PDAs), pagers, mobile televisions, gaming devices, cellular phones, all types of computers (for example, laptops, mobile computers or desktops), cameras, audio/video players, radios, global positioning system (GPS) devices, media players, mobile digital assistants, or any combination Of the aforementioned, and other types of communications devices. [0031] The device 100 may include an antenna 102 (or multiple antennas) in operable communication with a transmitter 104 and a receiver 106. The device 100 may further include an apparatus, such as a controller 108 or other processing device that provides signals to and receives signals from the transmitter 104 and receiver 106, respectively. The signals may include signaling information in accordance with the air interface standard of the applicable cellular system, and/or may also include data corresponding to user speech, received data and/or user generated data. In this regard, the device 100 may be capable of operating with one or more air interface standards, communication protocols, modulation types, and access types. By way of illustration, the device 00 may be capable of operating in accordance with any of a number of first, second, third and/or fourth-generation communication protocols or the like. For example, the device 100 may be capable of operating in accordance with second-generation (2G) wireless communication protocols IS-136 (time division multiple access (TDMA)), GSM (global system for mobile communication), and lS-95 (code division multiple access (CDMA)), or with third-generation (3G) wireless communication protocols, such as Universal Mobile Telecommunications System (UMTS), CDMA1000, wideband CDMA (WCDMA) and time division-synchronous CDMA (TD-SCDMA), with 3.9G wireless communication protocol such as evolved- universal terrestrial radio access network (E-UTRAN), with fourth-generation (4G) wireless communication protocols, or the like. As an alternative (or additionally), the device 100 may be capable of operating in accordance with non-cellular communication mechanisms. For example, computer networks such as the Internet, local area network, wide area networks, and the like; short range wireless communication networks such as include Bluetooth® networks, Zigbee® networks, Institute of Electric and Electronic Engineers (IEEE) 802.11x networks, and the like; wireline telecommunication networks such as public switched telephone network (PSTN).

[0032] The controller 108 may include circuitry implementing, among others, audio and logic functions of the device 100. For example, the controller 108 may include, but are not limited to, one or more digital signal processor devices, one or more microprocessor devices, one or more processor(s) with accompanying digital signal processor(s), one or more processor(s) without accompanying digital signal processors), one or more special-purpose computer chips, one or more field-programmable gate arrays (FPGAs), one or more controllers, one or more application-specific integrated circuits (ASICs), one or more computer(s), various analog to digital converters, digital to analog converters, and/or other support circuits. Control and signal processing functions of the device 100 are allocated between these devices according to their respective capabilities. The controller 108 thus may also include the functionality to convolutionally encode and interleave message and data prior to modulation and transmission. The controller 108 may additionally include an internal voice coder, and may include an internal data modem. Further, the controller 108 may include functionality to operate one or more software programs, which may be stored in a memory. For example, the controller 108 may be capable of operating a connectivity program, such as a conventional Web browser. The connectivity program may then allow the device 100 to transmit and receive Web content, such as location-based content and/or other web page content, according to a Wireless Application Protocol (WAP), Hypertext Transfer Protocol (HTTP) and/or the like. In an example embodiment, the controller 108 may be embodied as a multi-core processor such as a dual or quad core processor. However, any number of processors may be included in the controller 108.

[0033] The device 100 may also comprise a user interface including an output device such as a ringer 110, an earphone or speaker 112, a microphone 114, a display 116, and a user input interface, which may be coupled to the controller 108. The user input interface, which allows the device 100 to receive data, may include any of a number of devices allowing the device 100 to receive data, such as a keypad 118, a touch display, a microphone or other input device. In embodiments including the keypad 118, the keypad 118 may include numeric (0-9) and related keys (#, *), and other hard and soft keys used for operating the device 100. Alternatively or additionally, the keypad 118 may include a conventional QWERTY keypad arrangement. The keypad 118 may also include various soft keys with associated functions. In addition, or alternatively, the device 100 may include an interface device such as a joystick or other user input interface. The device 100 further includes a battery 120, such as a vibrating battery pack, for powering various circuits that are used to operate the device 100, as well as optionally providing mechanical vibration as a detectable output.

[0034] In an example embodiment, the device 100 includes a media capturing element, such as a camera, video and/or audio module, in communication with the controller 108. The media capturing element may be any means for capturing an image, video and/or audio for storage, display or transmission. In an example embodiment in which the media capturing element is a camera module 122, the camera module 122 may include a digital camera capable of forming a digital image file from a captured image. As such, the camera module 122 includes all hardware, such as a lens or other optical component(s), and software for creating a digital image file from a captured image. Alternatively, the camera module 122 may include the hardware needed to view an image, while a memory device of the device 100 stores instructions for execution by the controller 108 in the form of software to create a digital image file from a captured image. In an example embodiment, the camera module 122 may further include a processing element such as a co-processor, which assists the controller 108 in processing image data and an encoder and/or decoder for compressing and/or decompressing image data. The encoder and/or decoder may encode and/or decode according to a JPEG standard format or another like format. For video, the encoder and/or decoder may employ any of a plurality of standard formats such as, for example, standards associated with H.261 , H.262/ PEG-2, H.263, H.264, H.264/MPEG-4, MPEG-4, and the like. In some cases, the camera module 122 may provide live image data to the display 116. Moreover, in an example embodiment, the display 116 may be located on one side of the device 100 and the camera module 122 may include a lens positioned on the opposite side of the device 100 with respect to the display 116 to enable the camera module 122 to capture images on one side of the device 100 and present a view of such images to the user positioned on the other side of the device 100. [0035] The device 100 may further include a user identity module (UIM) 124. The UIM

124 may be a memory device having a processor built in. The UIM 124 may include, for example, a subscriber identity module (SIM), a universal integrated circuit card (UICC), a universal subscriber identity module (USIM), a removable user identity module (R-UIM), or any other smart card. The UIM 124 typically stores information elements related to a mobile subscriber. In addition to the UIM 124, the device 100 may be equipped with memory. For example, the device 100 may include volatile memory 126, such as volatile random access memory (RAM) including a cache area for the temporary storage of data. The device 100 may also include other non-volatile memory 128, which may be embedded and/or may be removable. The non-volatile memory 128 may additionally or alternatively comprise an electrically erasable programmable read only memory (EEPROM), flash memory, hard drive, or the like. The memories may store any number of pieces of information, and data, used by the device 100 to implement the functions of the device 100. [0036] FIGURE 2 illustrates an apparatus 200 for generation of an extended dynamic range color image, in accordance with an example embodiment. The apparatus 200 may be employed, for example, in the device 100 of FIGURE 1. However, it should be noted that the apparatus 200, may also be employed on a variety of other devices both mobile and fixed, and therefore, embodiments should not be limited to application on devices such as the device 100 of FIGURE 1. Alternatively, embodiments may be employed on a combination of devices including, for example, those listed above. Accordingly, various embodiments may be embodied wholly at a single device, for example, the device 100 or in a combination of devices. Furthermore, it should be noted that the devices or elements described below may not be mandatory and thus some may be omitted in certain embodiments.

[0037] The apparatus 200 includes or otherwise is in communication with at least one processor 202 and at least one memory 204. Examples of the at least one memory 204 include, but are not limited to, volatile and/or non-volatile memories. Some examples of the volatile memory include, but are not limited to, random access memory, dynamic random access memory, static random access memory, and the like. Some examples of the non-volatile memory include, but are not limited to, hard disks, magnetic tapes, optical disks, programmable read only memory, erasable programmable read only memory, electrically erasable programmable read only memory, flash memory, and the like. The memory 204 may be configured to store information, data, applications, instructions or the like for enabling the apparatus 200 to carry out various functions in accordance with various example embodiments. For example, the memory 204 may be configured to buffer input data comprising media content for processing by the processor 202. Additionally or alternatively, the memory 204 may be configured to store instructions for execution by the processor 202. [0038] An example of the processor 202 may include the controller 108. The processor 202 may be embodied in a number of different ways. The processor 202 may be embodied as a multi-core processor, a single core processor; or combination of multi-core processors and single core processors. For example, the processor 202 may be embodied as one or more of various processing means such as a coprocessor, a microprocessor, a controller, a digital signal processor (DSP), processing circuitry with or without an accompanying DSP, or various other processing devices including integrated circuits such as, for example, an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), a microcontroller unit (MCU), a hardware accelerator, a special-purpose computer chip, or the like. In an example embodiment, the multi-core processor may be configured to execute instructions stored in the memory 204 or otherwise accessible to the processor 202. Alternatively or additionally, the processor 202 may be configured to execute hard coded functionality. As such, whether configured by hardware or software methods, or by a combination thereof, the processor 202 may represent an entity, for example, physically embodied in circuitry, capable of performing operations according to various embodiments while configured accordingly. For example, if the processor 202 is embodied as two or more of an ASIC, FPGA or the like, the processor 202 may be specifically configured hardware for conducting the operations described herein. Alternatively, as another example, if the processor 202 is embodied as an executor of software instructions, the instructions may specifically configure the processor 202 to perform the algorithms and/or operations described herein when the instructions are executed. However, in some cases, the processor 202 may be a processor of a specific device, for example, a mobile terminal or network device adapted for employing embodiments by further configuration of the processor 202 by instructions for performing the algorithms and/or operations described herein. The processor 202 may include, among other things, a clock, an arithmetic logic unit (ALU) and logic gates configured to support operation of the processor 202.

[0039] A user interface 206 may be in communication with the processor 202. Examples of the user interface 206 include, but are not limited to, input interface and/or output interface. The input interface is configured to receive an indication of a user input. The output user interface provides an audible, visual, mechanical or other output and/or feedback to the user. Examples of the input interface may include, but are not limited to, a keyboard, a mouse, a joystick, a keypad, a touch screen, soft keys, and the like. Examples of the output interface may include, but are not limited to, a display such as light emitting diode display, thin-film transistor (TFT) display, liquid crystal displays, active-matrix organic light-emitting diode (A OLED) display, a microphone, a speaker, ringers, vibrators, and the like. In an example embodiment, the user interface 206 may include, among other devices or elements, any or all of a speaker, a microphone, a display, and a keyboard, touch screen, or the like. In this regard, for example, the processor 202 may comprise user interface circuitry configured to control at least some functions of one or more elements of the user interface 206, such as, for example, a speaker, ringer, microphone, display, and/or the like. The processor 202 and/or user interface circuitry comprising the processor 202 may be configured to control one or more functions of one or more elements of the user interface 206 through computer program instructions, for example, software and/or firmware, stored on a memory, for example, the at least one memory 204, and/or the like, accessible to the processor 202.

[0040] In an example embodiment, the apparatus 200 may include an electronic device. Some examples of the electronic device include communication device, media capturing device with or without communication capabilities, computing devices, and the like. Some examples of the electronic device may include a mobile phone, a personal digital assistant (PDA), and the like. Some examples of computing device may include a laptop, a personal computer, and the like. In an example embodiment, the electronic device may include a user interface, for example, the user interface 206, having user interface circuitry and user interface software configured to facilitate a user to control at least one function of the electronic device through use of a display and further configured to respond to user inputs. In an example embodiment, the electronic device may include a display circuitry configured to display at least a portion of the user interface 206 of the electronic device. The display and display circuitry may be configured to facilitate the user to control at least one function of the electronic device. [0041] In an example embodiment, the electronic device may be embodied as to include a transceiver. The transceiver may be any device operating or circuitry operating in accordance with software or otherwise embodied in hardware or a combination of hardware and software. For example, the processor 202 operating under software control, or the processor 202 embodied as an ASIC or FPGA specifically configured to perform the operations described herein, or a combination thereof, thereby configures the apparatus or circuitry to perform the functions of the transceiver. The transceiver may be configured to receive media content. Examples of the media content may include audio content, video content, data, and a combination thereof. [0042] In an example embodiment, the electronic device may be embodied to include one or more image sensors 208 for capturing a color image and a panchromatic image of a scene. In various embodiments, the panchromatic image may also be referred to as a black- and-white image or a grayscale image or a monochromatic (black-and-white) image. In an example embodiment, the color image may also be referred to as a Bayer image or an RGB image. In some example embodiments, the panchromatic image and the color image may be utilized for generation of extended dynamic range color images of the scene. Herein, the term 'dynamic range' of an image may refer to a variation of luminance (or light) levels of a scene captured by an image capturing device. In an embodiment, the 'dynamic range' of the image may be determined as a ratio between maximum and minimum measurable light intensities, or simply a ratio between lightest and darkest regions of the image. In some embodiments, the image having a wider dynamic range than the dynamic range of an image captured by a standard camera may be referred to as an extended dynamic range image. In an example embodiment, the extended dynamic range color image may recreate the complete scene contents even if the scene includes a wide range of lighting conditions. An example of the extended dynamic range color image is illustrated and described further with reference to FIGURES 12A-12C.

[0043] In an example embodiment, the one or more image sensors 208 may be embodied to include multiple camera components for capturing the color image and the panchromatic image of the scene. For example, in an embodiment, the one or more image sensors 208 may be embodied in a panchromatic camera and a Bayer camera such that the panchromatic camera may capture the panchromatic image of the scene while the Bayer camera may capture the color image of the scene. Herein, 'panchromatic image' refers to a single band image displayed as shades of gray, and 'color image' refers to an image that includes color information for a plurality of pixels.

[0044] In various example embodiments, the panchromatic camera may include a first image sensor that is sensitive to light of all visible colors in a scene, and may generate the panchromatic image. In various example embodiments, the Bayer camera may include a second image sensor that is sensitive to lights of colors such as red color, blue color and green color in the scene, and may generate the color image. In an example embodiment, the panchromatic camera and the Bayer camera may together constitute a dual camera which may be embodied within the electronic device of the apparatus 200. In an example embodiment, the panchromatic camera and the Bayer camera may collectively be configured to capture a luminance data and a chrominance data associated with the scene. For example, the panchromatic camera may facilitate in capturing the panchromatic image that may include a first luminance data, while the Bayer camera may facilitate in capturing the color image that ma include a second luminance data, and a first chrominance data of the scene. In an embodiment, the term 'luminance' may refer to perceived brightness of a scene, and 'chrominance' refers to color information of the

[0045] In another example embodiment, the one or more image sensors 208 of the electronic device may include a color filter array (CFA) based sensor that may be configured to capture a luminance data and a chrominance data of the scene. An example of the GFA based sensor may include RGBW sensors. In an example embodiment, the RGBW sensors may include a filter array of transparent filters and color filters. In an example embodiment, the transparent filters and the color filters may facilitate in capturing the first luminance data, and the second luminance data and the first chrominance data of the scene. For example, the transparent filter may facilitate in capturing a first luminance data associated with the scene while the color filters may facilitate in capturing a second luminance data and a first chrominance data associated with the scene. [0046] In various example embodiments, the one or more image sensors 208 may be in communication with the processor 202 and/or other components of the apparatus 200. The one or more image sensors 208 may be in communicatio with other imaging circuitries and/or software, and are configured to capture digital images or to make a video or other graphic media files or to capture luminance data and chrominance data associated with the scene. The one or more image sensors 208 and other circuitries, in combination, may be an example of at least one camera module such as the camera module 122 of the device 00.

[0047] These components (202-208) ma communicate to each other via a centralized circuit system 210 to generate an extended dynamic range color image of a scene based at least on the first luminance data, the second luminance data and the first chrominance data derived from the panchromatic image and the color image of the scene. The centralized circuit system 210 may be various devices configured to, among other things, provide or enable communication between the components (202-208) of the apparatus 200. In certain embodiments, the centralized circuit system 210 may be a central printed circuit board (PGB) such as a motherboard, main board, system board, or logic board. The centralized cireuit system 210 may also, or alternatively, include other printed circuit assemblies (PCAs) or Communication channel media. [0048] In an example embodiment, the processor 202 is configured to, with the content of the memory 204, and optionally with other components described herein, to cause the apparatus 200 to facilitate receipt of the panchromatic image (P) and the color image (C) associated ith the scene. Herein, the term 'scene' may refer to an arrangement (natural, manmade, sorted or assorted) of one or more objects of which images and/or videos can be captured. In an example embodiment, the panchromatic image (P) and the color image (C) may be captured by a dual camera having a panchromatic camera and a Bayer camera. In an embodiment, the panchromatic image P may include a first luminance data (L1), and the color image C may include a second luminance data (L2) and a first chrominance data {C1 )„ In an example embodiment, a baseline distance (d) between the panchromatic camera and the Bayer camera is less than a threshold baseline distance. Herein, the 'baseline distance' may refer to a distance between two cameras, for example, the distance between the panchromatic camera and the Bayer camera.

[0049] In another example embodiment, the panchromatic image P and the color image C may be generated or rendered by a camera that includes a CFA based sensor. It will be noted that the CFA based sensors may not be able to generate separate panchromatic image P and color image C of the scene, Herein 'generating or rendering of the panchromatic image P and the color image C by the CFA based Sensors' refers to derivation of the first luminance data L associated with the panchromatic image P, and the second luminance data L2 and the first chrominance data C1 associated with the color image C. In an example embodiment, the filter array of transparent filters may be utilized for deriving the first luminance data: Li and the second luminance data L2 of the scene, and the color filters may be utilized for deriving the first chrominance data CI of the scene. 10050] In an example embodiment, the one or more image sensors 208, for example the sensors embodied in the panchromatie camera and the Bayer camera or CFA based sensors ma be configured within the apparatus 200. In some other embodiments, the one or more image sensors 208 may be configured outside the apparatus 200, and may facilitate capture of the panchromatic image P and the color image C. For example, in some exam le embodiments, the apparatus 200 may be caused to send instructions for capturing of the panchromatic image P and the color image C of the scene by an external panchromatic camera and an external Bayer camera, respectively, that is accessible/communicably coupled to the apparatus 200. Herein, the panchromatic camera includes any camera that may be capable of capturing panchromatic images of the scene, and the Bayer camera may include any camera that may be capable of capturing color images of the scene. In some example embodiments, the panchromatic image P and the color image C may be prerecorded or stored in the apparatus 200, or may be received from sources external to the apparatus 200. in such example embodiments, the apparatus 200 may be caused to receive the panchromatic image P and the color image C from external storage mediums such as DVD, Compact Disk (CD), flash drive, memory card, or from external storage locations through Internet, Bluetooth ® , and the like. In an example embodiment, a processing means may be configured to facilitate receipt of the panchromatic image P and the color image C associated with the scene. An example of the processing means may include the processor 202, which may be an example of the controller 108, and/or the one or more image sensors 208.

[0051] In an example embodiment, the apparatus 200 may be caused to facilitate metering of an exposure of the one or more image sensors 208. Herein, 'metering' may refer to controlling exposure or amount of light entering into the camera. In various embodiments, the metering may be performed by adjusting parameters such as gain (analog and digital), exposure time (or shutter speed) and aperture of the camera. In various embodiments, the metering may facilitate in generating or capturing images in various lighting conditions. In an example embodiment, metering the exposure of the one or more image sensors 208 may facilitate in capturing 'shadow portions' and 'highlight portions' in the image. Herein, the 'shadow portions' of the image may refer to those portions in the image that are darker (having lower light intensity) as compared to other regions of the image, and may include lesser details as compared to other portions of the image. In various embodiments, the shadow portions of the image correspond to areas of the scene that are blocked from a light source or darker portions of the scene. For example, such an image may occur when the objects being photographed have background that is amply-lit, that is the foreground may be back-lit. The term 'highlight portions' may refer to regions in the image that include brighter regions or highly illuminated regions of the image. The highlight portions of the image may include regions that may be overexposed while capturing the image of the scene. The details of the shadow portions and highlight portions of an image are explained further in detail with reference to FIGURES 12A and 12B.

[0052] In an embodiment, the metering of the exposures of the one or more image sensors 208 is performed in such a manner that the sensors capturing the first luminance data L1 may capture an overexposed image of the scene while the sensors capturing the second luminance data L2 and the first chrominance data C1 may capture an underexposed image of the scene. In other words, Overexposed portions of image' refers to portions in the image that may have recorded excessive amount of light, and 'underexposed portions of image' may refer to portions of image that may have recorded little light. In an example embodiment, in case the one or more image sensors 208 are embodied in the panchromatic camera and the Bayer camera, the exposures of the panchromatic camera and the Bayer camera may be metered for facilitating the panchromatic camera and the Bayer camera to capture the shadow portions and the highlight portions, respectively of the scene. In an example embodiment, metering the exposures of the respective one or more image sensors 208 associated with the panchromatic camera and the Bayer camera facilitates in capturing the panchromatic image P and the color image C, as the overexposed image and the underexposed image, respectively of the scene. In an example embodiment, the exposures of the respective one or more image sensors 208 associated with the panchromatic camera and the Bayer camera may be metered by adjusting one or more exposure parameters associated with the panchromatic camera and the Bayer camera. The one or more exposure parameters include an exposure time, gain and aperture. Herein, 'exposure time' (or shutter speed) may refer to a length of time during which a shutter of a camera is open when capturing an image by the camera. Herein, 'gain' may refer to an exposure parameter that controls sensitivity of a camera sensor to light, and 'aperture' may refer to an exposure parameter that controls an area over which light can enter a camera. In an example embodiment, the exposures of the respective one or more image sensors 208 associated with the panchromatic camera and the Bayer camera may be metered by adjusting an exposure time associated with the panchromatic camera to be equal to an exposure time associated with the Bayer camera. In other example embodiments, the gain and the aperture can also be varied to meter the exposures of the respective one or more image sensors 208 associated with the panchromatic camera and the Bayer camera. In some example embodiments, if only one exposure parameter is adjusted for metering the exposures of the respective one or more image sensors 208 associated with the panchromatic camera and the Bayer camera, then the exposure parameter should be adjusted such that the panchromatic image P is captured as the overexposed image (as pixels of the panchromatic image P have higher sensitivity resulting in brighter pixels) and the color image C is captured as the underexposed image. [0053] In an example embodiment wherein the one or more image sensors 208 are embodied in CFA based sensors of a camera, the metering of the one or more image sensors 208 includes metering of an exposure of the filter array of transparent filters and color filters of the camera. In an example embodiment, metering the filter array of transparent filters and color filters facilitates in generating the panchromatic image P as the overexposed image and the color image C as the underexposed image of the scene. As already discussed, in this embodiment, where CFA based sensors are utilized for capturing image of the scene, 'generating' the panchromatic image P refers to deriving the first luminance data L1 from the filter array of transparent filters, and the second luminance data L2 and the first chrominance data C1 from the color filters of the CFA based sensor.

[0054] In an example embodiment, the apparatus 200 may be caused to receive the panchromatic image P and the color image C that may be distortion-free. For example, when the panchromatic camera that is utilized to capture the panchromatic image P and the Bayer camera that is utilized to capture the color image C are located/positioned in such a manner that the distance between the panchromatic camera and the Bayer camera is less than a threshold baseline distance, the panchromatic image P and the color image C may be distortion-free. In some example embodiments, the panchromatic camera and the Bayer camera may capture images that may include distortion artifacts. In an embodiment, the 'distortion or distortion artifacts' in an image may refer to warping and bending of objects and/or straight lines in the image so as to make them appear as curved. For example, the captured images may show objects stretched so as to make them appear straight, for example towards the edges of the frame. In some examples, distortion artifacts may distort images by curving straight lines in the images. In some other examples, due to distortion artifacts, the objects in an image may appear disproportionately large or distorted when compared to objects in the background. In some embodiments, the apparatus 200 may be caused to receive an initial panchromatic image (IP) and an initial color image (IC) that may include distortion artifacts. In such embodiments, the apparatus 200 may be caused to perform distortion calibration of the panchromatic camera and the Bayer camera for compensating distortion associated with the initial panchromatic image IP and the initial color image IC captured by the panchromatic camera and the Bayer camera, respectively to generate the panchromatic image P and the color image C, respectively. Herein, the term 'distortion calibration' may refer to removal of distortion from distorted images captured by the panchromatic camera and the Bayer camera by calibrating the panchromatic camera and the Bayer camera. The removal of distortion by performing distortion calibration facilitates in improving quality of image fusion for generating the extended dynamic range color image. In an example embodiment, a processing means may be configured to perform the distortion calibration of the panchromatic camera and the Bayer camera for compensating distortion associated with the initial panchromatic image IP and the initial color image IC. An example of the processing means may include the processor 202, which may be an example of the controller 108.

[0055] In an example embodiment, for performing distortion calibration, the apparatus 200 may be caused to determine a first distortion parameter (D1) and a second distortion parameter (D2) associated with the panchromatic camera and the Bayer camera, respectively. In an embodiment, the distortion associated with the panchromatic camera and the Bayer camera may be modelled by the first distortion parameter D1 and the second distortion parameter D2, respectively. In an example embodiment, the apparatus 200 may be caused to determine the first distortion parameter D1 and the second distortion parameter D2 from the initial panchromatic image IP and the initial color image IC. In an example embodiment, the initial panchromatic image IP and the initial color image IC may represent a convolution of the panchromatic image P with the first distortion parameter D1 , and a convolution of the color image C with the second distortion parameter D2, respectively. For example, the initial panchromatic image IP may be a convolution of the first distortion parameter D1 with the panchromatic image P, and may be represented as per the following equation (1a):

IP(x, y) = D1(x, y) * P(x, y) (1a) where (x, y) represents pixel values associated with the initial panchromatic image IP, D1(x, y) represents the first distortion parameter D1 associated with the pixel at (x,y), and P(x,y) represents the panchromatic image P.

[0056] In an example embodiment, the initial color image IC may be a convolution of the second distortion parameter D2 with the color image C, and may be represented as per the following equation (1 b): IC(x, y) = D2(x, y) * C(x, y) (1 b) where (x, y) represents pixel values associated with the initial color image IC, D2{x, y) represents the second distortion parameter D2 at pixel location (x, y), and C(x, y) represents the color image C.

[0057] In an example embodiment, the apparatus 200 may be caused to extract the panchromatic image P and the color image C from the initial panchromatic image IP and the initial color image IC, respectively based on the first distortion parameter D1 and the second distortion parameter D2, respectively. In an example embodiment, for extracting the panchromatic image P from the initial panchromatic image IP and the color image C from the initial color image IC, Fourier transform may be applied on equations (1a) and (1 b), respectively, for example, as represented in the following equations (2a) and (2b):

IP'(X, Y) = D1'(X, Y) * P'(X, Y) (2a) IC'(X, Y) = D2'(X, Y) * C'(X, Y) (2b) i(x,y) = P(x, y) + C(x, y)

= T ~1 + )

D1'(X, Υ D2' X, Υ

[0058] In an example embodiment, the distortion calibration of the panchromatic camera and the Bayer camera may not be performed if the first distortion parameter D1 is used in place of the second distortion parameter D2, such that distortion parameters for the panchromatic camera and the Bayer camera is same. In such cases, the image fusion may be directly performed without computing the first distortion parameter and the second distortion parameter.

IP(x, y) = D3(x, y) * P(x, y) (3a)

IC(x, y) = D3(x, y) * C(x, y) (3b) where, D3 may be a third distortion parameter for the panchromatic camera and the Bayer camera.

[0059] On applying Fourier transform on equations (3a) and {3b}, following equations 4(a) and 4(b) may be generated:

IP'iX, Y) = B3'(X, Y) * P'(X, Y) (4a)

IC'(X, Y) = D3'(X, Y) * C'(X, Y) (4b)

→l(x iy )=lP(x,y)+iC(x,y}

= "1 F(X,Y)(iP'(X, VJ+F C WPC ¾

=f(x,y)*(lP{x,y)±IC(x,y))

[0060] In an example embodiment, the initial panchromatic image IP and the initial color image IC captured by the panchromatic camera and the Bayer camera, respectively ma be misaligned. In an example embodiment, the tnitial panchromatic image IP and the initial color image IC ma be misaligned due to distortion in the initial panchromatic image IP and the initial color image IC. Examples of the distortion in the initial panchromatic image IP and the initial color image IC may include geometrical distortion and shift distortion. Herein, the term 'geometric distortion' may refer to distortion in the captured images due to the lenses of the panchromatic camera and the Bayer camera. The term 'shift distortion' may refer to distortion in the captured images due to distance between optical axes of the panchromatic camera and the Bayer camera. In an example embodiment, the apparatus 200 may be caused to perform distortion calibration of the panchromatic camera and the Bayer camera for performing distortion removal from initial distorted images captured by the panchromatic camera and the Bayer camera, and subsequent alignment of distortion-free panchromatic image and the color image. In an example embodiment, a processing means may be configured to perform distortion calibration of the panchromatic camera and the Bayer camera for compensating distortion associated with the panchromatic image P and the color image C captured by the panchromatic camera and the Bayer camera, respectively. An example of the processing means may include the processor 202, which may be an example of the controller 108.

[0061] In an example embodiment, for performing the distortion calibration, the apparatus 200 is caused to decompose the color image C into a decomposed panchromatic image (DP) and a decomposed color image (DC). In an example embodiment, the decomposed panchromatic image DP and the decomposed color image DC may include the second luminance data L2 and the first chrominance data C1 , respectively. In an example embodiment, the apparatus 200 may be caused to determine a set of feature pixels (FP1) from the panchromatic image P and a corresponding set of feature pixels (FP2) from the decomposed panchromatic image DP based on a bi-directional pixel matching between the panchromatic image P and the decomposed panchromatic image DP. In an example embodiment, due to image distortion, the set of feature pixels FP1 of the panchromatic image P may be shifted relative to the corresponding set of feature pixels FP2 of the decomposed panchromatic image DP by a corresponding first plurality of shift values (S1 ). Herein, the term 'feature pixels' may refer to pixels representing important features or information of the scene. Herein, the term 'bidirectional pixel matching' may refer to matching the set of feature pixels FP1 determined in panchromatic image P to the corresponding set of feature pixels FP2 from the decomposed panchromatic image DP, and matching the corresponding set of feature pixels FP2 determined in the decomposed panchromatic image DP to the set of feature pixels FP1 from the panchromatic image P. In an example embodiment, for pixels in the panchromatic image P and the decomposed panchromatic image DP, a Hessian matrix and corresponding eigenvalues may be computed to determine the set of feature pixels FP1 and the corresponding set of feature pixels FP2 in the panchromatic image P and the decomposed panchromatic image DP, respectively. As used herein, the Hessian matrix is a symmetric matrix, including the eigenvalues, which is used by a Hessian affine detector to detect features of an image. A local maxima is determined by comparing pixels of the panchromatic image P or the color image C with a set of neighboring pixels. A pixel may be determined as the local maxima if the neighboring pixels are beiow a threshold maxima, and the pixel is above the threshold maxima. The pixel is hence selected as a feature pixel. It should be noted that in similar manner of the determination of the feature pixel using the Hessian matrix, as described above, other feature pixels from the set of feature pixels FP1 and the corresponding set of feature pixels FP2 may be determined. In an embodiment, a feature pixel in the set of feature pixels FP1 or a feature pixel in the corresponding set of feature pixels FP2 may define a feature vector such that the feature vector may describe local feature statistics of a local area surrounding the feature pixel.

[00621 ' n an embodiment, the apparatus 200 is caused to perform bi-directional pixel matching on determination of the set of feature pixels FP1 and the corresponding set of feature pixels FP2. For example, for a feature pixel ΡΡ * \ , in the set of feature pixels FP1 , a local searching may be performed in a corresponding neighborhood region in the decomposed panchromatic image DP to determine a matching feature pixel FP2 1 having a highest similarity to the feature pixel FP1 i (based on a similarity score). The similarity score of the matching feature pixel FP2-) may be higher than other available scores and higher than a threshold similarity score. A similar matching process may be repeated for the matching feature pixel FP2^ in a corresponding neighborhood region in the panchromatic image P to determine whether the feature pixel FP^ and the feature pixel FP2i forms a best match. By performing such bidirectional pixel matching, the set of feature pixels FP1 of the panchromatic image P may be determined to be shifted relative to the corresponding set of feature pixels FP2 of the decomposed panchromatic image DP by the corresponding first plurality of shift values S1. In an example embodiment, a triangulation method, for example a 'Delaunay triangulation method' may be applied to the set of feature pixels FP1 to determine a local mesh network based on an assumption that the corresponding first plurality of shift values S1 of pixels in an image plane meet local continuity and local linearity (due to continuity of misalignments between pixel pairs from the panchromatic image P and the decomposed panchromatic image DP). The Delaunay triangulation method is a collection of edges that satisfy an empty circle property such that for each edge there is a circle that includes only endpoints of the edge. The edges connecting the feature pixels further determine a local mesh network. A method of determining the set of feature pixels FP1 from the panchromatic image P and the corresponding set of feature pixels FP2 from the decomposed panchromatic image DP based on the bi-directional pixel matching is described further with reference to FIGURE 4A. A method of determining the corresponding first plurality of shift values S1 is described further with reference to FIGURE 4B and FIGURE 4C. In an example embodiment, a processing means may be configured to determine the set of feature pixels FP1 from the panchromatic image P and the corresponding set of feature pixels FP2 from the decomposed panchromatic image DP. An example of the processing means may include the processor 202, which may be an example of the controller 108.

[0063] In this example embodiment of performing the distortion calibration, the apparatus 200 may be caused to determine, for a plurality of non-feature pixels (NFP1 ) of the panchromatic image P, a corresponding second plurality of shift values S2 based on a triangulation method applied on the set of feature pixels FP1 associated with the panchromatic image P and the corresponding first plurality of shift values S Herein, the term 'non-feature pixels' may refer to the pixels that may not be determined as feature pixels. In an example embodiment, the triangulation method may be applied to infer the corresponding second plurality of shift values S2 based on an assumption of local linearity. A method of determining the corresponding second plurality of shift values S2 is described further with reference to FIGURE 4D. In an example embodiment, a processing means may be configured to determine the corresponding second plurality of shift values S2 for the plurality of non-feature pixels NFP1 of the panchromatic image P. An example of the processing means may include the processor 202, which may be an example of the controller 108.

[0064] In an example embodiment, the apparatus 200 may be caused to align the panchromatic image P and the color image C based on the first plurality of shift values S1 and the corresponding second plurality of shift values S2, after the distortion calibration of the panchromatic camera and the Bayer camera. In an example embodiment, a processing means may be configured to align the panchromatic image P and the color image C. An example of the processing means may include the processor 202, which may be an example of the controller 108. In an example embodiment, the aligned the panchromatic image P and the color image C may include the first luminance data L1 , and the second luminance data L2 and the first chrominance data C1 , respectively.

[0065] In an example embodiment, the apparatus 200 may be caused to generate an extended dynamic range luminance image (EL) based at least on the first luminance data L1 and the second luminance data L2. In an example embodiment, generating the extended dynamic range luminance image EL includes fusing the first luminance data L1 from the panchromatic image P and the second luminance data L2 from the color image C. In an example embodiment, the first luminance data L1 and the second luminance data L2 may be fused based on one or more image fusion methods. Examples of image fusion methods may include, but are not limited to, principal component analysis based image fusion and wavelet transform image fusion. In an example embodiment, a processing means may be configured to generate the extended dynamic range luminance image EL based on the first luminance data L1 and the second luminance data L2. An example of the processing means may include the processor 202, which may be an example of the controller 108.

[0066] In an example embodiment, the apparatus 200 may be caused to pre-process the first luminance data L1 and the second luminance data L2, and utilize the pre-processed first luminance data and the pre-processed second luminance data to generate the extended dynamic range luminance image EL. Additionally or alternatively, the apparatus 200 may be caused to pre-process the first chrominance data C1, and utilize the pre-processed first chrominance data and the extended dynamic range luminance image EL to generate the extended dynamic range color image, in an example embodiment, a processing means may be configured to pre-process the first luminance data L1 , the second luminance data L2, and the first chrominance data G1. An example of the processing means may include the processor 202, which may be an example of the controller 108.

[0067] In this example embodiment of pre-processing the first luminance data L , the apparatus 200 is caused to perform an interpolation of the first luminance data L1 to generate an interpolated first luminance data {1L1). Herein, the term 'interpolation' may refer to an imaging method to increase number of pixels in a digital image. It will be noted that the term 'interpolation' may also be referred to as 'resampling'. In an example embodiment, the interpolation of the first luminance data L1 is performed by an 8-neighborhood interpolation method. In this example embodiment of pre-processing the second luminance data L2, the apparatus 200 is caused to perform at least a two-level interpolation of the second luminance data L2 to generate an interpolated second luminance data (IL2). In an example embodiment, the two-level interpolation of the second luminance data L2 is performed by the 8-neighborhood interpolation method. In an example embodiment, a processing means may be configured to perform the two-level interpolation of the second luminance data L2. An example of the 8- neighborhood interpolation is described further with reference to FIGURE 3A.

[0068] In an example embodiment, the apparatus 200 may be caused to generate the extended dynamic range luminance image EL by fusing the interpolated first luminance data I LI and the interpolated second luminance data IL2. An example of the extended dynamic range luminance image EL being generated based on the fusion of the interpolated first luminance data IL1 and the interpolated second luminance data IL2, is explained further with reference to FIGURE 3A. In an example embodiment, the apparatus 200 may further be caused to perform fusion of the interpolated first lum inance data I LI and the interpolated second luminance data 1L2 to generate the extended dynamic range luminance image EL. In an example embodiment, the apparatus 200 may further be caused to perform fusion of the extended dynamic range luminance image EL with the first chrominance data CI to generate an extended dynamic range color im age . I n some ot h er em bod i ments , the ap paratus 200 may be caused to pre-process the color image C for improving the first chrominance data G1, and the Improved first chrominance data CI may be utilized for generation of the extended dynamic range color image. [0069] In an embodiment of pre-processing the color image C, the color image C may be pre-processed such that the pre-processed color image may have intensity similar to intensity of the panchromatic image P. In an example embodiment, for pre-processing the color image C, the apparatus 200 may be caused to generate an intermediate color image IC from the color image C. In this example embodiment of generating the intermediate color image IC from the color image C, the apparatus 200 is caused to perform pre-processing of the first luminance data L1 and the second luminance data L2 for reducing noise components in the first luminance data L1 and the second luminance data L2, to thereby generate a pre-processed first luminance data (PL1 ) and a pre-processed second luminance data (PL2). In an example embodiment, the noise components in the first luminance data L1 and the second luminance data L2 may be reduced by applying a smoothing method to the first luminance data L1 and the second luminance data L2. It will be noted that the term 'smoothing' may refer to a technique in image processing that facilitates in blurring images and removing detail and noise. In an example embodiment, a processing means may be configured to perform pre-processing of the first luminance data L1 and the second luminance data L2. An example of the processing means may include the processor 202, which may be an example of the controller 108.

[0070] In this example embodiment of generating the intermediate color image IC from the color image C, the apparatus 200 is further caused to compute, for a pixel of a plurality of pixels of the color image C, a corresponding gain factor based on a comparison of corresponding first luminance components (LC1 ) and corresponding second luminance components (LC2) associated with the pixel. In an example embodiment, the corresponding first luminance components LC1 and the corresponding second luminance components LC2 for pixels may be derived from the pre-processed first luminance data PL1 and the pre-processed second luminance data PL2, respectively. It will be noted that in a manner similar to the process of the computation of the corresponding gain factor for the pixei, as described above, other gain factors for pixels {other than the pixel) of the plurality of pixels of the color image C may also be computed. In an example embodiment, a processing means may be configured to compute the corresponding gain factor for the pixel. An example of the processing means may include the processor 202, which may be an example of the controller 108.

[0071] In this example embodiment of generating the intermediate color image IC from the color image C, the apparatus 200 is further caused to multiply, for the pixel, the corresponding gain factor with a corresponding pixel value to determine a corresponding pixel value for a corresponding pixel of the intermediate color image IC. In an example embodiment, determining a corresponding pixel value for a corresponding pixel of the intermediate color image IC may facilitate in determination of the second chrominance data (C2) associated with the intermediate color image IC. It should be noted that in a manner similar to the process of determination of the corresponding pixel value of the corresponding pixels of the intermediate color image IC, as described above, other corresponding pixel values of other pixels (Other than the corresponding pixel) of the intermediate color image IC may also be generated. [0072] In an example embodiment, the apparatus 200 may be caused to further de- noise the second chrominance data C2 of the intermediate color image IC. The second chrominance data C2 of the intermediate color image IC may be de-noised based on one or more de-noising methods such as a bilateral filtering, minimum spanning tree based filtering, block matching and 3D filtering, and the like. In an example embodiment, a processing means may be configured to multiply, for the pixel, the corresponding gain factor with the corresponding pixel value to generate the corresponding pixel of the intermediate color image IC. An example of the processing means may include the processor 202, which may be an example of the controller 108. [0073] In another example embodiment of generating the intermediate color image IG from the color image C for improving the chrominance data of the color image C, the apparatus 200 is caused to invert the color image G to generate a color negative image (CN). The color negative image CN may represent an image that may be inverted from its original colors to its negative colors. In an example embodiment, if a pixel on the color image C has (R,G,B) on three color channels, the pixel at the same position on the color negative image CN may have (255-R, 255-G, 255-B) on the three color channels, and so the color negative image CN may appear hazy. In an example embodiment, a method of generating the color negative image CN is based on applying a haze imaging model to the color image C. in an example embodiment, the haze imaging model may be expressed as l{x) ■ = J(x)t(x) + A(1 - t{x)), where I represents observed intensity, J represents scene radiance, A represents global atmospheric light, and t represents a medium transmission describing a portion of light that is not scattered and reaches a camera. In an example embodiment, a processing means may be configured to invert the color image C to generate the color negative image CN. An example of the processing means may include the processor 202, which may be an example of the controller 108. [0074] In an example embodiment of generating the intermediate color image IC from the color image C, the apparatus 200 is caused to de-haze the color negative image CN for recovering the chrominance data and luminance data of the color negative image CN. In an embodiment, the apparatus 200 may be caused to de-haze the color negative image CN to generate a de-hazed image (DH) associated with the color image C. Herein, the term 'de-hazing the color negative image' may refer to a method of removing haze from the color negative image CN to thereby increase visibility and sharpness of the image. One such technique of de- hazing the color negative image CN is set forth in Kaiming He, Jian Sun and Xiaoou Tang, "Single Image Haze Removal Using Dark Channel Prior" in 2009 IEEE conference on Computer Vision and Pattern Recognition (CVPR 2009). In an example embodiment, a processing means may be configured to de-haze the color negative image CN for recovering the de-hazed image DH. An example of the processing means may include the processor 202, which may be an example of the controller 108. In an example embodiment of generating the intermediate color image IC from the color image C, the apparatus 200 is caused to invert the de-hazed image DH to generate the intermediate color image IC. In an example embodiment, the intermediate color image IC may include a third luminance data (L3) and a second chrominance data C2. In an example embodiment, a processing means may be configured to invert the de-hazed image DH to generate the intermediate color image IC. An example of the processing means may include the processor 202, which may be an example of the controller 108. In an example embodiment, the apparatus 200 may be caused to generate the extended dynamic range luminance image EL by fusing the first luminance data L1 from the panchromatic image P, the second luminance data L2 from the color image C, and the third luminance data L3 from the intermediate color image IC. In an example embodiment, the apparatus 200 may be caused to generate the extended dynamic range color image (EC) by fusing the extended dynamic range luminance image EL with the second chrominance data C2.

[0075] In an example embodiment, the intermediate color image IC may be generated by processing the color image C for removal of low-light portions associated with the color image C. Herein, the 'low-light portions of the color image' may refer to those portions of the image that are associated with lower illumination as compared to rest of the portions. In an embodiment, the processing the color image C for removal of low-light portions may be performed to increasing illumination or brightening the low-light portions. In an example embodiment, the portions corresponding to low-light portions of the processed color image (or the intermediate color image IC) may contain noise, and may further be processed to de-noise the intermediate color image I C. In an example embodiment, the de-noising of the intermediate color image IC may be performed based on the panchromatic image P, as the panchromatic image P is associated with much less noise as compared to the noise in the color image G and the intermediate color image IC. in an example embodiment, the apparatus 200 ma be Caused to perform a noise filtering of the color image based on the first luminance data associated with the panchromattc image.

[0076] Ih this exampte embodiment of de-noising the intermediate color image IG, the apparatus 200 may be caused to decompose the intermediate color image IG into an initial third luminance image IL3 and an initial second chrominance image IC2. Herein, 'decomposing the intermediate color image IC' may refer to separating the intermediate color image IC image into corresponding luminance components and chrominance components In an example embodiment, a processing means may be configured to decompose the intermediate color image IC. An example of the processing means may include the processor 202, which may be an example of the controller 108. In this example embodiment of de-noising the intermediate color image IC, the apparatus 200 may be caused to determine, for at least one portion of the panchromatic image P, a weight information based on a difference of gray pixel values of neighboring pixels in the at least one portion. The weight information (or weight masks) for the at least one portion of the panchromatic image P may be determined (or extracted) from the panchromatic image P to measure information richness surrounding the at least one portion. In ah embodiment, the apparatus 200 may be caused to determine the weight information for a plurality of portions of the panchromatic image P. In some example embodiments, the plurality of portions of the panchromatic image P ma be associated with pixel groups in the panchromatic image P. In some example embodiments, the plurality of portions of the panchromatic image P may collectively form the complete panchromatic image P. in some example erabodiments, the plurality of portions of the panchromatic image P may not collectively form the complete panchromatic image P, but may configure the panchromatic image P partially. In an example embodiment, the difference between a gray pixel value of the pixel and a gray pixel value of a neighboring pixel may assist in providing the weight information of the pixel. A method of determining the weight information associated with the portions/pixels of the panchromatic image P is described further with reference to FIGURE 5B. In an example embodiment, a processing means may be configured to determine the weight information for the at least one portion of the panchromatic image P. An example of the processing means may include the processor 202, which may be an example of the controller 108.

[0077] In this example embodiment of de-noising the intermediate color image IC, the apparatus 200 is caused to perform selective filtering of at least one portion of the intermediate color image IC based on the weight information of the at least one portion of the panchromatic image P, wherein the at least one portion of the intermediate color image IC is corresponding to the at least one portion of the panchromatic image P. In an example embodiment, the apparatus 200 is caused to decompose the intermediate color image IC to generate components of the intermediate color image IC, wherein the components of the intermediate color image IC includes a third luminance data L3 and a second chrominance data C2 In this example embodiment of de-noising the intermediate color image !C, the apparatus 200 is caused to perform selective filtering of portions of the initial third luminance image IL3 and the initial second chrominance image IC2 based on the weight information associated with the corresponding portions/pixels in the panchromatic image P. Hence, noise is filtered out from the initial third luminance image IL3 and the initial second chrominance image IC2 to generate the third luminance data L3 and the second chrominance data C2, respectively. A method of performing the selective filtering of the portions of the intermediate color image IC to generate the third luminance data L3 and the second chrominance data C2 is described further with reference to FIGURE 5C. In an example embodiment, a processing means may be configured to perform the selective filtering of the portions of the intermediate color image IC to generate the third luminance data L3 and the second chrominance data C2. An example of the processing means may include the processor 202, which may be an example of the controller 108. In this example embodiment, the apparatus 200 may be caused to generate the extended dynamic range luminance image EL by fusing the first luminance data L1 from the panchromatic image P, the second luminance data L2 from the color image C, and the third luminance data L3 from the intermediate color image IC.

[0078] Various embodiments of generation of the extended dynamic range luminance image EL of the scene based on the first luminance data L1 from the panchromatic image P, the second luminance data L2 from the color image C, and the third luminance data L3 from the intermediate color image IC are explained further with reference to FIGURES 8, 9A-9B, and 11. In various embodiments, the extended dynamic range luminance image EL of the scene is generated based on the first luminance data L1 and the second luminance data, only. Such embodiments are explained with reference to FIGURES 6 and 7.

[0079] It will be understood that in various embodiments the extended dynamic range luminance image EL may be generated based on the first luminance data L1 from the panchromatic image P and the second luminance data L2 from the color image C only. However, utilization of such extended dynamic range luminance image may result in generation of the extended dynamic range color image that may contain noise and/or may be associated with poor color quality. Various embodiments as described above and in conjunction with FIGURES 3A to 12C may facilitate in preprocessing at least one of the panchromatic image P and the color image C to generate the intermediate color image IC (and intermediate chrominance data) that may be utilized for generating high quality extended dynamic range color image. For instance, in various example embodiments, the apparatus 200 may be caused to generate the extended dynamic range color image EC of the scene by fusing the extended dynamic range luminance image EL with a second chrominance data C2 (or a chrominance data derived from an intermediate color image IC and/or an intermediate luminance image). In various other example embodiments, the extended dynamic range color image EC of the scene is generated based on fusing the extended dynamic range luminance image EL with an interpolated first chrominance data (IC1 ). An example of the extended dynamic range color image EC that is generated based on fusing the extended dynamic range luminance image EL with the interpolated first chrominance data IC1 is shown and described with reference to FIGURE 3B.

[0080] FIGURE 3A illustrates an example representation of generation of an extended dynamic range luminance image, in accordance with an example embodiment. FIGURE 3A illustrates example representations of generation of the extended dynamic range color image EC, in accordance with an example embodiment. As discussed with reference to some embodiments of FIGURE 2, the extended dynamic range luminance image EL may be generated by fusing at least the first luminance data L1 associated with the panchromatic image P and the second luminance data L2 of the color image C. In some embodiments, the color image C may be pre-processed for improving the quality of color in the extended dynamic range color image EC that may be generated by fusing the extended dynamic range luminance image EL with the color image C. [0081] In an example embodiment, the color image C may be decomposed into the second luminance data L2 and the first chrominance data C1. In an example embodiment, for performing pre-processing of the color image C, the second luminance data L2 may be interpolated (a first level interpolation of the two-level interpolation) using an interpolation method, for example an 8-neigborhood interpolation method to generate an (first-level) interpolated second luminance data. In some embodiments, the first luminance data L1 and the (first-level) interpolated second luminance data may be images of similar size, but may be distorted due to a capture distance D and the baseline distance d, associated with the panchromatic camera and the Bayer camera. In an embodiment, the first luminance data L1 and the (first-level) interpolated second luminance data may further be processed to minimize the effects of distortion.

[0082] In an example embodiment, the processing of the first luminance data L1 and the (first-level) interpolated second luminance data may be performed by interpolating the first luminance data L1 and the (first-level) interpolated second luminance data to generate an interpolated first luminance data IL1 and a (second-level) interpolated second luminance data IL2. In an example embodiment, the first luminance data L1 and (first-level) interpolated second luminance data may be interpolated based on an 8-neigborhood interpolation method. In an example embodiment, the interpolated first luminance data IL1 and the interpolated second luminance data !L2 may be fused to generate the extended dynamic range luminance image EL.

[0083] As shown in FIGURE 3A, an extended dynamic range luminance image EL is illustrated. Particularly, a portion 300 of the extended dynamic range luminance image EL is illustrated. It will be noted that portion 300 is a pixel-level diagram of a portion of the extended dynamic range luminance image EL and is representative of the extended dynamic range luminance image EL, and is shown for the purpose of explanation of the 8-neighbourhood interpolation method. The same 8-neighbourhood interpolation method may be extended throughout the image to generate the complete extended dynamic range luminance image EL. The pixel-level representation of the extended dynamic range luminance image EL 300 illustrates gray pixels 302, 304, 306, 308, 310, 312, 314, 316 and 318 as the pixels of the extended dynamic range luminance image. Herein, the pixels 302, 306, 310 and 314 may be the gray pixels of the panchromatic image P, and may be derived from the interpolated first luminance data ILL Also, a gray pixel, for example a gray pixel 304 may be the pixel of the color image C and may be derived from the interpolated second luminance data IL2. It will be noted that in manner similar to a process of the fusion of the gray pixels 302, 304, 306, 310 and 314, other gray pixels from the interpolated first luminance data IL1 and the interpolated second luminance data IL2 may be fused to generate the extended dynamic range luminance image EL. [0084] By performing the 8-neighborhood interpolation for the first luminance data L1 and the second luminance data L2, a number of pixels in an image may be increased and missing color (gray) information for the pixels may be determined. For instance, for a pixel G represented as the pixel 318, missing color (gray) information associated with the pixel 318 is determined by the 8-neigborhood interpolation method. The missing color information of the pixel 318 may be approximated by taking an average of pixel values of pixels in the neighborhood of the pixel 318, for example, the 8-neighborhood pixels including the gray pixels Gn - G 18 that are represented as the gray pixels 302, 304, 306, 308, 310, 312, 314, and 316. Since the gray pixels 302, 304, 306, 310 and 314 include some color (gray) information, the missing color (gray) information of the pixel 318 is interpolated from known samples of corresponding components at the gray pixels 302, 304, 306, 308, 310, 312, 314, and 316.

[0085] Referring to FIGURE 3B, a pixel-level representation of a portion of the interpolated first chrominance data IC1 is illustrated. The first chrominance data C1 of the color image C may be pre-processed for improving color information associated with the color image C. In some example embodiments, the color image C may be smaller in size as compared to the extended dynamic range luminance image EL, and therefore may be pre-processed prior to performing fusion of the first chrominance data C1 from the color image C with at least the first luminance data L1 from the panchromatic image P. In an example embodiment, the color image C may be smaller in size due to a spatial resolution of the second image sensor associated with the Bayer camera being lower {for example, a 2 megapixel camera sensor) than a spatial resolution of the first image sensor associated with the panchromatic camera (for example, a 5 megapixel camera sensor), as detail information in a resulting image, for example the extended dynamic range luminance image EL, is mainly composed from the panchromatic image P. Herein, 'spatial resolution' may refer to number of pixels used in constructing an image. In one example, quality of the color image C and image quality of a dual camera (including the panchromatic camera and the Bayer camera) can be improved if pixel size of pixels in the second image sensor associated with the Bayer camera is more, leading to lesser number of pixels for a same sensor area. In an embodiment, for performing pre-processing of the color image C, the first chrominance data C1 may be interpolated (2-level interpolation) by using an interpolation method, for example a bilinear interpolation method, to generate the interpolated first chrominance data IC1. As shown in FIGURE 3B, the pixel-level representation of the interpolated first chrominance data IC1 320 illustrates color pixel sets 322, 324, 326, and 328 as color pixel sets of the interpolated first chrominance data IC1. For instance, the color pixel set 322 includes color pixels corresponding to Red, Green, Blue, and Green colors represented as RL GH, B 1 f and G 12 , the color pixel set 324 includes color pixels R 2 , G 2 , B 2 , and G 2 2, the color pixel set 326 includes color pixels R 3 , G 31 , B 3 , and G 32 , and the color pixel set 328 includes color pixels R 4 , G 41 , B 4 , and G 42 . The extended dynamic range luminance image EL and the interpolated first chrominance data 1C1 are fused to generate the extended dynamic range color image EC. It should be noted that in a manner similar to a process of the fusion of pixels from the extended dynamic range luminance image EL and the interpolated first chrominance data IC1 , as described above, other pixels from the extended dynamic range luminance image EL and the interpolated first chrominance data IC1 are fused to generate the extended dynamic range color image EC.

[0086] By performing the bilinear interpolation for the first chrominance data C1 in two levels, number of pixels in the intermediate color image IC is increased and missing color information (two of the three primary colors) for the pixels may be determined. For instance, for a pixel 330, missing color (green) information associated with the pixel 330 may be determined by the bilinear interpolation method. The missing color (green) information of the pixel 330 may be interpolated by taking a linear average of neighboring pixels representing the missing color (green) information. For example, to determine a missing green color information for the pixel 330, components from the neighboring pixels representing the green color information from the color pixel sets 322, 324, 326 and 328 may be averaged. Hence the missing green color information of the pixel 330 or G is interpolated as (G 2 + G 22 + G 3 + G 41 )/4, where G 2 , G 22 ,G 31 and G 41 are representative of the green color information (green pixel values) in color pixel sets 322, 324, 326 and 328, respectively.

[0087] FIGURES 4A, 4B and 4C illustrate an example representation of a method for alignment of images, for example a panchromatic image P and a color image C, for facilitating generation of an extended dynamic range color image EC, in accordance with an example embodiment. Herein, the color image C may be decomposed into a decomposed panchromatic image DP and a decomposed color image DC. In an example embodiment, the alignment of the panchromatic image P and the color image C may be performed by determining sets of feature pixels in the panchromatic image P and the decomposed panchromatic image DP followed by a bi-directional pixel matching between the corresponding sets of feature pixels. Based on the bidirectional pixel matching, shift values between the corresponding feature pixels of the panchromatic image P and the decomposed panchromatic image DP may be determined. The shift values between the corresponding feature pixels of the panchromatic image P and the decomposed panchromatic image DP may be utilized for determining the shift values between non-feature pixels of the panchromatic image P and the decomposed panchromatic image DP, Based on the shift values between non-feature pixels of the panchromatic image P and the decomposed panchromatic image DP, the panchromatic image P and the color image C may be aligned.

[0088] In an example embodiment, the sets of feature pixels of the panchromatic image P and the decomposed panchromatic image DP may be determined based on a bidirectional pixel matching between the panchromatic image P and the decomposed panchromatic image DP, For example, referring to FIGURE 4A r the panchromatic image P (for example, an image 402) and a decomposed panchromatic image DP (for example, an image 404) are illustrated. In an example embodiment, the image 402 may include a set of feature pixels FP1, for example, feature pixels 406a, 406b, 406c, and the like. It Will be noted that the set of feature pixels FP1 includes more number of pixels than the pixels 406a, 406b, and 406c, however, for the sake of brevity of description, only three of the feature pixels are shown as numbered in FIGURE 4A. As illustrated in FIGURE 4A, the set of feature pixels may include the pixels shown as dots in the image 402. In an example embodiment, the set of feature pixels FP1 may be determined using the Hessian matrix. For example, for the pixels in the panchromatic image P and the decomposed panchromatic image DP, a Hessian matrix and corresponding eigenvalues may be computed to determine the set of feature pixels FP1 and the corresponding set of feature pixels FP2 in the panchromatic image P and the decomposed panchromatic image DP, respectively. As used herein, the Hessian matrix is a symmetric matrix, including the eigenvalues, which is used by a Hessian affj . ne detector to detect features of an image. A local maxima may be determined by comparing pixels of the panchromatic image P or the color image C with a set of neighboring pixels. A pixel may be determined as the local maxima if the neighboring pixels are below a threshold maxima, and the pixel is above the threshold maxima. The pixel is hence selected as a feature pixel. It will be noted that in a manner similar to the process of the determination of the feature pixel using the Hessian matrix, as described above, other feature pixels from the set of feature pixels FP1 and the corresponding set of feature pixels FP2 (for example, feature pixels 408a, 408b, and 408c) may be determined for the image 404. It will be noted that the corresponding set of feature pixels FP2 may include more number of pixels than the pixels 408a, 408b, and 408c, however, for the sake of brevity of description, only three of the feature pixels are shown as numbered in FIGURE 4A. As illustrated in FIGURE 4A, the corresponding set of feature pixels FP2 may include the pixels shown as dots in the image 404.

[0089] In an example embodiment, the bi-directional pixel matching is performed between the set of feature pixels FP1 in the image 402 with the corresponding set of feature pixels FP2 in the image 404 in order to determine a corresponding first plurality of shift values S1 between the set of feature pixels FP1 and the corresponding set of feature pixels FP2. In an embodiment, the feature pixels in the set of feature pixels FP1 and the corresponding set of feature pixels FP2 may be represented as a feature vectors. For example, a feature pixel 406a in the image 402 may be represented as a feature vector V{P). In an example embodiment, the feature vector V(P) may be a SURF (Speeded-up Robust Features) feature vector.

[0090] In order to determine a shift value Sn for the feature pixel 406a, the feature vector V(P) may be matched with a corresponding feature vector in the image 404. For the feature vector V(P) in the image 402, a local searching may be performed within a corresponding neighborhood region 410 in the image 404. The corresponding neighborhood region 410 may include one or more feature vectors. In an embodiment, a similarity score may be assigned to the vectors of the one or more feature vectors in the corresponding neighborhood region 410 based on a similarity of the one or more feature vectors with the feature vector V(P). In an example embodiment, the feature vector with a highest similarity score and having a similarity score higher than a pre-defined threshold similarity score may be determined as a corresponding feature vector V(DP) for the feature vector V(P). In an example embodiment, the feature vector V(DP) may be a SURF feature vector.

[0091] In an example embodiment, for the feature vector V(DP), a similar matching process (as described above) may be repeated in a corresponding neighborhood region, for example, a region 412 in the image 402, in order to confirm that the feature vector V(P) and the feature vector V(DP) are a best match. The shift value Sn may thereby be determined by taking a difference of the feature vector V(P) and the feature vector V(DP). By performing such bidirectional pixel matching for remaining feature pixels (other than the feature pixel 406a) of the set of feature pixels FP1 of the image 402 (the panchromatic image P) with the corresponding set of feature pixels FP2 of the image 404 (the decomposed panchromatic image DP), remaining shift values (other than the shift value Si i) of the corresponding first plurality of shift values SI may be determined. The set of feature pixels FP1 , after the bi-directional matching, may then be defined as a collection S^FPI L S 11 ) 1 (FP1 2 S 12 ) , ... (FP1;, S TI ),... ,( FP1„, S 1N )}, where FP1 ={Xi, Yi} is a feature pixel location in an image plane, n is a number of matched feature pixels, and Si, is a corresponding shift value of FP1,.

[0092] In an example embodiment, based on the feature pixel locations, a mesh network associated with the image may be constructed for determining a second plurality of shift values S2 associated with non-feature pixels of the images 402 and 404. In an example embodiment, the feature pixels being determined for the image, for example the image 402, may be connected to form a mesh network. For example, as illustrated in FIGURE 4B, a local mesh network 414 associated with the image 402 may be constructed based on the set of feature pixels FP1. In an example embodiment, the mesh network 414 may have the properties of local linearity and local continuity, in an example embodiment, the shift value of pixels and neighbors thereof may lie on or close to a locall linear patch of the manifold, and therefore characterizes each local patch as a triangle structure. In an example embodiment, the mesh network 414 ma be constructed by applying a triangulation method, for exam le a Delaunay triangulation method, to the set of feature pixels FP1 of the image 402. In an example embodiment, based on the Delaunay triangulation method, regions or patches in the image 402 may be modeled with a triangle mesh. In an example embodiment, the Delaunay triangulatfon method may be utilized for modeling a plurality of triangle meshes for a plurality of regions or patches in the image 402, so as to define a set T comprising the plurality of triangle meshes. In an example embodiment, the set T comprising the plurality of triangle meshes may be defined as T={T1, T2,...Ti,...,Tm}, where m is the number of triangle meshes and Ti comprises three different points and their corresponding warping shift values from the collection S.

[0093] In an example embodiment, based on a linear geometr of the triangular meshes of the mesh network 414, the second plurality of shift values for non-feature pixels in the mesh network 414 of the image 402 may be determined. In a exampfe embodiment, the second plurality of shift values S2 for the non-feature pixels may be determined based on an interpolation operation being performed using the first plurality of shift values S1 of the set of feature pixels FPl. [0094] Referring to FIGURE 4C, an example representation of a method for determining the corresponding second plurality of shift values S2 for the plurality of non-feature pixels NFP1 of the image 402 (the panchromatic image P) is illustrated, in accordance with an example embodiment. In an example embodiment, the shift values of the corresponding second plurality of shift values S2 may be determined for a first set of non-feature pixels of the plurality of non-feature pixels NFP1 present within the local mesh network 414. In an example embodimen a triangle 416 (encircled by a circle with the vertices of the triangle touching the circle) in the local mesh network 4 4 is considered to describe the determination of the shift values for the first set of non-feature pixels present within the local mesh network 414. In an example embodiment, the triangle 416 may be represented by three vertices. For example, the triangle 416 of the mesh network 414 may be shown to include vertices, such as L(1), L(2), and L(3). Herein, the vertices L(1), L(2), and L(3) may correspond to the feature pixels FP1 t , FP1 2 , and FPI 3 0T the set of feature pixels FPL A triangle representing the vertices L(1), L(2), and L{3) of the triangle 416 is illustrated as a triangle 416(1 ). In an example embodiment, for each non- feature pixel, for example a non-feature pixel 418 in the triangle 416, based on local linearity, location of the non-feature pixel 418 may be determined using an interpolation operation on the location of the three vertices L(1), L(2), and L(3) and corresponding shift values 8(1), 8(2), and S(3), respectively. A triangle representing the corresponding warping shift values of shift values S(1 ), S(2), and S(3) at the three vertices L(1 ), L(2), and L(3) of the triangle 416 is illustrated in a triangle 416(2).

[0095] In an example embodiment, location of the non-feature pixel 418 may be given as N ~ W k L(k), where k = 1 , 2, 3 and k is a weight coefficient. In an example embodiment, the weight coefficient W k may be determined as per the following equation (5):

N = WL{\) + W 2 L(2) + W 3 L(3)

→N = (i(l) L(2) L(3)) - (W, W 2 W

=> (i(l) L{2) L(3)) T - N = {L(l) L{2) Z(3)) r (Z(l) L{2) i(3)) - (i , W 2 w

1(1)1(1) i(l)i(2) Z(l)i(3)

L(2)N = L(l)L(2) Z(2)Z(2) £(2)£(3) (^ w 2 w

L(3)N ,i(l) (3) L(2)Z(3) i(3)i(3),

(5)

[0096] Using the above equation (5), a shift value (S N ) of the non-feature pixel N may be estimated as a linear parameter combination with similar corresponding weight coefficients and may be represented as per the following equation (6):

S„=∑ k W k S(k) (6) [0097] In an embodiment, based on equation (6), the shift values for the plurality of non-feature pixels of a triangle, for example, the triangle 416 of the mesh network 414 may be determined. In an example, a triangle represented as triangle 416(3) may be determined showing a vector representation for the plurality of non-feature pixels within the triangle 416 in addition to the vector representation for pixels at the vertices of the triangle 416.

[0098] Referring now to FIGURE 4D, a plurality of shift values of the corresponding second plurality of shift values S2 are determined for a second set of non-feature pixels of the plurality of non-feature pixels NFP2 present outside the local mesh network 414. To determine the shift values for the non-feature pixels NFP2 present outside the local mesh network 414, the local mesh network 414 may be expanded by extending two sides of the triangles in the set of triangles till intersection of the extended sides with an image border, for example, an image border 420 is reached. In an embodiment, the intersection of the extended sides with the image border 420 may configure a polygon. For example, on extending the sides of the triangle 416 the extended sides may intersect the image border 420 of the image 402 to generate a polygon 422. The polygon 422 is considered to describe the determination of the second plurality of shift values S2 for the second set of non-feature pixels NFP2 present within the iocal mesh network 414. For a non-feature pixel, for example, a pixel 424 in the polygon 422, a triangle (for example, a triangle 416) that is extended to generate the polygon 422 is represented with the three vertices L(1 ), L(2), and L(3). A triangle representing the vertices L(1 ), L(2), and L(3) corresponding to the feature pixels FPI ^ FP1 2 , and FP1 3 of the triangle 416 in the polygon 422 is illustrated as 422(1 )). A spatial location of the non-feature pixel N (shown as L N in the FIGURE 4D) is determined using an interpolation operation on location of three vertices L(1), L(2), and L(3) and corresponding shift values S(1 ), S(2), and S(3). A polygon representing the vertices L(1 ), L(2), and L(3) and corresponding shift values S(1), S(2), and S(3) in the polygon 422 is illustrated as a polygon 422(2)). In an embodiment, a shift value S N for the non-feature pixel 424 (as illustrated in the polygon 422(2)) may be determined in a manner similar to the determination of the shift value S N of the non-feature pixel 418 in the triangle 416, by using the equations (5) and (6). A polygon 422(3) may be determined that may provide a vector representation for the non-feature pixels (the second set of non-feature pixels) present outside the triangle 416 in addition to the vertices of the triangle 416.

[0099J FIGURES 5A, 5B and 5C illustrate an example representation of a method for performing de-noising during generation of an extended dynamic range color image EC, in accordance with an example embodiment. In an example embodiment, for generation of the extended dynamic range color image EC, a luminance image and a chrominance image associated with a scene may be captured; and at least a luminance data from the panchromatic image P and a luminance data from the color image C may be fused to generate an extended dynamic range luminance image EL The extended dynamic range luminance image EL may then be fused with a chrominance data from the color image C to generate the extended dynamic range color image EC. In some scenarios, the color image C may be pre-processed so as to improve the chrominance data (for example, a first chrominance data C1 ) associated with the color image C.

[00100] In an example embodiment, the color image C may be pre-processed such that the pre-processed color image may have intensity similar to the intensity of the panchromatic image P. In an example embodiment, the pre-processed color image (also referred to as, the intermediate color image IC) may be associated with lower noise components. In an example embodiment, the intermediate color image IC may be generated by processing the color image C for removal of low-light portions associated with the color image C. Herein, the 'low-light portions of the color image' may refer to those portions of the image that are associated with lower illumination as compared to rest of the portions. In an embodiment, the processing the color image C for removal of low-light portions may be performed to increasing illumination/brightening the low-light portions. In an example embodiment, the portions corresponding to low-light portions of the processed color image (or the intermediate color image IC) may contain noise, and may further be processed to de-noise the intermediate color image IC.

[00101] Referring to FIGURE 5A, an implementation of de-noising of the intermediate color image IC is illustrated, in accordance with an example embodiment. In an example embodiment, the de-noising of the intermediate color image IC, for example, an intermediate color image 502 may be performed based on the panchromatic image P, for example, a panchromatic image 504 associated with the intermediate color image 502. In an example embodiment, the de-noising of the intermediate color image 502 is performed based on the panchromatic image 504 as the panchromatic image 504 is associated with much less noise as compared to the noise in the color image C and the intermediate color image 502.

[00102] The intermediate color image 502 is shown to include a pixel (xO, yO) represented as 506. The intermediate color image 502 may be decomposed into an initial third luminance image IL3 and an initial second chrominance image IC2. In order to de-noise a pixel (xO, yO) (represented as 506) in the intermediate color image 502, an information richness associated with a corresponding pixel (x1 , y1) (represented as 508) in the panchromatic image 504 may be utilized, in an example embodiment, the information richness associated with the corresponding pixel 508 may be determined based on a weight information computed for the corresponding pixel 508. In an example embodiment, the information richness of the corresponding pixel 508 may be computed by performing a weight mask computation on a local area of the corresponding pixel 508 in the panchromatic image 504. Herein, the term 'weight mask' may refer to a reproduced area or pattern that may include representations of critical regions of an image by means of weights for use in selective area correction. Hence, by performing the weight mask computation on a local area of the corresponding pixel 508 of the panchromatic image 504, a weight mask 510 (represented as M(x0, yO)) may be determined. In an embodiment, the weight mask 510 may be applied to the pixel 506 in the intermediate color image 502. In an example embodiment, the pixels of the intermediate color image 502 that are corresponding to those pixels of the panchromatic image 504 and are associated with a weight information less than a threshold weight may be filtered smoothly. In an embodiment, selective filtering of the pixels of the panchromatic image 504 based on corresponding weight masks associated with those pixels may facilitate in removal of noise at the corresponding pixels in the intermediate color image 502. The weight mask computation for the pixels of panchromatic image 504 is explained further in detail with reference to FIGURE 5B and FIGURE 5C.

[00103] Referring to FIGURE 5B, weight mask computation of the local area, for example a local area 520, of a pixel, for example the pixel 508 in the panchromatic image 504, is illustrated. In an example embodiment, the weight mask computation for a pixel in the panchromatic image 504 may be performed based on a difference of gray pixel values of that pixel and neighboring pixels in the panchromatic image 504. Herein, the local area 520 corresponding to the pixel 508 is illustrated as a box of size L * L that is centered at the pixel 508 of the panchromatic image 504. In an embodiment, gray pixel values of the pixel 508 and a neighboring pixel 522 of the panchromatic image 504 may be represented as g(x0, yO) and g(p,q), respectively. In an example embodiment, for the neighboring pixel 522, a difference (Δ) of the gray pixel values of the neighboring pixel 522 and the pixel 508 may be represented as per the following equation (7):

A=|g (P ,q)-g (xO, y0)| (7)

[00104] In an example embodiment, the weight information corresponding to the neighboring pixel (p, q) may include a weight element w(p, q) in the weight mask 510. In an example embodiment, based on the difference Δ (determined as per equation (7)), the weight element w(p, q) (represented as 524) for the neighbouring pixel 522 may be determined as per the following equation (8): where, T is a threshold pixel value associated with the pixel 508.

[00105] In an example embodiment, a value of the weight element w(p, q) may facilitate in measuring information richness inside the local area 520 of the pixel 508. For instance, if the local area 520 is blank, most of the differences Δ may equal zeroes, and values of corresponding weight elements w(p, q) may be equal to T, thereby implying that most pixels in the local area 520 contribute equal weights for the subsequent de-noising (or filtering) of the image 502. However, if the pixel 508 is an edge point or a texture point, most weight elements w(p, q) in the local area 520 are zeroes, and do not contribute to the de-noising of the intermediate color image 502, thereby implying that the pixel associated with the edge point or the texture point may be retained as it is informative in the image 502. In an example embodiment, the values of T and L may determine extent of the de-noising (or filtering power). The value of weight element w (xO, yO) is represented as 526 in FIGURE 5B. Herein, it will be noted that the description of computation of the weight mask for the intermediate color image as explained with reference to FIGURES 5A and 5B is equally applicable to the component images, such as the initial third luminance image 1L3 or the initial second chrominance image IC2 of the intermediate color image 502.

[00106] Referring now to FIGURE 5C, where component image 530 of an intermediate image, for example the intermediate color image 502, is illustrated. It will be noted that the component image 530 of the intermediate color image 502 may be one of luminance component of the intermediate color image 502, the initial third luminance image 1L3 and the initial second chrominance image 1C2 of the intermediate color image 502. In an example embodiment, the components or component image 530 of the intermediate color image 502 (for example, the luminance component of intermediate color image 502 or the chrominance components of the intermediate color image 502) may be de-noised by selectively filtering the portions/pixels of respective components/component image based on the weight masks as determined from above methods. [00107] In the component image 530 of the intermediate color image 502, corresponding to a pixel 532, a local area 534 of size L * L that is also centered at the pixel 532 is determined. A weight mask, for example the weight mask 510, may be determined for the component image 530. If a gray pixel value of the pixel 532 in the image 530 is f(x0, yO) and a gray pixel value of a neighboring pixel 536 in the image 530 is f(p, q), weight elements of pixels in the local area 534 are determined, and new pixel values of the component image 530 after de-noising or filtering, may be determined as per the following equation (9): ∑ »?) */(Ρ.ί)

(9)

L*L where, f(p,q) is the gray pixel value of the pixel (p, q) in the local area 534, of size L*L Herein, f(p,q) is represented as 532. In an embodiment, the gray pixel values may be updated pixel by pixel by applying a corresponding weight mask, for example the weight mask M(x0, yO) {represented as 510) to filter noises present in the component image 530, with reliable weighting information (the weight elements w(p, q)) obtained from the image 504. The image 530, after the de-noising or the filtering, is represented as an image 538 in FIGURE 5C. Also, the pixel corresponding to the pixel 532 is represented as pixel 540 in the image 538.

[00108] It will be noted that FIGURES 3A to 5C are provided for the representation of examples only, and should not be considered limiting to the scope of the various example embodiments. It should be noted that the extended dynamic range color images generated using above methods provide an image of high quality.

[00109] FIGURE 6 is a flowchart depicting an example method 600 for generation of an extended dynamic range color image (EC), In accordance with an example embodiment. Example references are made to FIGURES 2 to 5C for the description of the method 600. The method 600 depicted in the flowchart may be executed by, for example, the apparatus 200 of FIGURE 2.

[00110] At 602, the method 600 includes facilitating receipt of a panchromatic image (P) and a color image (C) associated with a scene. In an example embodiment, the panchromatic image P may include a first luminance data (L1), and the color image C may include a second luminance data (L2) and a first chrominance data (C1 ). In an example embodiment, the panchromatic image P may be captured by a first image sensor of the one or more image sensors 208 (FIGURE 2) in a panchromatic camera. In an example embodiment, the panchromatic camera may be present in or otherwise accessible to the apparatus 200. In an example embodiment, the color image C may be captured by a second image sensor of the one or more image sensors 208 (FIGURE 2) in a Bayer camera present in or otherwise accessible to the apparatus 200. In an example embodiment, a baseline distance between the panchromatic camera and the Bayer camera may be less than a threshold baseline distance. In an example embodiment, if the baseline distance is much lesser than the threshold baseline: distance, the panchromatic image P and the color image C can be aligned easily (pixel to pixel) using techniques known in the art. [00111] In an example embodiment, the exposures of the panchromatic camera and the Bayer camera may be metered for facilitating the panchromatic camera and the Bayer camera to capture Shadow portions and highlight portions of the scene. In an example embodiment, metering the exposures of the panchromatic camera and the Bayer camera facilitates in capturing the panchromatic image P and the color image C, as an overexposed image and an underexposed image, respectivel of the scene. In an example embodiment, the exposures of the panchromatic camera and the Bayer camera may be metered by adjusting one or more exposure parameters associated with the panchromatic camera and the Bayer camera. The one or more exposure parameters include an exposure time, gain and aperture, in an example, given a sensitivity of the panchromatic camera, Using an identical exposure will result in a significantly brighter image, thus resulting in the color image G being underexposed with respect to the panchromatic image P. in an example embodiment, the exposures of the panchromatic camera and the Bayer camera may be metered by adjusting an exposure time associated with the panchromatic camera to be equal to an exposure time associated with the Bayer camera. In other example embodiments, the gain and the aperture can also be varied to meter the exposures of the panchromatic camera and the Bayer camera.

[00112] In an example embodiment, the terms 'panchromatic image' and 'color image' may be indicative of the first luminance data L1 , and the second luminance data L2 and the first chrominance data C1 , respectively. For example, in case an image of the scene is captured using a Camera embodying a GFA based sensor, then instead of capturing the panchromatic image P and the color image 0, the GFA based sensor may capture the first luminance data L1 , the second luminance data L2 and the first chrominance data C1 from the scene. The GFA based sensor includes a filter array of transparent filters and color filters such that the filter array of transparent filters and color filters may be utilized for capturing the first luminance data L , and the second luminance data L2 and the first chrominance data G1 , respectively. In an example embodiment, an exposure of the filter array of transparent filters and color filters of the camera may be metered. In an example embodiment, metering the exposures of the filter array of transparent filters and color filters facilitates in generating the panchromatic image P as an overexposed image and the color image C as an underexposed image of the scene. [00113] At 604, the method 600 includes generating an extended dynamic range luminance image (EL) based at least on the first luminance data L1 and the second luminance data L2. In an example embodiment, the extended dynamic range luminance image EL is generated by fusing the first luminance data L1 and the second luminance data L2. In an example embodiment, the fusion of the first luminance data L1 and the second luminance data L2 is performed by one or more image fusion methods, for example principal component analysis based image fusion and wavelet transform image fusion. In some embodiments, the extended dynamic range luminance image EL may be generated based on fusing the first luminance data L1 , the second luminance data L2, and an intermediate luminance data (for example, the third luminance data L3), wherein the intermediate luminance data may be generated by performing processing of the color image. Various embodiments describing the generation of the extended dynamic range luminance image EL are explained further with reference to FIGURES 8, 9A and 9B, 10 and 11.

[00114] At 606, the method 600 includes generating an extended dynamic range color image (EC) of the scene based on the extended dynamic range luminance image EL and at least the first chrominance data C1 associated with the color image C. In an example embodiment, the extended dynamic range color image EC of the scene is generated by fusing the extended dynamic range luminance image EL and the first chrominance data C1. In an example embodiment, the fusion of the extended dynamic range luminance image EL and the first chrominance data C1 is performed by one or more image fusion methods, for example the principal component analysis based image fusion and the wavelet transform image fusion. In some embodiments, the extended dynamic range color image EC may be generated based on the extended dynamic range luminance image EL, the first chrominance data C1 , and an intermediate chrominance data (for example, the second chrominance data C2), wherein the intermediate chrominance data may be generated by performing processing of the color image C. Various embodiments describing the generation of the extended dynamic range color image EC are explained further with reference to FIGURES 8, 9A and 9B, 10 and 11.

[00115] FIGURE 7 is a flowchart depicting a first example method for generation of an extended dynamic range color image EC, in accordance with an example embodiment. Example references are made to FIGURES 2 to 5C for the description of the method 700. The method 700 depicted in the flowchart may be executed by, for example, the apparatus 200 of FIGURE 2.

[00116] At 702, the method 700 includes facilitating receipt of a panchromatic image (P) and a color image (C) associated with a scene, in an example embodiment, the panchromatic image P includes a first luminance data (L1 ), and the color image C includes a second luminance data (L2) and a first chrominance data (C1 ). In an example embodiment, the panchromatic image P is captured by a panchromatic camera present in or otherwise accessible to the apparatus 200. In an example embodiment, the color image C is captured by a Bayer camera present in or otherwise accessible to the apparatus 200. In an example embodiment, a baseline distance between the panchromatic camera and the Bayer camera may be less than a threshold baseline distance. In an example embodiment, if the baseline distance is much lesser than the threshold baseline distance, the panchromatic image P and the color image C can be aligned easily (pixel to pixel) using techniques known in the art.

[00117] At 704, the method 700 includes generating an extended dynamic range luminance image (EL) by fusing the first luminance data L1 from the panchromatic image P and the second luminance data L2 from the color image C. In an example embodiment, the fusion of the first luminance data L1 and the second luminance data L2 may be performed by one or more image fusion methods, for example principal component analysis based image fusion and wavelet transform image fusion.

[00118] At 706, the method 700 includes generating an intermediate color image (IC) from the color image C. In an example embodiment, the intermediate color image may be generated by performing a method illustrated at 708 - 712. At 708, the method 706 includes performing pre-processing of the first luminance data L1 and the second luminance data L2 for reducing noise components in the first luminance data L1 and the second luminance data L2, to thereby generate a pre-processed first luminance data (PL1) and a pre-processed second luminance data (PL2), respectively. In an example embodiment, the noise components in the first luminance data L1 and the second luminance data L2 may be reduced by applying a smoothing method to the first luminance data L1 and the second luminance data L2. At 710, for a pixel of a plurality of pixels of the color image C, a corresponding gain factor may be computed based on a comparison of corresponding first luminance components LC1 and corresponding second luminance components LC2 associated with the pixel. In an example embodiment, the corresponding first luminance components LC1 and the corresponding second luminance components LC2 may be derived from the pre-processed first luminance data PL1 and the pre-processed second luminance data PL2, respectively. It will be noted that in similar manner of the computation of the corresponding gain factor for the pixel, as described above, gain factors for pixels (other than the pixel) of the plurality of pixels of the color image C may also be computed. At 712, for the pixel, the corresponding gain factor may be multiplied with a corresponding pixel value to generate a corresponding pixel value associated with corresponding pixel of the intermediate color image IC. It will be noted that in similar manner of the multiplication of the corresponding gain factor for the pixel with the corresponding pixel value to generate the corresponding pixel value associated with the corresponding pixel of the intermediate color image IC, as described above, other corresponding pixels (other than the corresponding pixel) of the intermediate color image IC may also be generated. In an example embodiment, the intermediate color image IC may include a second chrominance data (C2). [00119] At 714, the method 700 includes de-noising the second chrominance data C2 of the intermediate color image IC. The second chrominance data C2 of the intermediate color image IC may be de-noised based on a de-noising method. At 716, the method 700 includes generating an extended dynamic range color image (EC) of the scene by fusing the extended dynamic range luminance image EL with the second chrominance data C2 from the intermediate color image IC.

[00120] FIGURE 8 is a flowchart depicting an example method for generation of an extended dynamic range color image, in accordance with yet another an example embodiment. Example references of FIGURES 2 to 5C may be made for the description of the method 800. The method 800 depicted in the flowchart may be executed by, for example, the apparatus 200 of FIGURE 2.

[00121] At 802, the method 800 includes facilitating receipt of a panchromatic image (P) and a color image (C) associated with a scene. In an example embodiment, the panchromatic image P may include a first luminance data (L1), and the color image C may include a second luminance data (L2) and a first chrominance data (C1). In an example embodiment, the panchromatic image P and the color image C may be captured by one or more image sensors, for example the one or more image sensors 208 (FIGURE 2). In an example embodiment, the one or more image sensors may be embodied in a panchromatic camera and a Bayer camera such that the panchromatic camera may capture the panchromatic image P of the scene while the Bayer camera may capture the color image C of the scene. In an example embodiment, the one or more image sensors may be present in or otherwise accessible to the apparatus 200. [00122] At 804, the method 800 includes generating an intermediate color image (IC) from the color image C by performing a method illustrated at 806 - 810. At 806, the method 804 includes inverting the color image C to generate a color negative image (CN). The color negative image CN may represent an image that ma be inverted from its original colors to its negative colors. The color negative image C may appear hazy. In an example embodiment, if a pixel on the color image C has (R,G,B) on three color channels, the pixel at the same position on the color negative image CN may have (255-R, 255-Θ, 255-B) on the three color channels. At 808, the method 804 includes de-hazing the color negative image CN for recovering a de- hazed image (DH) associated with the color image C. Herein, the term 'de-hazing the color negative image' may refer to a method of removing haze from the color negative image CN to thereby increase visibility and sharpness of the image. At 810, the method 804 includes inverting the de-hazed image DH to generate the intermediate color image IC. In an example embodiment, the intermediate color image IC includes a third luminance data (L3) and a second chrominance data (C2). [00123] At 812, the method 800 includes generating an extended dynamic range luminance image (EL) by fusing the first luminance data Li from the panchromatic image P, the second luminance data L2 from the color image G, and the third luminance data L3 from the intermediate color image lC. At 814, the method 800 includes generating an extended dynamic range color image (EG) by fusing the extended dynamic range luminance image EL with the second chrominance data C2 from the intermediate color image IC. In an example embodiment, the fusion of the extended dynamic range luminance image EL with the second chrominance data C2 may be performed by one or more image fusion methods, for example principal component analysis based image fusion and wavelet transform image fusion. [00124J FIGURES 9A and 9B is a flowchart depicting an example method for generating an extended dynamic range color image EC, in accordance with still another example embodiment. Example references of FIGURES 2 to 5G may be made for the description of the method 900. The method 9Q0 depicted in the flowchart may be executed by, for example, the apparatus 200 of FIGURE 2. [00125] At 902, the method 900 includes capturing an initial panchromatic image (IP) and an initial color image (1C) by a panchromatic camera and a Bayer camera, respectively. In an example embodiment, the initial panchromatic image (IP) and the initial color image (1C) may be captured by one or more image sensors, for example the one or more image sensors 208 (FIGURE 2). In an example embodiment, the one or more image sensors may be embodied in the panchromatic camera and the Bayer camera such that the panchromatic camera may capture the initial panchromatic image IP of the scene while the Bayer camera may capture the initial color image IC of the scene. In an example embodiment, the one or more image sensors may be present in or otherwise accessible to the apparatus 200.

[00126] In an embodiment, the panchromatic camera and the Bayer camera may capture images that may include distortion artifacts. In an embodiment, the 'distortion or distortion artifacts' in an image may refer to warping and bending of objects and/or straight lines in the image so as to make the objects and/or straight lines appear as curved. For example, the captured images may show objects stretched so as to make them appear straight, for example towards the edges of the frame. In some examples, the distortion artifacts may distort images by curving straight lines in the images. In some other examples, due to the distortion artifacts, the objects in the image may appear disproportionately large or distorted when compared to objects in the background. In an embodiment, the panchromatic camera and the Bayer camera may be calibrated for compensating the distortion associated with the initial panchromatic image IP and the initial color image IC captured by the panchromatic camera and the Bayer camera, respectively. It will be noted that the terms 'initial panchromatic image IP' and 'initial color image IC may refer to a panchromatic image (P) and a color image (C) captured by the panchromatic camera and the Bayer camera, respectively. Also, with distortion compensation, as described in method 900, the 'initial panchromatic image IP' and the 'initial color image IC may be processed to generate the panchromatic image P and the color image C, respectively that may be free of the distortion artifacts. For example, at 904, the method 900 includes performing distortion calibration of the panchromatic camera and the Bayer camera for compensating distortion associated with the initial panchromatic image IP and the initial color image IC, respectively, to generate the panchromatic image P and the color image C, respectively.

[00127] In an example embodiment, the distortion calibration of the panchromatic camera and the Bayer camera is performed at 906 and 908. At 906, the method 904 includes determining a first distortion parameter (D1 ) and a second distortion parameter (D2) associated with the panchromatic camera and the Bayer camera, respectively. In an example embodiment, the first distortion parameter (D1) and the second distortion parameter (D2) may be determined from the initial panchromatic image IP and the initial color image IC, respectively. In an example embodiment, the initial panchromatic image IP and the initial color image IC may represent a convolution of the panchromatic image P with the first distortion parameter D1 and a convolution of the color image C with the second distortion parameter D2, respectively. Thus, a de- convolution of the initial panchromatic image P and the initial color image C may result in generation of the panchromatic image P and the color image C.

[00128J At 908, the method 904 includes extracting the panchromatic image P and the color image C from the initial panchromatic image IP and the initial color image IC, respectively based on the first distortion parameter D1 and the second distortion parameter D2, respectively. In an example embodiment, a Fourier transform may be applied on a pair of convolution equations (for example, equations (1) and (2) described with reference to FIGURE 2) that represent the convolution of the panchromatic image P with the first distortion parameter D1 and the convolution of the color image C with the second distortion parameter D2, respectively to extract the panchromatic image P from the initial panchromatic image IP and the color image C from the initial color image IC.

[00129] The extracted panchromatic image P and the color image C may be aligned, at 910. In an example embodiment, the panchromatic image P includes a first luminance data (L1) and the color image C includes a second luminance data (L2) and a first chrominance data (C1). [00130] At 912, a pre-processing of the first luminance data L1 and the second luminance data L2 may be performed. In an example embodiment, the pre-processing of the first luminance data L1 and the second luminance data L2 may be performed by a method illustrated at 914-916. At 914, an interpolation of the first luminance data L1 may be performed to generate an interpolated first luminance data (IL1 ). In an example embodiment, the interpolation of the first luminance data L1 is performed by an 8-neighborhood interpolation method. At 916, at least a two-level interpolation of the second luminance data L2 may be performed to generate an interpolated second luminance data (IL2). In an example embodiment, the two-level interpolation of the second luminance data L2 is performed by the 8-neighborhood interpolation method. An example of 8-neighborhood interpolation method is described in detail with reference to FIGURE 3A.

[00131] At 918, the method 900 includes generating an extended dynamic range luminance image (EL) by fusing the interpolated first luminance data IL1 and the interpolated second luminance data IL2. In an example embodiment, the fusion of the interpolated first luminance data IL1 and the interpolated second luminance data IL2 may be performed by one or more image fusion methods, for example principal component analysis based image fusion and wavelet transform image fusion.

[00132] At 920, pre-processing of the first chrominance data C1 may be performed. In an example embodiment, the first chrominance data C1 is pre-processed by performing at least a two-level interpolation of the first chrominance data C1 to generate an interpolated first chrominance data (IC1 ). In an example embodiment, the two-level interpolation of the first chrominance data C1 is performed by a bilinear interpolation method. An example of bilinear interpolation method is described in detail with reference to FIGURE 3B.

[00133] At 922, an extended dynamic range color image EC of the scene may be generated by fusing the extended dynamic range luminance image EL with the interpolated first chrominance data IC1.

[00134] FIGURE 10 is a flowchart depicting an example method 1000 for generation of an extended dynamic range color image, in accordance with an example embodiment. Example references are made to FIGURES 2 to 5C for the description of the method 1000. The method 1000 depicted in the flowchart may be executed by, for example, the apparatus 200 of FIGURE 2.

[00135] At 1002, the method 1000 includes facilitating receipt of a panchromatic image (P) and a color image (C) associated with a scene. In an example embodiment, the panchromatic image P includes a first luminance data (L1). In an example embodiment, the panchromatic image P and the color image C may be captured by one or more image sensors, for example the one or more image sensors 208 (FIGURE 2). In an example embodiment, the one or more image sensors may be embodied in a panchromatic camera and a Bayer camera such that the panchromatic camera may capture the panchromatic image P of the scene while the Bayer camera may capture the color image C of the scene, in an example embodiment, the one or more image sensors may be present in or otherwise accessible to the apparatus 200.

[00136] In an embodiment, the panchromatic camera and the Bayer camera may capture images that may include distortion artifacts. For example, the panchromatic image P and the color image C captured by the panchromatic camera and the Bayer camera may include warping and bending of objects and/or straight lines in the respective captured images. In some other examples, due to the distortion artifacts, the objects in an image may appear disproportionately large or distorted when compared to objects in the background. In an embodiment the panchromatic camera and the Bayer camera may be calibrated for compensating distortion associated with the panchromatic image P and the color image C captured by the panchromatic camera and the Bayer camera s respectively.

[001371 At 1004, the method 1000 includes decomposing the color image G into a decomposed panchromatic image (DP) and a decomposed color image (DC). The decomposed panchromatic image DP and the decomposed color image DC may include a second luminance data (L2) and a first chrominance data (C1), respectively. At 1006, the method 1000 includes performing distortion calibration of the panchromatic camera and the Bayer camera for compensating distortion associated with the panchromatic image and the decomposed panchromatic image DP. in an example embodiment, the distortion calibration of the panchromatic camera and the Bayer camera may be performed by a method illustrated at 1008 and 1010.

[00138] At 1008, a set of feature pixels (FP1) from the panchromatic image P and a corresponding set of feature pixels (FP2) from the decomposed panchromatic image DP may be determined. In an example embodiment, the determination of the set of feature pixels FP1 and the corresponding set of feature pixels FP2 may be determined based on a bi-directional pixel matching between the panchromatic image P and the decomposed panchromatic image DP. In an example embodiment, upon performing the bi-directional pixel matching, it may be determined that the set of feature pixels FP1 of the panchromatic image P are shifted relative to the corresponding set of feature pixels FP2 of the decomposed panchromatic image DP by a corresponding first plurality of shift values (SI). In an example embodiment, for each pixel in the panchromatic image P and the decomposed panchromatic image DP, a Hessian matrix and corresponding eigenvalues may be computed to determine the set of feature pixels FP1 and the corresponding set of feature pixels FP2 in the panchromatic image P and the decomposed panchromatic image DP, respectively. On determination of the set of feature pixels FP1 and the corresponding set of feature pixels FP2, the bi-directional pixel matching is performed. A triangulation method, for example a Delaunay triangulation method, may be applied to the set of feature pixels FP1 to determine a local mesh network based on an assumption that the corresponding first plurality of shift values S1 of pixels in an image plane meet local continuity and local linearity (due to continuity of misalignments between pixel pairs from the panchromatic image P and the decomposed panchromatic image DP). An example of the Delaunay triangulation method is explained with reference to FIGURE 4B.

[00139] At 10 0, for a plurality of non-feature pixels (NFP1) of the panchromatic image P, a corresponding second plurality of shift values (S2) may be determined. In an example embodiment, the corresponding second plurality of shift values (S2) may be determined based on a triangulation method, for example the Delaunay triangulation method, applied on the set of feature pixels FP1 associated with the panchromatic image P and the corresponding first plurality of shift values S1. In an example embodiment, the triangulation method may be applied to infer the corresponding second plurality of shift values S2 based on an assumption of local linearity. [001401 At 1012, an extended dynamic range luminance image (EL) may be generated by fusing the first luminance data L1 with the second luminance data L2 from the decomposed panchromatic image DP based at least on the corresponding first plurality of shift values S1 and the corresponding second plurality of shift values S2. In an example embodiment, the fusion of the first luminance data L1 with the second luminance data L2 may be performed by one or more image fusion methods, for example principal component analysis based image fusion and wavelet transform image fusion.

[00141] At 1014, an extended dynamic range color image (EC) of the scene may be generated by fusing the extended dynamic range luminance image EL with the first chrominance data C1 from the decomposed color image DC. In an example embodiment, the fusion of extended dynamic range luminance image EL with the first chrominance data C1 may be performed by one or more image fusion methods, for example principal component analysis based image fusion and wavelet transform image fusion. [00142] FIGURE 11 is a flowchart depicting an example method 1100 for generation of an extended dynamic range color image, in accordance with still another example embodiment. Example references are made to FIGURES 2 to 5C for the description of the method 1100, The method 1100 depicted in the flowcharts may be executed by, for example, the apparatus 200 of FIGURE 2.

[00143] At 1102, the method 1100 includes facilitating receipt of a panchromatic image (P) and a color image (C) associated with a scene. In an example embodiment, the panchromatic image P includes a first luminance data (L1), and the color image C includes a second luminance data (L2) and a first chrominance data (C1). In an example embodiment, the panchromatic image P and the color image C may be captured by one or more image sensors, for example the one or more image sensors 208 (FIGURE 2). In an example embodiment, the one or more image sensors may be embodied in a panchromatic camera and a Bayer camera such that the panchromatic camera may capture the panchromatic image P of the scene while the Bayer camera may capture the color image C of the scene. In an example embodiment, the one or more image sensors may be present in or otherwise accessible to the apparatus 200.

[00144] in an example embodiment, an intermediate color image (IC) may be generated by processing the color image C for removal of low-light portions associated with the color image G, at 104. Herein, the 'low-light portions of the color image' may refer to those portions of the color image C that are associated with lower illumination as compared to rest of the portions. In an embodiment, the processing of the color image C for removal of low-light portions may be performed to increase illumination or brighten the low-light portions. [00145] In an example embodiment, the portions corresponding to the low-light portions of the processed color image (or the intermediate color image !G) may include noise, and may further be processed to de-noise the intermediate color image IC. In an example embodiment, the de-noising of the intermediate color image IC may be performed based on the panchromatic image P, as the panchromatic image P is associated with much less noise as compared to the noise in the color image C and the intermediate color image IC. At 1106, de-noising of the intermediate color image IG may be performed. In an example embodiment, the de-noising of the intermediate color image may be performed by a method illustrated at 1108 - 1112. At 1108, the intermediate color image IC may be decomposed into an initial third luminance image (IL3) and an initial second chrominance image (IG2). Herein, 'decomposing' the intermediate color image IC may refer to separating the intermediate color image IC into its corresponding luminance components and chrominance components to thereby generate the initial third luminance image IL3 and the initial second chrominance image IC2, respectively. At 1110, for at least one portion of the panchromatic image P, a weight information may be determined based on a difference of gray pixel values of neighboring pixels in the at feast one portion. In an embodiment, the weight information {or weight mask) may be determined {or extracted) from the panchromatic image P to measure information richness surrounding a particular portion. For the portions or pixels in the panchromatic image P, the corresponding weight information may be determined. In an example embodiment, the difference between a gray pixel value of the pixel and a gray pixel value of a neighboring pixel may assist in providing the weight information of the corresponding portion or pixel.

[00146] At 1112, a selective filtering of at least one portion of the intermediate color image IC corresponding to the at least one portion of the panchromatic image P may be performed based on the weight information of the at least one portion of the panchromatic image P. In an embodiment, the selective filtering may facilitate in generating a third luminance data (L3) and a second chrominance data (C2) associated with the intermediate color image IG. In an example embodiment, the selective filtering may be performed on portions of the initial third luminance image 1L3 and the initial second chrominance image IC2 based on the weight information associated with the pixel in the panchromatic image P. Hence, noise is filtered out from the initial third luminance image IL3 and the initial second chrominance image IC2 to generate the third luminance data L3 and the second chrominance data C2.

[00147J At 1114, an extended dynamic range luminance image (EL) may be generated by fusing the first luminance data L1 from the panchromatic image P, the second luminance data L2 from the color image C, and the third luminance data L3 from the intermediate color image IC. In an example embodiment, the fusion of the first luminance data L1 from the panchromatic image P, the second luminance data L2, and the third luminance data L3 from the intermediate color image IC may be performed by one or more image fusion methods, for example principal component analysts based image fusion and wavelet transform image fusion.

{00148] At 1116, an extended dynamic range color image {EC) may be generated by fusing the extended dynamic range luminance image EL with the second chrominance data C2 from the intermediate color image IC. In an example embodiment, the fusion of extended dynamic range luminance image EL with the second chrominance data C2 may be performed by one or more image fusion methods, for example principal component analysis based image fusion and wavelet transform image fusion. [00149J It should be noted that to facilitate discussions of the flowcharts of FIGURES 6,

7, 8, 9A-9B, 10, and 11 certain operations are described herein as constituting distinct steps performed in a certain order. Such implementations are examples only and non-limiting in scope. Certain operations may be grouped together and performed in a single operation, and certain operations may be performed in an order that differs from the order employed in the examples set forth herein. Moreover, certain operations of the methods 600, 700, 800, 900, 1000 and 1100 are performed in an automated fashion. These operations involve substantially no interaction with the user. Other operations of the methods 600, 700, 800, 900, 1000 and 1100 may be performed by in a manual fashion or semi-automatic fashion. These operations involve interaction with the user via one or more user interface presentations.

[00150] The methods depicted in these flow charts may be executed by, for example, the apparatus 200 of FIGURE 2. Operations of the flowchart, and combinations of operation in the flowcharts, may be implemented by various means, such as hardware, firmware, processor, circuitry and/or other device associated with execution of software including one or more computer program instructions. For example, one or more of the procedures described in various embodiments may be embodied by computer program instructions. In an example embodiment, the computer program instructions, which embody the procedures, described in various embodiments ma be stored by at least one memory device of an apparatus and executed by at least one processor in the apparatus. Any such com uter program instructions may be loaded onto a computer or other programmable apparatus (for example, hardware) to produce a machine, such that the resulting computer or other programmable apparatus embody means for implementing the operations specified in the flowchart. These computer program instructions may also be stored in a computer-readable storage memory (as opposed to a transmission medium such as a carrier wave or electromagnetic signal) that may direct a computer or other programmable apparatus to function in a particular manner, such that th instructions stored in the computer-readable memory produce an article of manufacture the execution of which implements the operations specified in the flowchart The computer program instructions may also be loaded onto a computer or other programmable apparatus to cause a series of operations to be performed on the computer or other programmable apparatus to produce a non-transitory computer-implemented process such that the instructions, which execute on the computer or other programmable apparatus, provide operations for implementing the operations in the flowchart. The operations of the methods are described with help of apparatus 200. However, the operations of the methods can be described and/or practiced by using any other apparatus.

[00151] FIGURES 12A and 12B illustrate a panchromatic image and a color image, respectively associated with a scene, in accordance with an example embodiment, and. FIGURE 12C illustrates an extended dynamic range color image of the scene being generated from the panchromatic image and the color image, in accordance with an example embodiment.

[00152] Referring to FIGURE 12A, a panchromatic image 1202 of the scene is illustrated. The panchromatic image 1202 is an image that provides only luminance data (for example, the first luminance data L1). In an example embodiment, the panchromatic image 1202 is captured by a first image sensor of one or more image sensors (for example, the one or more image sensors 208 of the apparatus 200) in a panchromatic camera. In another example embodiment, the panchromatic image 1202 may be rendered by transparent filters in a filter array of a color filter array (CFA) based sensor associated with a camera. An exposure of the panchromatic camera or the camera with the transparent filters in the CFA based sensor is metered to generate the panchromatic image 1202 as an overexposed image {an image having increased brightness). The panchromatic image 1202 hence captures details in shadow portions of a scene. For example, herein, the panchromatic image 1202 illustrates a scene of a room including an illuminated lamp 1204, a chair 1206, and a picture-board 1208 against a patterned background 1210, and with a patterned ceiling 1212. Some of the portions of the illuminated lamp 1204, some areas of the chair 1206 and the picture-board 1208 in the panchromatic image 1202 are seen as highly saturated portions (overly bright or highlight areas of the panchromatic image 1202) with most of corresponding details being totally washed-out or not visible due to light from the illuminated lamp 1204 being overly bright. The patterned background 1210, the patterned ceiling 1212 and remaining areas of the chair 1206, in the panchromatic image 1202, are seen as highlight portions (bright area of the panchromatic image 1202), with corresponding details being visible and partially washed-out due to overexposure to light in the scene. The panchromatic image 1202 is an example of an image that is captured against a bright background or a bright light source (for example, the illuminated lamp 1204) in the scene, resulting in the overexposed image which looks washed-out or white. [00153J Referring now to FIGURE 12B, a color image 1222 of the scene is illustrated, in accordance with an example embodiment. In an example embodiment, the color image 1222 is an image that provides both the luminance data (for example, the second luminance data L2) and a chrominance data (for example, the first chrominance data C1 ) which is a measure of color information associated with the scene. In an example embodiment, the color image 1222 is captured by an image sensor in a Bayer camera of one or more image sensors (for example, the one or more image sensors 208 of the apparatus 200). In another example embodiment, the color image 1222 may be rendered by color filters in the filter array of the CFA based sensor associated with the camera. Herein, an exposure of the Bayer camera or the camera with the color filters in the CFA based sensor is metered to generate the color image 1222 as an underexposed image (an image having decreased brightness). The color image 1222 hence captures details in highlight portions of the scene. The color image 1222 illustrates the scene of the room including an illuminated lamp 1224, a chair 1226, and a picture-board 1228 against a patterned background 1230, and with a patterned ceiling 1232. The illuminated lamp 1224, and some areas of the chair 1226, and the picture-board 1228 in the color image 1222 are highlight portions (bright areas of the color image 1222) with most of corresponding details being visible. The patterned background 1230, the patterned ceiling 1232 and remaining areas of the chair 1226, in the color image 1222, are seen as dark portions, with corresponding details being invisible due to underexposure to the light in the scene.

[00154] Referring now to FIGURE 12C, an extended dynamic range color image 1242 (for example, the extended dynamic range color image EC as described with respect to above figures) is illustrated, in accordance with an example embodiment. The extended dynamic range color image 1242 is an enhanced image generated based on image fusion of the panchromatic image 1202 and the color image 1222. As illustrated, the extended dynamic range color image 1242 is a fused image that may be generated based on the details in the shadow portions of the scene (from the panchromatic image 1202) and the details in the highlight portions of the scene (from the color image 1222). The extended dynamic range color image 1242 illustrates the scene of the room including an illuminated lamp 1244, a chair 1246, and a picture-board 1248 against a patterned background 1250, and with a patterned ceiling 1252. By performing preprocessing operations on the panchromatic image 1202 and the color image 1222 as described with respect to various embodiments disclosed in FIGURES 2 - 11 , the extended dynamic range color image 1242 is generated. In the extended dynamic range color image 1242, the illuminated lamp 1244, the chair 1246, the picture-board 1248, the patterned background 1250, and the patterned ceiling 1252, are seen as optimally illuminated portions, and not as dark portions or saturated portions, with corresponding details being clearly visible and having better contrast in the shadow portions and the highlight portions. In the extended dynamic range color image 1242, some visible details in the extended dynamic range color image 1242 include written information and pictures on the picture-board 1248, shape details of the illuminated lamp 1244 and the chair 1246, color and pattern information of the patterned background 1250 and a block pattern in the patterned ceiling 1252. The extended dynamic range color image 1242 is an example of an image having a high quality of detail and color where the shadow portions and the highlight portions are reproduced optimally. The extended dynamic range color image 1242 also has a higher dynamic range as compared to either of the panchromatic image 1202 or the color image 1222.

[00155] Without in any way limiting the scope, interpretation, or application of the claims appearing below, a technical effect of one or more of the example embodiments disclosed herein is to generate an extended dynamic range color image of a high image quality from a panchromatic image and a color image associated with a scene. In various example embodiments, the panchromatic image and a color image may be captured by a panchromatic camera and a Bayer camera, respectively. In various embodiments, the panchromatic camera and the Bayer camera may be embodied in a dual camera. In an example embodiment, the panchromatic camera may be metered for shadows and the Bayer camera may be metered for highlights. In an example embodiment, the panchromatic image and the color image may be captured by a camera embodying a CFA based sensor (for example, RGBW), such that the image captured by the camera may include luminance data and chrominance data. In this embodiment, the camera may not capture the panchromatic image and the color image separately, but may facilitate in provisioning of the first luminance data, the second luminance data and the first chrominance data. In various embodiments, the W pixels of the RGBW sensor may be metered for shadows and RGB pixels may be metered for highlights. In various embodiments, the first luminance data and the second luminance data may be fused to generate an extended range luminance image, and then the extended range luminance image may be fused with the first chrominance data to generate an extended dynamic range color image. [00156] Various embodiments facilitate in generation of extended dynamic range color image having dynamic range greater than the dynamic ranges of either the panchromatic, image or the color image. Various embodiments provide methods that enables the panchromatic image and the color image to be captured by a single capture, thereby precluding possibility of motion artifacts. Also, the methods provided by various embodiments are equally applicable to video content as the methods described herein facilitates in elimination of the motion artifacts. Moreover, the embodiments descried herein are associated with lower computational complexity. [00157] Also, various embodiments discloses methods for, enabling distortion calibration of images (panchromatic and color images), and alignment of the images for generating the extended dynamic range color image of high image quality that is also free from residual errors. Further, various embodiments discloses methods that enables filtering of noises from the images to thereby render the extended dynamic range color image of the high image quality. Various embodiments provides method of generating the extended dynamic range color images by employing the panchromatic camera and the Bayer camera with a least baseline distance or by employing a camera with a color filter array (GFA) based sensor and such setup of either the panchromatic camera and the Bayer camera or the camera can be integrated in small hand-held devices, for example mobile phones.

[00158] Various embodiments described above ma be implemented in software, hardware, application logic or a combination of software, hardware and application logic. The software, application logic and/or hardware may reside on at least one memory, at least one processor, an apparatus or, a computer program product. In an example embodiment, the application logic, software or an instruction set is maintained on any one of various conventional com uter-readable media. In the context of this document, a "computer-readable medium" may be an media or means that can contain, store, communicate, propagate or transport the instructions for use by or in connection with an instruction execution system, apparatus, or device, such as a computer, with on example of an apparatus described an depicted in FIGURES 1 and/or 2. A computer-readable medium may comprise a computer-readable storage medium that may be any media or means that can contain or store the instructions for use by o in connection with an instruction execution system, apparatus, or device, such as a computer. [001591 If desired, the different functions discussed herein may be performed in different order and/or concurrently with each other. Furthermore, if desired, one or more of the above-described functions may be optional or ma be combined. [00160] Althoug various aspects of the embodiments are set out in the independent claims, other aspects comprise other combinations of features from the described embodiments and/or the dependent claims with the features of the independent claims, and not solely the combinations explicitly set out in the claims. [001611 K is also noted herein that while the above describes example embodiments of the invention, these descriptions should not be viewed in a Iimiting sense. Rather, there are several variations and modifications which may be made without departing from the scope of the present disclosure as defined in the appended claims.