Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
PLENOPTIC SUB APERTURE VIEW SHUFFLING FOR A RICHER COLOR SAMPLING
Document Type and Number:
WIPO Patent Application WO/2018/002097
Kind Code:
A1
Abstract:
A system and method for generating multiple images with rich color acquisition using a plenoptic camera having a main lens disposed in front of a micro array of lenses, a mosaic color filter array and an image sensor, characterized in that it comprises: capturing a first set of images using an ordinary state of an electrically controllable birefringent medium being disposed between said main lens and said micro array of lenses, said ordinary state providing an ordinary ray to each pixel; capturing a second set of images using an extraordinary state of said electrically controllable birefringent medium, said extraordinary state splitting the light from said main lens into an ordinary ray and a extraordinary ray respectively impinging on two adjacent pixels of different colors, said extraordinary ray being shifted by distance of one pixel on said image sensor; performing a weighted subtraction of information about said second set of images from information about said first set of images; and generating a final set of images with rich color information from said weighted subtraction and said first set of images.

Inventors:
DRAZIC VALTER (FR)
SCHUBERT ARNO (FR)
LE SCOUARNEC NICOLAS (FR)
Application Number:
PCT/EP2017/065922
Publication Date:
January 04, 2018
Filing Date:
June 27, 2017
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
THOMSON LICENSING (FR)
International Classes:
H04N9/04; H04N5/225; H04N5/232; H04N9/07; H04N13/232
Foreign References:
KR20140061234A2014-05-21
US20130270421A12013-10-17
US20110001866A12011-01-06
JPH09130818A1997-05-16
Other References:
None
Attorney, Agent or Firm:
HUCHET, Anne et al. (FR)
Download PDF:
Claims:
CLAIMS

1. A method of generating multiple images comprising:

generating (410) a first set of images using a first state of an electro optical polarization modulator; said modulator being disposed between a main lens and a micro array of lenses with plurality of apertures;

generating (420) a second set of images using a second state of an electro optical polarization modulator;

calculating (430) information about said second set of images from information about said first set; and

generating (440) a final set of images after said subtraction.

2. A system for generating multiple images , comprising

a main lens (310) disposed in front of a micro array of lenses (352) associated with a plurality of apertures;

an electrically controlled electro optical modulator (325) disposed between said micro array of lens and said main lens; said electro optical modulator functioning between a first state (340) and a second state (330) upon application of an electrical voltage;

means for genraing a first set of images using said first state of an electro optical polarization modulator;

means for generating a second set of images using a second state of an electro optical polarization modulator; and

a processor (430) configured to calculate information about said second set of images from information about said first set of captured images to generate (440) a final set of images with enhanced color intensity and characteristics.

3. The method of claim 1 or the system of claim 2, wherein border pixels generated from apertures that create micro image cross talk are discarded prior to the calculation step.

4. The method of claim 1 or 3, or the system of claim 2 or 3 wherein said electro optical polarization modulator is originally at a first state and changes to a second state after electrical voltage is applied to said electro optical polarization modulator.

5. The method of claim 4, or the system of claim 4 wherein said electro optical polarization modulator returns back to its first original state in absence of electrical voltage.

6. The method of claim 1 or 4, or the system of claim 2 or 4 wherein said elector optical polarization modulator is made from a birefringent crystalline material.

7. The method of claim 6 or the system of claim 6, wherein said birefringent crystalline material is a Liquid Crystal (LC).

8. The method of claim 1 or 7, or the system of claim 2 or 6 wherein said electro optical polarization modulator produces two simultaneous rays of light received from said main lens, such that a first ray is mapped into a first lower aperture and a second aperture located above said first lower aperture.

9. The method of claim 1 or the system of claim 2wherein said plenoptic camera has a sensor disposed between said micro lens array and said main lens.

10. The method of claim 9, or the system of claim 9wherein distance between said micro lens array and said sensor is greater than 10 but less than 100 microns.

11. The method of claim 8, or the system of claim 8, wherein said first and said second rays have different angles of refraction.

12. The method of claim 11, or the system of claim 11, wherein said first and second rays have a different index of refraction.

13. The method of claim 12, or the system of claim 12 wherein said first ordinary ray and said second extraordinary rays are polarized in a direction perpendicular to one another.

14. The method of claim 13, or the sytem of claim 13 wherein said two ordinary and extraordinary rays are being created with different propagation distances.

Description:
PLENOPTIC SUB APERTURE VIEW SHUFFLING FOR A RICHER COLOR

SAMPLING

TECHNICAL FIELD

The present disclosure relates generally to digital image processing and more particularly to plenoptic imaging techniques using demosaicing.

BACKGROUND This section is intended to introduce the reader to various aspects of art, which may be related to various aspects of the present invention that are described and/or claimed below. This discussion is believed to be helpful in providing the reader with background information to facilitate a better understanding. Accordingly, it should be understood that these statements are to be read in this light, and not as admissions of prior art.

Capturing the scene's light field has been an old interest in the field of computational photography. However, the recent release of hand held plenoptic cameras have introduced the potential of light field imaging to the mass market. A plenoptic camera uses a micro lens array that is positioned in the image plane of a main lens and before an array of photo sensors onto which one micro-image (also called sub-image) is projected. By placing a micro lens array between the main lens and the sensor, a plenoptic camera captures the direction of the light bundles that enter the camera, in addition to their position, intensity and color. Captured data is then demultiplexed to provide a matrix of horizontally and vertically aligned views from slightly different points of view over the scene. Consequently, each micro-image depicts an area of the captured scene and each pixel associated with that micro-image shows this certain area from the point of view of a certain sub-aperture location on the main lens exit pupil. The raw image of the scene is then obtained as a result of the sum of all the micro-images acquired from respective portions of the photo-sensors array With the light fields, a number of natural applications have risen such as depth estimation or post-capture refocusing. Raw data conversion is complex and involve several issues that need to be resolved. One such issue is that of accurate color conversion. Demosaicing processes aim to recover the color content of the scene from a mosaicked captured raw data. Unfortunately, many practical problems exist that interfere with full color image reconstruction. Another problem occurs when performing geometric operations. These operations often cause errors that are due to poor calculation of interpolated color values. These and other issues, impact full recovery and color conversions such as those addressed by demosaicing processes. Consequently, there is a need for improved techniques that can provide better color recovery using captured raw data.

SUMMARY

A system and method for generating multiple images of different color intensity and characteristics using a plenoptic camera having a main lens disposed in front of a micro array of lenses having a plurality of apertures. In one embodiment the method comprises capturing a first set of images using a first state of an electro optical polarization modulator and capturing a second set of images using a second state of an electro optical polarization modulator. The modulator is disposed between said main lens and array of lenses with plurality of apertures. Subsequently, information is subtracted about the second set of images from information about the first set. A final set of images is generated after the subtraction such that the final set of images have enhanced color intensity and characteristics.

In another embodiment, a system for generating multiple images of different color intensity and characteristics is provided comprising a main lens disposed in front of a micro array of lenses associated with a plurality of apertures and an electrically controlled electro optical modulator disposed between the micro lens and the array of lenses. The electro optical modulator functioning between a first state and a second state upon application of an electrical voltage. Means for capturing a first set of images using said first state of an electro optical polarization modulator and means for capturing a second set of images using a second state of an electro optical polarization modulator is provided. A processor configured to subtract information about the second set of images from information about the first set of captured images to create a final set of images with enhanced color intensity and characteristics is also provided.

Additional features and advantages are realized through the techniques of the present disclosure. Other embodiments and aspects of the invention are described in detail herein and are considered a part of the claimed invention. For a better understanding of the invention with advantages and features, refer to the description and to the drawings.

BRIEF DESCRIPTION OF THE DRAWINGS The present disclosure will be better understood and illustrated by means of the following embodiment and execution examples, in no way limitative, with reference to the appended figures on which:

Figure 1 is a block diagram depiction of showing a olor filter array (CFA) used in a demosaicing process; Figure 2A is a depiction of an image and a micro lens array captured with a color filter array (CFA) pattern;

Figure 2B is an illustration of a demultiplex image captured using a color filter and a micro lens array;

Figure 3A is a depiction of a diagram for a sub-aperture to pixel mapping of both ordinary and extraordinary states according to one embodiment;

Figure 3B provides a zoomed and more detail depiction of embodiment shown in Figure 3A; and

Figure 4 is a flow chart depiction of the process explained above according to one embodiment. In Figures 1-4, the represented figures provide examples that are purely functional entities and do not necessarily correspond to physically separate entities. Namely, they could be developed in the form of software, hardware, or be implemented in one or several integrated circuits, comprising one or more processors.

Wherever possible, the same reference numerals will be used throughout the figures to refer to the same or like parts.

DESCRIPTION

It is to be understood that the figures and descriptions of the present invention have been simplified to illustrate elements that are relevant for a clear understanding of the present invention, while eliminating, for purposes of clarity, many other elements found in typical digital multimedia content delivery methods and systems. However, because such elements are well known in the art, a detailed discussion of such elements is not provided herein. The disclosure herein is directed to all such variations and modification.

In regular cameras, a lens is used to focus the light reflected or emitted from objects into a real image on the light-sensitive surface, inside a camera during a timed exposure. With an electronic image sensor, an electrical charge is produced at each pixel, which is then processed and stored in a digital image file for further use. In classic photography, the focal surface is approximately a plane or focal plane. The focal surface is perpendicular to the optical axis of the camera.

By contrast, in a plenoptic camera, each micro-image depicts an area of the captured scene and each pixel associated with that micro-image shows this certain area from the point of view of a certain sub-aperture location on the main lens exit pupil. The raw image of the scene is then obtained as a result of the sum of all the micro-images and the raw image contains the angular information of the light-field. Consequently, neighbor pixels in a raw image contain different angular information as each pixel corresponds to a different view.

Figure 1 is a depiction of a plenoptic micro-image matrix using a color filter array (CFA) used in a demosaicing process. Color filter arrays are used in demosaicing processes are digital image processing techniques that use a color filter array to reconstruct a full color image from incomplete color samples. The color samples are often an output of one or more image sensors overlaid with a CFA. A color filter array is a mosaic of color filters in front of one or more image sensors. Commercially, the most commonly used CFA configuration is the Bayer filter. The example of Figure 1 provides an example of such a filter. As shown, the alternating red (R) and green (G) filters for odd rows, are further alternated in even rows with intervening rows that include interspersed green (G) and blue (B) filters.

In this example, there are twice as many green filters as red or blue ones, catering to the human eye's higher sensitivity to green light. The color sub- sampling of a CFA by its nature results in aliasing, and therefore optical anti-aliasing filters are placed in the optical path between the image sensor and the lens to reduce the false color artifacts (chromatic aliases) introduced by interpolation. Since each pixel of the sensor is behind a color filter, the output is an array of pixel values, each indicating a raw intensity of one of the three filter colors. The Bayer filter is an example of a CFA that uses a multivariate interpolation on a uniform grid. Thus, an algorithm is needed to estimate for each pixel the color levels for all color components, rather than a single component. Conventional demosaicing algorithm reconstruct a full color image from such spatially under sampled color channels output from the CFA or other filters.

The mathematical operations involved in such algorithms is simple as it is based on nearby instances of the same color component. The simplest method of such interpolation algorithm relies on the nearest-neighbor interpolation which simply copies an adjacent pixel of the same color channel. However, such filters are unsuitable for any application where details and quality of image are important. In addition, although these methods can obtain good results in homogenous image regions, they are prone to severe demosaicing artifacts in regions with edges and details when used with pure-color CFAs. More sophisticated demosaicing algorithms exploit the spatial and/or spectral correlation of pixels within a color image but they are still problematic as will be seen in more details by referring to Figure 2B.

Figure 2A is an exemplary embodiment showing an image obtained with a color filter array (CFA) pattern and an hexagonal lens grid using a plenoptic camera. Figure 2A depicts the appearance of micro-images in a plenoptic camera. The lighter shade represents green (G) while the darkest shade represent blue (B) and the medium gray shade represent the color red ( R). In this example, each pixel is illuminated by a sub-aperture of the exit pupil of the main lens. Due to the hexagonal sampling, the residual rotation between the lens and pixel grids and to the CFA, once the sub-aperture views are demultiplexed, the ensuing views may have information or color missing from it in some areas. In order to recover the missing portions of the views or objects in a scene, it is possible to demosaic the raw data obtained by a plenoptic camera and then demultiplex to recover the views. The problem is that in most instances, this leads to color artifacts on the views. Consider a case where a neighbor pixels construction is used on a plenoptic raw image that contains different angular information (each pixel under a microlens corresponds to a different view). Demosaicing the raw plenoptic image in this case will potentially wrongly mixes angular information. In traditional algorithm that interpolate neighbor color values creating the so-called view cross-talk artifacts, this causes erroneous results. Furthermore, it has been shown in that disparity estimation from views obtained from the demosaiced raw image is prone to even larger errors. Figure 2B is an illustration of a demultiplex image as discussed. In the illustrated example of Figure 2B, a scene is provided with views or images of one or more objects. The image was captured by a plenoptic camera in this Figure 2B. Looking at the view of Figure 2B, it is difficult to recognize the image in the scene being presented. The overall contours an object is visible in Figure 2B, but there are not sufficient information provided in the image to allow for detailed object recognition. Even in instances where shade, intensity and colors are not totally missing, the complementary color information has to be interpolated from distant pixels which are problematic in most instances. The latter means that the missing color is at best estimated (or even guessed) from other objects which may not even be local to the image. In Figure 2B, the image demosaiced yet, as there is missing color information, but the image is demultiplexed in the appropriate manner. Without demosaicing, the raw data requires to be demultiplexed into views of the scene, without being demosaiced . This will help avoid occurrence of color artifacts and help recover the missing color information such as produced by crosstalk,. A block-matching technique can be used to estimate pixel disparities. Subsequently, reliable estimated disparities need to be used to demosaic images and views, exploiting the redundant sampling of the scene. The results do not contain color artifacts, compared to the state of art methods. An accurate view demultiplexing and sub- pixel accuracy of the estimated disparities, leads to spatial resolution of the demosaiced views of higher quality. This improvement in quality can be achieved without bearing the complexity of additional super-resolution steps. In one embodiment, this concept can be used to provide means for a much richer color acquisition. In this embodiment an electro- optical polarization modulator is used. In one embodiment electro-optical polarization modulator is a medium that is turned to birefringence when a voltage is applied.

Birefringence is the property of crystalline material that gives rise to production of two refracted rays when a ray hits them. This property is due to non isotropic distribution of atoms in the media and happens in crystalline media whose crystalline mesh structure is strongly non isotropic. Quartz and Calcite are natural material which exhibit birefringence. With those media, when a non polarized ray hits one of their surfaces which is at a special orientation, there are two rays that get created at refraction. One of these rays have one polarization characteristic and the other has a slightly different one. According to Snell-Descartes law: ηιείηθι = n r sin6 r where n; and n r are the respective (and different) refractive indices of the incident and refracted medium and θί and Θ Γ are the incident and refracted angles.

In a birefringent media, the ray that obeys the Snell-Descartes law is called the ordinary ray and the medium has one ordinary index of refraction no. The second ray that is created undergoes another refraction n e , it propagates in a direction within the material which has the extraordinary index ne and is polarized perpendicularly to the ordinary ray. In the birefringent medium, two rays are created with different propagation directions.

Birefringent materials refract rays according to Snell's law but the effective index of refraction in the medial depends upon the input polarization state and the angle the refracted ray makes with respect to the crystal axis. In a birefringent materials, the two type of rays can be defined as ordinary and extraordinary. Ordinary rays are refracted according to the Snell's principle n sin Θ = no sin θ', where "0" indicates the ordinary index. For extraordinary rays, in birefringent cases, the refraction law provides that n sin Θ = n (0w) sin θ', where the effective index of refraction in the birefringent materials is a function of the angle 0 W . The angle 0 W is the angle between the crystal axis vector "a" and the refracted wave vector "k". Additionally, the ray vector "s", which is the vector pointing in the direction of energy propagation does not follow the wave vector "k", but makes small angles with respect to vector "k". In isotropic medial, vectors "k" and "s" are the same. Therefore, for most optical designs, vector "k" must be taken into consideration. In these cases, the angle 0 W is defined as: cos0„, = le a .

The effective index of refraction is defined by where n 0 is the ordinary and n e is the extraordinary index of refraction.

The angle a between k and s is defined by cos a = k * , where

(n e ~ n 0 )tm Q

t&n ct —

n e + (« 0 tane H) ) and the vectors k and s are both copianar with the crystal axis vector a . The wave vector k points along the normal to the wavefront. while s points along the direction of energy propagation.

Figure 3A illustrates a diagram for a sub-aperture to pixel mapping of both ordinary and extraordinary states. As shown a plurality of rays (shown generally as 320) are passed through an electro optical modulator shown by reference numerals 325. On the right hand side of the figure, there is the exit pupil of the main lens shown by reference numeral 327 generally. This main lens exit is further divided into sub apertures VI to VI 2, as shown by reference numerals 352 collectively. In this embodiment, a birefringent medium is provided that can be electrically controllable. A variety of such mediums are available as can be appreciated by those skilled in the art. One such example, could be one that incorporates twisted liquid crystals nematic (TN). The TN liquid crystal can be sandwiched between two glass plates having transparent Indium Tin Oxide (ITO) electrodes, in one embodiment. In one example, in its ground state, with no voltage applied to the TN cell, the sub-apertures are imaged each onto one sole pixel per micro image following the green lines which depicts the ordinary rays. Figure 3B provides a zoomed and more detail picture depicting the area 310 as shown across the line 310. In this zoomed picture (310), one micro-lens and one column of the red-green pixels from the CFA (such as a Bayer CFA) is depicted. The red portions are illustrated by 312 and the green portions by reference numerals 315. The illustration of 310 shows how an ordinary ray can be mapped such that each sub-aperture is aligned to one pixel. To aid understanding, it is better to start with following the extraordinary rays 330 as provided in Figure 3 A first and follow these in a reverse order back to Figure 3B's more detail area. This will result in a flow that runs from a pixels to sub-apertures direction. Looking at Figure 3A, and following the rays from left to right, the extraordinary ray 320 follows the electro optical modulator green ray path. It crosses the optical center of its micro-lens and then hits the birefringent cell. The cell is in its "extraordinary" state with an applied voltage and thus, this time, two rays are generated, the green which refracts as ordinary, and the blue one which refracts as the extraordinary ray. When both hit the exit pupil of the main lens, the ordinary ends up in one sub-aperture while the extraordinary maps into one sub-aperture above. This means that if we begin from the exit pupil, the sub- aperture's VI 1 ordinary ray will hit the bottom most pixel which is a green one. From the same VI 1 sub-aperture, the extraordinary ray will hit the second pixel from the bottom, which is red. Adding color information to a plenoptic image requires two images to be taken sequentially. A first image is taken at tO with the cell in the ordinary state. The pixels on the sensor do record the following states:

PV(t0,Vl l), PR(t0,V10), PV(tO,V9), PR(t0,V8),...

A second image is taken at tl with the cell in its extraordinary state. Two rays of equal intensities are generated if there is no polarizing phenomenon in the scene:

PV(tl,Vl l)/2+ PV(tl,V12)/2, PR(tl,V10)/2+ PR(tl,Vl l)/2, PV(tl,V9)/2+ PV(tl,V10)/2, PR(tl,V8)/2+ PR(tl,V9)/2,...

Looking at Figure 3, it is evident that the borders of the aperture will not benefit from increased color information, but border pixels are anyway discarded because of micro-image crosstalk and vignetting. So in one embodiment, the first image is subtracted from two times the second shot for pixels that are not borderline. This will be the second pixel from the bottom and can be defined as : PR= 2x[PR(tl,V10)/2+ PR(tl,Vl l)/2]- PR(t0,V10). If there are no movements between two scenes at t=0 and t=l (tO and tl), then:

PR= 2x[PR(tl,V10)/2+ PR(tl,Vl l)/2]- PR(t0,V10)= PR(V10)+ PR(Vl l)- PR(V10)= PR(Vl l).

Hence, a red component for the sub-aperture Vl l is produced, while from shot at tO, a green component for the sub-aperture VI 1 is also present. Applying this concept for the rest of the pixels, if two scene shots are produced at tO and tl, from these a linear combination of pixel values can be obtained that allows for two times more color information than usually provided using conventional methods. In one example, the additional component used in this system is twisted nematic (TN) cell.

The Liquid Crystal (LC) can have a big difference in value between the ordinary nO and extraordinary ne indices of refraction. In some cases, a LC mixture named MLC- 9200-100 (Refractive Indices of Liquid Crystals for Display Applications) is known for having ne-n0>0.2 which is a very big difference but is needed in order to reduce the thickness of the cell from Figure 3. This thickness is not yet compatible with a placement of the cell between the micro-lens array and the sensor, but it can be placed between the exit pupil and the micro-lens array as it reduces to few mm in thickness, (micro-lens to sensor distance is in the range of 10-100 microns).

Figure 4 is a flow chart depiction of the process explained above according to one embodiment. Figure 4 illustrates the steps for a method of generating multiple images of different color intensity and characteristics using a plenoptic camera having a main lens disposed ahead of an array of lenses having a plurality of apertures. As shown in Figure 4 at step 410, a first set of images are captured using a first state of an electro optical polarization modulator. In one embodiment, the modulator is disposed between said main lens and array of lenses with plurality of apertures. In step 420, a second set of images are captured using a second state of an electro optical polarization modulator. In step 430 the information about the second set of images is subtracted, such as for example with a configured processor, from information about the first set. In step 440, a final set of images are generated after the subtraction such that said final set of images have enhanced color intensity and characteristics. In one embodiment, a system for generating multiple images of different color intensity and characteristics can be used to conduct the method steps of Figure 4 and the arrangement of Figures 3 A and 3B. In this embodiment, a main lens (310) disposed in front of an array of lenses (352) associated with a plurality of apertures. An electrically controlled electro optical modulator (325) such as shown in Figure 3A can be disposed between the micro lens and the array of lenses shown. The electro optical modulator functions between the two states (330 and 340 in Figure 3A) upon application of an electrical voltage. The first set of images are then captured using the first state of an electro optical polarization modulator and a second set of images are also captured using a second state of an electro optical polarization modulator as discussed in conjunction with Figure 4. Subsequently, a processor can be incorporated into the system that is configured to subtract information about said second set of images from information about the first set of captured images to generate (440 in Figure 4) a final set of images with enhanced color intensity and characteristics. In this manner rich color information can be obtained, even in a very complicated situation, where a plenoptic camera delivers very sparse color information.