Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
METHOD AND APPARATUS FOR CONTROLLING IMAGE CAPTURE
Document Type and Number:
WIPO Patent Application WO/2015/150622
Kind Code:
A1
Abstract:
Embodiments of the present invention provide a method and apparatus for controlling image capture. There is disclosed a method for detecting an environmental parameter for controlling image capture, the method comprising: obtaining eye information of a user of an electronic device using a first camera on the electronic device; detecting an environmental parameter of an environment where the user is located based on the eye information captured by the first camera; and responsive to a predefined condition being satisfied, controlling image capture by a second camera on the electronic device at least in part based on the detected environmental parameter, the first camera and the second camera being located at different sides of the electronic device. There is also disclosed a corresponding apparatus, an electronic device, and a computer program product.

Inventors:
CHU WEILI (CN)
GUO FENGYANG (CN)
Application Number:
PCT/FI2015/050178
Publication Date:
October 08, 2015
Filing Date:
March 18, 2015
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
NOKIA CORP (FI)
International Classes:
H04N5/232; G02B7/28; G03B7/08; G03B15/00; G03B19/07; G06F3/01; G06T7/00; G06V10/147
Foreign References:
US20100110368A12010-05-06
JP5061968B22012-10-31
JP2011085829A2011-04-28
CN202309887U2012-07-04
US20090015689A12009-01-15
JP2004343315A2004-12-02
JP2008182369A2008-08-07
JP2013207721A2013-10-07
Other References:
See also references of EP 3127323A4
Attorney, Agent or Firm:
NOKIA TECHNOLOGIES OY et al. (IPR DepartmentKarakaari 7, Espoo, FI)
Download PDF:
Claims:
Claims:

1. A method of detecting an environmental parameter to control image capture, the method comprising:

obtaining eye information of a user of an electronic device using a first camera on the electronic device;

detecting an environmental parameter of an environment where the user is located based on the eye information obtained by the first camera; and

responsive to a predefined condition being satisfied, controlling capture of an image by a second camera on the electronic device at least in part based on the detected environmental parameter, the first camera and the second camera being located at different sides of the electronic device.

2. The method according to claim 1, wherein detecting the environmental parameter comprises determining the environmental parameter from the eye information using a predetermined mapping.

3. The method according to claim 2, further comprising:

calibrating the predefined mapping for the user before using the predefined mapping.

4. The method according to claim 3, wherein calibrating the predefined mapping for the user comprises:

receiving the user's selection of a calibration image from a set of calibration images, the set of calibration images being captured using different environmental parameters under a same image capture condition;

obtaining calibration eye information of the user using the first camera; and

calibrating the predefined mapping based on the user's selection of the calibration image and the calibration eye information. 5. The method according to claim 4, wherein the predefined mapping at least indicates an environmental parameter and eye information that are associated with each of a set of image capture conditions,

wherein receiving the user's selection of a calibration image from a set of calibration images comprises:

receiving the user's selection of an image capture condition from the set of image capture conditions;

determining the environmental parameter associated with the selected image capture condition based on the predefined mapping;

selecting a set of candidate environmental parameters within a predefined neighborhood of the environmental parameter;

controlling the second camera to capture an image based on each of the set of candidate environmental parameters to generate a set of calibration images; and

receiving the user's selection of the calibration image from the set of calibration images,

wherein calibrating the predefined mapping comprises:

determining default eye information corresponding to the candidate environmental parameter associated with the selected calibration image; and

calibrating the predefined mapping based on the calibration eye information and the default eye information.

6. The method according to any one of claims 1 to 5, wherein the eye information includes a sclera color, and wherein detecting the environmental parameter comprises detecting a color temperature of the environment based on the sclera color.

7. The method according to any one of claims 1 to 5, wherein the eye information includes a pupil ratio, and wherein detecting the environmental parameter comprises detecting brightness of illumination in the environment based on the pupil ratio

8. The method according to any one of claims 1 to 5, further comprising:

detecting a reference environmental parameter of the environment based on information obtained by the second camera while capturing the image;

comparing the environmental parameter detected based on the eye information and the reference environmental parameter; and

responsive to a difference between the environmental parameter and the reference environment parameter being lower than a predefined threshold, determining that the predefined condition is satisfied.

9. The method according to any one of claims 1 to 5, further comprising:

receiving a user input that commands to control the capture of the image by the second camera based on the environmental parameter; and

responsive to receiving the user input, determining that the predefined condition is satisfied.

10. The method according to any one of claims 1 to 5, wherein controlling the capture of the image by the second camera comprises controlling at least one of:

exposure time;

sensitivity;

a gain of at least one color channel; and white balance processing.

11. An apparatus for detecting an environmental parameter to control image capture, the apparatus comprising:

an eye information obtaining unit configured to obtain eye information of a user of an electronic device using a first camera on the electronic device;

an environment detecting unit configured to detect an environmental parameter of an environment where the user is located based on the eye information obtained by the first camera; and

an image capture control unit configured to, responsive to a predefined condition being satisfied, control capture of an image by a second camera on the electronic device at least in part based on the detected environmental parameter, the first camera and the second camera being located at different sides of the electronic device.

12. The apparatus according to claim 11, wherein the environment detecting unit is configured to determine the environmental parameter from the eye information using a predetermined mapping.

13. The apparatus according to claim 12, further comprising:

a calibrating unit configured to calibrate the predefined mapping for the user before using the predefined mapping.

14. The apparatus according to claim 13, further comprising:

a user selection receiving unit configured to receive the user's selection of a calibration image from a set of calibration images, the set of calibration images being captured using different environmental parameters under a same image capture condition; and

a calibration eye information obtaining unit configured to obtain calibration eye information of the user using the first camera,

the calibrating unit being configured to calibrate the predefined mapping based on the user's selection of the calibration image and the calibration eye information.

15. The apparatus according to claim 14, wherein the predefined mapping at least indicates an environmental parameter and eye information that are associated with each of a set of image capture conditions,

wherein the user selection receiving unit comprises:

a first receiving unit configured to receive the user's selection of an image capture condition from the set of image capture conditions;

an environmental parameter determining unit configured to determine the environmental parameter associated with the selected image capture condition based on the predefined mapping;

a candidate parameter selecting unit configured to select a set of candidate environmental parameters within a predefined neighborhood of the environmental parameter;

a calibration image generating unit configured to control the second camera to capture an image based on each of the set of candidate environmental parameters to generate a set of calibration images; and

a second receiving unit configured to receive the user's selection of the calibration image from the set of calibration images,

wherein the apparatus further comprises a default eye information determining unit configured to determine default eye information corresponding to the candidate environmental parameter associated with the selected calibration image,

and wherein the calibrating unit is configured to calibrate the predefined mapping based on the calibration eye information and the default eye information. 16. The apparatus according to any one of claims 11 to 15, wherein the eye information includes a sclera color, and wherein the environment detecting unit comprises a color temperature detecting unit configured to detect a color temperature of the environment based on the sclera color. 17. The apparatus according to any one of claims 11 to 15, wherein the eye information includes a pupil ratio, and wherein the environment detecting unit may comprise a brightness detecting unit configured to detect brightness of illumination in the environment based on the pupil ratio 18. The apparatus according to any one of claims 11 to 15, further comprising:

a reference environment detecting unit configured to detect a reference environmental parameter of the environment based on information obtained by the second camera while capturing the image;

an environmental parameter comparing unit configured to compare the environmental parameter detected based on the eye information and the reference environmental parameter; and

a first condition determining unit configured to, responsive to a difference between the environmental parameter and the reference environment parameter being lower than a predefined threshold, determine that the predefined condition is satisfied.

19. The apparatus according to claims 11-15, further comprising:

a user input receiving unit configured to receive a user input that commands to control the capture of the image by the second camera based on the environmental parameter; and a second condition determining unit configured to responsive to receiving the user input, determine that the predefined condition is satisfied.

20. The apparatus according to any one of claims 11 to 15, wherein the image capture control unit is configured to control at least one of:

exposure time;

sensitivity;

a gain of at least one color channel; and

white balance processing.

21. An electronic device, comprising:

a controller;

a first camera;

a second camera, the first and second cameras located at different sides of the electronic device; and

an apparatus according to any one of claims 11 to 20.

22. A computer program product for detecting an environmental parameter to control image capture, the computer program product being tangibly stored on a non-transient computer readable medium and including a machine executable instruction, the machine executable instruction, when being executed, causing the machine to perform steps of the method according to any one of claims 1 to 10.

Description:
METHOD AND APPARATUS FOR CONTROLLING IMAGE CAPTURE

FIELD OF THE INVENTION

Embodiments of the present invention relate to the field of image processing, and more specifically relate to a method and apparatus for detecting environmental parameters so as to control image capture. BACKGROUND OF THE INVENTION

With the development of image capture and processing technologies, more and more user devices have a function of image capture. For example, currently, most portable mobile devices (e.g., mobile phone, personal digital assistant PDA, tablet computer, etc.) have been equipped with a camera capable of capturing images. When a user uses such a mobile device to capture an image, environmental parameters such as illumination and color temperature always vary with the environment, while such environmental parameters have a direct influence on image capture. For example, dependent on different environmental parameters, a camera needs to adjust its exposure time, sensor sensitivity, gains of different color channels, and/or automatic white balance processing, and the like. Otherwise, the captured image might be problematic, e.g., having an incorrect brightness or tone.

Conventionally, in order to adapt image capture in different environments, a user may be allowed to manually set camera parameters so as to guarantee that exposure time, photosensitivity, white balance and the like match the current environment. However, for a user, particularly for a common consumer, due to lack of expertise, it is a complex process to set camera parameters, and its accuracy cannot be guaranteed.

Another known manner is pre-setting and saving profiles for a plurality of typical environments for a user to select. For example, examples of such profile may include "highlight," "night," "automatic," etc. In use, a user may select a profile based on a specific environment so as to apply a corresponding environmental parameter during the image capturing process. However, it would be appreciated that pre-stored profiles always do not exactly match a particular environment where a user is located. Therefore, an image captured through applying preset environmental parameters defined in profiles cannot achieve an optimal quality yet.

In order to achieve adaptive processing to a captured image, it has been proposed to automatically detect illumination and color temperature in the environment. However, known methods have their own defects. For example, one solution is to detect white points in a scenario for automatic image processing such as white balance. However, when there is no apparent white object in the scene that can be used as a reference object, this method will become invalid. In view of the above, it is desirable in the art for a technical solution that can detect environmental parameters more accurately and effectively to control image capture.

SUMMARY OF THE INVENTION In order to address the above and other potential problems, the present invention provides a method and apparatus for controlling image capture.

In one aspect of the present invention, there is provided a method for detecting an environmental parameter for controlling image capture. The method comprises: obtaining eye information of a user of an electronic device using a first camera on the electronic device; detecting an environmental parameter of an environment where the user is located based on the eye information captured by the first camera; and responsive to a predefined condition being satisfied, controlling image capture by a second camera on the electronic device at least in part based on the detected environmental parameter, the first camera and the second camera being located at different sides of the electronic device. Embodiments in this aspect further comprises a corresponding computer program product.

In another aspect of the present invention, there is provided an apparatus for detecting an environmental parameter for controlling image capture. The apparatus comprises: an eye information obtaining unit configured to obtain eye information of a user of an electronic device using a first camera on the electronic device; an environment detecting unit configured to detect an environmental parameter of an environment where the user is located based on the eye information captured by the first camera; and an image capture control unit configured to, responsive to a predefined condition being satisfied, control image capture by a second camera on the electronic device at least in part based on the detected environmental parameter, the first camera and the second camera being located at different sides of the electronic device.

It would be appreciated through the description below that according to embodiments of the present invention, while a user captures an image using a camera on an electronic device, other independent camera of the electronic device may be used to obtain eye information of the user. Because a human eye can provide stable reference information (e.g., color of sclera, pupil ratio, and the like) that can be used for environment detection, the environmental parameter, such as color temperature and luminance, of the environment where the user is located may be estimated based on the obtained eye information. Therefore, an environmental parameter estimated by a camera may be used to control other camera in image capture, e.g., performing automatic exposure and/or automatic white balance processing. In this way, an image with a higher quality can be achieved. BRIEF DESCRIPTION OF THE ACCOMPANYING DRAWINGS

The above and other objectives, features and advantages of embodiments of the present invention will become more comprehensible through reading the following detailed description with reference to the accompanying drawings. In the figures, several embodiments of the present invention are illustrated in an exemplary, rather than limitative, manner, wherein:

Fig. 1 shows a block diagram of an electronic device in which exemplary embodiments of the present invention may be implemented therein;

Fig. 2 shows a flow diagram of a method for detecting an environmental parameter for controlling image capture according to exemplar embodiments of the present invention;

Fig. 3 shows a flow diagram of a method for calibrating a predefined mapping between eye information and an environmental parameter for a specific user according to exemplary embodiments of the present invention;

Fig. 4A and Fig. 4B show a schematic diagram of calibrating a predefined mapping based on an offset vector according to exemplary embodiments of the present invention;

Fig. 5 shows a block diagram of an apparatus for detecting an environmental parameter to control image capture according to exemplary embodiments of the present invention.

In respective figures, same or corresponding reference numerals represent same or corresponding portions.

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS

As summarized above and will be detailed below, one of main ideas of the present invention is that while a user uses a camera (for example, a rear camera) on an electronic device to capture an image, another separate camera (for example, a front camera) of the electronic device may be used to capture the eye information of the user. Because a human eye can provide stable reference information (for example, color of sclera, pupil ratio, and the like) that can be used for environment detection, the environmental parameter, such as color temperature and luminance, of the environment where the user is located may be estimated based on the obtained eye information. Therefore, an environmental parameter estimated by a camera may be used to control other camera in image capture, for example, performing automatic exposure and/or automatic white balance processing. In this way, an image with a higher quality can be achieved. Moreover, in some embodiments, personalized customization of environment detection may be achieved for different users, thereby further enhancing user experience.

Note that the term "image" used in the context of the present disclosure includes not only a static image but also a dynamic image (i.e., video). Additionally, the term "image capture" used here not only includes the shooting process of the image, but also includes any post-processing procedure for the shot image.

Hereinafter, several exemplary embodiments shown in the drawings will be referenced to describe the principle and spirit of the present invention. It should be understood that these embodiments are described only for enabling those skilled in the art to better understand and then further implement the present invention, not intended to limit the scope of the present invention in any manner.

Reference is first made to Fig. 1, in which a block diagram of an electronic device 100 in which exemplary embodiments of the present invention may be implemented therein is presented. According to embodiments of the present invention, the electronic device 100 may be a portable electronic device such as a mobile phone. However, it should be understood that this is only exemplary and non- limitative. Other types of user device may also easily employ embodiments of the present invention, such as a personal digital assistant (PDA), a pager, a mobile computer, a mobile TC, a game device, a laptop computer, a camera, a video camera, a GPS, or other type of voice and text communication system.

The electronic device 100 may have a communication function. To this end, as shown in the figure, the electronic device 100 may comprise one or more antennas 112 operable to communicate with a transmitter 114 and a receiver 116. The electronic device 100 further comprises at least one processor controller 120. It should be understood that the controller 120 comprises circuits needed for implementing all functions of the electronic device 100. For example, the controller 120 may comprise a digital signal processor device, a microprocessor device, an A/D converter, a D/A converter and other support circuits. The control and signal processing functions of the electronic device 100 are distributed based on these devices' respective capabilities. The electronic device 100 may further comprise a user interface, e.g., a ringer 122, a speaker 124, a loudspeaker 126, a display, or a viewfinder 128 and a keypad 130, all of which are coupled to the controller 120. In particular, the electronic device 100 comprises a first camera 136 and a second camera 137 for capturing static and/or dynamic images. According to embodiments of the present invention, the first camera 136 and the second camera 137 are located at different sides of the electronic device 100. For example, in some embodiments, the first camera 136 may be a front camera of the electronic device 100. In other words, it is located at the same side of the electronic device 100 as the display or viewfmder 128. The second camera 137 may be a rear camera of the electronic device 100. In other words, it is located at an opposite side of the electronic device 100 relative to the display or viewfmder 128. Of course, such location relationship is not a must. Any other location relationship between the first camera 136 and the second camera 137 is possible, as long as they may capture images of difference scenes in use. The scope of the present invention is not limited in this aspect.

The electronic device 100 further comprises a battery 134, such as a vibrating battery pack for supplying power to various circuits needed for operating the electronic device 100 and alternatively providing a mechanical vibration as a detectable output. The electronic device 100 may further comprise a user identity module (UIM) 138. The UIM 138 is generally a memory device having an in-built processor. The UIM 138 may, for example, comprise a subscriber identity module (SIM), a universal integrated circuit card (UICC), a universal user identity module (USIM), or a removable user identity module (R-UIM), etc. The UIM 138 may comprise a card connection detecting module according to embodiments of the present invention.

The electronic device 100 further comprises a storage device. For example, the electronic device 100 may comprise a non-volatile memory 140, for example, a non-volatile random access memory (RAM) in a cache area for temporarily storing data. The electronic device 100 may further comprise other non-volatile memory 142 that may be either in-built or removable. The non- volatile memory 142 may additionally or alternatively comprise an EEPROM or a flash memory, etc., as an example. The memory may store any item in a plurality of information segments and data used by the electronic device 100, so as to perform functions of the electronic device 100.

It should be understood that the structural diagram in Fig. 1 is only for the purpose of illustration, not intended to limit the scope of the present invention. In some cases, some components may be added or reduced according to the actual needs.

Fig. 2 shows a flow diagram of a method for detecting an environmental parameter for controlling image capture according to exemplar embodiments of the present invention. It would be appreciated that the method 200 may be executed by an electronic device 100 as described above with reference to Fig. 1, e.g., executed by a controller 120. For the sake of discussion, the method 200 will be described below with reference to the electronic device 100 as shown in Fig. 1.

The method 200 starts, in step S201, eye information of a user of an electronic device 100 is obtained using a first camera 136 of the electronic device 100. According to embodiments of the present invention, the obtaining of the eye information of the user by the first camera 136 may be triggered by various different conditions. For example, in some embodiments, step S201 may be performed in response to detecting that the user is capturing an image using the second camera 137. As an example, when the user captures an image using the second camera 137 by operating a control such as a physical key on the electronic device 100 or a button on or user interface (UI), the controller 120 of the electronic device 100 may detect the action and correspondingly trigger the first camera 136 to obtain the eye information. As mentioned above, according to embodiments of the present invention, the first camera 136 may be a front camera of the electronic device 100, while the second camera 137 may be a rear camera of the electronic device 100. When the user holds up the electronic device 100 to capture an image of a front scene using the first (rear) camera 137, the second (front) camera 136 is just oriented to the user's face and thus can obtain the eye information of the user.

Alternatively or additionally, the step S201 of the method 200 may also be triggered by different events. For example, in some embodiments, once the controller 120 of the electronic device 100 detects that the user has activated a camera program on the electronic device 100 (e.g., by clicking a corresponding icon), it may command the first camera 136 to start detecting the user's eye. Once the first camera 136 detects that the user's eye appears in its field of view, the acquisition of the eye information begins. The first camera 136 may use any currently known or future developed technologies to implement the eye detection, including, but not limited to, an image processing-based algorithm, a model identifying algorithm, and an algorithm based on eyeball optical characteristics. The scope of the present invention is not limited in this aspect. Moreover, it is to be understood that any other trigger conditions for step S201 are possible, and the scope of the present invention is not limited in this aspect.

In some embodiments of the present invention, the eye information detected in step S201 may comprise color of sclera of the user's eye. As known, sclera is substantially a white area in the human eye, and the color of sclera will vary with color temperature in the environment. Therefore, by detecting the color of sclera, the color temperature of the environment where the user is currently located can be detected, while the color temperature may in turn be used to control the second camera 137 in image capture. For example, the color temperature may be used for automatic white balance processing, which will be detailed below. According to embodiments of the present invention, any currently known or future developed technologies may be used to detect the color of sclera in the eye area, including, but not limited to, an image processing technology, a model identifying technology, an optical knowledge-based technology, and the like. The scope of the present invention is not limited in this aspect. Alternatively or additionally, in some embodiments of the present invention, the eye information detected in step S201 may comprise a ratio of the pupil or iris, for example, with respect to the eyeball area or entire eye area. As known, the pupil ratio of a human eye will vary with the brightness of the illumination in the environment. Therefore, by detecting the pupil ratio, illumination conditions such as brightness of the illumination in the environment where the user is currently located may be detected, while the brightness may in turn be used to control the automatic exposure processing (e.g., adjusting the exposure time) of the second camera during image capturing, which will be discussed below. According to embodiments of the present invention, any currently known or future developed pupil detecting and processing technologies may be used, including, for example, an image processing technology, a mode identifying technology, and an optical knowledge-based technology (for example, the pupil cornea reflection vector approach), and the like. The scope of the present invention is not limited in this aspect. It should be understood that the sclera color and pupil ratio as described above are only examples of detecting eye information of the user in step S201, not embodiments intended to limit the present invention. In step S201, any alternative or additional eye information that is available for detecting an environmental parameter may be obtained. The scope of the present invention is not limited in this aspect.

Next, the method 200 proceeds to step S202 of detecting an environmental parameter of an environment where the user of the electronic device 100 is located based on the eye information captured by the first camera 136. As discussed above, the environmental parameter may include any parameter that may generate an influence on image capture and/or processing, including, but not limited to, illumination brightness, color temperature, etc.

According to embodiments of the present invention, an association relationship between the eye information detected in step S201 and a corresponding environmental parameter may be pre-defined based on prior knowledge and stored in the electronic device 100. In other words, in step S202, an environmental parameter may be determined from the eye information based on a predefined mapping.

Only for the sake of illustration, sclera color is considered as an example of eye information. Temperatures corresponding to different sclera colors may be pre-measured stored by for example a manufacturer of the electronic device 100 or a provider of a camera program. As an example, in some exemplary embodiments, such association relationship may be stored as a mapping table in the electronic device 100, for example, stored in the non- volatile memory 142. Alternatively, the predefined mapping may be stored at a location independent of the electronic device 100, for example, stored in a server, and may be accessed by the electronic device 100. Similarly, as another example, the association relationships between different pupil ratios and the illumination brightness may also be measured and stored in the mapping table in advance.

In particular, according to some embodiments of the present invention, the mapping table may be directly used. The mapping table may also be updated by the device manufacturer and/or any other service providers. Alternatively or additionally, in some embodiments, a predefined mapping table may be calibrated for a user of the electronic device 100, thereby achieving a more accurate and/or personalized image capture and processing. Exemplary embodiments in this aspect will be detailed below.

The method 200 proceeds to step S203 of determining whether a predefined condition is satisfied. Several examples of a predefined condition will be discussed later. In step S203 if it is determined that the predefined condition is not satisfied (branch "No"), then the method 200 proceeds to step S204 of controlling image capture of the second camera 137 without using the environmental parameter detected by the first camera 136 in step S202. The operation in step S204 is similar to a known solution, which will not be detailed here.

On the contrary, if it is determined in step S203 that the predefined condition is determined to be satisfied (branch "No"), the method 200 proceeds to step S205 of controlling image capture of the second camera 137 of the electronic device 100 at least in part based on the environmental parameter detected in step S202.

Traditionally, during the image capturing process, the second camera 137 only captures and processes the information obtained by itself. With the white balance processing as an example, the second camera 137 will locate white points in the scene, and performs automatic white balance processing when capturing an image using the information provided by the white points. However, when the white points in the target scene are few or even do not exist at all, the automatic balance processing might have a poor effect or even become invalid. Consequently, the quality of the resulting image will be deteriorated.

In contrast, because the sclera of the eye may stably provide a white color as a reference, the color temperature of the environment may be accurately detected by the first camera 136. Through control of the image capture based on the color temperature detected in step S205, embodiments of the present invention can perform automatic white balance processing to the image captured by the second camera 137 more accurately and effectively. Correspondingly, the second camera 137 needn't further perform location and processing of white points.

Similarly, as mentioned above, brightness of the environment may be determined based on the pupil ratio obtained by the first camera 136. In such an embodiment, in step S205, the exposure time of the second camera 137 may be controlled at least in part based on the detected ambient brightness, thereby achieving a more accurate automatic exposure processing. Alternatively or additionally, the detected environmental parameter may also be used to control the sensitivity of the second camera 137, gains of one or more color channels, and the like. It should be noted that the environmental parameter detected in step S202 may be used in the control of the second 137 in any appropriate manner, thereby capturing an image of high quality.

Referring back to step S203 of the method 200, according to embodiments of the present invention, any appropriate predefined condition may be used according to the specific needs and application scenarios to enable the environmental parameter detected by the first camera to be used in image capture of the second camera.

For example, in some embodiments, the predefined condition may be a user input or command. For example, an option may be provided in the settings of the electronic device 100, such that the user may control this option to enable or disable the use of the environmental parameter detected by the first camera during the image capture process of the second camera. In other words, in such an embodiment, responsive to receiving such user selection, it is determined in step S203 that the predefined condition is satisfied. Alternatively or additionally, in some embodiments, a difference between the environments where the first camera and the second camera are located may be determined, and a comparison between such difference and a predefined threshold may act as the predefined condition in step S203. Specifically, it would be appreciated that during the image capture process of the second camera, the second camera per se will also obtain the information from the shot scene, and such information may also be used for detecting environmental parameters of the environment, such as illumination brightness, color temperature, etc. For the sake of discussion, the environmental parameter detected by the second camera may be referred to as "reference environmental parameter." Then, the reference environmental parameter may be compared with the environmental parameter detected by the first camera in step S202. If the difference therebetween is lower than the predefined threshold, it may be decided in step S203 that the predefined condition is satisfied.

In such an embodiment, by determining the difference between the environmental parameters detected by the first camera and the second camera, respectively, potential deterioration of image quality can be effectively avoided. For example, consider the following situation: a user indoor uses the electronic device 100 to shoot a scene outdoor through a door or a window. In this scenario, what is detected by the first camera facing the user is the indoor environmental parameter, while what is detected by the second camera 137 facing the target scene is an outdoor environmental parameter. If the difference between the indoor and outdoor environments is very large, then the image quality would be deteriorated when the image capture of the second camera 137 is controlled based on the environmental parameter detected by the first camera 136. In this case, according to the above embodiments of the present invention, the electronic device 100 may automatically disable the environmental parameter detected by the first camera 136 during the image capture process of the second camera 137 based on a comparison between the two sets of environmental parameters, thereby avoiding occurrence of the above problem.

It should be understood that what are described above are only several examples of a predefined condition, not intended to limit the scope of the present invention. Any other appropriate predefined conditions are possible. For example, in some embodiments, the predefined condition is set to "null," such that in step S203, the predefined condition is always determined to be satisfied. In other words, at this point, image capture of the second camera is always controlled based on the environmental parameter detected by the first camera.

Please note that the sequence of respective steps in the above described method 200 is only exemplary, not intended to limit the scope of the present invention. For example, as mentioned above, in some embodiments, the predefined condition in step S203 may be indicating an input selected by the user. In this case, step S203 is likely executed first. In other words, in such an embodiment, it will be first determined whether a predefined condition is satisfied, and subsequent steps of the method are only performed when the predefined condition is satisfied. Other alternative sequence is also possible, and the scope of the present invention is not limited in this aspect. In particular, as mentioned above, in some embodiments, the electronic device 100, for example, may store an association relationship between eye information of the user and a corresponding environmental parameter in a form of a mapping table. Such mapping table may be directly used. Alternatively, considering that eyes of different users likely have a different response to the same environment (e.g., having different sclera colors at the same color temperature), in some embodiments, for a specific user of the electronic 100, the predefined mapping between eye information and environmental parameter may be calibrated.

In some embodiments, for example, an auxiliary tool such as a color card may be used to sample the user's eye color at a given condition and correspondingly perform calibration. Alternatively or additionally, in order to enhance calibration efficiency and user- friendly degree, in some embodiments, calibration may be implemented through the following manner. First, receive the user's selection of a calibration image in a set of calibration images, the set of calibration images being captured using different environmental parameters under the same image capture condition. The user's calibration eye information is obtained by the first camera 136. After words, calibration may be implemented based on the user's selection of the calibration image and the calibration eye information. Hereinafter, an exemplary embodiment in this aspect will be described with reference to Fig. 3.

Fig. 3 shows a flow diagram of a method 300 for calibrating a predefined mapping between eye information and an environmental parameter for a specific user according to exemplary embodiments of the present invention. According to the embodiment of the present invention, the method 300 may be performed before the above described method 200. For example, the method 300 may be performed when the user first configures or uses the electronic device 100 or its image capture function, so as to implement calibration for the user.

In the embodiment as shown in Fig. 3, the mapping table records the environmental parameters and corresponding eye information that are associated with each image capture condition in a set of image capture conditions. The image capture conditions may be predefined, e.g., blue sky, cloudy, noon, tungsten, candle light, etc. For each image capture condition, the mapping table indicates an associated environmental parameter and corresponding eye information. Only for the sake of illustration and discussion, an example of color temperature as an environmental parameter is considered. In this case, the corresponding eye information may be sclera color, which may be characterized by a tuple (R/G, B/G), wherein R, G, B represents values of red, green, blue channels of sclera colors, respectively. Therefore, a mapping table may be defined below. (It is to be understood that all the numerical values are only exemplary.)

Table 1

It should be noted that in this mapping table, there might also record values of other environmental parameters and corresponding eye information, and such environmental parameters might not be associated with a specific image capture condition. For example, in the exemplary predefined mapping shown in Table 1, in addition to the color temperature associated with respective predefined image capture conditions, other color temperatures may also be recorded, for example, a color temperature of 6700k and a tuple (R/G, B/G) indicating a corresponding sclera color.

In order to adaptively calibrate the predefined mapping for a user of the electronic device 100, in step S301 of the method 300, the user's selection of a specific image capture condition is received. For the sake of discussion, it is supposed that the user selects the "cloudy" image capture condition.

In step S302, an environmental parameter associated with the image capturing condition selected by the user is determined based on the predefined mapping. In this example, it is the color temperature 6500K which is associated with the image capture condition "cloudy."

Next, in step S303, within a predefined neighborhood of the environmental parameter determined in step S302, a set of candidate environmental parameters are selected. For example, centered with the environmental parameter determined in step S302, sampling may be performed within a predefined range at a given step length to obtain candidate environmental parameters. For example, in the example as discussed, centered with the color temperature 6500k, the sampling may be performed towards two directions at a step length of 200k, respectively, for example. In this way, a set of candidate environmental parameters {6100k, 6300k, 6500k, 6700k, 6900k} may be obtained. It is seen that the candidate environmental parameters include the environmental parameter itself which is determined in step S302. It is to be understood that this example is only for the purpose of illustration, and any other sampling manner is possible as well. The scope of the present invention is not limited in this aspect. In step S304 of the method 300, the user's eye information is obtained using the first camera 136 of the electronic device 100. In this example, the obtained eye information at least includes a color of sclera. For the sake of discussion, the eye information obtained in step S304 may be referred to as "calibration eye information." In step S305 which is essentially simultaneous with step S304, the second camera 137 is controlled to capture an image of the target scene based on the each of the candidate environmental parameters as determined in step S303, thereby obtaining a set of images which can be referred to as "calibration images." In other words, each calibration image is an image captured based on a different candidate environment parameter. By way of example, in the example discussed in the above paragraphs, given each color temperature in the candidate temperatures {6100k, 6300k, 6500k, 6700k, 6900k}, the second camera is controlled to apply the respective automatic white balance processing during image capture, thereby obtaining five calibration images. Next, in step S306, the user's selection of one of the calibration images is received. Specifically, according to embodiments of the present invention, each resulting calibration image may be presented to the user on a user interface (UI) of the electronic device 100, such that the user can select a calibration image that he/she believes to be the one most matching with the current image capture condition. It would be appreciated that the calibration image selected by the user is the image that he/she believes to have the best quality both visually and psychologically. Such selection reflects not only an objective image quality, but also the user's preference to some extent. In this way, while the predefined mapping is calibrated, a personalized customization for a specific user is achieved as well. For the sake of discussion, it is supposed in this example that the user selects a calibration image captured based on the candidate environmental parameter 6700k color temperature.

Then, in step S307, the eye information corresponding to the candidate environmental parameter associated with the calibration image selected by the user is determined. For the sake of discussion, the eye information determined in step S307 may be referred to as "default eye information." For example, in the example as discussed, a sclera color corresponding to the candidate environmental parameter color temperature 6700k may be determined in step S307 as the default eye information. In some embodiments, it is possible that the default eye information corresponding to the candidate environmental parameter associated with the calibration image selected by the user has already been recorded in the predefined mapping. In this case, in step S307, the default eye information may be directly read out from the predefined mapping. Alternatively, in some other embodiments, the default eye information corresponding to a certain candidate environment parameter might not be recorded in the predefined mapping. In this case, a technology such as interpolation may be used to estimate or calculate the eye information corresponding to the candidate environmental parameter based on nearby known environmental parameters and eye information. Next, in step S308, the predefined mapping table is calibrated based on the calibration eye information obtained in step S304 and the default eye information determined in step S307. For example, in some embodiments, an offset vector directed from the default eye information to the calibration eye information may be constructed. Then the calibration of the predefined mapping is performed based on the offset vector.

Specifically, in such an embodiment, a coordinate system may be defined using a tuple (R G, B/G) representing sclera color. For example, R/G may act as a horizontal axis, while B/G acts as a longitudinal axis. In this way, each tuple (R/G, B/G) may represent a point in the coordinate system. Based on the tuple (R/G, B/G) indicated in the predefined mapping, a curve may be obtained through for example a curve fitting technology, which is referred to as "mapping curve." As an example, Fig. 4A shows a mapping curve 401 generated based on the mapping table shown in above Table 1. The calibration eye information obtained in step S304 defines point A in the coordinate system, while the default eye information obtained at step S307 defines point B on the coordinate system. In particular, as mentioned above, the default eye information is generated based on the predefined mapping; therefore, point B is located on the mapping curve 401. Therefore, the vector directed from point B to point A may be defined as an offset vector. Based on the offset vector, as shown in Fig. 4B, the mapping curve 401 may be translated to obtain an updated mapping curve 402. Then, calibration of the predefined mapping may be completed based on the updated mapping curve 402.

It should be understood that although calibration of the predefined mapping is described with reference to the sclera color and color temperature, such calibration mechanism is likewise applicable to other environment parameter and corresponding eye information, e.g., brightness and pupil ratio.

Fig. 5 shows a block diagram of an apparatus 500 for detecting an environmental parameter to control image capture according to exemplary embodiments of the present invention is presented. As shown, according to embodiments of the present invention, the apparatus 500 comprises: an eye information obtaining unit 501 configured to obtain eye information of a user of an electronic device using a first camera on the electronic device; an environment detecting unit 502 configured to detect an environmental parameter of an environment where the user is located based on the eye information obtained by the first camera; and an image capture control unit 503 configured to, responsive to a predefined condition being satisfied, control capture of an image by a second camera on the electronic device at least in part based on the detected environmental parameter, the first camera and the second camera being located at different sides of the electronic device.

In some embodiments, the eye information includes a sclera color, and the environment detecting unit 502 may comprise a color temperature detecting unit configured to detect a color temperature of the environment based on the sclera color. In some embodiments, the eye information includes a pupil ratio, and the environment detecting unit 502 may comprise a brightness detecting unit configured to detect brightness of illumination in the environment based on the pupil ratio.

In some embodiments, the environment detecting unit 502 may be configured to determine the environmental parameter from the eye information using a predetermined mapping. In such an embodiment, the apparatus 500 may also comprise a calibrating unit configured to calibrate the predefined mapping for the user before using the predefined mapping. In some embodiments, the apparatus 500 further comprises: a user selection receiving unit configured to receive the user's selection of a calibration image from a set of calibration images, where the set of calibration images are captured using different environmental parameters under a same image capture condition; and a calibration eye information obtaining unit configured to obtain calibration eye information of the user using the first camera. In such embodiments, the calibrating unit is configured to calibrate the predefined mapping based on the user's selection of the calibration image and the calibration eye information.

In some embodiments, the predefined mapping at least indicates an environmental parameter and the eye information that are associated with each one in a set of image capture conditions. In such embodiments, the user selection receiving unit comprises: a first receiving unit configured to receive the user's selection of an image capture condition from the set of image capture conditions; an environmental parameter determining unit configured to determine the environmental parameter associated with the selected image capture condition based on the predefined mapping; a candidate parameter selecting unit configured to select a set of candidate environmental parameters within a predefined neighborhood of the environmental parameter; a calibration image generating unit configured to control the second camera to capture an image based on each of the set of candidate environmental parameters to generate a set of calibration images; and a second receiving unit configured to receive the user's selection of the calibration image from the set of calibration images. Accordingly, in such embodiments, the apparatus 500 further comprises a default eye information determining unit configured to determine default eye information corresponding to the candidate environmental parameter associated with the selected calibration image. The calibrating unit is configured to calibrate the predefined mapping based on the calibration eye information and the default eye information.

In some embodiment, the apparatus 500 may also comprise: a reference environment detecting unit configured to detect a reference environmental parameter of the environment based on information obtained by the second camera when capturing the image; an environmental parameter comparing unit configured to compare the environmental parameter detected based on the eye information and the reference environmental parameter; and a first condition determining unit configured to responsive to a difference between the environmental parameter and the reference environment parameter being lower than a predefined threshold, determine that the predefined condition is satisfied.

Alternatively or additionally, in some embodiments, the apparatus 500 may also comprise: a user input receiving unit configured to receive a user input that commands to control the image capture by the second camera based on the environmental parameter; and a second condition determining unit configured to responsive receiving the user input, determine that the predefined condition is satisfied. In some embodiments, in controlling the image capture by the second camera, the image capture control unit 503 may be configured to control at least one of the following: the exposure time, the sensitivity, the gain of at least one color channel, and the white balance processing.

It should be noted that for the sake of clarity, FIG. 5 does not show some optional units or sub- units included in the apparatus 500. However, it should be noted that respective features as described above with reference to Figs. 1-4 are likely applicable to the apparatus 500, which are therefore not detailed. Moreover, the term "unit" here may be either a hardware module or a software unit module. Correspondingly, the apparatus 500 may be implemented through various manners. For example, in some embodiments, the apparatus may be implemented partially or completely by software and/or firmware, e.g., implemented as a computer program product embodied on a computer readable medium. Alternatively or additionally, the apparatus 500 may be implemented partially or completely based on hardware, e.g., implemented as an integrated circuit (IC), an application-specific integrated circuit (ASIC), a system on chip (SOC), a field programmable gate array (FPGA), and the like. The scope of the present invention is not limited in this aspect.

Only for the illustration purpose, several exemplary embodiments of the present invention have been described above. Embodiments of the present invention may be implemented by hardware, software or a combination of the software and combination. The hardware part may be implemented using a dedicated logic; the software part may be stored in the memory, executed by an appropriate instruction executing system, e.g., a microprocessor or a dedicatedly designed hardware. In particular, both of the above methods described with reference to Figs. 2 and 3 may be implemented as a computer program product for detecting an environmental parameter to control image capture, the computer program product being tangibly stored on a non-transient computer readable medium and including a machine executable instruction, the machine executable instruction, when being executed, causing the machine to perform steps of the method 200 and/or method 300.

Those ordinary skilled in art may understand that the above apparatus and method may be implemented using a computer-executable instruction and/or included in processor control code. In implementation, such code is provided on a medium such as magnetic disk, CD or DVD/ROM, a programmable memory such as read-only memory (firmware), or a data carrier such as optical or electronic signal carrier. The system of the present invention can be implemented by a very large scale integrated circuit or gate array, semiconductor such as logic chips and transistors, or hardware circuitry of programmable hardware devices like field programmable gate arrays and programmable logic devices, or implemented by various kinds of processor-executable software, or implemented by a combination of the above hardware circuits and software, such as firmware.

It is to be understood that although several units or sub-units of the apparatus have been mentioned in the above detailed description, such division is merely exemplary and not mandatory. In fact, according to embodiments of the present invention, the features and functions of two or more modules described above may be embodied in one module. On the contrary, the features and functions of one module described above may be embodied by a plurality of modules. In addition, although in the accompanying drawings operations of the method of the present invention are described in specific order, it is not required or suggested these operations be necessarily executed in the specific order or the desired result should be achieved by executing all illustrated operations. On the contrary, the steps depicted in the flowcharts may change their execution order. Additionally or alternatively, some steps may be omitted, a plurality of steps may be combined into one step for execution, and/or one step may be decomposed into a plurality of steps for execution.

Although the present invention has been described with reference to several embodiments, it is to be understood the present invention is not limited to the embodiments disclosed herein. The present invention is intended to embrace various modifications and equivalent arrangements comprised in the spirit and scope of the appended claims. The scope of the appended claims accords with the broadest interpretation, thereby embracing all such modifications and equivalent structures and functions.