Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
GESTURE RESPONSIVE IMAGE CAPTURE CONTROL AND/OR OPERATION ON IMAGE
Document Type and Number:
WIPO Patent Application WO/2013/169259
Kind Code:
A1
Abstract:
Methods, apparatuses and storage medium associated with controlling capture of images and/or processing captured images are disclosed herein. In embodiments, a storage medium may include instructions configured to enable a device having an image capture component, in response to execution of the instructions by the device, to enable the device to provide one or more sendees to control the image capture component responsive to a gesture, to perform a post-capture processing operation on an image captured by the image capture component responsive to the gesture, or to do both. Other embodiments may be disclosed or claimed,

Inventors:
BILGEN ARAS (US)
KELLEY SEAN V (US)
Application Number:
PCT/US2012/037388
Publication Date:
November 14, 2013
Filing Date:
May 10, 2012
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
INTEL CORP (US)
BILGEN ARAS (US)
KELLEY SEAN V (US)
International Classes:
G06F3/048; G06F3/01; G06F9/44; H04N5/232
Domestic Patent References:
WO2007000743A22007-01-04
Foreign References:
US20110013049A12011-01-20
US20090309997A12009-12-17
US20110261213A12011-10-27
US20060026521A12006-02-02
US20110013049A12011-01-20
US20100020221A12010-01-28
Other References:
See also references of EP 2847649A4
Attorney, Agent or Firm:
FOX, Ryan C. (Williamson & Wyatt P.C.,Pacwest Center,1211 SW 5th Avenue, Suite 1500-200, Portland Oregon, US)
Download PDF:
Claims:
Claims

What is claimed is:

1. At least one non-transitory computer-readable storage medium having a plurality of instructions configured to enable a device having an image capture component, in response to execution of the instructions by the device, to enable the device to provide one or more services to control a setting of the image capture component responsive to a gesture, to perform a post-capture processing operation on an image captured by the image capture component responsive to the gesture, or to do both.

2. The at least one computer-readable storage medium of claim 1 , wherein the one or more services to control a setting of the image capture component include one or more services to control a zoom factor of the image capture component responsive to the gesture.

3. The at least one computer-readable storage medium of claim 2, wherein the one or more services to control a zoom factor of the image capture component is configured to control the zoom factor responsive to a finger span of a pinch gesture.

4. The at least one computer-readable storage medium of any one of claims 1-3, wherein the one or more services to perform a post-capture processing operation on an image captured by the image capture component include one or more services to crop an image captured by the image capture component responsive to the gesture. 5. The at least one computer-readable storage medium of claim 4, wherein the one or more services to crop an image captured by the image capture component is configured to crop an image captured by the image capture component responsive to a center of a finger span of a pinch gesture.

6. A method for operating a device having an image capture component, comprising: receiving, by the device, input of a gesture; and

controlling the image capture component responsive to the gesture, performing a post-capture processing operation on an image captured by the image capture component responsive to the gesture, or doing both.

7. The method of claim 6, wherein controlling the image capture component includes controlling a setting of the image capture component responsive to the gesture.

8. The method of claim 7, wherein controlling a setting of the image capture component includes controlling a zoom factor of the image capture component responsive to the gesture. 9. The method of claim 8, wherein controlling a zoom factor of the image capture component comprises controlling the zoom factor responsive to a finger span of a pinch gesture.

10. The method of any one of claims 6 - 9, wherein performing a post-capture processing operation on an image captured by the image capture component includes cropping an image captured by the image capture component responsive to the gesture.

11. The method of claim 10, wherein cropping an image captured by the image capture component comprises cropping an image captured by the image capture component responsive to a center of a finger span of a pinch gesture.

12. An apparatus comprising:

an image capture component configured to capture an image; and

one or more logic units coupled with the image capture component, and configured to control a setting of the image capture component responsive to a gesture, perform a post-capture processing operation on an image captured by the image capture component responsive to the gesture, or do both. 13. The apparatus of claim 12, wherein the one or more logic units are configured to control a zoom factor of the image capture component responsive to the gesture.

14. The apparatus of claim 13, wherein the one or more logic units are configured to control the zoom factor responsive to a finger span of a pinch gesture.

15. The apparatus of any one of claims 12-14, wherein the one or more logic units are configured to crop an image captured by the image capture component responsive to the gesture.

16. The apparatus of claim 15, wherein the one or more logic units is configured to crop an image captured by the image capture component responsive to a center of a finger span of a pinch gesture.

17. The apparatus of claim 12, wherein the one or more logic units comprises a gesture detector configured to map a finger span of a pinch gesture to a zoom setting of the image capture component. 18. The apparatus of claim 12, wherein the one or more logic units comprises a gesture detector configured to map a center of a finger span of a pinch gesture to a center of cropping for an image captured by the image capture component.

19. The apparatus of claim 18, wherein the one or more logic units comprises an image stream processor coupled with the gesture detector, and configured to cooperate with the gesture detector to provide an image of a current view of the image capture component for a user of the device to use to gesture a control of the image capture component.

20. The apparatus of claim 19, wherein the one or more logic units comprises an image stream processor coupled with gesture detector, and configured to cooperate with the gesture detector to provide an image of a current view of the image capture component for a user of the device to use to gesture an operation to be performed on an image captured by the image capture component.

21. The apparatus of claim 20, wherein the image stream processor is configured to crop a first image captured by the image capture component in response to a gesture of a cropping operation to be performed on the first image.

22. The apparatus of any one of claims 17-21, wherein the one or more logic units further comprises an image capture controller coupled with the image stream processor, wherein the image capture controller is configured to provide the control to the image capture controller, and the image capture controller is configured to provide the control to the image capture component.

23. The apparatus of claim 22, wherein the apparatus is a selected one of a camera, a smartphone, a computing tablet, or a laptop computer.

24. The apparatus of any one of claim 15, wherein the apparatus is a selected one of a camera, a smartphone, a computing tablet, or a laptop computer.

25. An apparatus comprising:

an image capture component configured to capture an image;

a gesture detector con igured to map a gesture to control a setting of the image capture component, to an operation to be performed on an image captured by the image capture component, or to both;

an image stream processor coupled with gesture detector, and configured to cooperate with the gesture detector to provide an image of a current view of the image capture component for a user of the device to use to gesture the control setting, the operation, or both; and

an image capture controller coupled with the image stream processor, and configured to provide the image stream processor with the current view, and to control a setting of the image capture component responsive to the control setting provided by the image stream processor, wherein the image stream processor is further configured to provide the control setting to the image capture component responsive to a mapping to the control setting by the gesture detector.

26. The apparatus of claim 25, wherein the control setting comprises a zoom setting, or the operation comprises a cropping operation.

27. The apparatus of any one of claims 25-26, wherein the apparatus is a selected one of a camera, a smartphone, a computing tablet, or a laptop computer. 28. An apparatus comprising:

an image capture component configured to capture an image;

a gesture detector configured to map a finger span of a pinch gesture to a zoom setting of the image capture component, and to map a center of the finger span of the pinch gesture to a center of cropping for an image captured by the image capture component;

an image stream processor coupled with gesture detector, and configured to cooperate with the gesture detector to provide an image of a current view of the image capture component for a user of the device to use to gesture the zoom setting and the cropping operation; and

an image capture controller coupled with the image stream processor, and configured to provide the image stream processor with the current view, and to control the image capture component responsive to the zoom setting provided by the image stream processor, wherein the image stream processor is further configured to provide the zoom setting to the image capture component responsive to a mapping to the zoom setting by the gesture detector;

wherein the apparatus is a selected one of a camera, a smartphone, a computing tablet, or a laptop computer.

Description:
Gesture Responsive Image Capture Control and/or Operation on Image

Technical Field

This application relates to the technical field of imaging, more specifically to methods and apparatuses associated with gesture responsive image capture control and/or operation on a captured image.

Background

The background description provided herein is for the purpose of generally presenting the context of the disclosure. Unless otherwise indicated herein, the materials described in this section are not prior art to the claims in this application and are not admitted to be prior art by inclusion in this section.

In recent years, digital cameras, from the low-end ultra compact cameras to the high-end professional single-lens reflex (SLR) cameras, have all seen major technological advances, e.g., higher resolution image sensors, better focus metering, and so forth. A number of these advances have also found their way into mobile devices, such as smartphones. Today, photo capturing, viewing and sharing are major usage scenarios of mobile devices. Many mobile devices allow for very compelling user friendly

interactions with captured images. Users can zoom in on captured images with their fingers, flick through huge image collections quickly, and manipulate the images using gestures. However, such compelling user friendly interactions are typically limited to captured image viewing on mobile devices.

BRIEF DESCRIPTION OF THE DRAWINGS

Embodiments of the present invention will be described by way of exemplary embodiments, but not limitations, illustrated in the accompanying drawings in which like references denote similar elements, and in which:

Figure 1 illustrates an example image capturing device equipped with gesture responsive image capture control and/or operation on captured images;

Figure 2 illustrates an imaging method using the image capturing device of Figure 1;

Figure 3 illustrates an example image of a current view for a user to gesture control and/or image operation; and Figure 4 illustrates an example non-transitory computer-readable storage medium having instructions configured to enable practice of all or selected aspects of the method of Figure 2; all arranged in accordance with embodiments of the present disclosure.

DETAILED DESCRIPTION

Methods, apparatuses and storage medium associated with gesture response image capture control and/or operation on images are described herein.

Various aspects of the illustrative embodiments will be described using terms commonly employed by those skilled in the art to convey the substance of their work to others skilled in the art. However, it will be apparent to those skilled in the art that alternate embodiments may be practiced with only some of the described aspects. For purposes of explanation, specific numbers, materials, and configurations are set forth in order to provide a thorough understanding of the illustrative embodiments. However, it will be apparent to one skilled in the art that alternate embodiments may be practiced without the specific details. In other instances, well-known features are omitted or simplified in order not to obscure the illustrative embodiments.

Various operations will be described as multiple discrete operations, in turn, in a manner that is most helpful in understanding the illustrative embodiments; however, the order of description should not be construed as to imply that these operations are necessarily order dependent. In particular, these operations need not be performed in the order of presentation. Further, descriptions of operations as separate operations should not be construed as requiring that the operations be necessarily performed independently and/or by separate entities. Descriptions of entities and/or modules as separate modules should likewise not be construed as requiring that the modules be separate and/or perform separate operations. In various embodiments, illustrated and/or described operations, entities, data, and or modules may be merged, broken into further sub-parts, and/or omitted.

The phrase "in one embodiment" or "in an embodiment" is used repeatedly. The phrase generally does not refer to the same embodiment; however, it may. The terms "comprising," "having," and "including" are synonymous, unless the context dictates otherwise. The phrase "A/B" means "A or B". The phrase "A and/or B" means "(A), (B), or (A and B)". The phrase "at least one of A, B and C" means "(A), (B), (C), (A and B), (A and C), (B and C) or (A, B and C)'\ Figure 1 illustrates an example image capturing device equipped with gesture response image capture control and/or operation on images, in accordance with various embodiments of the present disclosure. As illustrated, embodiments of device 100 may include gesture detector 102, image stream processor 104, image capture controller 106, image capture component 108 and touch-sensitive display 110, coupled with one another as shown. Collectively, these elements may enable a user of device 100 to control one or more image capture settings of image capture component 108, and/or having one or more operations performed on images captured by image capture component 108, using gestures 120 inputted through, e.g., touch-sensitive display 110.

An example of image capture setting may include, but is not limited to, a zoom setting. An example of operations on images may include, but is not limited to, a cropping operation.

In embodiments, gesture detector 102 may be configured to receive gestures 120, and in response, map gestures 120 to a control setting for image capture component 108, and/or an operation to be performed on an image captured by image capture component 108. In embodiments, gestures 120 may be a pinch gesture inputted, e.g., through touch- sensitive display 110. More specifically, gesture detector 102 may map the finger span of the pinch gesture to a zoom setting for image capture component 108. Additionally or alternatively, gesture detector 102 may also map the center of the finger span of the pinch gesture to a cropping operation to be performed on an image captured by image capture component 108. In embodiments, gesture detector 102 may be configured with storage medium, e.g., non-volatile memory, to store a table of mappings. In embodiments, gesture detector 102 may be configured with interface and logic to facilitate registration of these mappings programmatically or by a user. Gesture detector 102 may be implemented in hardware or software, or both (combination).

In embodiments, image stream processor 104 may be configured to present an image of a current view 122 as seen by image capture component 108 for display on touch-sensitive display 110, for a user to interact, e.g., through gesture detector 102. In embodiments, as described earlier, the interactions may include, but are not limited to, gesturing control setting for image capture component 108, and/or gesturing operations to be performed on a captured image. In embodiments, for the former types of interactions, image stream processor 104 may be configured to forward the gestured control setting to image capture controller 106. For the latter types of interactions, image stream processor 104 may be configured to perform the gestured operation on the presented image, after capture. In embodiments, image stream processor 104 may be implemented in hardware or software, or both (combination).

Figure 3 illustrates an example image 302 of current view 122 of objects 124 for a user to gesture 120 image capture controls and/or operations on captured images.

Referring back to Figure 1, in embodiments, image capture controller 106 may be configured to control image capture component 108, including controlling operations as well as operational settings of image capture component 108. In embodiments, image capture controller 106 may be configured to change the control setting of image capture component 108, in response to the control setting information/commands forwarded from image stream processor 104, including the gestured control setting image stream processor 104 forwards on behalf of gesture detector 102. In embodiments, image capture controller 106 may be implemented in hardware or software, or both

(combination).

In embodiments, image capture component 108 may include various optics and/or sensor components configured to provide a current view 122 of the environment external to device 100. The environment may include a number of objects 124 of interest, e.g., people, building, geographic features and so forth. Touch-sensitive display 110 may be configured to display an image of the current view, and facilitate user interactions with user interface elements and forwarding the user interactions to appropriate drivers or handlers of device 100 to process, including forwarding user gestures to gesture detector 102, as described earlier. In embodiments, image capture component 108 and touch- sensitive display 110 represent a broad range of these elements known in the art.

As described earlier, gesture detector 102, image stream processor 104 and image capture controller 106 may be implemented in hardware, software, or both (combination). In embodiments, one or more of gesture detector 102, image stream processor 104 and image capture controller 106 may be implemented with programming instructions configured to be executed by a processor (not shown) of device 100. The processor may be any one of a wide range of digital signal processors, graphics processors, or general purpose microprocessors, known in the art.

Examples of device 100 may include, but are not limited to, ultra compact cameras, point-and-shoot cameras, compact SLR cameras, medium SLR cameras, professional SLR cameras, smartphones, personal digital assistants, computing tablets, laptop computers, and so forth. Accordingly, in embodiments, device 100 may include other applications and/or components 112. For examples, in the case of cameras, other applications and/or components 112 may include other photography related applications and/or components. In the case of smartphones, other applications and/or components 112 may, e.g., include telephony related applications and/or components. In the case of computing tablets, other applications and/or components 112 may, e.g., include wireless communication applications and/or components.

Referring now to Figure 2, wherein an imaging method, in accordance with various embodiments of the present disclosure, is illustrated. As shown, method 200 may start at block 202. At block 202, device 100 may be positioned and/or pointed in a particular direction to acquire a current view of the external environment, e.g., by a user of device 100 or using a positioning device (not shown). In response, an image of the current view for a user of device 100 to interact may be displayed (e.g., by image stream processor 104).

From block 202, method 200 may proceed to block 204 or block 212. At block 204, a user or a machine may interact with the displayed image, including making gestures relative to the displayed image, e.g., a pinch gesture. The gestures may be detected, e.g., by gesture detector 102. Block 212 will be further described later.

From block 204, method 200 may proceed to block 206 and/or block 208. At blocks 206 and/or 208, the gestures may be mapped to a control setting for image capture component 108 or to an operation to be performed on the captured image of the current view. The mapping, e.g., may be performed by gesture detector 102, including using, e.g., a table of registered mappings.

From block 206, the method may proceed to block 210. At block 210, after a gesture is mapped to a control setting for image capture component 108, the control setting may be changed or re-configured (e.g., by image capture controller 106).

At block 212, an image of the current view may be captured. The image may be captured, e.g., by image capture component 108, in response to a capture instruction from a user or a machine.

From blocks 208 and 212, method 200 may proceed to block 214. At block 214, an image processing operation may be performed on the captured image (e.g., by image stream processor 104). The imaging processing operation may, e.g., be a mapped cropping operation, in accordance with a gesture.

Accordingly, embodiments of the present disclosure advantageously enable two primary operations for capturing an image, zooming and framing, to be fused into a single, easy-to-understand operation during viewing. As those of ordinary skill in the art would appreciate isolating an area of interest with a current view, and properly framing the scene are primary activities in capturing images. Current camera subsystems do not offer user friendly features to enable these activities to be performed together. Typically, these activities are performed separately.

Figure 4 illustrates an example non-transitory computer-readable storage medium having instructions configured to enable practice of all or selected aspects of the method of Figure 2; in accordance with various embodiments of the present disclosure. As illustrated, non-transitory computer-readable storage medium 402 may include a number of programming instructions 404. Programming instructions 404 may be configured to enable a device, e.g., device 100 (embodiments with appropriate processors), (in response to execution of the programming instructions) to perform the device related operations of method 200, earlier described with references to Figure 2. In alternate embodiments, programming instructions 404 may be disposed on multiple non-transitory computer- readable storage media 402 instead. In various embodiments, the programming instructions may be configured to implement gesture detector 102, image stream processor 104, and/or image capture controller 106.

For some of these embodiments, at least one of the processors) may be packaged together with programming instructions 404 to practice some or all aspects of method 200. For some of these embodiments, at least one of the processors) may be packaged together with programming instructions 404 to practice some or all aspects of method 200 to form a System in Package (SiP). For some of these embodiments, at least one of the processors) may be integrated on the same die with programming instructions 404 to practice some or all aspects of method 200. For some of these embodiments, at least one of the processors) may be integrated on the same die with programming instructions 404 to practice some or all aspects of method 200 to form a System on Chip (SoC). For at least one embodiment, the SoC may be utilized in a smartphone, cell phone, tablet, or other mobile device.

Although specific embodiments have been illustrated and described herein, it will be appreciated by those of ordinary skill in the art that a wide variety of alternate and/or equivalent implementations may be substituted for the specific embodiments shown and described, without departing from the scope of the embodiments of the present disclosure. This application is intended to cover any adaptations or variations of the embodiments discussed herein. Therefore, it is manifestly intended that the embodiments of the present disclosure be limited only by the claims thereof. In embodiments, at least one non-transitory computer-readable storage medium is provided with a plurality of instructions configured to enable a device having an image capture component, in response to execution of the instructions by the device, to enable the device to provide one or more services to control a setting of the image capture component responsive to a gesture, to perform a post-capture processing operation on an image captured by the image capture component responsive to the gesture, or to do both.

In embodiments, the one or more services to control a setting of the image capture component include one or more services to control a zoom factor of the image capture component responsive to the gesture. In embodiments, the one or more services to control a zoom factor of the image capture component is configured to control the zoom factor responsive to a finger span of a pinch gesture. In embodiments, the one or more services to perform a post-capture processing operation on an image captured by the image capture component include one or more services to crop an image captured by the image capture component responsive to the gesture. In embodiments, the one or more services to crop an image captured by the image capture component is configured to crop an image captured by the image capture component responsive to a center of a finger span of a pinch gesture.

In embodiments, a method for operating a device having an image capture component, includes receiving, by the device, input of a gesture; and controlling a setting of the image capture component responsive to the gesture, performing a post-capture processing operation on an image captured by the image capture component responsive to the gesture, or doing both.

In embodiments, controlling a setting of the image capture component includes controlling a zoom factor of the image capture component responsive to the gesture. In embodiments, controlling a zoom factor of the image capture component comprises controlling the zoom factor responsive to a finger span of a pinch gesture. In

embodiments, performing a post-capture processing operation on an image captured by the image capture component includes cropping an image captured by the image capture component responsive to the gesture. In embodiments, cropping an image captured by the image capture component comprises cropping an image captured by the image capture component responsive to a center of a finger span of a pinch gesture.

In embodiments, an apparatus for imaging includes an image capture component configured to capture an image; and one or more logic units coupled with the image capture component, and configured to control the image capture component responsive to a gesture, perform a post-capture processing operation on an image captured by the image capture component responsive to the gesture, or do both.

In embodiments, the one or more logic units are configured to control a setting of the image capture component responsive to a gesture. In embodiments, the one or more logic units are configured to control a zoom factor of the image capture component responsive to the gesture. In embodiments, the one or more logic units are configured to control the zoom factor responsive to a finger span of a pinch gesture. In embodiments, the one or more logic units are configured to crop an image captured by the image capture component responsive to the gesture. In embodiments, the one or more logic units is configured to crop an image captured by the image capture component responsive to a center of a finger span of a pinch gesture.

In embodiments, the one or more logic units comprise a gesture detector configured to map a finger span of a pinch gesture to a zoom setting of the image capture component. In embodiments, the one or more logic units comprises a gesture detector configured to map a center of a finger span of a pinch gesture to a center of cropping for an image captured by the image capture component. In embodiments, the one or more logic units comprises an image stream processor coupled with the gesture detector, and configured to cooperate with the gesture detector to provide an image of a current view of the image capture component for a user of the device to use to gesture a control of the image capture component. In embodiments, the one or more logic units comprises an image stream processor coupled with gesture detector, and configured to cooperate with the gesture detector to provide an image of a current view of the image capture component for a user of the device to use to gesture an operation to be performed on an image captured by the image capture component. In embodiments, the image stream processor is configured to crop a first image captured by the image capture component in response to a gesture of a cropping operation to be performed on the first image. In embodiments, the one or more logic units further comprises an image capture controller coupled with the image stream processor, wherein the image capture controller is configured to provide the control to the image capture controller, and the image capture controller is configured to provide the control to the image capture component.

In embodiments, the apparatus is a selected one of a camera, a smartphone, a computing tablet, or a laptop computer.