Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
ASYMMETRIC SENSOR ARRAY FOR CAPTURING IMAGES
Document Type and Number:
WIPO Patent Application WO/2015/010098
Kind Code:
A1
Abstract:
This document describes techniques and apparatuses for implementing an asymmetric sensor array for capturing images. These techniques and apparatuses enable better resolution, depth of color, or low-light sensitivity than many conventional sensor arrays.

Inventors:
STETSON PHILIP SEAN (US)
NEIFELD MARK A (US)
MUSATENKO YURIY (US)
Application Number:
PCT/US2014/047313
Publication Date:
January 22, 2015
Filing Date:
July 18, 2014
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
GOOGLE INC (US)
International Classes:
H04N5/225; H04N5/369; H04N9/09
Domestic Patent References:
WO2009151903A22009-12-17
Foreign References:
EP2472581A22012-07-04
US20050134712A12005-06-23
EP2451150A22012-05-09
US20080151084A12008-06-26
EP1978740A12008-10-08
US7507944B12009-03-24
EP2388987A12011-11-23
Other References:
HONG HUAM ET AL: "Dual-sensor foveated imaging system", APPLIED OPTICS, OPTICAL SOCIETY OF AMERICA, WASHINGTON, DC; US, vol. 47, no. 3, 20 January 2008 (2008-01-20), pages 317 - 327, XP001511036, ISSN: 0003-6935, DOI: 10.1364/AO.47.000317
Attorney, Agent or Firm:
COLBY, Michael, K. (601 W. Main AvenueSuite 130, Spokane WA, US)
Download PDF:
Claims:
CLAIMS

What is claimed is:

1. An asymmetric sensor array comprising:

a main sensor having a main resolution and angled at a main angle; and multiple peripheral sensors having peripheral resolutions, the peripheral resolutions asymmetric to the main resolution, at least one of the multiple peripheral sensors positioned at a peripheral angle different from the main angle of the main sensor.

2. The asymmetric sensor array as recited in claim 1 , wherein the main resolution is a first number of pixels and the peripheral resolutions are each a second number of pixels, the first number of pixels being larger than the second number of pixels.

3. The asymmetric sensor array as recited in claim 1, wherein the main sensor includes a first size of pixels and the peripheral sensors include a second size of pixels, the first size smaller than the second size.

4. The asymmetric sensor array as recited in claim 1, wherein the peripheral angle of the at least one of the multiple peripheral sensors enables a depth of image to be created for an image of a scene based on peripheral data sensed by the one of the peripheral sensors of the scene and main data sensed by the main sensor of the same scene.

5. The asymmetric sensor array as recited in claim 1 , wherein the main sensor is a monochrome with a clear color filter.

6. The asymmetric sensor array as recited in claim 1, wherein the main sensor includes a filter permitting infrared radiation to be sensed by the main sensor.

7. The asymmetric sensor array as recited in claim 1 , wherein the main sensor is centered between the peripheral sensors.

8. The asymmetric sensor array as recited in claim 7, wherein the peripheral sensors include two or four peripheral sensors.

9. The asymmetric sensor array as recited in claim 1 , wherein the peripheral sensors are Bayer sensors.

10. An imaging device comprising:

an imager, the imager comprising:

an asymmetric sensor array having a main sensor and two or more peripheral sensors; and

a lens stack for each of the main and peripheral sensors;

one or more computer processors; and

one or more computer-readable storage media having instructions stored thereon that, responsive to execution by the one or more computer processors, implements an image manager capable of performing operations comprising:

receiving, from the main sensor, sensor data, the sensor data including a high-resolution, monochromatic image of a scene;

receiving, from the peripheral sensors, peripheral sensor data, the peripheral sensor data including multiple low-resolution color images of the scene, at least one of the multiple low -resolution color images being sensed at an angle different than an angle of reception of the sensor data of the main sensor;

determining, based on at least one of the multiple low -resolution color images, a depth map; and

constructing a final image using the low-resolution color images, the depth map, and the monochromatic high-resolution image.

11. The imaging device of claim 10, wherein the final image includes a high- resolution of the high-resolution, monochromatic image and colors of the multiple low- resolution color images.

12. The imaging device of claim 10, wherein pixels of the peripheral sensors are larger than pixels of the main sensor.

13. The imaging device of claim 10, wherein the imaging device is capable of constructing, without a focusing mechanism, the final image in focus for objects of a scene that are beyond two meters from the imager.

14. The imaging device of claim 10, further comprising a near- far toggle focus system, the near-far toggle focus system effective to enable the image manager to construct the final image in focus, the focus on objects between one and two meters from the imager or beyond two meters from the imager.

15. An imager comprising:

a main sensor having a main lens stack, the main sensor having a main resolution and angled at a main angle; and

multiple peripheral sensors having respective peripheral lens stacks and peripheral resolutions, the peripheral resolutions asymmetric to the main resolution, at least one of the multiple peripheral sensors positioned at a peripheral angle different from the main angle of the main sensor.

16. The imager of claim 15, wherein the main sensor and the multiple peripheral sensors are within a single die or substrate.

17. The imager of claim 15, wherein the main lens stack includes an auto-focus device capable of determining a focus of the main sensor in part using depth data captured by the peripheral sensors.

18. The imager of claim 15, wherein the imager is capable of focusing, without an auto-focus mechanism, at objects in scenes beyond about two meters from the main lens.

19. The imager of claim 15, wherein the main sensor is a monochromatic sensor and the peripheral sensors are color sensors.

20. The imager of claim 19, wherein the main sensor has both a higher number and smaller size of pixels than each of the peripheral sensors.

Description:
ASYMMETRIC SENSOR ARRAY FOR CAPTURING IMAGES

PRIORITY APPLICATION

[0001] This application claims priority under 35 U.S.C. § 1 19(e) to U.S. Provisional Patent Application No. 61/856,449, entitled "Asymmetric Array Camera" and filed on July 19, 2013, the disclosure of which is incorporated in its entirety by reference herein.

BACKGROUND

[0002] This background description is provided for the purpose of generally presenting the context of the disclosure. Unless otherwise indicated herein, material described in this section is neither expressly nor impliedly admitted to be prior art to the present disclosure or the appended claims.

[0003] Current sensor arrays for capturing images have partially addressed the need for a small form factor in the Z dimension for cameras and other imaging devices. These conventional sensor arrays, however, have various limitations. First, images captured for each sensor of the array must be combined in some manner through computational effort to construct the final image, which has varied success and requires computing resources. Second, this construction of the final image can be scene- dependent, meaning that some scenes result in relatively poor image quality. Third, these conventional sensor arrays often struggle to provide high resolution images, especially if there are any flaws in the sensors or lenses.

BRIEF DESCRIPTION OF THE DRAWINGS

[0004] Apparatuses of and techniques using an asymmetric sensor array for capturing images are described with reference to the following drawings. The same numbers are used throughout the drawings to reference like features and components:

Figure 1 illustrates an example environment in which an asymmetric sensor array for capturing images can be enabled.

Figure 2 illustrates an example of an asymmetric sensor array of figure 1, shown in both cross-section and plan views.

Figure 3 illustrates alternative asymmetric sensor arrays, all shown in plan view.

Figure 4 illustrates lens stacks of different Z-heights relative to sensor sizes of sensors in an asymmetric sensor array.

Figure 5 illustrates the imaging device of figure 1 in greater detail.

Figure 6 illustrates example methods that use an asymmetric sensor array to capture images and, with those images, create a final image. Figure 7 illustrates various components of an electronic device that can implement an asymmetric sensor array for capturing images in accordance with one or more embodiments.

DETAILED DESCRIPTION

[0005] Conventional sensor arrays use an array of equivalent image sensors to realize a final image. These sensor arrays enable a camera to have a low Z-height relative to the quality of the final image. Compared to a single sensor that provides a similar image quality, for example, sensor arrays have a low Z-height. This is due to a relationship between sensor size and Z height for the lens that focuses the image onto the sensor. Thus, a four-megapixel single sensor requires, assuming similar lens characteristics, a much taller Z height than an array of four one -megapixel sensors. Each of the four one-megapixel sensors is smaller and thus uses a shorter Z-height.

[0006] These conventional sensor arrays, however, have various limitations, such as failing to realize sharp optics, depth of color, scene-independent image reconstruction, or low-light sensitivity.

[0007] Consider, for example, a conventional sensor having a 2 x 2 grid of sensors, the sensors having red, green, green, and blue pixels to capture images. Each of the four sensors in the array includes small repeating squares having four pixels each, one pixel that senses red, one blue, and two green. The two green are used to determine resolution (e.g., sharpness) in addition to the color green. Mathematically, a one- megapixel sensor is then capable of one-half-megapixel resolution. Through various computational processes, which are not the topic of this disclosure, this one-half- megapixel resolution can be interpolated to improve the resolution (again, with varied success) by about 20%. Thus, the one-megapixel red, green, green, blue sensor can result in a final resolution of about 0.7 megapixels, though this final resolution has limitations as noted above.

[0008] To maximize this resolution, conventional sensor arrays use small color pixels to increases a number of pixels in a sensor, and thus keep the size of the sensor down, which in turn keeps the Z-height relatively low. Small color pixels, however, often fail to handle noise well, as each pixel's ability is limited by size, and thus small pixels have poorer signal-to-noise ratios than large pixels. Conventional sensor arrays often forgo use of large pixels, however, because doing so increases the Z-height or reduces the final resolution of the image.

[0009] Consider instead, however, an example asymmetric sensor array for capturing images. This asymmetric sensor array, instead of using small color pixels and equivalent sensors, uses an asymmetric sensor array having a central monochrome sensor for resolution and peripheral, relatively large color-pixel sensors for color. The central monochrome-pixel sensor provides high resolution using small pixels. The peripheral, large-pixel color sensors provide color and, due to their size, have excellent signal-to-noise ratios, and thus provide truer color, better color in low-light situations, or other benefits described below. While these peripheral color sensors have lower resolution than the central sensor, the human eye distinguishes less detail in color than it does in greyscale (e.g. , the image's resolution or sharpness). Therefore, this asymmetric sensor array provides a final image that conforms to the human eye's characteristics - with high sharpness and truer color, as well as less sensitively to low- light and other adverse scene characteristics.

[0010] The following discussion first describes an operating environment, then example asymmetric sensor arrays, then a detailed description of an example imaging device, followed by techniques that may be employed in this environment and imaging device, and ends with an example electronic device.

Example Environment

[0011] Figure 1 illustrates an example environment 100 in which an asymmetric sensor array for capturing images can be embodied. Example environment 100 includes an imaging device 102 capturing images of a scene 104. Imaging device 102 includes an imager 106, which includes lens stacks 108 and asymmetric sensor array 110, shown combined and separate.

[0012] Asymmetric sensor array 1 10 includes a main sensor 112 having a main resolution and angled at a main angle 114. Here main angle 114, as shown relative to object 116 of scene 104, is at ninety degrees. Asymmetric sensor array 1 10 also includes multiple peripheral sensors 1 18. These peripheral sensors 1 18 have peripheral resolutions or colors that are asymmetric to the main colors or resolution. Asymmetric sensors can be asymmetric to each other by having different numbers of pixels, color- sensing of pixels, sizes of pixels, or sensor size.

[0013] Peripheral sensors 118 (shown at 118-1 and 118-2) can be positioned at peripheral angles 120 (shown as peripheral angles 120-1 and 120-2, respectively), which are different from main angle 114 of main sensor 1 12. This difference or differences in angles enables a depth of image to be created for an image of a scene based on peripheral data sensed by the one of the peripheral sensors of the scene and main data sensed by the main sensor of the same scene or different peripheral data sensed by a different peripheral sensor of the same scene, or some combination of these. Thus, peripheral angle 120-1 of 5° off of main angle 1 14 captures peripheral data through capture of an image of object 1 16 different from that of an image captured of object 1 16 from main sensor 1 12 at main angle 1 14.

[0014] Consider figure 2, which illustrates an example of asymmetric sensor array 1 10 of figure 1. In figure 1 , asymmetric sensor array 1 10 is shown oriented vertically and in cross section and is also shown at cross-section view 202 in figure 2 but oriented horizontally. Figure 2 illustrates, expanded and in a plan view 204, asymmetric sensor array 1 10. Note that asymmetric sensor array 1 10 is structured such that main sensor 1 12 is centered between peripheral sensors 1 18.

[0015] As mentioned above, main sensor 1 12 may include various resolutions and types. In the example of figure 2, main sensor 1 12 is monochrome with a clear color filter. This monochrome aspect improves signal-to-noise ratio in low-light situations and, as noted, enables a high detail for a given pixel count, though in monochrome (e.g., grayscale). Thus, main sensor 1 12 enables higher detail than color- pixel sensors. Main sensor 1 12 can also perform better in low-light environments due to an improved signal-to-noise ratio (SNR). In some cases, main sensor 1 12 also includes a filter permitting infrared radiation to be sensed by main sensor 1 12. Typically infrared radiation is not desired for color-pixel sensors because infrared radiation inhibits color fidelity. Here, however, main sensor 1 12 is monochrome, and thus this typical limitation is not present. Further, by permitting infrared radiation to be sensed, the bandwidth captured by the imager is expanded into the near Infrared (IR). This also improve SNR in low-light scenes, in some cases so much that main sensor 1 12 may capture images in near darkness. IR sensing may, in some cases, permit a faster exposure time as well as better capture of moving objects in a scene, which can be useful for still images and for capture of multiple images in recording a video, especially for high-resolution capture of video.

[0016] This illustration shows resolutions of main sensor 1 12 and peripheral sensors 1 18 in terms of a number and size of squares, which are here assumed to be pixels. While simplified for visual clarity (showing millions of pixels is not possible for this type of illustration), main sensor 1 12 includes four times the number of pixels of each of peripheral sensors 1 18, and peripheral sensors 1 18 include pixels that are four times as large as those of main sensor 1 12.

[0017] The estimation of a depth map for images (e.g., a per-pixel estimation of the distance between a camera and a scene) improves with image SNR. Given this, and the use of peripheral sensors 1 18 at some angle relative to main sensor 1 12 for depth mapping, the larger size of pixels of peripheral sensors 1 18 can improve depth mapping by improving SNR. In more detail, smaller pixels have less capacity to absorb photons, and thus, they have less capacity to accept noise. Therefore, the larger pixels allow for a better signal-to-noise ratio, which aids in depth mapping for accurate color representation and for low-light scenes.

[0018] In addition to the example shown in figure 2, consider figure 3, which illustrates alternative asymmetric sensor arrays 302, all shown in plan view. Asymmetric sensor array 302-1 includes a clear, high pixel-count main sensor and two color-pixel peripheral sensors with larger pixels but smaller overall physical size than that of the main sensor. Asymmetric sensor array 302-2 includes a clear, high pixel- count main sensor and four peripheral color-pixel sensors with same-size pixels as that of the main sensor. Asymmetric sensor array 302-3 includes a clear or a color high- pixel count main sensor and two large-sized and high-pixel count peripheral sensors with color pixels. The color sensors may be Bayer filter sensors, panchromatic cell sensors, improved or angled Bayer-type filter sensors (e.g., EXR and X-Trans by Fujifilm™), or other color-capturing sensors. The color sensors may sense a wide variety of colors and be structured with color pixels, such as red, green, green, and blue pixels arranged in squares, red, green, blue, and white also in squares, angled double- pixel colored (non-square) of cyan, magenta, and yellow or red, green, and blue of roughly equal amounts (rather than more green), and so forth.

[0019] Each of these asymmetric sensor arrays 302 and 1 10 are examples, rather than limitations to the types of asymmetric sensor arrays contemplated by this disclosure. This description now turns to lens stacks 108.

[0020] As noted for figure 1 , imaging device 102 includes imager 106, which includes asymmetric sensor array 1 10 and lens stacks 108. Figure 4 illustrates one of lens stacks 108 in detail to describe a relationship between sensor size and Z-height mentioned above. As shown, and assuming a similar lens material and physical structure, Z-height 402 is related (in some cases proportional) to dimensions of sensors, such as X-width 404 of main sensor 1 12. As shown, a height of lens 406, along with some distance to focus light (focal distance 408) makes up Z-height 402, which together are related to X-width 404 (as well as Y-breadth, not shown).

[0021] Figure 4 also illustrates an example asymmetric sensor array 410 with a main sensor 412 having an X-width 414 related to a lens-stack height 416 and two peripheral sensors 418 having a smaller X-width 420 related to a smaller lens-stack height 422. This illustration shows that a Z-height of an imager is related to sensor size (Y-breadth is illustrated and equal to respective X-widths). A plan view 424 of asymmetric sensor array 410 is also provided to show this relationship. While not required, in some cases main and peripheral sensors (e.g., main sensor 412 and peripheral sensors 418) and integrated into a single die or substrate. Imager 106 may also be structured as an integrated apparatus, such as including asymmetric sensor array 410 along with lens stacks for each of the sensors in the array.

[0022] While not shown, the various imagers may include, or imaging device 102 may include separate from the various imagers, an auto-focus device capable of determining a focus of the main sensor in part using depth data captured by the peripheral sensors. This is not required, as in some cases no auto-focus is needed. Use of an auto-focus device can depend on a desired image quality and a size and resolution of sensors used to deliver this image quality. This can be balanced, however, with an undesirable focus lag of many current auto-focus mechanisms. The asymmetric sensor array, however, can reduce this focus lag by decreasing an iterative adjust and sense operation of current auto-focus systems. The iterative adjust and sense operation is decreased by using depth information captured by the peripheral sensors to guide the auto-focus system, thereby reducing a number of iterations required to achieve focus. [0023] Furthermore, these various imagers can be structured to be capable of focusing at objects in scenes beyond about two meters from the lens of the main sensor without a focusing mechanism. If focusing on objects within two meters is desired, a simpler optical system to adjust focus only at near- field scenes (objects within one to two meters) can be used. This simpler optical system can be a near-far toggle, for example.

[0024] Having generally described asymmetric sensor arrays and imagers, this discussion now turns to figure 5, which illustrates imaging device 102 of figure 1 in greater detail. Imaging device 102 is illustrated with various non-limiting example devices: smartphone 102-1 , laptop 102-2, television 102-3, desktop 102-4, tablet 102- 5, and camera 102-6. Imaging device 102 includes processor(s) 504 and computer- readable media 506, which includes memory media 508 and storage media 510. Applications and/or an operating system (not shown) embodied as computer-readable instructions on computer-readable memory 506 can be executed by processor(s) 504 to provide some or all of the functionalities described herein. Computer-readable media 506 also includes image manager 512. As noted above, imaging device 102 includes imager 106, which in turn includes lens stacks 108 and asymmetric sensor array 1 10, and in some cases a focusing module 514, which may be software or hardware or both (e.g., as an above-mentioned auto-focus system).

[0025] In some cases, imaging device 102 is in communication with, but may not necessarily include, imager 106 or elements thereof. Captured images are then received by imaging device 102 from imager 106 via the one or more I/O ports 516. I/O ports 516 can include a variety of ports, such as by way of example and not limitation, high-definition multimedia (HDMI), digital video interface (DVI), display port, fiber-optic or light-based, audio ports (e.g., analog, optical, or digital), USB ports, serial advanced technology attachment (SATA) ports, peripheral component interconnect (PCI) express based ports or card slots, serial ports, parallel ports, or other legacy ports. Imaging device 102 may also include network interface(s) 518 for communicating data over wired, wireless, or optical networks. By way of example and not limitation, network interface 518 may communicate data over a local-area-network (LAN), a wireless local-area-network (WLAN), a personal-area-network (PAN), a wide-area-network (WAN), an intranet, the Internet, a peer-to-peer network, point-to- point network, a mesh network, and the like.

Example Methods

[0026] The following discussion describes methods by which techniques are implemented to enable use of asymmetric sensor arrays for capturing images. These methods can be implemented utilizing the previously described environment and example sensor arrays and imagers, such as shown in figures 1 -5. Aspects of these example methods are illustrated in figure 6, which are shown as operations performed by one or more entities. The orders in which operations of these methods are shown and/or described are not intended to be construed as a limitation, and any number or combination of the described method operations can be combined in any order to implement a method, or an alternate method.

[0027] Figure 6 illustrates example methods 600 using an asymmetric sensor array to capturing images and, with those images, create a final image. At 602, sensor data is received from a main sensor of an asymmetric sensor array. The main sensor, as noted above, may be monochromatic, and thus the sensor data include a high- resolution, monochromatic image of a scene. Using imaging device 102 of figures 1 and 5 as an example, image manager 512 receives, from main sensor 1 12, sensor data capturing scene 104 and object 116. Assuming also that main sensor 1 12 and peripheral sensors 1 18 are as shown in figure 2, main sensor 1 12 provides sensor data that is monochromatic and high resolution with good data for low-light scenes.

[0028] At 604, peripheral sensor data including multiple color images of the scene are received from peripheral sensors. One or more of the multiple color images can be sensed at an angle different than the angle of reception of the sensor data of the main sensor. As noted above, this different angle enables creation of a depth map along with other uses also described above. Also, in some example asymmetric sensor arrays, such as asymmetric sensor array 1 10, 302-1 , and 302-2 (but not 302-3), of figures 2 and 3, respectively, the color images are low resolution relative to the resolution of the main sensor.

[0029] Continuing the ongoing example, peripheral sensor data from peripheral sensors 1 18 include two low -resolution color images of scene 104, both of which are sensed at angles different from those of main sensor 112, namely by five degrees (see figure 1), though other angles may instead be used.

[0030] At 606, a depth map is determined based on the multiple color images. This depth map includes information relating to distances of surfaces in a sense (such as object 116 of scene 104 of figure 1), though these distances may be relative to a focal plane, other objects in the scene, or the imager or sensors. Here image manager 512 of figure 5 receives the sensor data from main sensor 112 and peripheral sensor data from peripheral sensors 118. Image manager 512 then determines the depth map based on the color images being sensed at different angles from the main sensor's high-resolution image (whether color or monochrome).

[0031] At 608, a final image is constructed using the depth map, the multiple color images, and the high-resolution image. Image manager 512, for example, may "paint" the low-resolution color images from peripheral sensors 118 onto the high- resolution, monochromatic image from main sensor 1 12, in part with use of the depth map. By so doing, methods 600 create a final image having object 116 in focus, with high sharpness, accurate color and depth of color, and, in many cases, using fewer computation resources or more quickly (in focusing or processing).

Example Electronic Device

[0032] Figure 7 illustrates various components of an example electronic device 700 that can be implemented as an imaging device as described with reference to any of the previous figures 1-6. The electronic device may be implemented as any one or combination of a fixed or mobile device, in any form of a consumer, computer, portable, user, communication, phone, navigation, gaming, audio, camera, messaging, media playback, and/or other type of electronic device, such as imaging device 102 described with reference to figures 1 and 5.

[0033] Electronic device 700 includes communication transceivers 702 that enable wired and/or wireless communication of device data 704, such as received data, transmitted data, or sensor data as described above. Example communication transceivers include NFC transceivers, WPAN radios compliant with various IEEE

802.15 (Bluetooth™) standards, WLAN radios compliant with any of the various IEEE 802.1 1 (WiFi™) standards, WWAN (3GPP-compliant) radios for cellular telephony, wireless metropolitan area network (WMAN) radios compliant with various IEEE

802.16 (WiMAX™) standards, and wired local area network (LAN) Ethernet transceivers.

[0034] Electronic device 700 may also include one or more data input ports 706 via which any type of data, media content, and/or inputs can be received, such as user- selectable inputs, messages, music, television content, recorded video content, and any other type of audio, video, and/or image data received from any content and/or data source (e.g., other image devices or imagers). Data input ports 706 may include USB ports, coaxial cable ports, and other serial or parallel connectors (including internal connectors) for flash memory, DVDs, CDs, and the like. These data input ports may be used to couple the electronic device to components (e.g., imager 106), peripherals, or accessories such as keyboards, microphones, or cameras.

[0035] Electronic device 700 of this example includes processor system 708 (e.g., any of application processors, microprocessors, digital-signal-processors, controllers, and the like), or a processor and memory system (e.g., implemented in a SoC), which process (i.e., execute) computer-executable instructions to control operation of the device. Processor system 708 (processor(s) 708) may be implemented as an application processor, embedded controller, microcontroller, and the like. A processing system may be implemented at least partially in hardware, which can include components of an integrated circuit or on-chip system, digital-signal processor (DSP), application-specific integrated circuit (ASIC), field-programmable gate array (FPGA), a complex programmable logic device (CPLD), and other implementations in silicon and/or other hardware.

[0036] Alternatively or in addition, electronic device 700 can be implemented with any one or combination of software, hardware, firmware, or fixed logic circuitry that is implemented in connection with processing and control circuits, which are generally identified at 710 (processing and control 710). Hardware-only devices in which an asymmetric sensor array for capturing images may be embodied include those that convert, without computer processors, sensor data into voltage signals by which to control focusing systems (e.g., focusing module 514).

[0037] Although not shown, electronic device 700 can include a system bus, crossbar, or data transfer system that couples the various components within the device. A system bus can include any one or combination of different bus structures, such as a memory bus or memory controller, a peripheral bus, a universal serial bus, and/or a processor or local bus that utilizes any of a variety of bus architectures.

[0038] Electronic device 700 also includes one or more memory devices 712 that enable data storage, examples of which include random access memory (RAM), non- volatile memory (e.g., read-only memory (ROM), flash memory, EPROM, EEPROM, etc.), and a disk storage device. Memory device(s) 712 provide data storage mechanisms to store the device data 704, other types of information and/or data, and various device applications 720 (e.g., software applications). For example, operating system 714 can be maintained as software instructions within memory device 712 and executed by processors 708. In some aspects, image manager 512 is embodied in memory devices 712 of electronic device 700 as executable instructions or code. Although represented as a software implementation, image manager 512 may be implemented as any form of a control application, software application, signal-processing and control module, or hardware or firmware installed on imager 106.

[0039] Electronic device 700 also includes audio and/or video processing system 716 that processes audio data and/or passes through the audio and video data to audio system 718 and/or to display system 722 (e.g., a screen of a smart phone or camera). Audio system 718 and/ or display system 722 may include any devices that process, display, and/or otherwise render audio, video, display, and/or image data. Display data and audio signals can be communicated to an audio component and/or to a display component via an RF (radio frequency) link, S -video link, HDMI (high-definition multimedia interface), composite video link, component video link, DVI (digital video interface), analog audio connection, or other similar communication link, such as media data port 724. In some implementations, audio system 718 and/or display system 722 are external components to electronic device 700. Alternatively or additionally, display system 722 can be an integrated component of the example electronic device, such as part of an integrated touch interface. Electronic device 700 includes, or has access to, imager 106, which includes lens stacks 108 and asymmetric sensor array 110 (or 302 or 410). Sensor data is received from imager 106 and/or asymmetric sensor array 110 by image manager 512, here shown stored in memory devices 712, which when executed by processor 708 constructs a final image as noted above.

[0040] Although embodiment of an asymmetric sensor array for capturing images have been described in language specific to features and/or methods, the subject of the appended claims is not necessarily limited to the specific features or methods described. Rather, the specific features and methods are disclosed as example implementations an asymmetric sensor array for capturing images.