Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
PYRAMIDAL STRUCTURE FOR PLENOPTIC PIXEL
Document Type and Number:
WIPO Patent Application WO/2023/006508
Kind Code:
A1
Abstract:
Some embodiments of an apparatus, for example a light-field apparatus, may include a microlens; a color filter system; and two or more plenoptic subpixel sensors, the two or more plenoptic subpixel sensors comprising two or more photodiodes formed in a silicon layer, wherein a pyramid shape is etched in the two or more respective silicon photodiodes formed in the silicon layer.

Inventors:
CHATAIGNIER GUILLAUME (FR)
VANDAME BENOIT (FR)
VAILLANT JÉROME (FR)
Application Number:
PCT/EP2022/070209
Publication Date:
February 02, 2023
Filing Date:
July 19, 2022
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
INTERDIGITAL CE PATENT HOLDINGS SAS (FR)
COMMISSARIAT ENERGIE ATOMIQUE (FR)
International Classes:
H01L27/146
Foreign References:
US20170094152A12017-03-30
Other References:
CHATAIGNIER GUILLAUME ET AL: "Joint electromagnetic and ray-tracing simulations for quad-pixel sensor and computational imaging", OPTICS EXPRESS, vol. 27, no. 21, 8 October 2019 (2019-10-08), pages 30486, XP055975387, Retrieved from the Internet [retrieved on 20221123], DOI: 10.1364/OE.27.030486
BARDONNET F ET AL: "Pixels with add-on structures to enhance quantum efficiency in the near infrared", SPIE SMART STRUCTURES AND MATERIALS + NONDESTRUCTIVE EVALUATION AND HEALTH MONITORING, 2005, SAN DIEGO, CALIFORNIA, UNITED STATES, SPIE, US, vol. 11871, 12 September 2021 (2021-09-12), pages 118710Y - 118710Y, XP060147242, ISSN: 0277-786X, ISBN: 978-1-5106-4548-6, DOI: 10.1117/12.2597142
CHATAIGNIER, G.VANDAME, B.VAILLANT, J.: "Joint Electromagnetic and Ray-Tracing Simulations for Quad-Pixel Sensor and Computational Imaging", OPTICS EXPRESS, vol. 27, 14 October 2019 (2019-10-14), pages 21
KOBAYASHI, MASAHIRO: "A Low Noise and High Sensitivity Image Sensor with Imaging and Phase-Difference Detection AF in All Pixels", ITE TRANS. ON MEDIA TECHN. AND APPLICATIONS (MTA, vol. 4, no. 2, 2016, pages 123 - 128
Attorney, Agent or Firm:
AWA SWEDEN AB (SE)
Download PDF:
Claims:
CLAIMS

1. An apparatus comprising: a microlens; and two or more plenoptic subpixel sensors, the two or more plenoptic subpixel sensors comprising two or more photodiodes formed in a silicon layer, wherein a pyramidal structure is etched in the two or more respective silicon photodiodes formed in the silicon layer.

2. The apparatus of claim 1, wherein the pyramidal structure is formed to generate an optical prism deviating light away from a center of the two or more plenoptic subpixel sensors.

3. The apparatus of any one of claims 1-2, wherein the pyramidal structure is configured to reduce optical crosstalk between at least two of the two or more plenoptic subpixel sensors.

4. The apparatus of any one of claims 1-3, wherein the pyramidal structure is configured to increase sensitivity of at least one of the two or more plenoptic subpixel sensors.

5. The apparatus of any one of claims 1-4, wherein the pyramidal structure is configured to increase angular discrimination for light incident on the microlens at an angle near a chief ray angle of the microlens.

6. The apparatus of any one of claims 1-5, wherein the pyramid structure is configured to confine light to one of the two or more plenoptic subpixel sensors.

7. The apparatus of any one of claims 1-6, wherein the pyramid structure is configured to reflect light away from at least one of the two or more plenoptic subpixel sensors.

8. The apparatus of any one of claims 1 -7, wherein the silicon layer is etched with a trench conforming to a deep trench isolation (DTI) structure.

9. The apparatus of any one of claims 1 -8, wherein a focal length of the microlens is less than a threshold.

10. The apparatus of any one of claims 1 -9, wherein the two or more plenoptic sensors comprise a dual pixel sensor, and wherein the pyramidal structure comprises a triangular structure.

11. The apparatus of any one of claims 1-10, wherein the two or more plenoptic sensors comprise a quad pixel sensor.

12. The apparatus of any one of claims 1-11, wherein the pyramidal structure comprises a material layer above the silicon layer.

13. The apparatus of any one of claims 1-12, wherein the pyramidal structure conforms to pyramid-shaped trenches etched in the silicon layer.

14. The apparatus of any one of claims 1-13, further comprising a color filter system.

15. The apparatus of claim 14, wherein the pyramidal structure comprises a material layer above the silicon layer, and wherein the material layer comprises an oxide layer between the color filter system and at least one of the two or more plenoptic subpixel sensors.

16. The apparatus of claim 15, wherein the pyramidal structure further comprises a color filter layer of the color filter system above the material layer.

17. The apparatus of claim 16, wherein the color filter layer comprises a color filter resin.

18. The apparatus of any one of claims 14-15, wherein the color filter system comprises a planarization layer and a color filter layer.

19. The apparatus of any one of claims 14-18, wherein at least one of the two or more plenoptic subpixel sensors is adjacent to the color filter system.

20. The apparatus of any one of claims 14-18, further comprising an oxide layer, wherein the oxide layer is adjacent to the color filter system, wherein at least one of the two or more plenoptic subpixel sensors is adjacent to the oxide layer, and wherein the oxide layer is between the color filter system and at least one of the two or more plenoptic subpixel sensors.

21. The apparatus of claim 20, wherein a base of a pyramid portion of the pyramidal structure is adjacent to the oxide layer.

22. The apparatus of any one of claims 14-21 , wherein the color filter system is adjacent to the microlens.

23. The apparatus of any one of claims 14-22, wherein two or more plenoptic subpixel sensors are adjacent to the color filter system.

24. The apparatus of any one of claims 14-23, wherein the color filter system comprises a filter pattern of two or more subpixels.

25. The apparatus of any one of claims 1-24, wherein a horizontal facet angle of the pyramidal structure is 54.7 degrees.

26. The apparatus of any one of claims 1-25, wherein a base of the pyramidal structure is equal to half of a diameter of the microlens.

27. The apparatus of any one of claims 1 -26, wherein the two or more plenoptic subpixel sensors are arranged in a 2x1 grid, and wherein the pyramidal structure is centered on a line dividing the 2x1 grid in half.

28. The apparatus of any one of claims 1 -26, wherein the two or more plenoptic subpixel sensors are arranged in a 2x2 grid, and wherein the pyramidal structure is centered on a center of the 2x2 grid.

29. The apparatus of any one of claims 1 -26, wherein the two or more plenoptic subpixel sensors are arranged in a 3x3 grid, and wherein the pyramidal structure is centered on a center of the 3x3 grid.

30. The apparatus of any one of claims 1 -26, wherein the pyramidal structure is a frustum, and wherein a flat top of the frustum is configured to match a center subpixel sensor of the 3x3 grid.

31. The apparatus of any one of claims 1 -30, wherein a tip of the pyramidal structure is a square.

32. The apparatus of any one of claims 1-30, wherein a base of the pyramidal structure is a disk.

33. The apparatus of any one of claims 1 -30, wherein a tip of the pyramidal structure is a disk.

34. The apparatus of any one of claims 1-33, wherein the two or more plenoptic subpixel sensors are arranged in a grid array, and wherein the pyramidal structure is centered on a center of the grid array.

35. The apparatus of any one of claims 1-27, wherein a pyramid portion of the pyramidal structure is a triangle extruded in at least one direction.

36. The apparatus of any one of claims 1-29, wherein a pyramid portion of the pyramidal structure is a polygon extruded in at least one direction.

37. The apparatus of any one of claims 1-31, wherein a base of a pyramid portion of the pyramidal structure is a square.

38. The apparatus of any one of claims 1-37, further comprising one or more external insulators adjacent to an external edge of at least one of the plenoptic subpixel sensors.

39. The apparatus of any one of claims 1-38, wherein the microlens is configured to shift a chief ray angle corresponding to a pixel offset of two pixels.

40. The apparatus of any one of claims 1-39, wherein a size of a base of a pyramid portion of the pyramidal structure is adjusted based on a curvature of the microlens.

41. The apparatus of any one of claims 1-40, wherein the pyramidal structure is symmetrical along a first plane cutting through a center of the pyramidal structure, wherein the pyramidal structure is symmetrical along a second plane cutting through the center of the pyramidal structure, and wherein the first plane is perpendicular to the second plane.

42. The apparatus of any one of claims 1-41, wherein the pyramidal structure comprises silicon dioxide ( Si02 ).

43. The apparatus of any one of claims 1-41, wherein the pyramidal structure comprises silicon oxide.

44. The apparatus of any one of claims 1-43, wherein the pyramidal structure is configured to reduce optical crosstalk between at least two pixels.

45. The apparatus of any one of claims 1-43, wherein the pyramidal structure is configured to reduce optical crosstalk between at least two microlenses.

Description:
PYRAMIDAL STRUCTURE FOR PLENOPTIC PIXEL

CROSS-REFERENCE TO RELATED APPLICATIONS

[0001] The present application is an International Application, which claims benefit of European Patent Application No. EP21306035, entitled “PYRAMIDAL STRUCTURE FOR PLENOPTIC PIXEL,” filed July 26, 2021, which is hereby incorporated by reference in its entirety.

BACKGROUND

[0002] An integrated plenoptic sensor is an imaging sensor where a microlens is shared between multiple subpixels. An integrated plenoptic sensor records a “4D light-field” which may give indications on the angle of incidence of light. An integrated plenoptic sensor may be used on cameras and smartphones to drive the autofocus of the main lens. An integrated plenoptic sensor may provide additional abilities, such as passive depth mapping, refocusing, or aberration correction.

[0003] The present disclosure relates to plenoptic cameras. A plenoptic camera is similar to a common camera with a lens system and a light sensor, with the addition of a micro-lens array over the micro-image sensor. Each micro-lens produces a micro-image on the sensor. The resulting plenoptic image may be referred to as a 4D light field which gives indications on the sensor and pupil coordinates of the photon trajectory. For later display and processing, the 4D light field may be processed through an operation known as projection into a 2D refocused image. The projection operation allows for the possibility of tuning the focalization distance.

[0004] In some plenoptic cameras, each pixel of the light sensor is covered by a color filter that primarily allows light of one color to reach the corresponding pixel. In some such cameras, the color filters are arranged as a so-called Bayer filter. The conventional Bayer filter allows one color— red, green, or blue— to be recorded by each corresponding pixel. When an image has been captured using a Bayer filter, each pixel has only one associated color value, corresponding to the color of the filter associated with that pixel. From this image, it may be desirable to obtain an image in which each of the pixels has all three color values. This may be done with processing to obtain the two missing color values for each pixel. Such processing techniques are referred to as demosaicing. Demosaicing can be a non-trivial process, particularly for images or regions of images that cover highly textured areas.

[0005] Bayer color filters have been used with plenoptic cameras. To process 4D light field images captured with such cameras, demosaicing may be performed concurrently with a 2D refocusing process.

Plenoptic Sampling of 4D Light-Field Data

[0006] Conventional plenoptic cameras 100 are similar to ordinary 2D cameras with a main lens 102 and the addition of a micro-lens array 104 set just in front of the sensor 106 as illustrated schematically in FIG. 1. The sensor pixels under each micro-lens record a respective micro-lens image.

[0007] Plenoptic cameras record 4D light-field data which can be transformed into various by-products such as re-focused images with freely selected distances of focalization.

[0008] The sensor of a light-field camera records an image which is made of a collection of 2D small images arranged within a larger 2D image. Each micro-lens in the array, and each corresponding small micro-lens image generated under that lens, may be indexed by the coordinates The pixels of the light field may be associated to four coordinates (x, y, i,j ), where (x, y) identifies the location of the pixel in the complete image. The 4D light field recorded by the sensor may be represented by L(x,y, i,j ).

[0009] FIG. 2 schematically illustrates the image 200 which is recorded by the sensor. Each micro-lens produces a micro-image 202 which is schematically represented by a circle (the shape of the small image depends on the shape of the micro-lenses which is typically circular). Pixel coordinates are labelled (x, y). p 206 is the distance between two consecutive micro-images p 206 is not necessary an integer value. Micro-lenses are chosen such that p 206 is larger than a pixel size d. Micro-lens images are referenced by their coordinate Each micro-lens image samples the pupil of the main lens with the ( u , v ) coordinate system. Some pixels might not receive any photons from any micro-lens; those pixels may be disregarded. Indeed, the inter microlens space may be masked out to prevent photons to pass outside from a micro-lens (if the micro-lenses have a square shape, no masking is needed). The center of a micro-lens image (i,j) is located on the sensor at the coordinate ( Xi j .yt j ). Q is the angle between the square lattice of pixel and the square lattice of micro-lenses. In FIG. 2, Q = 0. Assuming the micro-lenses are arranged according to a regular square lattice, the (Xi j .yt j ) can be computed by the following equation considering (x 00 , y 00 ) the pixel coordinate of the micro-lens image (0,0):

[0010] FIG. 2 also illustrates that an object from the scene may be visible on several contiguous micro-lens images, with each image 204 being illustrated as a dark square dot. The distance between two consecutive views of an object is w 208. This distance w is referred to herein as the replication distance. An object is theoretically visible on r consecutive micro-lens images with where r is the number of consecutive micro-lens images in one dimension, and [... J is the floor function. An object is theoretically visible in r 2 micro-lens images. Depending on the shape of the micro-lens image, some of the r 2 views of the object might be invisible.

Optical Properties of Light-Field Cameras

[0011] The distances p and w introduced in the previous sub-section are given in unit of pixel size. They can be converted into physical unit distances (e.g., meters), respectively P and W, by multiplying them by the pixel size <5, such that W = 5w and P = dr . These distances can vary depending on the light-field camera characteristics.

[0012] FIG. 3 and FIG. 4 are schematic side illustrations of different light-field cameras 300, 400 assuming a perfect thin-lens model. The main lens in these examples has a focal length F and an aperture F. The microlens array is made of micro-lenses having a focal length /. The pitch of the micro-lens array is f. The microlens array 304, 404 is located at a distance D from the main-lens 302, 402, and a distance d from the sensor 306, 406. The object (not visible on the figures) is located at a distance z from the main-lens (toward the left). This object is focused by the main lens at a distance z’ from the main lens (toward the right). FIG. 3 illustrates the case where D > z’, and FIG. 4 illustrates the case where D < z'. In both cases, the micro-lens images can be in focus depending on d and /. FIGs. 3 and 4 illustrate examples of so-called type II plenoptic cameras.

[0013] In an alternative light-field camera design referred to as a type I plenoptic camera 500, the parameters are selected such that focal length / = d, the distance between the micro-lens array 504 and the sensor 506. An example of such a design is illustrated in FIG. 5. This design is made such that the main-lens 502 is focusing images close to the micro-lens array 504. If the main-lens 502 is focusing exactly on the micro-lens array 504, then W = oo. Also, the micro-lens images are fully out-of-focus and equal to a constant (not considering noise). [0014] The replication distance W varies with the z, the distance of the object. To establish the relation between W and z, one may refer to the thin lens equation

1 1 _ 1 Eq. 3 z + z' F and to Thales law

D — z' D — z' + d Eq. 4 f W

[0015] Combining the previous two equations, one can deduce

[0016] The relation between W and z does not assume that the micro-lens images are in focus. Micro-lens images may be in focus when thin lens equation is satisfied such that

1 1 _ 1 Eq. 6 D - z' + d ~ J

[0017] Also from the Thales law one derives P as follows. Eq. 7

P = <pe

[0018] The ratio e defines the enlargement between the micro-lens pitch and the micro-lens images pitch. This ratio is very close to 1 since D » d.

Sub-Aperture Images

[0019] Some of the plenoptic cameras as described above have the following properties: the micro-lens array has a square lattice (like the pixel array) and has no rotation versus the pixels; and the micro-lens image diameter is equal to an integer number of pixels (or almost equal to an integer number of pixels). These properties are satisfied by most feasible plenoptic sensors. These properties allow for the generation of images known as subaperture images.

[0020] A sub-aperture image collects all of the 4D light-field pixels having the same relative position within their respective micro-lens image, for example all of the pixels having the same (u, v ) coordinates. If the array of micro-lenses has the size / x /, then each sub-aperture image also has size / x /. And if there is a p x p array of pixels under each micro-lens, then there are p x p sub-aperture images. If the number of pixels of the sensor is N x x N y , then each sub-aperture image may have the size of N x /p x N y /p.

[0021] FIGs. 6A-6B schematically illustrate a conversion from a captured light-field image L(x, y, i,j ) into a series of sub-aperture images S(a^,u,v). FIG. 6A illustrates a light-field image 600 (with size 24 x 16 pixels in this simplified example, although real-world examples generally include many more pixels), with each pixel position being given by coordinates (x,y). Each of the micro-lenses 602 (illustrated schematically by a circle) is associated with a 4 x 4 micro-image, with positions in the micro-image being given by coordinates ( a,b ). The micro-images are arranged in a 6 x 4 array, with each micro-image being indexed by coordinates As seen in FIG. 6A, an object 604 (represented by a solid round dot) is imaged in nine of the micro-images.

[0022] FIG. 6B illustrates a set 650 of sixteen, i.e. 4 x 4, sub-aperture images 652 generated from the light field of FIG. 6A. Each sub-aperture image 652 has a size of / x / pixels (6 x 4 in this simplified example, corresponding to the number of micro-images). A position within each sub-aperture image is indicated by coordinates (a,b), where 0 < a < I and 0 < b < /. Each 2D sub-aperture image 652 may be identified by pupil coordinates ( u , v ) and denoted by S(u, v ).

[0023] An example of generating a sub-aperture image from a light-field image is as follows. In FIG. 6A, the top-left pixel of each micro-image within the light-field image is shaded. All of these pixels are combined into a single sub-aperture image, namely the sub-aperture image 652 at the top-left of FIG. 6B.

[0024] The relations between (x, y, i,j) and (a, b, u, v ) may be expressed as follows: where [. J denotes the floor function, and mod denotes the modulo function.

[0025] If p is not exactly an integer but close to an integer, then the sub-aperture images can be computed by considering the distance between the micro-lens image equal to [pj the integer just greater than p. This case occurs especially when the micro-lens diameter f is equal to an integer number of pixels. In the case, p = e being slightly larger than f since e = (D + d)/d is slightly greater than 1. The advantage of considering [pj is that the sub-aperture images are computed without interpolation since one pixel L(x,y, i,j) corresponds to an integer coordinate sub-aperture pixel X(a,b, u, v ). The drawback is that the portion of the pupil from which photons are recorded is not constant within a given sub-aperture image S(u,v). As a result, S(u,v ) subaperture image is not exactly sampling the (u, v ) pupil coordinate.

[0026] In cases where p is not an integer, or where the micro-lens array is rotated versus the pixel array, then the sub-aperture images may be computed using interpolation since the centers ( c ί ; ·,ΐί ί ; · ) of the micro-lenses are not at integer coordinates.

[0027] Within the light-field image L(x,y,i,j ) an object is made visible on several micro-images with a replication distance w. On the sub-aperture images, an object is also visible several times. From one subaperture image to the next horizontal one, an object coordinate (a,b) appears shifted by the disparity p. The relation between p and w can be expressed by:

[0028] Also, it is possible to establish a relation between the disparity p and the distance z of the object by combining equations (5) and (9):

Eq. 10

Projecting light-field pixels on a re-focus image.

[0029] Image refocusing consists in projecting the light-field pixels L(x, y, i,j ) recorded by the sensor into a 2D refocused image of coordinate ( X , Y ). The projection may be performed by shifting the micro-images (t,y): where w focus is the selected replication distance corresponding to z focus the distance of the objects that appear in focus in the computed refocused image s is a zoom factor which controls the size of the refocused image. The value of the light-field pixel L(x, y, i,j ) is added on the refocused image at coordinate (X, Y). If the projected coordinate is non-integer, the pixel is added using interpolation. To record the number of pixels projected into the refocus image, a weight-map image having the same size as the refocus image is created. This image is preliminary set to 0. For each light-field pixel projected on the refocused image, the value of 1.0 is added to the weight-map at the coordinate (X, Y). If interpolation is used, the same interpolation kernel is used for both the refocused and the weight-map images. After all of the light-field pixels are projected, the refocused image is divided pixel per pixel by the weight-map image. This normalization step provides for brightness consistency of the normalized refocused image.

Addition of the Sub-aperture Images to Compute the Re-focus Image.

[0030] In another technique of performing refocusing, the refocused images can be computed by summing- up the sub-aperture images S(a, /?) taking into consideration the disparity Pf 0CU s for which objects at distance Zfocus are in focus.

The sub-aperture pixels are projected on the refocused image, and a weight-map records the contribution of this pixel, following the same procedure described above.

SUMMARY

[0031] An example apparatus in accordance with some embodiments may include a microlens; and two or more plenoptic subpixel sensors, the two or more plenoptic subpixel sensors including two or more photodiodes formed in a silicon layer, wherein a pyramidal structure is etched in the two or more respective silicon photodiodes formed in the silicon layer.

[0032] For some embodiments of the example apparatus, the pyramidal structure may be formed to generate an optical prism deviating light away from a center of the two or more plenoptic subpixel sensors.

[0033] For some embodiments of the example apparatus, the pyramidal structure may be configured to reduce optical crosstalk between at least two of the two or more plenoptic subpixel sensors.

[0034] For some embodiments of the example apparatus, the pyramidal structure may be configured to increase sensitivity of at least one of the two or more plenoptic subpixel sensors.

[0035] For some embodiments of the example apparatus, the pyramidal structure may be configured to increase angular discrimination for light incident on the microlens at an angle near a chief ray angle of the microlens.

[0036] For some embodiments of the example apparatus, the pyramid structure may be configured to confine light to one of the two or more plenoptic subpixel sensors. [0037] For some embodiments of the example apparatus, the pyramid structure may be configured to reflect light away from at least one of the two or more plenoptic subpixel sensors.

[0038] For some embodiments of the example apparatus, the silicon layer may be etched with a trench conforming to a deep trench isolation (DTI) structure.

[0039] For some embodiments of the example apparatus, a focal length of the microlens may be less than a threshold.

[0040] For some embodiments of the example apparatus, the two or more plenoptic sensors may include a dual pixel sensor; and the pyramidal structure may include a triangular structure.

[0041] For some embodiments of the example apparatus, the two or more plenoptic sensors may include a quad pixel sensor.

[0042] For some embodiments of the example apparatus, the pyramidal structure may include a material layer above the silicon layer.

[0043] For some embodiments of the example apparatus, the pyramidal structure may conform to pyramidshaped trenches etched in the silicon layer.

[0044] Some embodiments of the example apparatus may further include a color filter system.

[0045] For some embodiments of the example apparatus, the pyramidal structure may include a material layer above the silicon layer; and the material layer may include an oxide layer between the color filter system and at least one of the two or more plenoptic subpixel sensors.

[0046] For some embodiments of the example apparatus, the pyramidal structure may further include a color filter layer of the color filter system above the material layer.

[0047] For some embodiments of the example apparatus, the color filter layer may include a color filter resin.

[0048] For some embodiments of the example apparatus, the color filter system may include a planarization layer and a color filter layer.

[0049] For some embodiments of the example apparatus, a horizontal facet angle of the pyramidal structure may be 54.7 degrees.

[0050] For some embodiments of the example apparatus, a base of the pyramidal structure may be equal to half of a diameter of the microlens. [0051] For some embodiments of the example apparatus, the two or more plenoptic subpixel sensors may be arranged in a 2x1 grid, and the pyramidal structure may be centered on a line dividing the 2x1 grid in half.

[0052] For some embodiments of the example apparatus, the two or more plenoptic subpixel sensors may be arranged in a 2x2 grid, and the pyramidal structure may be centered on a center of the 2x2 grid.

[0053] For some embodiments of the example apparatus, the two or more plenoptic subpixel sensors may be arranged in a 3x3 grid, and the pyramidal structure may be centered on a center of the 3x3 grid.

[0054] For some embodiments of the example apparatus, the pyramidal structure may be a frustum, and a flat top of the frustum may be configured to match a center subpixel sensor of the 3x3 grid.

[0055] For some embodiments of the example apparatus, a tip of the pyramidal structure may be a square.

[0056] For some embodiments of the example apparatus, a base of the pyramidal structure may be a disk.

[0057] For some embodiments of the example apparatus, a tip of the pyramidal structure may be a disk.

[0058] For some embodiments of the example apparatus, the two or more plenoptic subpixel sensors may be arranged in a grid array, and the pyramidal structure may be centered on a center of the grid array.

[0059] For some embodiments of the example apparatus, a pyramid portion of the pyramidal structure may be a triangle extruded in at least one direction.

[0060] For some embodiments of the example apparatus, a pyramid portion of the pyramidal structure may be a polygon extruded in at least one direction.

[0061] For some embodiments of the example apparatus, a base of a pyramid portion of the pyramidal structure may be a square.

[0062] Some embodiments of the example apparatus may further include one or more external insulators adjacent to an external edge of at least one of the plenoptic subpixel sensors.

[0063] For some embodiments of the example apparatus, the microlens may be configured to shift a chief ray angle corresponding to a pixel offset of two pixels.

[0064] For some embodiments of the example apparatus, at least one of the two or more plenoptic subpixel sensors may be adjacent to the color filter system.

[0065] Some embodiments of the example apparatus may further include an oxide layer, wherein the oxide layer may be adjacent to the color filter system, wherein at least one of the two or more plenoptic subpixel sensors may be adjacent to the oxide layer, and wherein the oxide layer may be between the color filter system and at least one of the two or more plenoptic subpixel sensors.

[0066] For some embodiments of the example apparatus, a base of a pyramid portion of the pyramidal structure may be adjacent to the oxide layer.

[0067] For some embodiments of the example apparatus, a size of a base of a pyramid portion of the pyramidal structure may be adjusted based on a curvature of the microlens.

[0068] For some embodiments of the example apparatus, the pyramidal structure may be symmetrical along a first plane cutting through a center of the pyramidal structure, the pyramidal structure may be symmetrical along a second plane cutting through the center of the pyramidal structure, and wherein the first plane may be perpendicular to the second plane.

[0069] For some embodiments of the example apparatus, wherein the color filter system may be adjacent to the microlens.

[0070] For some embodiments of the example apparatus, two or more plenoptic subpixel sensors may be adjacent to the color filter system.

[0071] For some embodiments of the example apparatus, the color filter system may include a filter pattern of two or more subpixels.

[0072] For some embodiments of the example apparatus, the pyramidal structure may include silicon dioxide [Si0 2 ).

[0073] For some embodiments of the example apparatus, the pyramidal structure may include silicon oxide.

[0074] For some embodiments of the example apparatus, the pyramidal structure may be configured to reduce optical crosstalk between at least two pixels.

[0075] For some embodiments of the example apparatus, the pyramidal structure may be configured to reduce optical crosstalk between at least two microlenses.

BRIEF DESCRIPTION OF THE DRAWINGS

[0076] FIG. 1 is a schematic illustration of an example plenoptic camera.

[0077] FIG. 2 is a schematic illustration of example light field data recorded by a plenoptic sensor. [0078] FIG. 3 is a schematic illustration of the parameters of a plenoptic type II camera with W>P.

[0079] FIG. 4 is a schematic illustration of the parameters of a plenoptic type II camera with W<P.

[0080] FIG. 5 is a schematic illustration of the parameters of a plenoptic type I camera with / = d.

[0081] FIGs. 6A-6B are schematic illustrations of conversion of light-field pixels into sub-aperture images.

[0082] FIG. 7 is a system diagram illustrating an example wireless transmit/receive unit (WTRU) that may be used to capture and/or process plenoptic images according to some embodiments.

[0083] FIG. 8A is a schematic side view illustrating an example quad pixel according to some embodiments.

[0084] FIG. 8B is a schematic plan view illustrating an example quad pixel according to some embodiments.

[0085] FIG. 9A is a schematic plan view illustrating an example sub aperture image (SAI) pattern for an array of quad pixels according to some embodiments.

[0086] FIG. 9B is a schematic plan view illustrating example sub aperture images (SAIs) according to some embodiments.

[0087] FIG. 10A is a schematic side view illustrating an example diffraction pattern inside a pixel for light entering at normal incidence according to some embodiments.

[0088] FIG. 10B is a schematic side view illustrating an example diffraction pattern inside a pixel for light entering at an off-angle according to some embodiments.

[0089] FIGs. 11 A to 11 D are schematic plan views illustrating example sub aperture images A to D for an out- of-focus white spot without diffraction according to some embodiments.

[0090] FIG. 11 E is a schematic plan view illustrating an example summation of sub aperture images A to D in FIGs. 5Ato 5D according to some embodiments.

[0091] FIGs. 11 F to 111 are schematic plan views illustrating example sub aperture images A to D for an out- of-focus white spot with diffraction according to some embodiments.

[0092] FIG. 11 J is a schematic plan view illustrating an example summation of sub aperture images A to D in FIGs. 11 F to 111 according to some embodiments.

[0093] FIG. 12 is a schematic side view illustrating an example crosstalk between microlenses according to some embodiments. [0094] FIGs. 13A-13E are schematic plan views illustrating an example Chief Ray Angle correction via a microlens shift according to some embodiments.

[0095] FIG. 14A is a graph illustrating example left and right photo detect (PD) signals vs. angle of incidence according to some embodiments.

P a bs

[0096] FIG. 14B is a graph illustrating an example ratio of left and right PD signals ( a Sleft ) vs. angle of

P abs r ight incidence according to some embodiments.

[0097] FIG. 15A is a schematic side view illustrating an example low angle of incidence for a quad pixel without a pyramid shape etched in the silicon according to some embodiments.

[0098] FIG. 15B is a schematic side view illustrating an example higher angle of incidence for a quad pixel without a pyramid shape etched in the silicon according to some embodiments.

[0099] FIG. 15C is a schematic side view illustrating an example low angle of incidence for a quad pixel with a pyramid shape etched in the silicon according to some embodiments.

[0100] FIG. 15D is a schematic side view illustrating an example higher angle of incidence for a quad pixel with a pyramid shape etched in the silicon according to some embodiments.

[0101] FIG. 16A is a graph illustrating example left and right photo detect (PD) signals vs. angle of incidence for a 3.5pm quad pixel without a pyramid shape etched in the silicon according to some embodiments.

[0102] FIG. 16B is a graph illustrating an example ratio of left and right PD signals ( Pabsleft ) and a threshold

P absrig vs. angle of incidence for a 3.5pm quad pixel without a pyramid shape etched in the silicon according to some embodiments.

[0103] FIG. 17A is a graph illustrating example left and right photo detect (PD) signals vs. angle of incidence for a 3.5pm quad pixel with a pyramid shape etched in the silicon according to some embodiments. [0104] FIG. 17B is a graph illustrating an example ratio of left and right PD signals ( Pabsleft ) and a threshold pabs r ight vs. angle of incidence for a 3.5pm quad pixel with a pyramid shape etched in the silicon according to some embodiments.

[0105] FIG. 18A is a graph illustrating example left and right photo detect (PD) signals ( P a bsi eft> p abs right ) vs. angle of incidence for a 1.6pm quad pixel without a pyramid shape etched in the silicon according to some embodiments.

[0106] FIG. 18B is a graph illustrating an example ratio of left and right PD signals ( Pabsleft ) and a threshold

Pabsright vs. angle of incidence for a 1.6pm quad pixel without a pyramid shape etched in the silicon according to some embodiments.

[0107] FIG. 19A is a graph illustrating example left and right photo detect (PD) signals vs. angle of incidence for a 1.6pm quad pixel with a pyramid shape etched in the silicon according to some embodiments.

[0108] FIG. 19B is a graph illustrating an example ratio of left and right PD signals ( Pabsleft ) and a threshold

P abs r ight vs. angle of incidence for a 1.6pm quad pixel with a pyramid shape etched in the silicon according to some embodiments.

[0109] FIG. 20A is a schematic plan view illustrating an example pyramid for a dual pixel according to some embodiments.

[0110] FIG. 20B is a schematic side view illustrating an example pyramid for a dual pixel according to some embodiments.

[0111] FIG. 21 A is a schematic plan view illustrating an example pyramid for a quad pixel according to some embodiments.

[0112] FIG. 21 B is a schematic side view illustrating an example pyramid for a quad pixel according to some embodiments.

[0113] FIG. 22A is a schematic plan view illustrating an example pyramid for a nona pixel according to some embodiments. [0114] FIG. 22B is a schematic side view illustrating an example pyramid for a nona pixel according to some embodiments.

[0115] FIG. 23A is a schematic plan view illustrating an example pyramid for a 45° square based pyramidal structure according to some embodiments.

[0116] FIG. 23B is a schematic plan view illustrating an example pyramid for a disk based pyramidal structure according to some embodiments.

[0117] FIG. 23C is a schematic side view illustrating an example pyramid for a pyramidal structure with a curved convex facet according to some embodiments.

[0118] FIG. 23D is a schematic side view illustrating an example pyramid for a pyramidal structure with a curved concave facet according to some embodiments.

[0119] FIG. 24A is a schematic side view illustrating an example quad pixel with a material layer with a pyramidal structure according to some embodiments.

[0120] FIG. 24B is a schematic side view illustrating an example quad pixel with a color filter with a pyramidal structure according to some embodiments.

[0121] FIG. 24C is a schematic side view illustrating an example quad pixel with a color filter, a material layer, and a pyramidal structure according to some embodiments.

[0122] The entities, connections, arrangements, and the like that are depicted in— and described in connection with— the various figures are presented by way of example and not by way of limitation. As such, any and all statements or other indications as to what a particular figure “depicts,” what a particular element or entity in a particular figure “is” or “has,” and any and all similar statements— that may in isolation and out of context be read as absolute and therefore limiting— may only properly be read as being constructively preceded by a clause such as “In at least one embodiment, ....” For brevity and clarity of presentation, this implied leading clause is not repeated ad nauseum in the detailed description.

DETAILED DESCRIPTION

[0123] A wireless transmit/receive unit (WTRU) may be used, e.g., as a plenoptic camera in some embodiments described herein. [0124] FIG. 7 is a system diagram illustrating an example WTRU 702. As shown in FIG. 7, the WTRU 702 may include a processor 718, a transceiver 720, a transmit/receive element 722, a speaker/microphone 724, a keypad 726, a display/touchpad 728, non-removable memory 730, removable memory 732, a power source 734, a camera 736, and/or other peripherals 738, among others. It will be appreciated that the WTRU 702 may include any sub-combination of the foregoing elements while remaining consistent with an embodiment.

[0125] The processor 718 may be a general purpose processor, a special purpose processor, a conventional processor, a digital signal processor (DSP), a plurality of microprocessors, one or more microprocessors in association with a DSP core, a controller, a microcontroller, Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) circuits, any other type of integrated circuit (IC), a state machine, and the like. The processor 718 may perform signal coding, data processing, power control, input/output processing, and/or any other functionality that enables the WTRU 702 to operate in a wireless environment. The processor 718 may be coupled to the transceiver 720, which may be coupled to the transmit/receive element 722. While FIG. 7 depicts the processor 718 and the transceiver 720 as separate components, it will be appreciated that the processor 718 and the transceiver 720 may be integrated together in an electronic package or chip.

[0126] The transmit/receive element 722 may be configured to transmit signals to, or receive signals from, a base station over the air interface 716. For example, in one embodiment, the transmit/receive element 722 may be an antenna configured to transmit and/or receive RF signals. In an embodiment, the transmit/receive element 722 may be an emitter/detector configured to transmit and/or receive IR, UV, or visible light signals, for example. In yet another embodiment, the transmit/receive element 722 may be configured to transmit and/or receive both RF and light signals. It will be appreciated that the transmit/receive element 722 may be configured to transmit and/or receive any combination of wireless signals.

[0127] The transceiver 720 may be configured to modulate the signals that are to be transmitted by the transmit/receive element 722 and to demodulate the signals that are received by the transmit/receive element 722. The WTRU 702 may have multi-mode capabilities. Thus, the transceiver 720 may include multiple transceivers for enabling the WTRU 702 to communicate via multiple radio access technologies, such as New Radio and IEEE 802.11, for example.

[0128] The processor 718 of the WTRU 702 may be coupled to, and may receive user input data from, the speaker/microphone 724, the keypad 726, the display/touchpad 728 (e.g., a liquid crystal display (LCD) display unit or organic light-emitting diode (OLED) display unit), and/or the camera 736. The processor 718 may also output user data to the speaker/microphone 724, the keypad 726, and/or the display/touchpad 728. In addition, the processor 718 may access information from, and store data in, any type of suitable memory, such as the non-removable memory 730 and/or the removable memory 732. The non-removable memory 730 may include random-access memory (RAM), read-only memory (ROM), a hard disk, or any other type of memory storage device. The removable memory 732 may include a subscriber identity module (SIM) card, a memory stick, a secure digital (SD) memory card, and the like. In other embodiments, the processor 718 may access information from, and store data in, memory that is not physically located on the WTRU 702, such as on a server or a home computer (not shown).

[0129] The processor 718 may receive power from the power source 734, and may be configured to distribute and/or control the power to the other components in the WTRU 702. The power source 734 may be any suitable device for powering the WTRU 702. For example, the power source 734 may include one or more dry cell batteries (e.g., nickel-cadmium (NiCd), nickel-zinc (NiZn), nickel metal hydride (NiMH), lithium-ion (Li-ion), etc.), solar cells, fuel cells, and the like.

[0130] The processor 718 may also be coupled to the GPS chipset 736, which may be configured to provide location information (e.g., longitude and latitude) regarding the current location of the WTRU 702. In addition to, or in lieu of, the information from the GPS chipset 736, the WTRU 702 may receive location information over the air interface 716 from a base station and/or determine its location based on the timing of the signals being received from two or more nearby base stations. It will be appreciated that the WTRU 702 may acquire location information by way of any suitable location-determination method while remaining consistent with an embodiment.

[0131] The processor 718 may further be coupled to other peripherals 738, which may include one or more software and/or hardware modules that provide additional features, functionality and/or wired or wireless connectivity. For example, the peripherals 738 may include an accelerometer, an e-compass, a satellite transceiver, additional digital camera (for photographs and/or video), a universal serial bus (USB) port, a vibration device, a television transceiver, a hands free headset, a Bluetooth® module, a frequency modulated (FM) radio unit, a digital music player, a media player, a video game player module, an Internet browser, a Virtual Reality and/or Augmented Reality (VR/AR) device, an activity tracker, and the like. The peripherals 738 may include one or more sensors, the sensors may be one or more of a gyroscope, an accelerometer, a hall effect sensor, a magnetometer, an orientation sensor, a proximity sensor, a temperature sensor, a time sensor; a geolocation sensor; an altimeter, a light sensor, a touch sensor, a magnetometer, a barometer, a gesture sensor, a biometric sensor, and/or a humidity sensor.

[0132] The emulation devices may be designed to implement one or more tests of other devices in a lab environment and/or in an operator network environment. For example, the one or more emulation devices may perform the one or more, or all, functions while being fully or partially implemented and/or deployed as part of a wired and/or wireless communication network in order to test other devices within the communication network. The one or more emulation devices may perform the one or more, or all, functions while being temporarily implemented/deployed as part of a wired and/or wireless communication network. The emulation device may be directly coupled to another device for purposes of testing and/or may performing testing using over-the-air wireless communications.

[0133] The one or more emulation devices may perform the one or more, including all, functions while not being implemented/deployed as part of a wired and/or wireless communication network. For example, the emulation devices may be utilized in a testing scenario in a testing laboratory and/or a non-deployed (e.g., testing) wired and/or wireless communication network in order to implement testing of one or more components. The one or more emulation devices may be test equipment. Direct RF coupling and/or wireless communications via RF circuitry (e.g., which may include one or more antennas) may be used by the emulation devices to transmit and/or receive data.

[0134] As noted previously, an integrated plenoptic sensor is an imaging sensor in which a microlens is shared between multiple subpixels. An integrated plenoptic sensor records a “4D light-field” which may give indications on the angle of incidence of light. An integrated plenoptic sensor may be used on cameras and smartphones to drive the autofocus of the main lens. Furthermore, an integrated plenoptic sensor may provide additional abilities, such as passive depth mapping, refocusing, or aberration correction.

[0135] FIG. 8A is a schematic side view illustrating an example quad pixel according to some embodiments. FIG. 8B is a schematic plan view illustrating an example quad pixel according to some embodiments. For some embodiments, a quad pixel may be composed of a microlens covering a 2x2 square array of subpixels, multiple layers of different materials including a color filter, and the silicon which absorbs the light. A quad pixel may have trenches to isolate pixels and subpixels electrically and optically. According to some embodiments, a pyramid shape (not shown in FIGS. 8A and 8B) may be inside the silicon to avoid or reduce optical crosstalk between microlenses. For some embodiments, a pyramid shape may be inside the silicon to avoid or reduce optical crosstalk between pixels. FIG. 8B shows four subpixels and four photo diodes associated with each subpixel. The photo diode sensors 852, 854, 856, 858 are labeled PD-A, PD-B, PD-C, and PD-D.

[0136] In particular, an example practical implementation of the example design shown in FIG. 8A is based on a back side illuminated (BSI) pixel including a microlens 802 placed on top of planar layers with a given thickness: for example, a planarization layer 804 made of transparent resin (e.g., 500 nm), a color filter 806 (e.g., in the range of 500 nm to 1 pm), a passivation layer made of silicon dioxide 808 (e.g., 650 nm), and silicon including the four sub-pixels (4 pm) forming the 2x2 square array. For some embodiments, the planarization layer 804 may be made of the same material as the microlens 802. For some embodiments, the planarization layer 804 is made of a different resin than the microlens 802. For some embodiments, the planarization layer 804 is a transparent, under-layer that has a thickness in the range of 200 nm to 500 nm. The color filter 806 may be, for example, a red color filter, a green color filter, or a blue color filter. In some embodiments, the color filter 806 may have subpixel areas of different color. For some embodiments, the passivation layer 804 may be made of one or more layers of inorganic material, including, e.g., hafnium oxide, aluminum oxide, and/or tantalum oxide, and may be capped by silicon oxide. Collectively, these materials are known as high-k dielectric materials in the microelectronics industry and may be non-stoichiometric oxides. For example, silicon oxide may be non- stoichiometric. For some embodiments, silicon dioxide may be used instead of silicon oxide. For some embodiments, the total thickness of the planarization layer 804, the color filter 806, and the passivation layer may be within the range of slightly less than 10Onm to less than 1 pm. Of course, the materials and dimensions are merely examples. According to the example, deep trench isolation (DTI) 812 (e.g., silicon oxide or occasionally tungsten) inside silicon may be used to isolate the photodiodes from each other (e.g., 100 nm wide). For some embodiments, capacitive DTI may be used and may include deep trenches filled with, e.g., (1) a thin silicon oxide liner and/or (2) a core of polycrystalline silicon that may be biased. For some embodiments, the tungsten isolation 810 may be used at the passivation layer level to “guide” the light to the silicon (and then to prevent crosstalk from one pixel to another pixel) and/or to mask part of the pixel to protect part of the pixel from light. In addition, isolation of the color filter may be present. The isolation may be made of material with a lower refractive index than the color filter resin. The isolation may be silicon oxide or may be a dedicated resin for some embodiments. For some embodiments, the isolation may be associated with tungsten walls 860. FIG. 8A shows a view 800 from the side and FIG. 8B shows a view 850 from the top (not to scale).

[0137] Examples of quad-pixel sensors, along with example quad-pixel sensor designs and examples of related wave optics simulations and results, are included in: Chataignier, G., Vandame, B., and Vaillant, J., Joint Electromagnetic and Ray-Tracing Simulations for Quad-Pixel Sensor and Computational Imaging, 27:21 OPTICS EXPRESS (Oct. 14, 2019) (“ ChataignieT’ ).

[0138] For some embodiments, a shape such as a pyramid shape (not shown in FIGs 8A or 8B) may be inside (e.g., etched inside or into) the silicon to avoid or reduce optical crosstalk between microlenses. For some embodiments, the pyramid shape and/or pyramidal structure may have one or more of (or any combination of) the following positive effects: (1) to reduce optical crosstalk between adjacent microlenses, (2) to reduce optical crosstalk between adjacent (subpixel) sensors; (3) to reduce the width of the “blind” angular zone, which is denoted as Ra in FIG. MB’s graph of ( a Sleft ) ratio vs. angle of incidence; (4) to increase sensitivity between

P absright adjacent (subpixel) sensors; (5) to increase angular discrimination of light incident on the microlens (or main lens for some embodiments) at an angle near a chief ray angle of the microlens. For some embodiments, one or more external insulators may be adjacent to an external edge of a photo detector. For example, an external insulator (or isolator for some embodiments) be attached to or next to a portion of a photo detector, such as the example shown in FIG. 8A. For some embodiments, one or more external insulators may be adjacent to an external edge of at least one of the plenoptic subpixel sensors.

[0139] FIG. 9A is a schematic plan view illustrating an example sub aperture image (SAI) pattern for an array of quad pixels according to some embodiments. FIG. 9B is a schematic plan view illustrating example sub aperture images (SAIs) according to some embodiments.

[0140] FIG. 9A shows an array 900 of quad pixels. Each microlens produces a micro-image 902 which is schematically represented by a circle. A set 950 of Sub Aperture Images (SAIs) 952, 954, 956, 958 may be used with photo diode sensors arranged in a subpixel configuration. The SAIs are formed by each subpixel 904, 906, 908, 910 having the same position under the array of microlenses. Depending on the focus distance, the distance to an object, and the geometrical aberrations of the main lens, the object may appear shifted between the SAIs. The amount of translation, or disparity, is at the base of certain algorithms. The SAIs may be shifted (for some embodiments globally) to refocus or locally correct the main lens’ aberrations. The final image is the summation of the shifted SAIs.

[0141] A quad-pixel was simulated using electromagnetic simulations with Finite Difference Time Domain (FDTD) to study and optimize the design of quad pixels of 1.75 m and 0.8 m in size. The power absorbed by the silicon ( P abs ) as a function of the angle of incidence is called the angular response of the pixel. Two types of crosstalk have been identified: inside the microlens and between the microlenses. In the manner of Kobayashi, Masahiro, A Low Noise and High Sensitivity Image Sensor with Imaging and Phase-Difference Detection AF in All Pixels, 4:2 ITE TRANS, ON MEDIA TECHN. AND APPLICATIONS (MTA) 123-128 (2016) (“Kobayashi”), a specific figure of merit was determined and used to compare different quad pixel designs. These comparisons are described later in this specification.

[0142] FIG. 10A is a schematic side view illustrating an example diffraction pattern inside a pixel for light entering at normal incidence according to some embodiments. FIG. 10B is a schematic side view illustrating an example diffraction pattern inside a pixel for light entering at an off-angle according to some embodiments.

[0143] When light enters the microlens, the light forms a diffraction spot, which has a spatial extent. The diffraction spot may cause crosstalk between subpixels, especially near normal incidence (perpendicular to the surface). FIG. 10A shows a diffraction pattern 1000 for light entering the curved microlens at the top of the figure and forming a diffraction spot at the surface of the silicon. The diffraction spot has a spatial extent (or width) that causes optical crosstalk between two subpixel sensors. FIG. 10B shows a diffraction pattern 1050 for light entering the curved microlens at an off-angle to the top of the figure and forming a diffraction spot mostly located above the left subpixel sensor and having less crosstalk than FIG. 10A.

[0144] For light entering a microlens at small angles of incidence, diffraction inside the pixel may be a source of crosstalk between the subpixels. Reducing crosstalk between subpixels may help avoid pollution between SAIs. With more crosstalk, the SAIs look more similar. If the SAIs look more similar, disparity estimation is more difficult because there are less clues indicating depth.

[0145] FIGs. 11 A to 11 D are schematic plan views illustrating example sub aperture images A to D for an out- of-focus white spot without diffraction according to some embodiments. FIG. 11 E is a schematic plan view illustrating an example summation of sub aperture images A to D in FIGs. 11 A to 11 D according to some embodiments. FIGs. 11 F to 111 are schematic plan views illustrating example sub aperture images A to D for an out-of-focus white spot with diffraction according to some embodiments. FIG. 11 J is a schematic plan view illustrating an example summation of sub aperture images A to D in FIGs. 11 F to 111 according to some embodiments.

[0146] Compare the sub aperture images 1100, 1110, 1120, 1130, 1140 of FIGs. 11 A to 11 E with the sub aperture images 1150, 1160, 1170, 1180, 1190 of FIGs. 11 F to 11 J. Both sets of figures show the same out-of- focus point, but the diffraction in FIGs. 11 F to 11 J causes crosstalk. The crosstalk induces a blur function shown in each of the sub aperture images (SAIs) of FIGs. 11 F to 111. The blur function prevents an accurate disparity estimation. The amount of crosstalk may depend on the size of the microlens in which larger microlenses have more crosstalk, and crosstalk may be difficult to avoid.

[0147] FIG. 12 is a schematic side view illustrating an example crosstalk between microlenses according to some embodiments. Crosstalk between microlenses happens when the angle of incidence of light is too high. There are two main reasons: the aperture of the main lens is too big, or the chief ray angle is too high or not well corrected (or a mix of both). Crosstalk between microlenses at a high angle of incidence degrades the final image, and reduction or avoidance of such crosstalk is desirable.

[0148] In FIG. 12, the angle of incidence is larger and off-angle. As a result, the chief ray angle of the chief ray 1212 is too high, and the light rays pass through the microlens and towards a neighboring pixel due to crosstalk instead of going towards the center of the pixel while passing through the planarization layer 1204, color filter 1206, oxide layer 1208, and photo detector 1210.

[0149] Described herein are embodiments which may decrease crosstalk between microlenses 1202. With a reduction in crosstalk, some embodiments may pair a sensor with a main lens that has a higher f-number {f/ff). The f-number is a ratio of the focal length to the diameter of the lens. The higher the f-number, the smaller the lens. Some embodiments may make the sensor more resilient with respect to the chief ray angle.

[0150] FIGs. 13A-13E are schematic plan views illustrating an example Chief Ray Angle correction via a microlens shift according to some embodiments. For purposes of this description, a Chief Ray (CR) may be defined as a light ray that passes through the center of a main lens. FIG. 13A shows a pixel structure 1300 for a telecentric lens in which the Chief Rays 1302, 1304 are perpendicular to the image plane 1308. If the main lens 1306, 1326 is not telecentric in the image space, the chief ray 1322, 1324 forms an angle, the Chief Ray Angle (CRA), with the sensor and image plane 1328 as shown in the pixel structure 1320 of FIG. 13B.

[0151] FIG. 13C shows a pixel structure 1340 for a first pixel sensor aligned with a first chief ray 1342 passing through the center of the main lens 1346 along the optical axis of the main lens. A second chief ray 1344 passes through the main lens 1346 and hits a second pixel sensor off-center 1348. FIG. 13D shows a pixel structure 1360 for an enlarged view of the chief ray 1362, the second chief ray 1364, and the second pixel sensor 1366. Such a situation may be corrected to avoid crosstalk between pixel sensors. Furthermore, such a situation may be corrected so that a centered angular response exists for each pixel, independent of the position of the sensor. For some embodiments, one way to correct the Chief Ray Angle (CRA) is to decenter (or shift) the microlens. The amount of translation may be calculated given the specifications for the main lens. FIG. 13E shows a pixel structure 1380 for a Chief Ray Angle correction 1386 in which the microlens is shifted. As a result, the chief ray 1364 shown in FIG. 13D is shifted to hit the center 1382 of the pixel sensor for the chief ray 1384 shown in FIG. 13E. For some embodiments, the main lens characteristic of CRA vs. image height gives the chief ray angle as a function of position in the image plane. Flence, for each pixel, the location of the pixel within the image plane is used to determine the CRA seen by the pixel and to determine the amount of shift of the microlens for that particular pixel.

[0152] For some embodiments, the microlens may be configured to shift a chief ray angle corresponding to a pixel offset of two pixels. For example, the microlens may have a shape or arc that shifts the chief ray angle by an amount corresponding to a shift as shown in FIG. 13E. For some embodiments, a pixel (or a picture element for some embodiments) corresponds to a microlens that is an optical element that spatially samples an image. Sub-pixels may denote a matrix of elements below the microlens. The microlens shift may correspond to, e.g., the center of the pixel (or the center of the group of sub-pixels) at the silicon level. For example, the matrix of sub-pixels may include quad-pixels, 2x2-pixels, 3x3-pixels, and other arrangements for a matrix of pixels.

[0153] FIG. 14A is a graph illustrating example left and right photo detect (PD) signals vs. angle of incidence according to some embodiments. For the graph 1400 in FIG. 14A, the left photo detect signal is PD-L 1402, and the right photo detect signal is PD-R 1404.

[0154] For each simulated angle of incidence, the left/right ratio described in Kobayashi was used to compare different pixel designs. This figure of merit, the ( Pabsleft ) ratio, was computed and compared to an

P abs ri ght arbitrary threshold. If the ratio was below the threshold, the signal was treated as “good.” Three areas on the ratio curve were distinguished: a) At low angles of incidence in which the ratio is above the threshold, the signal is considered “bad” due to microlens diffraction, as discussed above. The range of this area is denoted R a . b) For angles of incidence in which the ratio is below the threshold, the signal is considered “good.” The range of this area is denoted R b c) For angles of incidence in which the ratio goes above the threshold again (angles of incidence beyond the end of R b ), the signal is considered “bad” due to crosstalk between microlenses, as discussed above.

[0155] FIG. 14B is a graph illustrating an example ratio of left and right PD signals ( Pabsleft ) V s. angle of

P abs r ight incidence according to some embodiments. The graph 1450 in FIG. 14B indicates the first range (item a 1452) for angles of incidence of -5° to +5°. The second range (item b 1454) is shown for angles of incidence of 5° to 25°. The third range (item c) in FIG. 14B includes angle of incidence above 25°. For the example shown in FIG. 14B, R a is 10° and R b is 20°.

[0156] For some embodiments, the angular range R a is reduced (or even minimized for some embodiments), and the angular range R b is increased (of even maximized for some embodiments). For some embodiments, the radius of curvature of the microlens and the height of the optical stack (the distance between the bottom of the microlens and the surface of the silicon) are adjusted to reduce R a and increase R b .

[0157] To speed up simulation of a quad-pixel, only subpixels A and B, which are shown in FIG. 8B, were simulated. The real “left signal” the sum of the subpixels A and C. The real “right signal” {P a bs r ight ) is the sum of subpixels B and D. Because of the symmetry of a quad pixel along the horizontal and vertical axes, the ratio calculated above was not affected. Although the left/right ratio was used above, the top/bottom ratio could have been used instead because of the symmetry of the quad-pixel.

[0158] FIG. 15A is a schematic side view illustrating an example low angle of incidence for a quad pixel without a pyramid shape etched in the silicon according to some embodiments. FIG. 15B is a schematic side view illustrating an example higher angle of incidence for a quad pixel without a pyramid shape etched in the silicon according to some embodiments. FIG. 15C is a schematic side view illustrating an example low angle of incidence for a quad pixel with a pyramid shape etched in the silicon according to some embodiments. FIG. 15D is a schematic side view illustrating an example higher angle of incidence for a quad pixel with a pyramid shape etched in the silicon according to some embodiments.

[0159] For some embodiments, an inverted pyramid shape is etched in the silicon to reduce crosstalk between microlenses. The use of such an inverted pyramid shape (which may be a pyramidal structure or pyramid-shaped prism depending on context for some embodiments) may increase convergence (and even ensure optimal convergence for some embodiments) of the microlens at low angles of incidence, thereby reducing R a , and decrease (or “break”) convergence at higher angles of incidence. As such, at higher angles of incidence, the silicon absorbs more of the light before the light enters a neighboring pixel cell structure.

[0160] For some embodiments, the optical stack may be decreased (or “thinned”). For example, compare the optical stack 1500, 1520 of microlens 1502, 1522, planarization 1504, 1524, color filter 1506, 1526, oxide layer 1508, 1528, and photo detector 1510, 1530 of FIGs. 15A and 15B with the optical stack 1540, 1560 of microlens 1542, 1562, planarization 1544, 1564, color filter 1546, 1566, oxide layer 1548, 1568, pyramidal structure 1550, 1570, and photo detector 1552, 1572 of FIGs. 15C and 15D. The optical stack, for some embodiments, may be thinned by an amount equal to as much as the height of the inverted pyramid shape. For some embodiments, the stack (the distance between microlens and an upper face of the silicon) may be thinner by having part of the material (such as mostly the color filters) filling the pyramidal structure. Generally, the silicon has a very high refractive index (n = 3.8 to 4.0 over the visible and near-infrared range). Placing oxide, resin, and/or other transparent materials (which have a refractive index of n = 1.4 to 2.2 over the same spectral range as the silicon) above the optical prism will deviate the light. For some embodiments, the horizontal facet angle may be set to 54.7°, which may be due to a manufacturing process. For example, in embodiments using a wet etching process with silicon, such an angle may be used since 54.7° is a natural angle formed by the silicon when the silicon is etched due to the crystalline structure of the silicon. The angle may vary with the use of other processes and/or materials. FIGs. 15A and 15B show a quad pixel without a pyramidal structure 1500, 1520. FIGs. 15C and 15D show a quad pixel adjusted using a pyramidal structure 1540, 1560. For some embodiments, the pyramidal structure 1550, 1570 may be generated with a photoresist process using grayscale lithography, for example, and the pyramid shape of the pyramidal structure may be transferred into the silicon using a dry etching process. For some embodiments, the horizontal facet angle may be an angle other than 54.7° and may be tuned depending on the configuration of the pixel, the objective of the pyramidal structure, and requirements for the width of Ra and Rb.

[0161] For some embodiments, a pixel stack structure may include, e.g., a microlens, a color filter system (for example), and two or more plenoptic subpixel sensors, in which the two or more plenoptic subpixel sensors may include two or more photodiodes formed in a silicon layer, and in which a pyramid shape is etched in the two or more respective silicon photodiodes formed in the silicon layer. For some embodiments, the pyramid shape and/or pyramidal structure may be configured to reduce optical crosstalk between at least two of the two or more plenoptic subpixel sensors. In some embodiments, the pyramidal structure may include a pyramid shaped insulator or pyramid shaped insulator structure. For some embodiments, a horizontal facet angle of the pyramid shape of the pyramidal structure may be 54.7 degrees. An example of a pyramidal structure 1550 is shown in FIG. 15C. For some embodiments, the color filter system may include a planarization layer and a color filter layer. An example color filter system is shown in FIG. 15C for some embodiments. For some embodiments, one or more plenoptic subpixel sensors may be adjacent to the color filter system. For some embodiments, the pixel stack structure may include an oxide layer, in which the oxide layer is adjacent to the color filter system, in which at least one of the plenoptic subpixel sensors may be adjacent to the oxide layer, and in which the oxide layer may be between the color filter system and at least one of the plenoptic subpixel sensors. For some embodiments, a base of the pyramid portion of the pyramid shape may be adjacent to the oxide layer. For some embodiments, the size of a base of the pyramid portion of the pyramid shape may be adjusted based on a curvature of the microlens. For example, the size of the base of the pyramid may be adjusted to be slightly wider than the area over which low angle, near incident light rays strike the pyramidal structure after passing through the microlens and one or more other layers, such as the planarization, color filter, and oxide layers. FIG. 15C shows one example embodiment of such a pyramidal structure. For some embodiments, the pyramid shape may be symmetrical along a first plane cutting through a center of the pyramid shape and symmetrical along a second plane cutting through the center of the pyramid shape, in which the first plane is perpendicular to the second plane. For example, the two planes may form a “plus” type shape if viewed via a top of the pixel stack structure for some embodiments. For some embodiments, the color filter system may be adjacent to the microlens. For example, the planarization layer may be part of the colorfilter layer or may not be present for some embodiments. For some embodiments, the base of the pyramidal shape, as seen from the top, may be rotated 45°. For some embodiments, the base of the pyramid shape may be a disk. For some embodiments, the tip of the pyramid shape may be a disk.

[0162] As shown in FIGs. 16A to 19B and discussed below, the pyramidal shape is able to increase the width of the Rb region. By tuning the microlens focal length and the distance between the microlens and the silicon, the width of the Ra region also may be reduced. The microlens focal length and the distance between the microlens and the silicon may be tuned together and validated by simulation. The effect of the pyramid shape is to deviate light away from the center of the pixel (corresponding to the microlens), thereby improving the right/left discrimination. This effect allows the distance between the microlens and the silicon to be reduced, enlarging the Rb region, thanks to the deep trench isolation (DTI) structure, which confines the light inside a given sub-pixel.

[0163] For some embodiments, an apparatus may include: a microlens; and two or more plenoptic subpixel sensors, the two or more plenoptic subpixel sensors including two or more photodiodes formed in a silicon layer, in which a pyramidal structure is etched in the two or more respective silicon photodiodes formed in the silicon layer. For some embodiments, the pyramid structure may be formed and/or configured for one or more of the following features: to generate an optical prism deviating light away from a center of the two or more plenoptic subpixel sensors; to reduce optical crosstalk between at least two of the two or more plenoptic subpixel sensors; to increase sensitivity of at least one of the two or more plenoptic subpixel sensors; to increase angular discrimination for light incident on the microlens at an angle near a chief ray angle of the microlens; to confine light to one of the two or more plenoptic subpixel sensors; and to reflect light away from at least one of the two or more plenoptic subpixel sensors. For some embodiments, the silicon layer may be etched with a trench conforming to a deep trench isolation (DTI) structure. For some embodiments, the focal length of the microlens may be less than a threshold.

[0164] FIG. 16A is a graph illustrating example left and right photo detect (PD) signals ( P a bsi eft> p abs right ) vs. angle of incidence for a 3.5pm quad pixel without a pyramid shape etched in the silicon according to some embodiments. FIG. 16B is a graph illustrating an example ratio of left and right PD signals ( a left ) and a

P absright threshold vs. angle of incidence for a 3.5pm quad pixel without a pyramid shape etched in the silicon according to some embodiments. The graphs 1600, 1650 of FIGs. 16A and 16B relate to a 3.5pm quad pixel (1.75pm subpixels) without a pyramid shape in which R a is 10° and R b is 25°. FIGs. 15A and 15B show a quad pixel without a pyramid shape. For FIGs. 16A and 16B, the reference design has an optical stack height of 1.9mhi (the distance between the bottom of the microlens and the top of the silicon).

[0165] FIG. 17A is a graph illustrating example left and right photo detect (PD) signals vs. angle of incidence for a 3.5pm quad pixel with a pyramid shape etched in the silicon according to some embodiments. FIG. 17B is a graph illustrating an example ratio of left and right PD signals ( Pabsleft ) and a

Pabs-fight threshold vs. angle of incidence for a 3.5pm quad pixel with a pyramid shape etched in the silicon according to some embodiments. The graphs 1700, 1750 of FIGs. 17A and 17B relate to a 3.5pm quad pixel (1.75pm subpixels) with a pyramid shape in which R a is 14° and R b is out of range (at least 35°). For FIGs. 17A and 17B, the pyramid design has a stack height of O.dmhi and a pyramid height of 1.236mhi.

[0166] FIG. 18A is a graph illustrating example left and right photo detect (PD) signals ( P a bsi eft> p abs right ) vs. angle of incidence for a 1.6pm quad pixel without a pyramid shape etched in the silicon according to some embodiments. FIG. 18B is a graph illustrating an example ratio of left and right PD signals ( a Sleft ) and a

P absrig threshold vs. angle of incidence for a 1.6pm quad pixel without a pyramid shape etched in the silicon according to some embodiments. The graphs 1800, 1850 of FIGs. 18A and 18B relate to a 1.6pm quad pixel (0.8pm subpixels) without a pyramid shape in which R a is 18° and R b is 14°. The quad pixel in FIGs. 12A and 12B is “normally optimized” without using a pyramid shape. For FIGs. 18A and 18B, the reference design has an optical stack height of 1.1 mhi. [0167] FIG. 19A is a graph illustrating example left and right photo detect (PD) signals ( Pabsi eft> p abs right ) vs. angle of incidence for a 1.6pm quad pixel with a pyramid shape etched in the silicon according to some embodiments. FIG. 19B is a graph illustrating an example ratio of left and right PD signals ( Pabsleft ) and a

^abSfight threshold vs. angle of incidence for a 1.6pm quad pixel with a pyramid shape etched in the silicon according to some embodiments. The graphs 1900, 1950 of FIGs. 19A and 19B relate to a 1.6pm quad pixel (0.8pm subpixels) with a pyramid shape in which R a is 20° and R b is 20°. For FIGs. 19A and 19B, the pyramid design has a stack height of 0.7mhi and a pyramid height of 0.565mhi.

[0168] The threshold line in FIGs. 16B, 17B, 18B, and 19B is a ratio equal to 0.2. In FIG. 14B, the ratio is two- sided in which the angle of incidence is both positive and negative. In FIGs. 16B, 17B, 18B, and 19B, the ratio is one-sided in which the angle of incidence is only positive. The angular range R a in FIGs. 16B, 17B, 18B, and 19B is doubled when using the definition for angular range R a as in FIG. 14B due to this one-sided vs. two-sided ratio.

[0169] In some embodiments, using a structure with a pyramidal or pyramid shape allows good gains on the crosstalk between microlenses, and reduced (or even minimal) losses on the crosstalk between subpixels. In some embodiments, this pyramidal structure allows the use of a wide aperture main lens, and this structure makes the sensor more robust versus the Chief Ray Angle.

[0170] Constraints in the manufacturing process may limit the degree to which the optical stack components may be thinned, thereby explaining the losses on the R a range. To preserve color filtering functionality, for some embodiments a limit may be imposed on the amount of height reduction of the color filter. For some embodiments, the base of the pyramid may set equal to half of the diameter of the microlens, and the height of pyramid may be set according to equation 13:

0.5 *(microlens diameter ) pyramid height = = 0.354 * ( microlens diameter ) Eq. 13 tan (54.7°)

For some embodiments, the 54.7° facet angle may be used to determine height of the pyramid. For some embodiments, the horizontal facet angle may be an angle other than 54.7°, and the pyramid height may be calculated as shown in Eq. 14:

. . . . . 0.5 *(microlens diameter ) pyramid heiqht = - Eq. 14 tan (horizontal facet angle ) [0171] For some embodiments, a base of the pyramid shape may be equal to half of a diameter of the microlens. For example, the size of the base may be calculated using Eq. 13 for some embodiments.

[0172] FIG. 20A is a schematic plan view illustrating an example pyramid (e.g., an inverted pyramid) for a dual pixel sensor configuration according to some embodiments. FIG. 20B is a schematic side view illustrating an example pyramid for a dual pixel sensor configuration according to some embodiments. For some embodiments, a pyramidal structure 2002, 2052, which may include a pyramid shaped insulator or other material or combination of materials, may be used with a dual pixel plenoptic sensor, such as a 2x1 grid of rectangular subpixels 2004, 2054, in which a pyramid shape is extended in one direction to create a triangular trench (e.g., a triangular prism shaped trench). FIGs. 20A and 20B show an example of how such pyramid shape may be configured with a dual pixel sensor for some embodiments.

[0173] For some embodiments, the pyramidal structure may conform to pyramid-shaped trenches etched in the silicon layer. For example, the pyramidal structure may fill in, e.g., a pyramid shaped trench such as the triangle shaped trench and form a flat base even with the top of two or more photo detectors as shown in FIG. 20B for some embodiments. For some embodiments, the plenoptic subpixel sensors may be arranged in a 2x1 grid in which the pyramid shape is centered on a line dividing the 2x1 grid in half. For example, the arrangement may be as shown in FIG. 20A in which the pyramid shape is symmetrical along the dashed line for some embodiments. For some embodiments, the pyramid portion of the pyramid shape is a triangle extruded in at least one direction. For example, a triangle may be extruded along the dashed line as shown in FIGs. 20A and 20B for some embodiments.

[0174] It will be understood, of course, in some embodiments and in different manufacturing processes in some embodiments, that a pyramidal structure as part of a dual or multi-pixel sensor configuration may approximate, or exhibit generally, a pyramid shape in observable, or potentially measurable, respects, but the pyramidal structure might not be formed as a “perfect” pyramid, pyramidal, or triangular abstract or geometric shape. The pyramidal structure may also have, e.g., convex or concave features, examples of which are shown in FIGs. 23C and 23D (discussed below).

[0175] FIG. 21 A is a schematic plan view illustrating an example pyramid (e.g., an inverted pyramid) for a quad pixel sensor configuration according to some embodiments. FIG. 21 B is a schematic side view illustrating an example pyramid for a quad pixel sensor configuration according to some embodiments. For some embodiments, a pyramid shape 2102, 2152 may be used with a quad pixel plenoptic sensor, such as a 2x2 grid of rectangular subpixels 2104, 2154, in which a pyramid shape is used. FIGs. 21 A and 21 B show an example of how such a pyramid shape may be configured with a quad pixel sensor for some embodiments.

[0176] For some embodiments, the plenoptic subpixel sensors may be arranged in a 2x2 grid, in which the pyramidal shape is centered on a center of the 2x2 grid. For example, the pyramid shape may have a square base which is centered around the center point of the 2x2 grid as shown in FIG. 21 A for some embodiments. In some embodiments, the pyramid shape may be rotated 45 degrees from what is shown in FIG. 21 A so that the base of the pyramid is a diamond with the four corners of the square aligning with the dashed lines shown in FIG. 21 A for some embodiments. For some embodiments, the plenoptic subpixel sensors may be arranged in a grid array in which the pyramid shape is centered on a center of the grid array. For some embodiments, the base of the pyramid portion of the pyramid shape may be a square. For some embodiments, in which a color filter system is included, two or more plenoptic subpixel sensors may be adjacent to the color filter system. For example, the color filter system may include a color filter layer that is directly adjacent to the top of one or more (or portions thereof) photo detectors for some embodiments. For some embodiments, the color filter system may include a filter pattern of two or more subpixels.

[0177] FIG. 22A is a schematic plan view illustrating an example pyramid (e.g., an inverted pyramid) for a nona pixel sensor configuration according to some embodiments. FIG. 22B is a schematic side view illustrating an example pyramid for a nona pixel sensor configuration according to some embodiments. For some embodiments, a pyramid shape 2202, 2252 may be used with nona pixel plenoptic sensor, such as a 3x3 grid of square subpixels 2204, 2254, in which the tip of a pyramid shape is centered on the center subpixel for some embodiments. For some embodiments, the tip or top of the pyramid may be a flat plateau. For some embodiments, the tip of the pyramid may be configured to match the area of the center subpixel. An example of such a configuration is shown in FIGs 22A and 22B for some embodiments.

[0178] An angle of 54.7° was used with the analysis described herein because such an angle is the natural angle of silicon when using wet etching. Alternatively, other angles may be used. For example, greyscale lithography may be used for some embodiments. For some embodiments, a non-straight face, such as the examples shown in FIGs. 23C and 23D for some embodiments and described below, may be used with the pyramid shape. For some embodiments, the pyramid shape may be adapted according to the shape of the microlens (e.g., via diameter or via radius of curvature of the microlens). For some embodiments, the pyramid shape may be adapted to address a manufacturing constraint. For example, one or more of the layers of a pixel apparatus may be adjusted (or minimized), such as the height of the planarization, color filter, or other layer. Furthermore, a passivation process may be used to reduce the reactivity of one of more surfaces of a pixel sensor apparatus.

[0179] For some embodiments, the plenoptic subpixel sensors may be arranged in a 3x3 grid, and the pyramid shape may be centered on a center of the 3x3 grid. For some embodiments, the pyramid shape may be a frustum, and the flat top of the frustum may be configured to match a center subpixel sensor of the 3x3 grid. For example, the pyramid shape may be symmetrical and centered around the center subpixel sensor of the 3x3 grid for some embodiments. In some embodiments, the tip of the pyramid shape is a square. For example, the tip of the pyramidal structure may be clipped so that the tip is a square. The square tip may correspond to a square center subpixel sensor of a 3x3 grid of subpixels. For some embodiments, the pyramid portion of the pyramid shape may be a polygon extruded in at least one direction.

[0180] FIG. 23A is a schematic plan view illustrating an example pyramid for a 45° square based pyramidal structure according to some embodiments. The pyramidal structure 2302 may be symmetrical and centered around the center of the pixel for some embodiments. The pyramidal structure may be rotated 45° in comparison with the orientation of the subpixels 2304, the boundaries of which are indicated by the dashed lines.

[0181] FIG. 23B is a schematic plan view illustrating an example pyramid for a disk based pyramidal structure according to some embodiments. The pyramidal structure 2322 may be symmetrical and centered around the center of the pixel for some embodiments. For some embodiments, the pixel may include four subpixels 2324. The pyramidal structure may be a cone with the tip of the cone centered on the center of the pixel for some embodiments.

[0182] FIG. 23C is a schematic side view illustrating an example pyramid for a pyramidal structure with a curved convex facet according to some embodiments. The pyramidal structure 2342 may be symmetrical and centered around the center of the pixel for some embodiments. For some embodiments, the pixel may include four subpixels 2344. The pyramidal structure may have curved convex facets for some embodiments. The base of the pyramidal structure (not shown) may be a square or a circular disk for some embodiments.

[0183] FIG. 23D is a schematic side view illustrating an example pyramid for a pyramidal structure with a curved concave facet according to some embodiments. The pyramidal structure 2362 may be symmetrical and centered around the center of the pixel for some embodiments. For some embodiments, the pixel may include four subpixels 2364. The pyramidal structure may have curved concave facets for some embodiments. The base of the pyramidal structure (not shown) may be a square or a circular disk for some embodiments. [0184] FIG. 24A is a schematic side view illustrating an example quad pixel with a material layer with a pyramidal structure according to some embodiments. For some embodiments, the material layer 2408 may include a pyramidal structure (or shape) as shown in FIG. 24A. For some embodiments, the material layer (which may be called a passivation layer or an oxide layer in some embodiments) may include a flat plane section with a thickness attached to a pyramid shape. The material layer may be filled with an inorganic material, such as, e.g., hafnium oxide, aluminum oxide, tantalum oxide, and/or silicon oxide for some embodiments. For some embodiments, a pixel structure 2400 may include a microlens 2402, a planarization layer 2404, a color filter 2406, a material layer 2408, and one or more photo detectors 2410 as shown in FIG. 24A.

[0185] FIG. 24B is a schematic side view illustrating an example quad pixel with a color filter with a pyramidal structure according to some embodiments. In some embodiments, the material layer may follow a pyramidal structure (or shape) with a thickness as shown in FIG. 24B. For some embodiments, the color filter may include a pyramid shape as shown in FIG. 24B. In some embodiments, the color filter may include a flat plane section with a thickness attached to a pyramid shape. For some embodiments, the material layer may be filled with an inorganic material, such as, e.g., hafnium oxide, aluminum oxide, tantalum oxide, and/or silicon oxide. For some embodiments, a pixel structure 2430 may include a microlens 2432, a planarization layer 2434, a color filter 2436, a material layer 2438, and one or more photo detectors 2440 as shown in FIG. 24B.

[0186] FIG. 24C is a schematic side view illustrating an example quad pixel with a color filter, a material layer, and a pyramidal structure according to some embodiments. In some embodiments, the material layer may follow a pyramidal structure (or shape) with a thickness as shown in FIG. 24C. For some embodiments, the material layer may be filled with silicon dioxide ( Si0 2 ). In some embodiments, a separate pyramidal structure may be used with the material layer that follows a pyramid with a thickness as shown in FIG. 24C. The pyramidal structure, the material layer, and the color filter may each be a different material than the other two layers for some embodiments. For some embodiments, the pyramidal structure may be filled with an inorganic material, such as, e.g., hafnium oxide, aluminum oxide, tantalum oxide, and/or silicon oxide, and the color filter and the material layer may each be filled with color filter resin. For some embodiments, a pixel structure 2460 may include a microlens 2462, a planarization layer 2464, a color filter 2466, a pyramidal structure 2468, a material layer 2470, and one or more photo detectors 2472 as shown in FIG. 24C.

[0187] For some embodiments, the pyramidal structure may include a material layer above the silicon layer. For some embodiments, the pyramidal structure may include a color filter layer of the color filter system above the material layer. For some embodiments, the color filter layer may include a color filter resin. For some embodiments, the material layer may include an oxide layer between the color filter system and at least one of the two or more plenoptic subpixel sensors. For some embodiments, the pyramidal structure may be made of silicon dioxide ( Si0 2 ).

[0188] An example apparatus in accordance with some embodiments may include a microlens; and two or more plenoptic subpixel sensors, the two or more plenoptic subpixel sensors including two or more photodiodes formed in a silicon layer, wherein a pyramidal structure is etched in the two or more respective silicon photodiodes formed in the silicon layer.

[0189] For some embodiments of the example apparatus, the pyramidal structure may be formed to generate an optical prism deviating light away from a center of the two or more plenoptic subpixel sensors.

[0190] For some embodiments of the example apparatus, the pyramidal structure may be configured to reduce optical crosstalk between at least two of the two or more plenoptic subpixel sensors.

[0191] For some embodiments of the example apparatus, the pyramidal structure may be configured to increase sensitivity of at least one of the two or more plenoptic subpixel sensors.

[0192] For some embodiments of the example apparatus, the pyramidal structure may be configured to increase angular discrimination for light incident on the microlens at an angle near a chief ray angle of the microlens.

[0193] For some embodiments of the example apparatus, the pyramid structure may be configured to confine light to one of the two or more plenoptic subpixel sensors.

[0194] For some embodiments of the example apparatus, the pyramid structure may be configured to reflect light away from at least one of the two or more plenoptic subpixel sensors.

[0195] For some embodiments of the example apparatus, the silicon layer may be etched with a trench conforming to a deep trench isolation (DTI) structure.

[0196] For some embodiments of the example apparatus, a focal length of the microlens may be less than a threshold.

[0197] For some embodiments of the example apparatus, the two or more plenoptic sensors may include a dual pixel sensor; and the pyramidal structure may include a triangular structure. [0198] For some embodiments of the example apparatus, the two or more plenoptic sensors may include a quad pixel sensor.

[0199] For some embodiments of the example apparatus, the pyramidal structure may include a material layer above the silicon layer.

[0200] For some embodiments of the example apparatus, the pyramidal structure may conform to pyramidshaped trenches etched in the silicon layer.

[0201] Some embodiments of the example apparatus may further include a color filter system.

[0202] For some embodiments of the example apparatus, the pyramidal structure may include a material layer above the silicon layer; and the material layer may include an oxide layer between the color filter system and at least one of the two or more plenoptic subpixel sensors.

[0203] For some embodiments of the example apparatus, the pyramidal structure may further include a color filter layer of the color filter system above the material layer.

[0204] For some embodiments of the example apparatus, the color filter layer may include a color filter resin.

[0205] For some embodiments of the example apparatus, the color filter system may include a planarization layer and a color filter layer.

[0206] For some embodiments of the example apparatus, a horizontal facet angle of the pyramidal structure may be 54.7 degrees.

[0207] For some embodiments of the example apparatus, a base of the pyramidal structure may be equal to half of a diameter of the microlens.

[0208] For some embodiments of the example apparatus, the two or more plenoptic subpixel sensors may be arranged in a 2x1 grid, and the pyramidal structure may be centered on a line dividing the 2x1 grid in half.

[0209] For some embodiments of the example apparatus, the two or more plenoptic subpixel sensors may be arranged in a 2x2 grid, and the pyramidal structure may be centered on a center of the 2x2 grid.

[0210] For some embodiments of the example apparatus, the two or more plenoptic subpixel sensors may be arranged in a 3x3 grid, and the pyramidal structure may be centered on a center of the 3x3 grid.

[0211] For some embodiments of the example apparatus, the pyramidal structure may be a frustum, and a flat top of the frustum may be configured to match a center subpixel sensor of the 3x3 grid. [0212] For some embodiments of the example apparatus, a tip of the pyramidal structure may be a square.

[0213] For some embodiments of the example apparatus, a base of the pyramidal structure may be a disk.

[0214] For some embodiments of the example apparatus, a tip of the pyramidal structure may be a disk.

[0215] For some embodiments of the example apparatus, the two or more plenoptic subpixel sensors may be arranged in a grid array, and the pyramidal structure may be centered on a center of the grid array.

[0216] For some embodiments of the example apparatus, a pyramid portion of the pyramidal structure may be a triangle extruded in at least one direction.

[0217] For some embodiments of the example apparatus, a pyramid portion of the pyramidal structure may be a polygon extruded in at least one direction.

[0218] For some embodiments of the example apparatus, a base of a pyramid portion of the pyramidal structure may be a square.

[0219] Some embodiments of the example apparatus may further include one or more external insulators adjacent to an external edge of at least one of the plenoptic subpixel sensors.

[0220] For some embodiments of the example apparatus, the microlens may be configured to shift a chief ray angle corresponding to a pixel offset of two pixels.

[0221] For some embodiments of the example apparatus, at least one of the two or more plenoptic subpixel sensors may be adjacent to the color filter system.

[0222] Some embodiments of the example apparatus may further include an oxide layer, wherein the oxide layer may be adjacent to the color filter system, wherein at least one of the two or more plenoptic subpixel sensors may be adjacent to the oxide layer, and wherein the oxide layer may be between the color filter system and at least one of the two or more plenoptic subpixel sensors.

[0223] For some embodiments of the example apparatus, a base of a pyramid portion of the pyramidal structure may be adjacent to the oxide layer.

[0224] For some embodiments of the example apparatus, a size of a base of a pyramid portion of the pyramidal structure may be adjusted based on a curvature of the microlens.

[0225] For some embodiments of the example apparatus, the pyramidal structure may be symmetrical along a first plane cutting through a center of the pyramidal structure, the pyramidal structure may be symmetrical along a second plane cutting through the center of the pyramidal structure, and wherein the first plane may be perpendicular to the second plane.

[0226] For some embodiments of the example apparatus, wherein the color filter system may be adjacent to the microlens.

[0227] For some embodiments of the example apparatus, two or more plenoptic subpixel sensors may be adjacent to the color filter system.

[0228] For some embodiments of the example apparatus, the color filter system may include a filter pattern of two or more subpixels.

[0229] For some embodiments of the example apparatus, the pyramidal structure may include silicon dioxide [Si0 2 ).

[0230] For some embodiments of the example apparatus, the pyramidal structure may include silicon oxide.

[0231] For some embodiments of the example apparatus, the pyramidal structure may be configured to reduce optical crosstalk between at least two pixels.

[0232] For some embodiments of the example apparatus, the pyramidal structure may be configured to reduce optical crosstalk between at least two microlenses.

[0233] Note that various hardware elements of one or more of the described embodiments are referred to as “modules” that carry out (i.e., perform, execute, and the like) various functions that are described herein in connection with the respective modules. As used herein, a module includes hardware (e.g., one or more processors, one or more microprocessors, one or more microcontrollers, one or more microchips, one or more application-specific integrated circuits (ASICs), one or more field programmable gate arrays (FPGAs), one or more memory devices) deemed suitable by those of skill in the relevant art for a given implementation. Each described module may also include instructions executable for carrying out the one or more functions described as being carried out by the respective module, and it is noted that those instructions could take the form of or include hardware (i.e., hardwired) instructions, firmware instructions, software instructions, and/or the like, and may be stored in any suitable non-transitory computer-readable medium or media, such as commonly referred to as RAM, ROM, etc.

[0234] Although features and elements are described above in particular combinations, one of ordinary skill in the art will appreciate that each feature or element can be used alone or in any combination with the other features and elements. In addition, the methods described herein may be implemented in a computer program, software, or firmware incorporated in a computer-readable medium for execution by a computer or processor. Examples of computer-readable storage media include, but are not limited to, a read only memory (ROM), a random access memory (RAM), a register, cache memory, semiconductor memory devices, magnetic media such as internal hard disks and removable disks, magneto-optical media, and optical media such as CD-ROM disks, and digital versatile disks (DVDs). A processor in association with software may be used to implement a radio frequency transceiver for use in a WTRU, UE, terminal, base station, RNC, or any host computer.