Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
AUTOMATIC OBFUSCATION ENGINE FOR COMPUTER-GENERATED DIGITAL IMAGES
Document Type and Number:
WIPO Patent Application WO/2019/126389
Kind Code:
A1
Abstract:
A method and apparatus are disclosed for identifying an object with specific characteristics and automatically obfuscating part or all of a digital image corresponding to that object. The obfuscation comprises pixelation, color alteration, and/or contrast alteration. The obfuscation optionally can be performed only when the digital image is being viewed by certain client devices.

Inventors:
GARCIA LLORENC (US)
TUCKER MORGAN (US)
Application Number:
PCT/US2018/066603
Publication Date:
June 27, 2019
Filing Date:
December 19, 2018
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
IMVU INC (US)
International Classes:
G06T11/60; G06T3/40; G06T11/00; G06V10/40; H04L29/06
Foreign References:
US20130152014A12013-06-13
US20160125244A12016-05-05
US20130117861A12013-05-09
Attorney, Agent or Firm:
YAMASHITA, Brent (US)
Download PDF:
Claims:
What Is Claimed Is:

1. A method of automatically generating an obfuscated image, comprising:

receiving characteristics data for an object comprising a first sub-object and a second sub-object:

determining, based on the characteristics data, that the first sub-object should be

identifying an object to be obfuscated;

receiving a first set of pixel data for the first sub-object, the first set of pixel data comprising data elements for each pixel in a first plurality of pixels; and

transforming the first set of pixel data into a second set of pixel data, the second set of pixel data comprising data elements for each pixel in a second plurality of pixels, wherein the second plurality of pixels is fewer in number than the first plurality of pixels and each of the data elements in the second set of pixel data is calculated from data elements for one or more of the pixels in the first plurality of pixels.

scaling the second set of pixel data to generate a third set of pixel data; and

rendering the third set of pixel data on a display of a computing device.

2. The method of claim 1, further comprising:

scaling the second set of pixel data to generate a third set of pixel data; and

rendering the third set of pixel data on a display of a computing device.

3. The method of claim 1, further comprising:

altering the color content of the second set of pixel data to generate a third set of pixel data;

scaling the third of pixel data to generate a fourth set of pixel data; and rendering the fourth set of pixel data on a display of a computing device.

4. The method of claim 1, further comprising:

altering the second set of pixel data to alter the contrast between pixels in the second set of pixel data to generate a third set of pixel data;

scaling the third of pixel data to generate a fourth set of pixel data; and

rendering the fourth set of pixel data on a display of a computing device.

5. The method of claim 1, further comprising:

altering the color content of the second set of pixel data to generate a third set of pixel data;

altering the third set of pixel data to alter the contrast between pixels in the third set of pixel data to generate a fourth set of pixel data;

scaling the fourth set of pixel data to generate a fifth set of pixel data; and

rendering the fifth set of pixel data on a display of a computing device.

6. The method of claim 1, further comprising:

rendering the first set of pixel data on a display of a first computing device; and rendering the second set of pixel data on a display of a second computing device.

7. The method of claim 2, further comprising:

rendering the first set of pixel data on a display of another computing device.

8. The method of claim 3, further comprising:

rendering the first set of pixel data on a display of another computing device.

9. The method of claim 4, further comprising:

rendering the first set of pixel data on a display of another computing device.

10. The method of claim 5, further comprising: rendering the first set of pixel data on a display of another computing device.

11. A computing device comprising a processing unit configured by a program of instructions to:

identify an object to be obfuscated;

receive a first set of pixel data for the object, the first set of pixel data comprising data elements for each pixel in a first plurality of pixels; and

transform the first set of pixel data into a second set of pixel data, the second set of pixel data comprising data elements for each pixel in a second plurality of pixels, wherein the second plurality of pixels is fewer in number than the first plurality of pixels and each of the data elements in the second set of pixel data is calculated from data elements for one or more of the pixels in the first plurality of pixels.

12. The computing device of claim 11, wherein the processing unit is further configured by the program of instructions to:

scale the second set of pixel data to generate a third set of pixel data; and

render the third set of pixel data on a display of a computing device.

13. The computing device of claim 11, wherein the processing unit is further configured by the program of instructions to:

alter the color content of the second set of pixel data to generate a third set of pixel data; scale the third of pixel data to generate a fourth set of pixel data; and

render the fourth set of pixel data on a display of a computing device.

14. The computing device of claim 11, wherein the processing unit is further configured by the program of instructions to:

alter the second set of pixel data to alter the contrast between pixels in the second set of pixel data to generate a third set of pixel data;

scale the third of pixel data to generate a fourth set of pixel data; and

render the fourth set of pixel data on a display of a computing device.

15. The computing device of claim 11, wherein the processing unit is further configured by the program of instructions to:

alter the color content of the second set of pixel data to generate a third set of pixel data; alter the third set of pixel data to alter the contrast between pixels in the third set of pixel data to generate a fourth set of pixel data;

scale the fourth set of pixel data to generate a fifth set of pixel data; and

render the fifth set of pixel data on a display of a computing device.

16. The computing device of claim 11, wherein the processing unit is further configured by the program of instructions to:

render the first set of pixel data on a display of a first computing device; and

render the second set of pixel data on a display of a second computing device.

17. The computing device of claim 12, wherein the processing unit is further configured by the program of instructions to:

render the first set of pixel data on a display of another computing device.

18. The computing device of claim 13, wherein the processing unit is further configured by the program of instructions to:

render the first set of pixel data on a display of another computing device.

19. The computing device of claim 14, wherein the processing unit is further configured by the program of instructions to:

render the first set of pixel data on a display of another computing device.

20. The computing device of claim 15, wherein the processing unit is further configured by the program of instructions to:

render the first set of pixel data on a display of another computing device.

Description:
AUTOMATIC OBFUSCATION ENGINE FOR

COMPUTER-GENERATED DIGITAU IMAGES

REUATED APPUIC A TION

[0001] This application claims priority to U.S. Patent Application No. 15/853,405 entitled “AUTOMATIC OBFUSCATION ENGINE FOR COMPUTER-GENERATED DIGITAL IMAGES, filed on December 22, 2017.

TECHNICAL FIELD

[0002] A method and apparatus are disclosed for identifying an object with specific characteristics and automatically obfuscating part or all of a digital image corresponding to that object. The obfuscation comprises pixelation, color alteration, and/or contrast alteration. The obfuscation optionally can be performed only when the digital image is being viewed by certain client devices.

BACKGROUND OF THE INVENTION

[0003] As computing technology continually improves, the ability to quickly generate and render digital images on a display is becoming more and more sophisticated. Computer-generated images have become extremely realistic and often comprise layers of different details.

[0004] At the same time, realistic images are not always desirable for all viewers. For example, if a minor is operating a client device, the content provider may not want that minor to be able to see images containing adult content, such as images containing nudity, violence, or disturbing depictions. Numerous other reasons exist for wanting to shield certain users from certain content. For example, there may be privacy or intellectual property concerns with certain images, or the content provider may wish for only certain individuals, and not the general public, to be able to see the images.

[0005] To date, content has been shielded from viewers through access controls, for example, by preventing certain users from accessing certain content altogether, such as by denying access to a video file. This is an overly restrictive approach, as it prevents users from seeing the entire content even though the objectionable portion may be only a small portion of the overall content in terms of pixels or time displayed on the screen.

[0006] What is needed is a mechanism for automatically identifying an object for which obfuscation is desired, identifying the specific structure that should be obfuscated, and then obfuscating the structure prior to display on a screen. What is further needed is a mechanism for achieving this result in a way that does not detract from the viewing of the overall image containing the specific structure. What is further needed is the ability to perform such obfuscation only for certain client computing devices and not others.

SUMMARY OF THE INVENTION

[0007] A method and apparatus are disclosed for identifying an object with specific

characteristics and automatically obfuscating part or all of a digital image corresponding to that object. The obfuscation comprises pixelation, color alteration, and/or contrast alteration. The obfuscation optionally can be performed only when the digital image is being viewed by certain client devices.

BRIEF DESCRIPTION OF THE DRAWINGS

[0008] Figure 1 depicts hardware components of a client device.

[0009] Figure 2 depicts software components of the client device.

[0010] Figure 3 depicts a plurality of client devices in communication with a server.

[0011] Figure 4 depicts an obfuscation engine. [0012] Figure 5 depicts an object identification engine for identifying objects for which an associate image should be obfuscated.

[0013] Figure 6 depicts pixel data and an image for an exemplary object for which obfuscation is to be performed.

[0014] Figure 7 depicts a pixelation engine operating upon pixel data from an object.

[0015] Figure 8 depicts a color engine operating upon pixel data from an object.

[0016] Figure 9 depicts a contrast engine operating upon pixel data from an object.

[0017] Figure 10 depicts a pixelation engine, color engine, and contrast engine operating upon pixel data from an object.

[0018] Figure 11 depicts the display of an image and an altered image derived from the same object, where the image is displayed on one client device and the altered image is concurrently displayed on another client device.

DETAILED DESCRIPTIONS OF THE PREFERRED EMBODIMENTS

[0019] Figure 1 depicts hardware components of client device 100. These hardware components are known in the prior art. Client device 100 is a computing device that comprises processing unit 110, memory 120, non-volatile storage 130, positioning unit 140, network interface 150, image capture unit 160, graphics processing unit 170, and display 180. Client device 100 can be a smartphone, notebook computer, tablet, desktop computer, gaming unit, wearable computing device such as a watch or glasses, or any other computing device.

[0020] Processing unit 110 optionally comprises a microprocessor with one or more processing cores. Memory 120 optionally comprises DRAM or SRAM volatile memory. Non-volatile storage 130 optionally comprises a hard disk drive or flash memory array. Positioning unit 140 optionally comprises a GPS unit or GNSS unit that communicates with GPS or GNSS satellites to determine latitude and longitude coordinates for client device 100, usually output as latitude data and longitude data. Network interface 150 optionally comprises a wired interface (e.g., Ethernet interface) or wireless interface (e.g., 3G, 4G, GSM, 802.11, protocol known by the trademark“Bluetooth,” etc.). Image capture unit 160 optionally comprises one or more standard cameras (as is currently found on most smartphones and notebook computers). Graphics processing unit 170 optionally comprises a controller or processor for generating graphics for display. Display 180 displays the graphics generated by graphics processing unit 170, and optionally comprises a monitor, touchscreen, or other type of display.

[0021] Figure 2 depicts software components of client device 100. Client device 100 comprises operating system 210 (such as the operating systems known by the trademarks“Windows,” “Linux,”“Android,”“iOS,” or others) and client application 220. Client application 220 comprises lines of software code executed by processing unit 110 and/or graphics processing unit 170 to perform the functions described below. For example, client device 100 can be a smartphone sold with the trademark“Galaxy” by Samsung or“iPhone” by Apple, and client application 220 can be a downloadable app installed on the smartphone or a browser running code obtained from server 300 (described below). Client device 100 also can be a notebook computer, desktop computer, game system, or other computing device, and client application 220 can be a software application running on client device 100 or a browser on client device 100 running code obtained from server 300. Client application 220 forms an important component of the inventive aspect of the embodiments described herein, and client application 220 is not known in the prior art.

[0022] With reference to Figure 3, three instantiations of client device 100 are shown, client devices lOOa, lOOb, and lOOc. These are exemplary devices, and it is to be understood that any number of different instantiations of client device 100 can be used.

[0023] Client devices lOOa, lOOb, and lOOc each communicate with server 300 using network interface 150. Server 300 runs server application 320. Server application 320 comprises lines of software code that are designed specifically to interact with client application 220.

[0024] Figure 4 depicts engines contained within client application 220, within server application 320, or split between client application 220 and server application 320. One of ordinary skill in the art will understand and appreciate that the functions described below can be distributed between server application 320 and client application 220.

[0025] Application 220 and/or application server 320 comprise obfuscation engine 400, scaler 440, and object identification engine 450. Obfuscation engine comprises pixelation engine 410, color engine 420, and/or contrast engine 430. Obfuscation engine 400, pixelation engine 410, color engine 420, contrast engine 430, scaler 440, and object identification engine 450 each comprises lines of software code executed by processing unit 110 and/or graphics processing unit 170, and/or comprises additional integrated circuitry, to perform certain functions. For example, scaler 440 might comprise software executed by processing unit 110 and/or graphics processing unit 170 and/or might comprise hardware scaling circuitry comprising integrated circuits.

[0026] Obfuscation engine 400 receives an input, typically comprising pixel data, and performs an obfuscation function using one or more of pixelation engine 410, color engine 420, contrast engine 430, and/or other engines on the input to generate an output, where the output can then be used to generate an image that is partially or wholly obfuscated.

[0027] Pixelation engine 410 performs an obfuscation function by receiving input pixel data and pixelating the received input pixel data to generate output pixel data, where the output pixel data generally contains fewer pixels than the input pixel data and each individual pixel in the output pixel data is based on one or more pixels in the input pixel data.

[0028] Color engine 420 performs an obfuscation function by receiving input pixel data and altering the color of one or more pixels in the input pixel data to generate output pixel data.

[0029] Contrast engine 430 performs an obfuscation function by receiving input pixel data and altering the contrast between two or more pixels in the input pixel data to generate output pixel data.

[0030] Scaler 440 performs a scaling function by receiving input pixel data and scaling the input pixel data to generate output pixel data. Scaler 440 can be used, for example, if the input pixel data is arranged in a different size configuration (e.g., y rows of x pixels per row) than the size configuration of display 180 of client device 100 on which the image is to be displayed (e.g., c rows of d pixels per row).

[0031] Object identification engine 450 identifies one or more objects or sub-objects upon which obfuscation is to be performed.

[0032] With reference to Figure 5, object identification engine 450 analyzes object 500 and provides an input to obfuscation engine 400. Object 500 optionally comprises data structure 510 and is associated with pixel data 520 and image 530. Data structure 510 comprises sub-objects 501 and 504 and characteristics 506 and 507. Sub-object 501 comprises characteristics 502 and 503, and sub-object 504 comprises characteristic 505. Pixel data 520 optionally corresponds to object 500 at a specific moment in time, and image 530 is the image that would be generated based on pixel data 520 if no alteration occurred.

[0033] An example of object 500 might be a character in a video game or virtual world, and examples of sub-objects 501 and 504 might be a shirt and pants that the character wears. Another example of object 500 might be a digital photograph, and examples of sub-objects 501 and 504 might be a face and body. Another example of object 500 might be landscape imagery, and examples of sub-objects 501 and 504 might be sunlight and a mountain. One of ordinary skill in the art will appreciate that these examples are not limiting, and object 500 can be any number of possible objects.

[0034] Optionally, one or more of characteristics 502, 503, 505, 506, and 507 can be a characteristic for which obfuscation is desired. For example, the characteristic might indicate that an item is secret or private (such as a person’s face/identity, or financial information) or that the item is not appropriate for viewing by all audiences (such as an item with sexual content, violent content, etc.). In the example where object 500 is a character in a video game or virtual world and sub-object 501 is a shirt, characteristic 502 might be“adult only,”“see-through," or “invisible.” Object identification engine 450 examines all portions of object 500 and identifies sub-objects or objects for which obfuscation is desired, such as sub-object 501 (e.g., a see- through shirt). Once such items are identified, object identification engine 450 sends the object 500, sub-object 501, or their associated pixel data to obfuscation engine 400.

[0035] In another embodiment, object identification engine 450 comprises image recognition engine 540, which will analyze pixel data 520 or image 530 and compare it to a set of known pixel data or images contained in database 550. If a match is found, then object identification engine 450 will identify object 500 or a relevant sub-object as an object to be obfuscated and sends object 500, the relevant sub-object 501, or their associated pixel data to obfuscation engine 400. This embodiment is useful for identifying known images for which obfuscation is desired. For example, one might do this with images protected by a copyright or trademark for which no license has been obtained, or one might also do this with images known to be offensive. [0036] With reference now to Figure 6, it is assumed that sub-object 501 is sent to obfuscation engine 400, along with pixel data 620, where pixel data 620 is the portion of pixel data 520 that corresponds to sub-object 501 (e.g., shirt). In this embodiment, Pixel data 620 comprises an array of pixel data, the array comprising i columns and j rows of pixel data values, p C oiumn, row, where each pixel data value contains data that can be used to generate a pixel on display 180. For example, p C oiumn, row can comprise 32 bits (8 bits for red, 8 bits for green, 8 bits for blue, and optionally, 8 bits for alpha channel or transparency). One of ordinary skill in the art will appreciate that p C oiumn, row can comprise other numbers of bits and that 32 bits is just one possible embodiment. It is to be further understood that pixel data 620 need not be in array form and could constitute any collection of pixel data values. Obfuscation engine 400 will act upon pixel data 620 using one or more of pixelation engine 410, color engine 420, and contrast engine 430. Image 630 is the image that would be displayed based on pixel data 620 absent any alteration.

[0037] In Figure 7, pixelation engine 410 is shown. Pixelation engine 410 receives pixel data 620 and pixelates the data to generate pixelated data 720. In this embodiment, pixelated data comprises an array of pixel data, the array comprising m columns of n rows of pixel data value, qcoiumn, row, where each pixel data value contains data that can be used to generate a pixel on display 180. Typically, m < i and n < j. For instance, i and j might be 32 and 32, and m and n might be 16 and 16 or 8 and 8. That is, a 32x32 array of pixel data might be pixelated into an array of 16x16 or 8x8. In this example, q C oiumn, row can comprise 32 bits (8 bits for red, 8 bits for green, 8 bits for blue, and optionally, 8 bits for alpha channel or transparency). One of ordinary skill in the art will appreciate that q C oiumn, row can comprise other numbers of bits and that 32 bits is just one possible embodiment.

[0038] There are numerous approaches for determining the value of each q C oiumn, row. In one embodiment, q C oiumn,row is a weighted average of all pixels in pixel data 620 that are within the same relative location within the array. For example, when pixelated data 720 is a 16 x 16 array, the second pixel in the top row can be considered to occupy a space equal to 1/16 of the width of the array x 1/16 of the height of the array, starting at a location that is 1/16 in from the left edge in the horizontal direction and at the top edge in the vertical direction. With that relative size and location in mind, one could then determine the same relative size and location in the 32x32 array represented by pixel data 620. Because pixel data 620 has a larger array size than pixelated data 720, each pixel q C oiumn, row will correspond to some or all of more than one pixel p C oiumn, row. q can be calculated as a weighted average of those p values based on the portion of p that is covered by the q pixel.

[0039] Contained below is exemplary source code that can be used by pixelation engine 410 for performing the pixelation function. This code can be used to obtain samples on many positions within pixel data 620 on a given texture and to perform an average on those values to generate a pixel value. In this exemplary code, the variable“color” is q C oiumn, row.

vec4 color;

vec2 origin = getSampleOrigin();

float sampleWidth = pixelWidth / widthSamples;

float sampleHeight = pixelHeight / heights amp les;

for (int i = 0; i < widthSamples; i++) {

for (int j = 0; j < heightSamples; j++) {

vec2 coord = origin + vec2(sampleWidth * i, sampleHeight * j);

color += texture2D( tMap, coord) .rgb;

} }

color /= (float)(widthSamples * heightSamples);

[0040] Because pixelated data 720 will not have the same array size as pixel data 620, the resulting pixelated image 730 will be smaller than image 630. However, the end result will be scaled by scaler 440 into the appropriate size for display 180, resulting in scaled, pixelated image 735.

[0041] In Figure 8, color engine 420 is shown. Color engine 420 receives pixel data 620 and alters the color of one or more pixels in pixel data 620 to generate color-altered pixel data 820. Here, the array sizes of pixel data 620 and color- altered pixel data 820 are the same (i.e., i columns x j rows). However, color engine 420 applies a filter to each pixel data value p C oiumn, row to generate a color-altered pixel data value r co iumn, row. In this example, r co iumn, row can comprise 32 bits (8 bits for red, 8 bits for green, 8 bits for blue, and optionally, 8 bits for alpha channel or transparency). One of ordinary skill in the art will appreciate that r co iumn, row can comprise other numbers of bits and that 32 bits is just one possible embodiment.

[0042] Any number of different filters can be applied. For example, a grayscale filter can be applied to translate each pixel data value p C oiumn, row into a gray- scale value, such that the resulting color-altered image 830 is a gray-scale image. As another example, a bright color filter can be applied to translate each pixel data value p C oiumn, row into a bright color selected from a specific set of bright colors (e.g., fuchsia, bright green, etc.). As another example, a sepia filter can be applied to translate each pixel data value p C oiumn, row into a sepia-colored value.

[0043] Contained below is exemplary source code that can be used by color engine 420 for performing the color alteration function to generate a sepia-colored value. This code will transform the given color into a sepia tone color. Here, sepiaColor.r is the“r” value, sepiaColor.g is the“g” value, sepia.Color.b is the“b” value, and sepiaColor.a is the“a” value of

IOG rcolumn, row·

vec4 color;

vec4 sepiaColor;

sepiaColor.r = (color.r * 0.393) + (color.g * 0.769) + (color.b * 0.189);

sepiaColor.g = (color.r * 0.349) + (color.g * 0.686) + (color.b * 0.168);

sepiaColor.b = (color.r * 0.272) + (color.g * 0.534) + (color.b * 0.131);

sepiaColor.a = color.a;

[0044] In Figure 9, contrast engine 430 is shown. Contrast engine 430 receives pixel data 620 and alters the contrast between pixels to generate contrast-altered pixel data 820. Here, the array sizes of pixel data 620 and contrast-altered pixel data 820 are the same (i.e., i columns x j rows). However, contrast engine 420 applies a filter to each pixel data value p Coiumn, row to generate a contrast-altered pixel data value s coiumn, row . In this example, s coiumn, row can comprise 32 bits (8 bits for red, 8 bits for green, 8 bits for blue, and optionally, 8 bits for alpha channel or transparency). One of ordinary skill in the art will appreciate that s coiumn, row can comprise other numbers of bits and that 32 bits is just one possible embodiment.

[0045] Any number of different contrast filters can be applied. For example, filter can be applied to increase the contrast between pixels. Or a filter can be applied to decrease the contrast between pixels. The latter is typically more useful in obfuscating images for the human eye.

[0046] Contained below is exemplary source code that can be used by contrast engine 430 for performing the contrast alteration function to alter the contrast between pixels. In this example, the code decreases the contrast of the given color by making an interpolation towards white, controlled by contrastFactor. Here, the variable color.rgb is s coiumn, row . vec4 color;

color.rgb = mix(color.rgb, vec3(l.0), contrastFactor);

[0047] It is to be understood that pixelation engine 410, color engine 420, and contrast engine 430 can be applied in varying combinations and in different orders. For example, only one of them might be applied or two or three of them can be applied, and the order in which they are applied can vary. Obfuscation engine 400 optionally will allow the administrator of application server 320 to select which engine to apply in a given situation.

[0048] In Figure 10, an example is shown where that pixelation engine 410, color engine 420, and contrast engine 430 are all applied. Here, pixelation engine 410 receives pixel data 620. Its output 621 is then provided to color engine 420, and then the output 622 of color engine 420 is provided as an input to contrast engine 430. The end result is pixelated, color-altered, contrast- altered pixel data 1020, comprising an array of pixel data, the array comprising m columns of n rows of pixel data value, t co iumn, row, where each pixel data value contains data that be used to generate a pixel on display 180. In this example, t co iumn, row can comprise 32 bits (8 bits for red, 8 bits for green, 8 bits for blue, and optionally, 8 bits for alpha channel or transparency). One of ordinary skill in the art will appreciate that t co iumn, row can comprise other numbers of bits and that 32 bits is just one possible embodiment. Scaler 440 ultimately will be used to scale the image to the ideal size for display 180, here shown as scaled, pixelated, color- altered, contrast- altered image 1035.

[0049] The value of the invention can be seen in comparing scaled, pixelated, color-altered, contrast-altered image 1035 to image 630 in Figure 10. The obfuscation is readily apparent, and its value can be appreciated by those of ordinary skill in the art as well as anyone who has ever desired to shield minors or other users from certain content. [0050] In Figure 11, obfuscation engine 400 can be utilized only for certain client devices 100.

In this example, client device lOOa is operated by an adult and client device lOOb is operated by a minor. This information is known by server 300, for example, based on the user profiles of the users operating client devices lOOa and lOOb. As a result, object identification engine 450 determines that obfuscation of sub-object 501 is desired for client device lOOb but not for client device lOOa. Thereafter, client device lOOa renders image 630, which is an unaltered image generated for object 500, but client device lOOb renders scaled, pixelated, color-altered, contrast- altered image 1035.

[0051] In the example where object 500 is a character and sub-object 501 is a see-through shirt, the character would appear on client device lOOa in a see-through shirt, but the character would appear on client device lOOb in an obfuscated shirt.

[0052] References to the present invention herein are not intended to limit the scope of any claim or claim term, but instead merely make reference to one or more features that may be covered by one or more of the claims. Materials, processes and numerical examples described above are exemplary only, and should not be deemed to limit the claims. It should be noted that, as used herein, the terms“over” and“on” both inclusively include“directly on” (no intermediate materials, elements or space disposed there between) and“indirectly on” (intermediate materials, elements or space disposed there between). Likewise, the term“adjacent” includes“directly adjacent” (no intermediate materials, elements or space disposed there between) and“indirectly adjacent” (intermediate materials, elements or space disposed there between). For example, forming an element“over a substrate” can include forming the element directly on the substrate with no intermediate materials/elements there between, as well as forming the element indirectly on the substrate with one or more intermediate materials/elements there between.