Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
SYSTEM FOR CONTROLLING PIXEL ARRAY SENSOR WITH INDEPENDENTLY CONTROLLED SUB PIXELS
Document Type and Number:
WIPO Patent Application WO/2015/092794
Kind Code:
A1
Abstract:
A system for controlling a pixel array sensor with independently controlled sub pixels is provided herein. The system includes at least one image detector, comprising an array of photo-sensitive pixels, each photo-sensitive pixel comprising at least one first type photo-sensitive sub pixel and plurality of second type photo-sensitive sub pixels; and a processor configured to control the at least first type controlled photo-sensitive sub pixel and the plurality of second type second type photo-sensitive sub pixels according to a specified exposure scheme, wherein the processor is further configured to control the at least one first type sub pixel independently of the specified exposure scheme, wherein the processor is further configured to selectively combine data coming from the at least one first type sub pixel with data coming from at least one of the plurality of second type sub pixels.

Inventors:
GRAUER YOAV (IL)
Application Number:
PCT/IL2014/051106
Publication Date:
June 25, 2015
Filing Date:
December 17, 2014
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
BRIGHTWAY VISION LTD (IL)
International Classes:
H04N5/335; H01L27/146; H04N5/359; H04N9/083
Domestic Patent References:
WO2013157001A12013-10-24
WO2013157001A12013-10-24
Foreign References:
US20130208154A12013-08-15
US20110279721A12011-11-17
US20030193589A12003-10-16
US8446470B22013-05-21
DE19833207A12000-02-17
US20110260059A12011-10-27
Other References:
See also references of EP 3085076A4
Attorney, Agent or Firm:
TAL, Ophir et al. (P.O.B. 12704, 49 Herzliya, IL)
Download PDF:
Claims:
P-77461-PC

CLAIMS

1. A system comprising:

at least one image detector, comprising an array of photo-sensitive pixels, each photo-sensitive pixel comprising at least one first type photo-sensitive sub pixel and at least one second type photo-sensitive sub pixel; and

a processor configured to control the at least one first type controlled photo-sensitive sub pixel and the at least one second type photo-sensitive sub pixels according to a specified exposure scheme,

wherein the processor is further configured to control the at least one first type sub pixel independently of the specified exposure scheme, and

wherein the processor is further configured to selectively combine data coming from the at least one first type sub pixel with data coming from the at least one second type sub pixel.

The system according to claim 1, wherein the processor is further configured to control at least one of: exposure, and readout, of the at least one first type sub pixel, based on data coming from the at least one second type sub pixel.

The system according to claim 1, wherein the photo sensitive sub pixels have an anti- blooming capability so that a saturated sub pixel does not affect adjacent sub pixels.

The system according to claim 3, wherein the anti-blooming capability exhibit a ratio that is greater than 1 : 1000.

The system according to claim 1, wherein the system is mounted on a moving platform.

6. The system according to claim 1, wherein the at least one first type photo-sensitive sub pixel is infra-red (TR) sensitive. P-77461-PC

7. The system according to claim 6, wherein the at least one first type photo-sensitive sub pixel has a full width half maximum transmission of up to five percent of the center wavelength.

8. The system according to claim 7, wherein the at least one first type photo-sensitive sub pixel has an off band rejection of less than ten percent.

9. The system according to claim 1, wherein the at least one first type photo-sensitive sub pixel is sensitive to visible spectrum.

10. The system according to claim 9, wherein the at least one first type photo-sensitive sub pixel has a full width half maximum transmission of up to ten percent of the center wavelength.

11. The system according to claim 10, wherein the at least one first type photo-sensitive sub pixel has an off band rejection of less than ten percent.

12. The system according to claim 1, wherein the at least one first type photo-sensitive sub pixel has a larger area than the at least one second type photo-sensitive sub pixel.

13. The system according to claim 1, wherein the at least one first type photo-sensitive sub pixel has a smaller area than the area of the at least one second type photo-sensitive sub pixel.

14. The system according to claim 1, wherein the at least one first type photo-sensitive sub pixel comprises readout channel which is separate from the readout channel of the at least one second type photo-sensitive sub pixel.

15. The system according to claim 1, wherein the at least one first type photo-sensitive sub pixel is coupled to an amplifier configured to amplify a signal coming from the at least one first type photo-sensitive sub pixel independently of the signals coming from the at least one second type photo-sensitive sub pixel. P-77461-PC

16. The system according to claim 1, wherein the at least one first type photo-sensitive sub pixel is synchronized with an external light source.

17. The system according to claim 1, wherein the at least one first type photo-sensitive sub pixel exposure scheme is synchronized with an external light source pulsing scheme.

18. The system according to claim 1, wherein the at least one first type photo-sensitive sub pixel exposure scheme is synchronized with an external light source modulation scheme.

19. The system according to claim 1 , further comprising an external light source and wherein the at least the first type sub pixel is coupled to a filter having a spectral range similar to a spectral range of the external light source.

20. The system according to claim 1 wherein the data selectively combined by the processor includes data captured at different exposure times.

21. The system according to claim 1, wherein the selectively combined data coming from the at least one first type sub pixel with data coming from at least one of the plurality of second type sub pixels is usable for Advance Driver Assistance Systems (ADAS) functions.

Description:
SYSTEM FOR CONTROLLING PIXEL ARRAY SENSOR WITH INDEPENDENTLY CONTROLLED SUB PIXELS

BACKGROUND

1. TECHNICAL FIELD

[0001] The disclosed technique relates to imaging systems, in general, and to method for object detection and classification system.

2. DISCUSSION OF RELATED ART

[0002] Traditional imaging sensors use a spectral pattern with various configurations such as Bayer pattern, "RGBW" (red, green, blue, white), "RCCC" (red, Clear, Clear, Clear) etc. These color or clear spectral filter pass a wide spectral region which masks the pure signal. Prior art has described also narrow spectral patterns on the imaging sensor pixels such Fabry-Perot filters. This approach may lack spectral information due to the narrow spectral band and may also lack immunity to backscattering for an active imaging approach.

[0003] Another prior art, US Patent No. 8,446,470, titled "Combined RGB and IR imaging sensor" describes an imaging system with plurality of sub-arrays having different sensing colors and infrared radiation. This proposed imaging system has inherent drawback in imaging wide dynamic range scenery of a single spectral radiation such as originating from a LED or a Laser where a saturated pixel may mask (due to signal leakage) a nearby non saturated pixel. Another drawback may occur in imaging scenery consisting a pulsed or modulated spectral radiation, such as originating from a LED or a Laser, where a pixel exposure is not synchronized or unsynchronized to this type of operation method.

[0004] Before describing the invention method, the following definitions are put forward. [0005] The term "Visible" as used herein is a part of the electro-magnetic optical spectrum with wavelength between 400 to 700 nanometers.

[0006] The term "Infra-Red" (IR) as used herein is a part of the Infra-Red spectrum with wavelength between 700 nanometers to 1mm. P-77461-PC

[0007] The term "Near Infra-Red" (NIR) as used herein is a part of the Infra-Red spectrum with wavelength between 700 to 1400 nanometers.

[0008] The term "Short Wave Infra-Red" (SWIR) as used herein is a part of the Infra-Red spectrum with wavelength between 1400 to 3000 nanometers. [0009] The term "Field Of View" (FOV) as used herein is the angular extent of a given scene, delineated by the angle of a three dimensional cone that is imaged onto an image sensor of a camera, the camera being the vertex of the three dimensional cone. The FOV of a camera at particular distances is determined by the focal length of the lens and the active image sensor dimensions.

[0010] The term "Field of Illumination" (FOI) as used herein is the angular extent of a given scene, delineated by the angle of a three dimensional cone that is illuminated from an illuminator (e.g. LED, LASER, flash lamp, ultrasound transducer, etc.), the illuminator being the vertex of the three dimensional cone. The FOI of an illuminator at particular distances is determined by the focal length of the lens and the illuminator illuminating surface dimensions.

[0011] The term "pixel" or "photo-sensing pixel" as used herein, is defined as a photo sensitive element used as part of an array of pixels in in an image detector device.

[0012] The term "sub pixel" or "photo-sensing sub pixel" as used herein, is defined as a photo sensitive element used as part of an array of sub pixels in a photo-sensing pixel. Thus, an image detector has an array of photo-sensing pixels and each photo-sensing pixel includes an array of photo-sensing sub pixels. Specifically, each photo-sensing sub pixel may be sensitive to a different range of wavelengths. Each one of the photo-sensing sub pixel are controlled in accordance with a second type exposure and/or readout scheme.

[0013] The term "second type exposure and/or readout scheme" of a photo-sensing sub pixel as used herein, is defined as a single exposure (i.e. light accumulation) of the photo sensitive element per a single signal read. [0014] The term "first type sub pixel" or "first type photo-sensing sub pixel" as used herein, relates to a photo-sensing sub pixel which is controllable beyond the second type exposure scheme. P-77461-PC

BRIEF SUMMARY

[0015] In accordance with the disclosed technique, there is thus provided an imaging sensor (detector) or camera having an array of photo-sensitive pixels configuration that combines:

1. a mosaic spectral filter array photo-sensing sub pixels with at least two different spectrum sensitivity response;

2. a photo-sensing sub pixel exposure control mechanism for at least one type of the photo- sensing sub pixels;

3. a high anti-blooming ratio between adjacent sub pixels; and

4. a data transfer mechanism between at least two types of photo-sensing sub pixels to

improve signal accumulation and noise reduction of imaging sensor (detector).

[0016] In one embodiment of the present invention, photo-sensitive pixel configuration as described hereinabove includes at least one sub pixel: "first type sub pixel" or "first type photo-sensing sub pixel" which relates to a photo-sensing sub pixel controllable beyond the second type exposure scheme.

[0017] In another embodiment of the present invention, exposure control mechanism (i.e. exposure scheme) for at least one first type sub pixel may provide a single exposure per sub pixel signal readout or multiple exposures per single sub pixel readout.

[0018] In another embodiment of the present invention, pixel signal readout may be a single channel or multiple readout channels.

[0019] In another embodiment of the present invention, at least one first type sub pixel may have a separate signal readout channel as to other sub pixels readout channel.

[0020] The imaging sensor (detector) or camera of the present invention is suitable for use in automotive camera products, such as for mono-vision based systems, providing driver assistance functionalities such as: adaptive headlamp control systems, lane departure warning (and/or lane keeping), traffic sign recognition, front collision warning, object detection (e.g. pedestrian, animal etc.), night vision and/or the like.

[0021] The imaging sensor (detector) or camera of the present invention is suitable for use in automotive camera products, such as for stereo-vision based systems, providing driver assistance P-77461-PC functionalities such as: described hereinabove for mono-vision based systems, and 3D mapping information.

[0022] Therefore, the imaging sensor of the present invention can provide multi spectral imaging (for example both visible and IR imaging) capability with an adequate Signal to Noise (S N) and/or adequate Signal to Background (S/B) for each photo-sensing sub pixel array in a single sensor frame, without halo (blooming) effect between adjacent sub pixels, and without external filters (such as spectral, polarization, intensity etc.). Such a sub pixel configuration of visible and IR pixels is applicable to various pixelated imaging array type sensing devices. The imaging sensor of the present invention is suitable for applications in maritime cameras, automotive cameras, security cameras, consumer digital cameras, mobile phone cameras, and industrial machine vision cameras, as well as other markets and/or applications.

[0023] These, additional, and/or other aspects and/or advantages of the present invention are: set forth in the detailed description which follows; possibly inferable from the detailed description; and/or learnable by practice of the present invention.

BRIEF DESCRIPTION OF THE DRAWINGS

[0024] The present invention will be more readily understood from the detailed description of embodiments thereof made in conjunction with the accompanying drawings of which: Figure 1 is a schematic illustration of the operation of a mono vision system, constructed and operative in accordance with some embodiments of the present invention;

Figure 2A-Figure 2H are images taken with a SWIR active imaging system in accordance with some embodiments of the present invention;

Figures 3A- Figure 3C are images taken with a NIR active imaging system in accordance with some embodiments of the present invention;

Figures 4A-4D are schematic diagrams of a pixel and sub pixel array in accordance with some embodiments of the present invention; P-77461-PC

Figure 5 is a schematic of a pixel and sub pixel array control in accordance with some embodiments of the present invention;

Figure 6 is a schematic of a pixel and sub pixel array in accordance with some embodiments of the present invention; Figure 7 is a schematic of sensing structure with pixels in accordance with some embodiments of the present invention;

Figure 8 is a schematic of an ADAS configuration in accordance with some embodiments of the present invention;

Figure 9 is a schematic of sensing structure with a pixel array in accordance with some embodiments of the present invention; and

Figure 10 is a schematic illustration of the operation of a stereo vision system, constructed and operative in accordance with some embodiments of the present invention.

DETAILED DESCRIPTION

[0025] Before explaining at least one embodiment of the invention in detail, it is to be understood that the invention is not limited in its application to the details of construction and the arrangement of the components set forth in the following description or illustrated in the drawings. The invention is applicable to other embodiments or of being practiced or carried out in various ways. Also, it is to be understood that the phraseology and terminology employed herein is for the purpose of description and should not be regarded as limiting. [0026] In accordance with the present invention, the disclosed technique provides methods and systems for accumulating a signal by a controllable spectral sensing element.

[0027] Figure 1 is a schematic illustration of the operation of a mono vision system 10, constructed and operative in accordance with some embodiments of the present invention. System 10 which may include at least a single illuminator 14 in the non-visible spectrum (e.g. NTR or SWIR by a LED and/or laser source) in order to illuminate, for example, the environment. Furthermore, system 10 may also include at least a single mosaic spectral imaging camera 15. For automotive applications, imaging camera 15 may be located internally in the vehicle behind the mirror in the area cleaned by the windshield wipers forward facing. Mosaic spectral imaging camera 15 may be an intensified- P-77461-PC

CCD, intensified-CMOS (where the CCD/CMOS is coupled to an image intensifier), electron multiplying CCD, electron bombarded CMOS, hybrid FPA (CCD or CMOS where the camera has two main components; Read-Out Integrated Circuits and an imaging substrate), avalanche photo- diode FPA etc. Preferably, imaging camera 15 is a Complementary Metal Oxide Semiconductor (CMOS) Imager Sensor (CIS). System 10 may further include a system control 11 interfacing with user via output 17. Imaging optical module 16 is adapted to operate and detect electromagnetic wavelengths at least those provided by illuminator 14 and may also detect electromagnetic wavelengths of the visible spectrum and of the IR spectrum. Imaging optical module 16 is further adapted for focusing incoming light onto light sensitive area of mosaic spectral imaging camera 15. Imaging optical module 16 may be adapted for filtering certain wavelength spectrums, as may be performed by a band pass filter and/or adapted to filter various light polarizations. Imaging optical module 16 is adapted to operate and detect electromagnetic wavelengths similar to those detected by mosaic spectral imaging camera 15.

[0028] System 10 may include at least a single illuminator 14 in the non-visible spectrum (i.e. NIR, SWIR or NIR/SWIR spectrum) providing a Field Of Illumination (FOI) covering a certain part of the mosaic spectral imaging camera 15 FOV. Illuminator 14 may be a Continues Wave (CW) light source or a pulsed light source. Illuminator 14 may provide a polarized spectrum of light and/or a diffusive light.

[0029] System 10 further includes a system control 11 which may provide the synchronization of the mono vision control 12 to the illuminator control 13. System control 11 may further provide real-time image processing (computer vision) such as driver assistance features (e.g. pedestrian detection, lane departure warning, traffic sign recognition, etc.) in the case of an automotive usage. Mono vision control 12 manages the mosaic spectral imaging camera 15 such as: image acquisition (i.e. readout), de-mosaicing and imaging sensor exposure control/mechanism. Illuminator control 13 manages the illuminator 14 such as: ON/OFF, light source optical intensity level and pulse triggering for a pulsed light source configuration.

[0030] Prior describing embodiments of invention, Figure 2A-Figure 3C demonstrate some of the drawbacks of prior art. These images were taken with a single pattern filter imaging array having a single exposure control. [0031] Reference is now made to Figure 2A-Figure 2H, where images were taken at nighttime with a system consisting a Continuous Wave (CW) SWIR laser illumination (i.e. 1.5μηι) and an imaging P-77461-PC sensor sensitive to SWIR spectrum (i.e. 0.8-1.6μηι). This vehicular system imaging sensor FOV is wider than the SWIR FOI. Images scenery is typical for an interurban road. In Figure 2A vehicular headlamps are not illuminating. The SWIR image is similar to a visible or NIR reflected image where; the road marking are noticeable, safety fences on the road margins are noticeable and other objects are easily understood. In Figure 2B vehicular headlamps are not illuminating. This SWIR image demonstrates the effect that close by objects (i.e. trees) have on active imaging (in this case a SWIR active imaging). The outcome is a saturated SWIR image. In Figure 2C-Figure 2D vehicular headlamps are illuminating (i.e. illuminating in the visible, NIR & SWIR spectrum). These SWIR images demonstrate the effect that close by objects (i.e. road) and the vehicle headlamp illumination have on active imaging (in this case a SWIR active imaging). The outcome is saturated SWIR images. In Figure 2E vehicular headlamps are illuminating (i.e. illuminating in the visible, NIR & SWIR spectrum). This SWIR image demonstrates the effect that close by objects (i.e. road) and the vehicle headlamp illumination have on active imaging (in this case a SWIR active imaging). The outcome is saturated SWIR image but with noticeable road asphalt cracks that may be used to provide road surface data. In Figure 2F- Figure 2G vehicular headlamps are not illuminating. These SWIR images demonstrate the ability to observe a pedestrian crossing the road at about 50m and about 120m respectively with an active imaging (in this case a SWIR active imaging). In Figure 2H vehicular headlamps are illuminating (i.e. illuminating in the visible, NIR & SWIR spectrum). This SWIR image demonstrates the effect that an oncoming vehicle with its headlights operating may saturate the imaging sensor.

[0032] Reference is now made to Figure 3 A- Figure 3B, where images were taken at nighttime with a system consisting a CW NIR laser illumination (i.e. 0.8μηι) and an imaging sensor sensitive to NIR spectrum (i.e. 0.81±0.05μιη due to a spectral filter in front the sensor) with a High Dynamic Range (HDR) of about 120dB. This vehicular system imaging sensor FOV is wider than the NIR FOI. Images scenery is typical for an interurban road. In Figure 3A vehicular headlamps are illuminating (i.e. illuminating in the visible & NIR spectrum). The NIR image is similar to a visible reflected image where; the road marking are noticeable, safety fences on the road margins are noticeable and other objects are easily understood. This NIR image demonstrates the ability to observe a pedestrian walking at about 40m while an oncoming vehicle with its headlights operating. In this scenario a pedestrian walking further away (for example at a distance on the oncoming vehicle, about 100m) will not be noticeable with this type of an active imaging (in this case a NIR active imaging) due to gain control, sensor sensitivity and dynamic range. In Figure 3B vehicular P-77461-PC headlamps are illuminating (i.e. illuminating in the visible & NIR spectrum). This NIR image demonstrates the effect that an oncoming vehicle, with its high beam headlights operating, may saturate the imaging sensor.

[0033] Reference is now made to Figure 3C, where the image was taken at daytime with a system consisting an imaging sensor sensitive to NIR spectrum (i.e. 0.81±0.05μιη due to a spectral filter in front the sensor). This image scenery is typical for urban scenario. The NIR image is similar to a visible reflected image where; the road marking are noticeable, traffic light signals are noticeable and other objects are easily understood. This NIR image lacks wide spectral data such as; red spectrum (i.e. stop sign on both sides of the intersection or vehicle tale lights) or the visible spectrum for some of lane marking traffic signals using LEDs. This NIR image demonstrates the effect that spectral data is required from the imaging sensor in order to achieve higher understanding of the viewed scenery.

[0034] Figure 4A illustrates a mosaic spectral imaging sensor pixel 35 (two by two sub pixel that is repeated over the pixelated array of imaging sensor 15) constructed and operative in accordance with some embodiments of the present invention. In such a pixelated array, the imaging sensor (detector) 15 includes individual optical filters that may transmit different spectrum: Fl spectrum 30a, F2 spectrum 30c, F3 spectrum 30b and F4 spectrum 30d. Each transmitted spectrum (Fl, F2, F2 and F3) may include at least one the following types of spectral filtration;

1. Long Pass Filter (LPF).

2. Short Pass Filter (SPF).

3. Band Pass Filter (BPF) with a Center Wavelength (CWL) transmission, Full Width Half

Maximum (FWHM) and peak transmission.

4. Polarization.

5. Optical density (intensity).

For example, this representation can define standard pixelated filters as indicated in the following table.

Example 1 Example 2

(Standard Bayer filter) (Standard RCCC filter)

Sub pixel Fl (R), Red information, high (R), Red information, high

transmission in the red spectrum. transmission in the red spectrum.

Sub pixel F2 (G), Green information, high (C), Clear information, no P-77461-PC

For example this representation can define pixelated filters as indicated in the following table.

[0035] A signal output, Signal(e) expressed in electrons, of prior art imaging 2D sensing element (i.e. sub pixel) without an internal gain and neglecting noise can be expressed by:

P(A)

Signal e) ¾ ■ ' ^width ' ^length ' ^exposure

S¾ is the sensing element response (responsivity) to a specific wavelength (i.e. S¾ = QE(X) · (A), QE(X) is the quantum efficiency and FF(X) is the sub pixel fill factor), is the optical power P-77461-PC density at a specific wavelength, d width di ength is the photo-sensing active area of the sub pixel (e.g. pin diode, buried pin diode etc.) and t exposure is the sub pixel exposure duration to the optical power density. Thus, taking into account that a Color Filter Array (CFA) and/or any type of spectral pattern (as illustrated in Figure 4A) is introduced on the imaging sensor array may result in an uneven signal (Signal(e)) from each sub pixel type and/or "spill" of signal (causing blooming/saturation) between the different spectral pattern sub pixels.

[0034] Figure 4B illustrates a mosaic spectral imaging sensor pixel 35 (two by two sub pixel that is repeated over the pixelated array of imaging sensor 15) constructed and operative in accordance with some embodiments of the present invention. The pattern is a Standard Bayer filter array. As one can see, the filters spectral responses (transmission curves) are overlapping and "open" (i.e. high transmission) in the Visible/ NTR. This spectral configuration may have degraded performance in day-time. The reason is sun irradiance which has more photons in NIR versus the visible spectrum resulting in lack of visible spectrum discrimination. A common solution is by introducing a constant or movable spectral filter (e.g. LFP, BPF, polarizer etc.) which reduces/ blocks the NIR spectrum. [0035] Figure 4C illustrates a mosaic spectral imaging sensor pixel 35 (two by two sub pixel that is repeated over the pixelated array of imaging sensor 15) constructed and operative in accordance with some embodiments of the present invention. The pattern is a similar to a Bayer filter array where one of the pixels (F3) is processed/ fabricated with a narrow NIR filter (i.e. FWHM of lOnm). This spectral configuration provides the ability of visible & NIR spectrum in a single image frame in two operating modes; in passive mode (i.e. without system illumination) and active mode (i.e. with system NIR system illumination). The illumination may be pulsed (synchronized) or CW. This configuration may lack the same visible spectrum discrimination as describe hereinabove. At scenarios where an active mode is operated some of the pixels (Fl, F2 & F4) may not detect the NIR illumination due to signal clutter. Signal clutter may origin in the visible spectrum or the NIR spectrum where these pixels have wide spectral filters (i.e. transmittance is high in a wide spectrum). This spectral configuration may lack to provide a full pixel array NIR resolution at nighttime and day-time.

[0036] Figure 4D illustrates a mosaic spectral imaging sensor pixel 35 (two by two sub pixel that is repeated over the pixelated array of imaging sensor 15) constructed and operative in accordance with some embodiments of the present invention. The pattern is a different from a Bayer filter array P-77461-PC where some of the pixels are processed/ fabricated with a wide spectral filter (i.e. FWHM of lOOnm with an optical density of at least two outside this spectral band ) and all of the pixels are processed/ fabricated with a narrow spectral filter (i.e. FWHM of l Onm with an optical density of at least two outside this spectral band). This spectral configuration provides the ability of visible & NIR spectrum in a single image frame in two operating modes; in passive mode (i.e. without system illumination) and active mode (i.e. with system NIR system illumination). The illumination may be pulsed (synchronized) or CW. The peak transmission of each response curve may be different. This configuration is ideal for providing visible spectrum information (which is not overlapping between the pixels/ filters), NIR information (which is overlapping between the pixels/ filters) to provide a full pixel array NIR resolution at night-time and day-time (i.e. the pixels transmission is overlapping in the NIR spectral band).

[0037] Figure 5 illustrates a mosaic spectral imaging sensor (detector) pixel 35 (two by two sub pixel array that is repeated over the pixelated array of imaging sensor 15) constructed and operative in accordance with some embodiments of the present invention. Each sub pixel pattern may have an exposure control capability (32a for 30a, 32b for 30b, 32c for 30c and 32d for 30d) to enable an uniform and controllable signal accumulation (Signal(e)) for sensor pixel 35.

[0038] Figure 6 illustrates a mosaic spectral imaging sensor (detector) pixel 35 (two by two sub pixel array that is repeated over the pixelated array of imaging sensor 15) constructed and operative in accordance with some embodiments of the present invention. At least a single sub pixel pattern (i.e. first type sub pixel) may have an exposure control capability (Figure 6 illustrate four first type sub pixel; 32a for 30a, 32b for 30b, 32c for 30c and 32d for 30d) to enable a uniform and controllable signal accumulation (Signal(e)) for sensor pixel 35. An exposure control mechanism 38 may be integrated in the imaging sensor or located externally of the imaging sensor. Each sub pixel pattern exposure control mechanism (e.g. exposure scheme) may operate separately, may operate in different timing, may operate in single exposure duration per single sub pixel signal readout and may operate with multiple exposures per single sub pixel signal readout. Exposure control mechanism 38 (controlling 32a, 32b, 32c and 32d) may be a gate-able switch, a controllable transistor or any other method of exposing and accumulating a signal in the sub pixel. Each sub pixel pattern exposure control mechanism may be synchronized or unsynchronized to external light source such as illustrated in Figure 1 (Illuminator 14). Exposure control mechanism 38 provides multi-functionality in a single imaging sensor (detector). Furthermore, exposure control mechanism P-77461-PC

38 provides the sub pixels to operate in a second type exposure and/or readout scheme or to provide for at least a single first type sub pixel to operate with a different exposure scheme. In addition, an anti-blooming mechanism is integrated in each type of sub pixel. Thus, a saturated sub pixel will not affect adjacent sub pixels (i.e. saturated sub pixel accumulated signal will not "spill" to nearby sub pixels). An anti-blooming ratio of above 1,000 may be sufficient. The mosaic spectral imaging sensor pixel 35 may include internally or externally a data transfer mechanism 39. Data transfer mechanism 39 probes each type of photo-sensing sub pixel accumulated signal to improve signal accumulation in other types of photo-sensing sub pixels. This method may be executed in the same imaging sensor 15 frame and/or in the following image sensor 15 frames. [0039] Figure 7 illustrates a section of a mosaic spectral imaging sensor pixels 40 (two by two sub- array pixel 35 that is repeated over the pixelated array of imaging sensor 15) constructed and operative in accordance with some embodiments of the present invention. A sub-array exposure control mechanism 38 and data transfer mechanism 39 (as described hereinabove) are not illustrated for reasons of simplicity. Imaging sensor pixels 40 resolution format may be flexible (e.g. VGA, SXGA, HD, 2k by 2k etc.). Sub- Array 35 may be distributed in a unified pattern spread or a random pattern spread over spectral imaging sensor pixels 40. Mosaic spectral imaging sensor pattern 40 readout process may be executed by rows, by columns and/or by reading-out similar sub pixels type. For example all first type sub pixels (for example Fl) shall be readout by a separate readout channel versus other sub pixels (F2, F3 and F4) that are readout by a different readout channel. This readout capability provides another layer of flexibility in the imaging sensor 15.

[0040] In another embodiment, fusion frames of mosaic spectral imaging sensor pixels 40 (two by two sub pixel array 35 that is repeated over the pixelated array of imaging sensor 15) provides yet another layer of information. A fused frame may provide data such as: moving objects types in the imaging sensor FOV, trajectory of moving objects in the imaging sensor FOV, scenery conditions (for example, ambient light level) or any other spectral, time variance data of the viewed scenery.

[0041] In another embodiment, in case of a moving platform (i.e. imaging sensor pixels 40 is movable) fused frames may provide yet another layer of information. A fused frame may provide full resolution image of the viewed FOV with at least a single spectral photo-sensing sub pixel.

[0042] Advance Driver Assistance Systems (ADAS) imaging based applications may require spectral information (info') as presented in Figure 8. Collision avoidance and mitigation includes all P-77461-PC types of objects such as; pedestrians, cyclists, vehicles and/or any other type of an object captured by the imaging system. Type A- Type C may define a specific ADAS configuration. System 10 may provide at least the above ADAS applications where mosaic spectral imaging sensor pixels 40 are incorporated in mosaic spectral imaging camera 15. For example, Type A and Type B may be based on a CMOS imager sensor where Type C may be based on an InGaAs imager sensor. For example, pixel 35 Type A and pixel 35 type B may be as follows.

In addition each type (Type A and/or Type B) may have different exposure control mechanism ( and anti-blooming ratio as defined hereinabove.

[0043] For another example, sub- Array 35 Type C may be at least on the options as follows.

P-77461-PC

In addition each type option (Type C option 1 and/or option 2) may have different exposure control mechanism (i.e. exposure scheme) and anti-blooming ratio as defined hereinabove.

[0044] In another embodiment, system 10 may provide at least the above ADAS applications in addition to predication of areas of interest where a mosaic spectral imaging sensor pixels 40 is incorporated in mosaic spectral imaging camera 15. Predicated areas of interest may include: objects in the viewed scenery (e.g. road signs, vehicles, traffic lights, curvature of the road etc.) and similar system approaching system 10.

[0045] Figure 9 illustrates a mosaic spectral imaging sensor pixel 36 (sub-array that is repeated over the pixelated array of imaging sensor 15) constructed and operative in accordance with some embodiments of the present invention. Mosaic spectral imaging sensor pixel 36 is similar to mosaic spectral imaging sensor pixel 35 in almost every aspect expect: sub pixel dimensions and the number of sub pixels per area. In such a pixelated array, the imaging sensor 15 includes individual optical filters that transmit; F2 spectrum 30c, F3 spectrum 30b, F5 spectrum 30g, F6 spectrum 30e, F7 spectrum 30h and F8 spectrum 30f. Each transmitted spectrum (F2 to F8) may include at least one the following types of spectral filtration;

1. Long Pass Filter (LPF).

2. Short Pass Filter (SPF).

3. Band Pass Filter (BPF) with a Center Wavelength (CWL) transmission, Full Width Half Maximum (FWHM) and peak transmission.

4. Polarization.

5. Optical density (intensity).

A sub-array exposure control mechanism 38 and data transfer mechanism 39 (as described hereinabove) are not illustrated for reasons of simplicity. As the signal output, Signal^e) is directly P-77461-PC related the photo-sensing active area of the sub pixel (d width di ength ) this proposed embodiment provides another layer of flexibility in sensing a wide dynamic range scene that may have also a wide spectrum distribution. Taking into account also S ¾ (sensing element response, responsivity) with this proposed embodiment can provide a unified imaging sensor 15 output from the entire array.

[0046] Figure 10 illustrates a stereo vision system 50 constructed and operative in accordance with some embodiments of the present invention. Stereo vision system 50 is similar to mono vision system 10 in almost every aspect expect: an additional imaging channel and an addition processing layer is added which provides also 3D mapping information in day-timing conditions, night-time conditions and any other light conditions. Stereo vision control 52 provides functionality as mono vision control 12 and also synchronizes each mosaic spectral imaging camera 15. Stereo vision system control 51 provides functionality as mono vision system control 11 and also includes all algorithms for 3D mapping. Stereo vision interfacing with user via output 21 provides functionality as mono vision system interfacing with user via output 17 and may also include 3D mapping information.

[0047] While the invention has been described with respect to a limited number of embodiments, these should not be construed as limitations on the scope of the invention, but rather as exemplifications of some of the preferred embodiments. Other possible variations, modifications, and applications are also within the scope of the invention.