Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
INTEGRATED TRANSCEIVER MODULE AND LIGHTING AND MONITORING ARRANGEMENT
Document Type and Number:
WIPO Patent Application WO/2023/156857
Kind Code:
A1
Abstract:
An integrated transceiver module for forward lighting, dynamic signaling and sensing comprises a substrate (SB) and an active area (AR) further comprising an array of pixels. A transceiver circuit (TC) is operable to selectively drive pixels in a first mode of operation or in a second mode of operation. In the first mode of operation, the transceiver circuit (TC) is operable to drive pixels with a forward bias so as to emit light. In the second mode of operation, the transceiver circuit (TC) is operable to drive pixels with a reverse bias so as to detect light.

Inventors:
NGUYEN HO HOAI DUC (DE)
COLOMBO MATTEO (IT)
LOLLIO ALEX (IT)
Application Number:
PCT/IB2023/050371
Publication Date:
August 24, 2023
Filing Date:
January 16, 2023
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
AMS SENSORS GERMANY GMBH (DE)
AMS ITALY S R L (IT)
International Classes:
G01S7/481; G01S17/10; G01S17/931
Foreign References:
US20030122749A12003-07-03
FR2857756A12005-01-21
US20150348504A12015-12-03
Attorney, Agent or Firm:
VIERING, JENTSCHURA & PARTNER MBB (DE)
Download PDF:
Claims:
CLAIMS

1. An integrated transceiver module for forward lighting, dynamic signaling and sensing comprising: a substrate (SB) , an active area (AR) comprising an array of pixels , a transceiver circuit (TC) operable to selectively drive pixels in a first mode of operation or in a second mode of operation; wherein: in the first mode of operation, the transceiver circuit (TC) is operable to drive pixels with a forward bias so as to emit light, and in the second mode of operation, the transceiver circuit (TC) is operable to drive pixels with a reverse bias so as to detect light.

2. The module according to claim 1, wherein the transceiver circuit (TC) is operable to address pixels individually by means of a select signal, and the pixels comprise in-pixel circuitry which is operable to selectively provide the forward bias or the reverse bias depending on the select signal.

3. The module according to claim 1 or 2, further comprising a processing unit integrated into the module.

4. The module according to one of claims 1 to 3, wherein the transceiver circuit (TC) is operable to address pixels: to form a first subset of pixels commonly operated as light emitters in the first mode of operation, or to form a second subset of pixels commonly operated as light detectors in the second mode of operation .

5. The module according to claim 4, wherein the first subset of pixels form one or more light emitting segments (LS) on the active area (AR) , and/ or the second subset of pixels form one or more light detecting segments (DS) on the active area (AR) .

6. The module according to claim 4 or 5, wherein in a LiDAR mode of operation, the first subset forms an emitter segment (ES) and the second subset forms a detector segment (DS) , spaced apart from the emitter segment (ES ) , the transceiver circuit (TC) is operable to drive pixels from the emitter segment (ES) to emit pulses of light, and the transceiver circuit (TC) is operable to drive pixels from the detector segment (DS) to detect incident light synchronized with the emission of pulses of light.

7. The module according to claim 6, wherein the processing unit is operable to determine a time-of- flight of emitted pulses of light and detected incident light .

8. The module according to claim 4 or 5, wherein in a structured light mode of operation, the first subset of pixels form a predefined pattern (PT) on the active area (AR) , the transceiver circuit is operable to drive pixels from the pattern (PT) to emit pulses of light, and the transceiver circuit (TC) is operable detect incident light by way of one or more of the light detecting segments DS) and synchronized with the emission of pulses of light.

9. The module according to claim 8, wherein the processing unit is operable to determine a detected pattern (PT) from light detected by the one or more of the light detecting segments (DS) . 10. The module according to one of claims 1 to 9, comprising a third subset of pixels forming a lighting segment (LS) on the active area (AR) .

11. The module according to one of claims 1 to 10, wherein the pixels comprise: light-emitting diodes, micro light-emitting diodes, and/or resonant-cavity light emitting devices.

12. The module according to one of claims 1 to 11, further comprising an array of micro-lenses (ML) , wherein micro-lenses of the array of micro-lenses are aligned with respective pixels of the array of pixels.

13. The module according to claim 12, wherein micro-lenses of the array of micro-lenses (ML) are aligned with one or more of the light emitting segments (ES) on the active display area (AR) , and/or micro-lenses of the array of micro-lenses (ML) are aligned with one or more of the light detecting segments (DS) on the active display area (AR) .

14. A lighting and monitoring arrangement comprising : at least one integrated transceiver module according to one of claims 1 to 13, and a host system comprising the module, wherein the host system comprises : an illumination device for in-cabin illumination of a vehicle, or an illumination device for exterior illumination of a vehicle.

15. The lighting and monitoring arrangement according to claim 14, comprising at least a first and a second module, wherein the first and a second module are spaced apart, and the first and second module are operated in a combined LiDAR mode of operation; wherein: a first subset of pixels form an emitter segment (ES) of the first module and a second subset of pixels form a detector segment (DS) of the second module, a first transceiver circuit (TC) is operable to drive pixels from the emitter segment (ES) to emit pulses of light, and a second transceiver circuit (TC) is operable to drive pixels from the detector segment (DS) to detect incident light synchronized with the emission of pulses of light.

Description:
" INTEGRATED TRANSCEIVER MODULE AND LIGHTING AND MONITORING ARRANGEMENT"

Technical field

This disclosure relates to an integrated transceiver module , e . g . , for forward lighting, dynamic signaling and sensing . The disclosure further relates to a lighting and monitoring arrangement .

Background of the disclosure

Distance sensors , such as proximity, time-of- f light or Light detection and ranging ( LiDAR) sensors , are employed in various fields such as mobile devices , wearables , automotive devices and the like . A focus of current developments lies on manufacturing dedicated sensor modules which are to be integrated into a host system . One requirement relates to miniaturi zation but the sensor typically has one function, i . e . , detection of distance .

It is an obj ect of the present disclosure to provide an integrated transceiver module and a lighting and monitoring arrangement with improved sensing and detection functionality in a single device .

These obj ectives are achieved by the subj ect-matter of the independent claims . Further developments and embodiments are described in the dependent claims .

It is to be understood that any feature described in relation to any one embodiment may be used alone , or in combination with other features described herein, and may also be used in combination with one or more features of any other of the embodiments , or any combination of any other of the embodiments unless described as an alternative . Furthermore , equivalents and modi fications not described below may also be employed without departing from the scope of the integrated transceiver module and the lighting and monitoring arrangement both of which are defined in the accompanying claims .

Summary of the disclosure

The following relates to an improved concept in the field of lighting with signaling and sensing . The concept is based on the observation that light emitting devices , such as light-emitting diodes and/or semiconductor lasers , such as vertical cavity surface emitting lasers (VCSELS ) , depending on their bias , can be used alternately as both detector and emitter . For example , reverse biasing of said light emitting devices allows for ef ficient photo detection using the Stark Ef fect for LEDs and the Quantum-Confined Stark-Ef fect for lasers , such as VCSELs .

The proposed concept suggests to integrate an active area comprising an array of pixels into a common module , and drive pixels individually to emit light in one mode of operation . In another mode of operation, a sensing functionality can be achieved by reverse biasing of the pixels . This means that the same module can be employed as emitter and receiver . Thus , the module forms an electro-optical transducer which is electrically driven by a transceiver circuit . Consequently, lighting and sensing functionality can be included into a single device .

In at least one embodiment , an integrated transceiver module is arranged for forward lighting, dynamic signaling and sensing . The module compri ses a substrate and an active area . The active area comprises an array of pixels . Furthermore , the module comprises a transceiver circuit .

The transceiver circuit is operable to selectively drive pixels in a first mode of operation or in a second mode of operation . In the first mode of operation, the transceiver circuit is operable to drive pixels with a forward bias so as to emit light . In the second mode of operation, the transceiver circuit is operable to drive pixels with a reverse bias so as to detect light .

For example , the substrate can be a silicon substrate , e . g . , a silicon wafer or a diced chip of a silicon wafer . The display substrate can also be of a di f ferent material such as FR4 or polyimide . For example , it is possible to grow InGaN-based LEDs and micro-LEDs directly on Sapphire and trans fer them afterwards . The substrate may further comprise functional layers having circuitry for operating the pixels , such as components of a readout circuit and/or a driving circuit , for instance .

The pixels may be considered semiconductor light emitting devices . For example , the pixels are integrated on or into the substrate and/or the transceiver circuit .

In at least one embodiment , the pixels are of a same type , e . g . , a light emitting diode emitting in the same spectral range . Thus , the array of pixels may be considered an array for lighting applications . The term "active area" denotes that by means of the array of pixels said portion of the area is capable of emitting and/or sensing light which is incident on the active area .

The pixels can be altered in their functionality by means of a bias applied to them . The bias is provided by means of the transceiver circuit . The transceiver circuit is electrically connected to the array of pixels . For example , the transceiver circuit is configured to address the display pixels individually . Thus , the transceiver operates as a driver circuit of pixels . By way of the electrical connections the transceiver circuit provides a bias to the pixels . Which bias is applied to the pixels is defined according to the mode of operation of the module . Depending on the mode of operation pixels can be operated as light detector or emitter . Whether a pixel operates as detector or emitter depends on the bias it receives from the transceiver circuit . For example , reverse biasing allows for ef ficient photo detection using the Stark Ef fect or Quantum-Confined Stark-Ef f ect . This way, a pixel can absorb visible or IR light . Thus , the transceiver operates as a detection circuit of pixels .

The transceiver circuit allows to operate the array of pixel , or parts thereof , with a light emitting or with a sensing functionality . This means that the same device can be employed as emitter and receiver, thus forming an electro-optical transceiver . Consequently, sensing functionality and illumination can be included into the module without requiring an additional sensor chip . Basically, the module acts as a transceiver, driven by the transceiver circuit . No additional optical components are required . The circuitry to inverse the polarity of the pixels in order to make them optical sensors may be comprised by the drive electronics , i . e . the transceiver circuit .

The module comprises an integrated circuit and the transceiver circuit can be considered a driver IC which is modi fied to include driver and receiver in one building block . For example , the transceiver circuit is also operable to read out photo-signals which are generated by those pixels which are operate as photodetectors , for instance . In this sense , the transceiver circuit comprises both a driver and detector circuit .

In at least one embodiment , the transceiver circuit is operable to address pixels individually by means of a select signal . The pixels comprise in-pixel circuitry which is operable to selectively provide the forward bias or the reverse bias depending on the select signal . For example , the transceiver circuit issues one or more select signals to select one pixel (or more pixels) . A selected pixel is provided with the forward bias or reverse bias depending on the one or more select signals, according to the intended mode of operation.

Whether a pixel has the function of emitter or detector may be determined by means of the transceiver circuit and by way of the select signal (s) . This allows for a higher degree of freedom as pixels may be selected or grouped together to operate in the same mode of operation, for example. A pixel may only be in one mode of operation at a time. However, the same pixel may be operated subsequently in different modes of operation. Different pixels may at times be operated in either different modes of operation or in a same mode operation. This concept allows to implement various complex detection schemes which rely on certain sequences of modes of operation. Ultimately, the same module may be used to implement different sensing functionality using the same device, i.e. by virtue of addressing and driving pixels .

In at least one embodiment, the module further comprises a processing unit which is integrated into the module .

The modes of operation may be controlled by the processing unit, such as a microcontroller or a processor. The processing unit may issue the select signal (s) to individual pixels or to subsets of pixels. For example, the processing unit controls communication, timing and/or synchronization operations in order define the sensor functionality, e.g. by alternating the modes of operations as needed. For example, the two modes may alternate within a refresh rate and, thus, may not be noticed by human perception. The detection functionality of the pixels may not interfere with the illumination function . In at least one embodiment , the transceiver circuit is operable to address pixels to form subsets . For example , the transceiver circuit addresses pixels to form a first subset of pixels by commonly operating said pixels as light emitters in the first mode of operation . In addition, or alternatively, the transceiver circuit addresses pixels to form a second subset of pixels commonly operating said pixels as light detectors in the second mode of operation .

In at least one embodiment , the first subset of pixels form one or more light emitting segments on the active display area . In addition, or alternatively, the second subset of pixels form one or more light detecting segments on the active display area .

The subsets allow to address groups of pixels to operate at a same mode of operation at a time . Pixels of a given subset may form the segments on the active area . For example , a segment may be dedicated to illumination and one or more other segments may be dedicated to sensor functionality . Instead of forming segments , however, the pixels of any subset may be spread over the active area in a regular or irregular fashion, e . g . without any immediate neighbor operated in the respective other mode of operation . This way pixels of a respective subset may be indiscernible from pixels of another subset .

In general , there may be pixels which are not operated in changing modes of operation . For example , pixels from a third subset may be operated in the first mode of operation at all times , i . e . the transceiver circuit drives pixels with a forward bias so as to emit light . This way there may be a segment on the active area which is used for illumination throughout . The resulting segments may cover area from the active area with defined ratios . For example, there may be a 1 : 3 ratio for segments dedicated to sensing functionality, e.g. changing modes of operation, as opposed to segments dedicated to illumination only, i.e. not changing the mode of operation.

In at least one embodiment, in a LiDAR mode of operation, the first subset forms an emitter segment and the second subset forms a detector segment, spaced apart from the emitter segment. The transceiver circuit is operable to drive pixels from the emitter segment to emit pulses of light. The transceiver circuit is operable to drive pixels from the detector to detect incident light synchronized with the emission of pulses of light.

Light detection and ranging, LiDAR for short, provides a method for determining ranges (such as variable distance) by targeting an external object with light pulses emitted by the pixels of the emitter segment and measuring the time for the reflected light to return to the detector segment. A field of view is illuminated with a wide diverging light in a single pulse. Depth information is collected using the time-of-f light of the reflected pulse (i.e., the time it takes each emitted pulse to hit the target and return to the array) , which requires the pulsing (emission by the emitter segment) and acquisition (detection by the detector segment) to be synchronized. A range of detection can be adjusted or extended depending on how much the emitter segment and the detector segment are spaced apart (baseline) .

In at least one embodiment, the processing unit is operable to determine a time-of-f light of emitted pulses of light and detected incident light. For example, sensor signals generated by the pixels of the detector segment are received by the transceiver circuit and processed in a processing unit, e.g., a microcontroller, to yield proximity or time-of-f light information.

In at least one embodiment, in a structured light mode of operation, the first subset of pixels form a predefined pattern on the active area. The transceiver circuit is operable to drive pixels from the pattern to emit pulses of light. The transceiver circuit is operable detect incident light by way of one or more of the light detecting segments and synchronized with the emission of pulses of light.

The first subset of pixels may project the predefined or known pattern (e.g., grids or horizontal bars) onto an external scene.

The projected patterns eventually deform when striking an external surface and return to the module by way of reflection or scattering. The deformed pattern can be detected by the second subset of pixels. These pixels are operated by the transceiver circuit as light detectors, i.e. in the second mode of operation. Detection is synchronized with the emission of pulses of light. For example, the pixels are operated as detectors only after emission of pulses. Alternatively, the pixels are operated as detectors continuously.

The second subset of pixels generate detection signals which allow to construct the returned pattern. A deformation can be detected from light detected by the one or more of the light detecting segments. For example, vision systems (external or integrated into the module) allow to calculate depth and surface information of external objects in a scene.

In at least one embodiment, the processing unit is operable to determine a detected pattern from light detected by the one or more of the light detecting segments .

In at least one embodiment, the module comprises a third subset of pixels forming a lighting segment on the active area. The lighting segment may be used for lighting a scene, for example. The pixels forming the third subset may be operated only in the first mode of operation .

In at least one embodiment, the pixels comprise light-emitting diodes, micro light-emitting diodes, and/or resonant-cavity light emitting devices.

LEDs and microscopic LEDs, or micro-LEDs for short, are based on conventional technology, e.g., for forming gallium nitride based LED. However, micro-LEDs are characterized by a much smaller footprint. Each microLED can be as small as 5 micrometers in size, for example. Micro-LEDs enable higher pixel density or a lower population density of active components, while maintaining a specific pixel brightness or luminosity. The latter aspect allows for the placement of additional active components in the pixel layer of the display, thus allowing for additional functionality and/or a more compact design. Micro-LEDs offer an enhanced energy efficiency compared to conventional LEDs by featuring a significantly higher brightness of the emission compared to other LEDs .

A resonant-cavity light emitting device can be considered a semiconductor device, similar to a lightemitting diode, which is operable to emit light based on a resonance process. In this process, the resonantcavity light emitting device may directly convert electrical energy into light, e.g., when pumped directly with an electrical current to create amplified spontaneous emission. However, instead of producing stimulated emission only spontaneous emission may result, e.g., spontaneous emission perpendicular to a surface of the semiconductor is amplified. A resonant photodetector is established when reverse biasing a resonant light emitting device, such as a VCSEL or resonant cavity LED, for instance.

In at least one embodiment, the module further comprises an array of micro-lenses, wherein micro-lenses of the array of micro-lenses are aligned with respective pixels of the array of pixels . The optics , e . g . microlenses may be monolithically grown on top of the pixels of the array or are integrated / stacked on top of the module .

In at least one embodiment , micro-lenses of the array of micro-lenses are aligned with one or more of the light emitting segments on the active area . In addition, or alternatively, the micro-lenses of the array of micro-lenses are aligned with one or more of the light detecting segments on the active area . This way a dedicated optic can be created with defined f ields- of-view for the respective segments .

In at least one embodiment , a lighting and monitoring arrangement comprising one or more integrated transceiver module according to the improved concept discussed above . A host system comprising the module . For example , the host system comprises an illumination device for in-cabin illumination of a vehicle , or an illumination device for exterior illumination of a vehicle .

In at least one embodiment , a lighting and monitoring arrangement comprises at least a first and a second module . The first and a second module are spaced apart ( forming a baseline ) . The first and second module are operated in a combined LiDAR mode of operation . In said mode of operation a first subset of pixels form an emitter segment of the first module and a second subset of pixels form a detector segment of the second module . A first transceiver circuit is operable to drive pixels from the emitter segment to emit pulses of light . A second transceiver circuit is operable to drive pixels from the detector segment to detect incident light synchroni zed with the emission of pulses of light .

The two modules implement a LiDAR detector . Compared with a single module LiDAR, a larger baseline and, thus , larger range can be achieved . Further modules can be used to complement the two modules in order to dynamically create LiDAR detectors . This way di f ferent ranges according to di f ferent directions of interest can be established .

Further embodiments of the lighting and monitoring arrangement according to the improved concept become apparent to a person skilled in the art from the embodiments of the integrated transceiver module described above and vice versa .

The following description of figures of example embodiments may further illustrate and explain aspects of the improved concept . Components and parts with the same structure and the same ef fect , respectively, appear with equivalent reference symbols . Insofar as components and parts correspond to one another in terms of their function in di f ferent figures , the description thereof is not necessarily repeated for each of the following figures .

Brief description of the drawings

In the Figures :

Figures 1A, IB show example embodiments of integrated transceiver modules ,

Figures 2A to 2D show example embodiments of inpixel circuitry,

Figures 3A, 3B show further example embodiments of integrated transceiver modules ,

Figures 4A, 4B show further example embodiments of integrated transceiver modules ,

Figures 5A, 5B show further example embodiments of integrated transceiver modules ,

Figure 6 shows a further example embodiment of an integrated transceiver module , Figures 7A, 7B show further show further example embodiments of lighting and monitoring arrangements,

Figures 8A to 8C show an example embodiment of an integrated illumination module, and

Figure 9 shows an example flowchart of a method of operating an integrated illumination module.

Detailed description

Figure 1A shows an example embodiment of an integrated transceiver module. An integrated transceiver module comprises a substrate SB, an active area AR, a transceiver circuit TC and an array of micro-lenses ML. The module constitutes an integrated circuit with all its components integrated on the substrate or electrically connected thereto. For example, the module comprises a stack of substrate, active area, transceiver circuit and array of micro-lenses.

The active area AR comprises an array of pixels. Pixels are denoted light emitting devices. For example, pixels comprise light-emitting diodes, micro lightemitting diodes, laser diodes and/or resonant-cavity light emitting devices, such as VCSEL lasers. Typically, the active area comprises pixels of a same type, e.g. light-emitting diodes. However, the active area may also comprise pixels of a different type, e.g. light-emitting diodes and VCSELs . The pixels are integrated either on the transceiver circuit TC or in the substrate SB.

The array of micro-lenses ML comprises optical elements, such as micro-lenses, lenses and/or diffusers. For example, the micro-lenses are aligned with respective pixels of the array of pixels. Lenses and diffusers may be aligned with a number of pixels in order to determine a common f ield-of-view, for example. The optics could be monolithically grown on top of the pixels of the array or integrated / stacked on top of the module . The transceiver circuit TC is integrated on the substrate SB . The transceiver circuit comprises circuitry to individually address pixels from the array . Pixels can be addressed by means of a select signal . Furthermore , the transceiver circuit comprises circuitry to selectively drive pixels in di f ferent modes of operation, or a combination or sequence of modes . The modes of operation are defined with respect to a bias which is provided to the respective pixels . The transceiver circuit addresses a pixels and provides either a forward bias or a reverse bias to the addressed pixel .

For example , in a first mode of operation, the transceiver circuit drives ( or provides ) pixels with a forward bias so as to emit light . In a second mode of operation, the transceiver circuit drives ( or provides ) pixels with a reverse bias so as to detect light . The pixels change their functionality depending on the bias applied to them . Depending on the mode of operation pixels can be operated as light detector or emitter . Whether a pixel operates as detector or emitter depends on the bias it receives from the transceiver circuit . For example , reverse biasing allows for ef ficient photo detection using the Stark Ef fect or Quantum-Confined Stark-Ef f ect . This way, a pixel can absorb visible or IR light , for example . Thus , the transceiver operates as a detection circuit of pixels .

Figure IB shows an example embodiment of an integrated transceiver module . The drawing shows the module in greater detail . The transceiver circuit power terminals to receive driving current and a communication diagnosis interface for communication with further electronic components , such as a processing unit ( external or integrated in the same module ) . Furthermore , the pixels comprise in-pixel circuitry which, as with the pixels , is integrated into the transceiver circuit . The in-pixel circuitry can be addressed by means of the select signal ( s ) . Depending on the applied select signal the pixels receive either the forward bias (pixels PX1 ) or the reverse bias (pixels PX2 ) .

Figures 2A and 2B show example embodiments of inpixel circuitry . The pixels of the array are arranged for light emission when they are provided with a forward bias Vbias , forward . However, i f the same pixel is biased di f ferently, e . g . reverse bias represented as Vbias , backward, the pixel is operable to detect light . The di f ferent bias conditions are provided to the pixels by inverting polarity of the bias current between Ibias , forward and Ibias , backward, as illustrated by the diode symbols in Figure 2A ( forward bias , PX1 ) and Figure 2B ( reverse bias , PX2 ) .

The transceiver circuit TC is configured to alter the polarity of the bias current and provide this current to the pixels during the first and second mode of operation . Under first mode of operation, an LED j unction is forward biased to emit light energy at various wavelengths that depend on the materials used . The reverse of this ef fect is that a standard LED emittingj unction can operate as a light-detecting j unction, under second mode of operation, generating a photocurrent proportional to the incoming light energy . In the embodiment of Figures 2A and 2B the in-pixel circuitry comprises MOSFET transistors which along their drain and source terminals are connected to the pixel , e . g . light emitting diode . The gate may be coupled with respective control terminal to receive the select signals .

The layout of the pixels PX1 , PX2 and in-pixel circuitry may be optimi zed with respect to the structures depicted in Figures 2C and 2D . Figure 2C shows MOSFETs with rectangular shape of gate electrode with length of the channel L and width of the channel W . The drawing depicts an active area AA, polysilicon gate electrode G, contacts CT , metallic interconnection MI , source electrode S and drain electrode D . Basically, the MOSFETS are designed as stripes of width W and placed in parallel with length L . This layout ef fectively allows to reduce space requirements . Figure 2D shows MOSFETs with a waffle structure . The drawing depicts an active area AA, polysilicon gate electrode G, contacts CT , metallic interconnection MI , source electrode S and drain electrode D . Thi s alternative layout may reduce space requirements even further, theoretically down to 40% of substrate area .

The individual addressing of pixels by means of the transceiver circuit TC allows to form subsets of pixels . These subsets may form segments or contiguous areas on the array AR of pixels . The subsets may also form defined patterns on the array . Furthermore , the forming of segments and patterns can also change over the course of operation of the module . At any time a single pixel , or commonly those pixels associated with a respective segment or pattern, can be operating either as light emitters in the first mode of operation or operated as light detectors in the second mode of operation .

Figure 3A shows a further example embodiment of an integrated transceiver module . The drawing shows the array of pixels in top view . Depicted are three segments SGI , SG2 , SG3 which are formed by respective subsets of pixels . Whether a pixel belongs to a respective segment is not a question of a hardwired connection, or the like . Rather, the transceiver circuit TC addresses the pixels and thereby determines i f said pixels are operated in the first or second mode of operation . In this example , pixels are addressed to form a first subset of pixels and are commonly operated as light emitters in the first mode of operation . The first subset of pixels forms a light emitting segment ES on the active area AR . Similarly, pixels are addressed to form a second subset of pixels and are commonly operated as light detectors in the second mode of operation . The second subset of pixels forms a light detecting segment DS on the active area AR .

Furthermore , some pixels are addressed to form a third subset of pixels . The third subset forms a lighting segment LS on the active display area . This lighting segment may not alter its mode of operation and may be used for illumination, e . g . of parts of a cabin .

This embodiment can be used as a LiDAR detector . In a LiDAR mode of operation, the first subset forms an emitter segment and the second subset forms a detector segment . These segments correspond to the light emitting segment LS and light detecting segment DS and are spaced apart from each other to form a baseline . In order to implement the LiDAR functionality, the transceiver circuit TS further drives pixels from the emitter segment to emit pulses of light . Correspondingly, the transceiver circuit further drives pixels from the detector segment to detect incident light . Operation of the emitter segment and the detector segment is synchroni zed with the emission of pulses of light .

The module may optionally comprises a processing unit integrated into the module , such as a microcontroller or AS IC, which is integrated into the module as well . The processing unit determines a time- of- flight of emitted pulses of light , as starting event , and detected incident light , as stop event . The LiDAR mode of operation provides a method for determining ranges ( such as variable distance ) by targeting an external object with light pulses emitted by the pixels of the emitter segment. Measuring the time-of-f light provides a measure of distance of pulses reflected at the external object and being returned to the detector segment .

A field of view is illuminated with a wide diverging light in a single pulse, for example. The optics, such as micro-lens array ML, defines the field of view. For example, the optic can be arranged to illuminate a desired field of view, e.g. inside a cabin. This way, range measurement can be configured into a direction of interest, e.g. where a driver is located, or not (presence detection) . Depth information is collected using the time-of-f light of the reflected pulse (i.e., the time it takes each emitted pulse to hit the target and return to the array) , which requires the pulsing (emission by the emitter segment) and acquisition (detection by the detector segment) to be synchronized.

Figure 3B shows a further example embodiment of an integrated transceiver module. The drawing shows the array of pixels in top view. Similar to Figure 3A three segments are depicted which are formed by respective subsets of pixels, respectively.

In this example, pixels are addressed to form a first subset of pixels and are commonly operated as light emitters in the first mode of operation. The first subset of pixels forms a light emitting segment ES on the active area AR. Similarly, pixels are addressed to form a second subset of pixels and are commonly operated as light emitters in the first mode of operation. The second subset of pixels forms another light emitting segment ES on the active area. Furthermore, some pixels are addressed to form a third subset of pixels. The subsets basically form lighting segments LS on the active display area . This embodiment can be used as a projector. In a projection mode of operation, one, two or all segments, can be operated to emit light, e.g. at the same time or in a sequence, or only when activated. The segments may be assigned to illuminate a certain direction only. The optics, e.g. micro-lens array, may also have segments which correspond to the respective lighting segments. This way a given lighting segment LS may be used to illuminate a dedicated f ield-of-view . The transceiver circuit TC then acts as a driver circuit which can address pixels to illuminate a desired direction of interest .

Figures 4A and 4B show further example embodiments of integrated transceiver modules. The transceiver circuit can alter the subsets, or allocate pixels to segments SGI to SG3, simply by addressing pixels to be operated in the first or second mode of operation. This way the area or number of pixels allocated to a respective segment can be adjusted. For example, for the embodiments of Figure 3A and 3B an aspect ratio of 1:3 or 1:4 may be defined, i.e. relative area of lighting segment with respect to emitter segment and/or detector segment. This way a large part of the module can be used for illumination and a smaller part of detection purpose, e.g. LiDAR. Figure 4B depicts that the shape of segments can also be adjusted, or altered during operation.

A range of detection can be adjusted or extended depending on how much the emitter segment and the detector segment are spaced apart (baseline) . The baseline can be determined by means of the transceiver circuit TC. The transceiver circuit can alter the subsets, or allocate pixels to segments, simply by addressing pixels to be operated in the first or second mode of operation. This way the emitter segment and the detector segment do not necessarily have to be fixed but may be spaced apart differently. By changing the distance between the segments, or baseline, different ranges can be detected.

Furthermore, in an embodiment not shown, the transceiver circuit TC may form more light emitting ES and/or light detecting segments DS. For example, more than one pair of emitter and detector segments can be formed, effectively forming LiDAR detector with several ranges in parallel.

Figure 5A shows a further example embodiment of an integrated transceiver module. The drawing shows the array of pixels in top view. Different to Figure 3A, however, the first subset of pixels forms a pattern PT on the array.

For example, in a structured light mode of operation, the first subset of pixels form a predefined pattern on the active display area. The pixels of the pattern are operated as light emitters, i.e. in the first mode of operation. The transceiver circuit is operable to drive pixels from the pattern to emit pulses of light. Thus, the first subset of pixels may project the predefined or known pattern (e.g., grids or horizontal bars) onto an external scene (as indicated in Figure 5B) .

The way that these patterns deform when striking an external surface and eventually return to the module by way of reflection or scattering. The deformed pattern can be detected by the second subset of pixels. These pixels are operated by the transceiver circuit TC as light detectors, i.e. in the second mode of operation. Detection is synchronized with the emission of pulses of light. For example, the pixels are operated as detectors only after emission of pulses. Alternatively, the pixels are operated as detectors continuously.

The second subset of pixels generate detection signals which allow to construct the returned pattern. A deformation can be detected from light detected by the one or more of the light detecting segments. For example, vision systems (external or integrated into the module) allow to calculate depth and surface information of external objects in a scene.

Figure 6 shows an example embodiment of a lighting and monitoring arrangement. The sensing and emission functionality can be integrated into a single module. However, it is also possible to combine a pair of more modules in order to build a larger lighting and monitoring arrangement. For example, a first and a second module are spaced apart, e.g. by a baseline. The first and second module are operated in a combined LiDAR mode of operation, i.e. the first module serves as emitter and the second module serves as detector.

In the combined LiDAR mode of operation, a first subset of pixels of the first module Ml forms an emitter segment ES (or pattern PT) and a second subset of the second module M2 forms a detector segment DS (or pattern PT) . The emitter and detector are spaced apart from each other to form a baseline. In order to implement the LiDAR functionality, the transceiver circuit of the first module drives pixels from the emitter to emit pulses of light. Correspondingly, the transceiver circuit TC of the second module drives pixels from the detector to detect incident light. Operation of the emitter and the detector segment is synchronized with the emission of pulses of light. For example, the transceiver circuits may be electrically or optically connected to establish synchronization .

The two modules can be integrated into a host system. For example, the modules can be arranged in an illumination device for in-cabin illumination of a vehicle, e.g. a left and right headlamp) . The host system can also be an illumination device for exterior illumination of a vehicle, etc.

In general, the functionality and features discussed herein for a single module can be applied to any pair or larger number of modules. In fact, any specific functionality, such as driving pixels in a mode of operation may be shared between modules so as to complement each other to achieve a combined functionality. Synchronization may be supported by means of one or more processing units. These units may be integrated in the modules or may be an external component, e.g. a microprocessor of the host system.

Figures 7A and 7B show further example embodiments of lighting and monitoring arrangements. There are as many possible applications as there are possible host systems. The module combines pixels to function as emitter and detector (e.g., photodiode reverse-biased LED, and light emitting diode forward-biased LED) . Modules include monolithic integration of driver and detector circuit (transceiver circuit) and optics, such as micro-lens array. The modules allow for smart segmented layout for emitter and receiver arrays, both in a static or dynamic fashion. Possible applications include ranging, Lidar cocoon, proximity sensing and incabin sensing, to name but a few.

Figure 7A shows possible host systems associated with a vehicle. These systems typically provide lighting inside or outside of the vehicle. Using the proposed module, however, the lighting can be complemented with sensing functionality. Examples include interior lighting, head lamps (e.g., head light, turn indicator) , fog lights, exterior displays, rear and front lamps and design elements, etc. As one general guideline, the module can be used wherever there is a need for semiconductor light emitters. Then there may be light detectors as well.

Figure 7B shows a LIDAR cocoon. The lighting and monitoring arrangement comprises a plurality of modules Ml to M6 which are arranged around a vehicle. Each one of the modules may be operated in the LiDAR mode of operation. However, pairs of modules can be assigned and operated in the combined LiDAR mode of operation. This way the lighting and monitoring arrangement can be used with various combinations, baselines and, thus, ranges.

Figures 8A to 8C show an example embodiment of an integrated illumination module. The module comprises a substrate SB, an active area AR and a driver circuit DC. The drawing shows the module in side view.

The active area AR comprises an array of pixels. At least some pixels of the array are arranged in segments SGI, SG3, SG3 configured to provide illumination to a zone of a cabin, respectively. Pixels are denoted light emitting devices. For example, pixels comprise lightemitting diodes, micro light-emitting diodes, laser diodes and/or resonant-cavity light emitting devices. In this example embodiment the pixels comprises VCSEL lasers and are arranged to emit visual light.

Pixels which are arranged in a segment are commonly operated to illuminate the zone of the cabin, respectively. Typically, the active area AR comprises pixels of a same type, e.g. light-emitting diodes. However, the active area may also comprises pixels of a different type, e.g. light-emitting diodes and VCSELs . The pixels are integrated either on the driver circuit or in the substrate. In fact, the pixels, or array, is directly integrated on the driver circuit or the substrate, i.e. the driver circuit DC and/or the substrate SB form an integrated circuit with the pixels, or array.

The module further comprises a plurality of optical elements ML. Each optical element is arranged to cover a segment SGI, SG2, SG3 of pixels. The optical elements can be a diffuser or micro-lens, for example. Each optical element is respectively configured to define a field of view of a respective illumination beam which is emitted from the pixels of a corresponding segment. The field of views of the segments can be partially overlapping, as depicted in Figures 8B and 8C or are non-overlapping. Emission of pixel of a segment can be of different emission wavelength, or mixture of wavelengths, compared to neighboring segments. This allows to better distinguish segment emission, e.g. for the purpose of additional sensing functionality.

An example of an (refractive) optical element diffuser is a lens placed over a pixel or a segment. If the emitted light from the segment is a collimated beam, a negative lens can be used to turn the collimated beam into a divergent beam. Alternatively, a positive lens with a focal point which is much shorter than the distance to the illuminated target can be used. A larger diffusing angle, also referred to as a larger field of view, can be achieved using a stronger lens. A segment of pixels can be covered by an array of lenses, with one lens per pixel, or alternatively a single refractive lens can be used to cover the pixels of the whole segment. Alternatively, an array of prisms or other refractive optical elements can be used to diffract the light. The same optical function can be achieved with a micro-structured meta-surface or array of micro-lenses.

Examples of diffractive optical elements are a grating, or a small opening in an opaque screen. A smaller opening will create a larger amount of diffraction, or varying the grating constant will vary the amount of diffraction accordingly. An array of small openings can be used to create a speckle pattern based on interference between the light emerging from the openings. Holographic diffusers can be manufactured with photopolymers, and provide a further option for implementing the invention. Holographic diffusers may provide more precise control over the shape of the output beam and may thus help to homogenize the output beam to reduce a risk of hot spots compared to diffractive and/or refractive diffusers. A holographic diffuser may comprise one or more photopolymer layers comprising pseudo random, non-periodic structures, for example micro-lenses configured to provide a predetermined output field of view.

The optical elements ML, individual or array type, can be integrated or etched directly on the pixels, or array. Thus, the optics can be considered integral part of the integrated illumination module.

The drawing of Figure 8A shows an example with three segments of pixels, or emitter arrays. A respective optical element ML is arranged to cover the segments, so that in operation by means of the driver circuit respective beams of light are directed into a cabin of a vehicle, for example. The beams illuminate respective areas in the cabin. For example, a field of illumination is split into three areas according to the segments formed on the module array. Size of these areas depends on the eye safety limit for passengers in the cabin, for example .

Each area Al to A3 can serve a different field of illumination and/or additional sensing functionality in a cabin. For example, a first area serves the driver for driver illumination and monitoring, e.g. vital sign monitoring for high resolution, range: 1 m. A second area serves the rear-row passengers, range ~2 m and a third area serves the co-drivers .

The additional sensing functionality can be achieved by complementing the driver circuit of the proposed integrated transceiver circuit . This way module constitutes a transceiver module for forward lighting, dynamic signaling and sensing as discussed above . All features and embodiments discussed above then apply to the integrated illumination module for in-cabin monitoring . The additional sensing functionality can also be achieved by one or more external sensor which are arranged inside or outside of the cabin . These sensor include any of proximity sensors , time-of- f light sensors , LiDAR sensors , occupancy sensor, vital sign sensors , seat belt sensor, camera, gesture sensor, and seat sensor, for example .

The driver circuit comprises an input to receive an occupancy signal . The occupancy signal indicates a presence or occupancy of a person in the cabin . By way of the input one or more occupancy signals can be received by the module . In turn, the driver circuit selectively drives pixels from the segments and adj ust illumination to a zone of the cabin depending on the received occupancy signal , respectively . For example , only an area of the cabin is illuminated from which an occupancy signal was received, indicating that a person occupies said area .

This allows to save power needed to illuminate the cabin, as only the occupied area i s illuminated, or other areas are not illuminated at all , or only with reduced intensity . For example , it can be shown that the segmented array can reduce total optical power needed by three to seven times . Emitting wavelength of segments could di f fer, e . g . 850 nm and 940 nm, to avoid crosstalk .

The input can be implemented as an interface for the external sensor ( s ) . The input can also be an internal terminal which provides the signal generated by the " internal" sensor, i . e . the integrated transceiver module .

Figure 9 shows an example flowchart of a method of operating an integrated illumination module. The proposed integrated illumination module allows for dynamic adjusting of in-cabin illumination. For example, all segments are turned on for cabin scan. Afterwards, some arrays could be switched off depending on the passengers on the car. For example, only the driver is illuminated if there are no further passengers in the cabin. The Illumination field can dynamically controlled during the journey, i.e. once there is a change in occupancy the integrated illumination module can account for that presence.

The drawing shows an example flow which can be executed when the vehicle stars (step SI) . In step S2 all segments are turned on and occupancy signal from all involved sensor, internal or external, are "scanned". In a next step occupancy is determined (step S3) . Depending on the occupancy only those areas are illuminated which correspond to respective segments of the module. Unneeded segments are turned off, or are reduced in illumination intensity (step S4) . The sequence of steps S2 to S4 can be looped for dynamic scanning. The looping may stop, when the vehicle stops (step S5) and the flow may return to step SI, once the vehicle is started again. Instead of switching off a repetition rate of emitters can be decreased if no passenger is present in a particular area.

While this specification contains many specifics, these should not be construed as limitations on the scope of the invention or of what may be claimed, but rather as descriptions of features specific to particular embodiments of the invention. Certain features that are described in this specification in the context of separate embodiments can also be implemented in combination in a single embodiment . Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination . Moreover, although features may be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination can in some cases be excised from the combination, and the claimed combination may be directed to a sub-combination or variation of a subcombination .

Similarly, while operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results . In certain circumstances , multitasking and parallel processing may be advantageous .

A number of implementations have been described . Nevertheless , various modi fications may be made without departing from the spirit and scope of the invention . Accordingly, other implementations are within the scope of the claims .

For example , further aspects of the disclosure relate to or can be derived from the following .

The array may be segmented into segments or single pixels , such as VCSELs , i . e . some pixels may have a di f ferent functionality than illumination . The optics may comprise a mixture between refractive optics and di f fusors , e . g . to shape beams for better reliability . Furthermore , certain areas of the cabin can illuminated with two or more segments , i . e . by overlapping FOV . The FOV can be adj usted for increased power saving .

The sensors , internal or external , may also monitor additional information, which af fects illumination . For example , a sensor can be watching the direction of the driver so that illumination follows the driver . A monitor for vital sign monitoring can be implemented, e . g . vital sign, security belt , and hands on steering wheel . This way illumination may indicate to a passenger that a vital signal is critical or needs attention, or monitor i f the driver pays attention .

Other monitoring includes a seatbelt closed monitor, children as co-driver detection (Airbag) , gesture detection for every beam, additional camera for sensor fusion and additional functionality such as face recognition ( LiDAR and camera ) Reading lips and translating into commands , maybe with gesture . Backseat warning for strange behavior and seat adj ustment for passengers can be included .

Further functionality can be complemented to illumination or combined with illumination, including authentication without key, authentication of all registered persons with restrictions , one beam directed outside the cabin for authentication and unlocking the car . Limit the speed depending on persons and open garage and house doors depending on authentication within the car . Outside sensors (maybe more for the outdoor application) can be added . Authentication may depend on face for outside monitor for accessing the car .

LIST OF REFERENCE SIGNS

Al to A3 area of illumination

AA active area

AR array of pixels

CT contact

D drain electrode

DS detector segment

ES emitter segment

G gate electrode

L length of channel

LS lighting segment

Ml to M6 modules

MI metallic interconnection

ML micro-lens array

PT pattern

PX1 first subset of pixels

PX2 second subset of pixels

S source electrode

SB substrate

TC transceiver circuit

W width of the channel