Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
DEVICE AND METHOD FOR COMPENSATING EVENT LATENCY
Document Type and Number:
WIPO Patent Application WO/2022/096086
Kind Code:
A1
Abstract:
The present disclosure relates to latency correction for events. To this end, the disclosure proposes an entity for compensating latency of a plurality of pixels, the entity being configured to: estimate illumination information for one or more pixels or pixel groups of the plurality of pixels; estimate a latency value for each of the one or more pixels or pixel groups, based on the respective estimated illumination information of the one or more pixels or pixel groups; and compensate, at least partly, time information of a triggered record of a pixel or a pixel group of the one or more pixels or pixel groups, based on the respective estimated latency value of the pixel or pixel group. This disclosure further proposes an event sensing device comprising the entity and an event sensor comprising the plurality of pixels.

Inventors:
MUUKKI MIKKO (SE)
BILCU RADU (SE)
Application Number:
PCT/EP2020/080950
Publication Date:
May 12, 2022
Filing Date:
November 04, 2020
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
HUAWEI TECH CO LTD (CN)
MUUKKI MIKKO (SE)
International Classes:
G06F16/58; H04N5/217; H04N5/232; H04N5/235
Foreign References:
US20180262705A12018-09-13
US20160320834A12016-11-03
Attorney, Agent or Firm:
KREUZ, Georg (DE)
Download PDF:
Claims:
Claims

1. An entity (100) for compensating latency of a plurality of pixels, wherein the entity (100) is configured to: estimate illumination information (101) for one or more pixels or pixel groups of the plurality of pixels; estimate a latency value (102) for each of the one or more pixels or pixel groups, based on the respective estimated illumination information (101) of the one or more pixels or pixel groups; and compensate, at least partly, time information (103) of a triggered record of a pixel or a pixel group of the one or more pixels or pixel groups, based on the respective estimated latency value (102) of the pixel or pixel group.

2. The entity (100) according to claim 1, wherein the time information (103) of the triggered record is a timestamp associated with an event generated for the pixel or the pixel group.

3. The entity (100) according to claim 2, further configured to: compensate, at least partly, the time information (103) of the triggered record by subtracting the respective estimated latency value (102) from the timestamp of the event.

4. The entity (100) according to one of the claims 1 to 3, further configured to: estimate the illumination information (101) based on a background rate of events generated by the one or more pixels or pixel groups.

5. The entity (100) according to claim 4, further configured to: obtain information related to the one or more pixels or pixel groups; and estimate the illumination information (101) based further on the information related to the one or more pixels or pixel groups.

6. The entity (100) according to claim 5, wherein the information related to the one or more pixels or pixel groups comprises a temperature of the one or more pixels or pixel groups and/or a setting of the one or more pixels or pixel groups.

7. The entity (100) according to one of the claims 1 to 6, wherein the plurality of pixels belong to an event sensor, wherein the entity (100) is further configured to: estimate the illumination information (101) based on an RGB frame and/or a linear output of the event sensor.

8. The entity (100) according to one of the claims 1 to 7, further configured to: estimate the illumination information (101) based on a spectral response of the one or more pixels or pixel groups and/or an estimated spectrum of light received by the one or more pixels or pixel groups.

9. An event sensing device (10) comprising an entity (100) according to one of the claims 1 to 8, and an event sensor (200) comprising the plurality of pixels (201), wherein one or more first thresholds are associated with each of the plurality of pixels (201) or with each of one or more pixel groups (202) of the plurality of pixels (201), wherein the event sensor (200) is configured to: detect a change in illumination at a pixel (201) or pixel group (202); and generate an event for the pixel (201) or pixel group, if the change in illumination exceeds any one of the one or more first thresholds associated with the pixel (201) or pixel group (202), wherein the event comprises a timestamp, and the time information (103) of the triggered record is the timestamp.

10. The event sensing device (10) according to claim 9, further configured to: measure a brightness at the pixel (201) or pixel group (202); and provide the brightness of the pixel (201) or pixel group (202) to the entity (100); wherein the entity (100) is further configured to: estimate the illumination information (101) for the pixel (201) or pixel group (202) based on the brightness of the pixel (201) or pixel group (202).

11. The event sensing device (10) according to claim 9 or 10, wherein the event is associated with a timestamp, the event sensing device (10) is further configured to: generate an event stream comprising one or more events, wherein each event is associated with a compensated timestamp, wherein the compensated timestamp equals the timestamp of the event minus the respective estimated latency value (102).

12. The event sensing device (10) according to one of the claims 9 to 11, wherein the one or more pixels (201) or pixel groups (202) are associated with a region of interest or with a field of view of a scene imaged by the event sensing device (10).

13. The event sensing device (10) according to one of the claims 9 to 12, further configured to: estimate a latency value for the pixel (201) or pixel group (202), based on the respective estimated illumination information (101) of the pixel (201) or pixel group (202), and based further on one of the one or more first thresholds associated with the pixel (201) or pixel group (202), or based on a second threshold that is calculated from the one or more first thresholds.

14. A method (400) performed by an entity (100) for compensating latency of a plurality of pixels (201), wherein the method (400) comprises: estimating (401) illumination information (101) for one or more pixels (201) or pixel groups (202) of the plurality of pixels (201) or pixel groups (202); estimating (402) a latency value (102) for each of the one or more pixels (201) and or pixel groups (202) based on the respective estimated illumination information (101) of the one or more pixels (201) or pixel groups (202); and compensating (403), at least partly, time information (103) of a triggered record of a pixel (201) or a pixel group (202) of the one or more pixels (201) or pixel groups (201), based on the respective estimated latency value (102) of the pixel (201) or pixel group (202).

15. A method (500) performed by an event sensing device (10), wherein the event sensing device (10) comprising an entity (100) for compensating latency of a plurality of pixels (201), and an event sensor (200) comprising a plurality of pixels (201), wherein one or more first thresholds are associated with each of the plurality of pixels (201) or with each of one or more pixel groups (202) of the plurality of pixels (201), wherein the method (500) comprises: estimating (501), by the entity (100), illumination information (101) for one or more pixels (201) or pixel groups (202) of the plurality of pixels (201); estimating (502), by the entity (100), a latency value (102) for each of the one or more pixels (201) or pixel groups (202), based on the respective estimated illumination information (101) of the one or more pixels (201) or pixel groups (202); detecting (503), by the event sensor (200), a change in illumination at a pixel (201) or pixel group (202); generating (504), by the event sensor (200), an event for the pixel (201) or pixel group (202), if the change in illumination exceeds any one of the one or more first thresholds associated with the pixel (201) or pixel group (202), wherein the event comprise a timestamp; and compensating (505), by the entity (100), at least partly, the timestamp based on the respective estimated latency value (102) of the pixel (201) or pixel group (202).

16. A computer program product comprising a program code for carrying out, when implemented on a processor, the method (400, 500) according to claim 14 or 15.

Description:
DEVICE AND METHOD FOR COMPENSATING EVENT LATENCY

TECHNICAL FIELD

The present disclosure relates generally to addressing latencies of a plurality of pixels. Particularly, the present disclosure relates to event-based cameras, and more particularly to addressing the latencies of a plurality of pixels of the event-based cameras that generate events. The disclosure proposes, to this end, a device and a method for compensating a latency of a plurality of pixels or pixel groups, respectively, by compensating a time information (like timestamps) of triggered records (like events) of the plurality of pixels or pixel groups. The device and the method may thus beneficially be used for reducing an impact of local brightness differences caused to an event-based camera.

BACKGROUND

Event-based cameras use types of sensors, which respond to changes in the incoming light intensity. In contrast to standard cameras, in which each pixel of the sensor captures the amount of incident light at a fixed rate, event-based cameras are asynchronous and pixels of the event-based sensors are activated only when a change in the intensity of the incoming light is sensed. Consequently, the output data rate of such an event-based camera is variable. When there is no change of the incoming light intensity, there will also be no activated pixels, and thus no data is generated by the event-based camera. When there is a change of the incoming light intensity, for instance, in case of moving objects, the pixels that capture the intensity of the objects will generate the so-called events (triggered by the changes of the incoming light intensity sensed by the pixels). The pixels of an event-based camera sensor usually capture the logarithm of the incoming light intensity, which is then further processed in additional sensor circuitry.

In a noise-free situation (i.e., assuming a clean log intensity signal), since events are only generated when changes in the incoming light intensity are detected, for static scenes without motion no event pixel will generate any data (event). However, in a real world noisy situation, even a scene without motion may become non-static, since event pixels suffer from the noise as any other pixels of sensors. The noise can cause one or more event pixels to believe that there is a light intensity change. Furthermore, any event or other triggered record of such a pixel or pixel group is usually associated with a time information, wherein the time information indicates when the pixel or pixel group detected the light intensity change. The time information may be a timestamp of the event generated by the pixel or pixel group. However, the time information may not be absolutely correct, and in particular different pixels may output different time information even though they detected the same light intensity change. This is due to the fact that the event or other triggered record of such a pixel or pixel group may be generated with a certain delay from the light intensity change (an inherent property of the pixel), wherein the delay may depend on a local illuminance level of the pixel or pixel group. In an ideal case, the latency between the light intensity change (e.g. caused by a scene change) and the event that is generated is 0 or very small. However, in real life the latency may vary depending on the local illuminance level at the pixel or pixel group. If different pixels or pixel groups have significantly different local illuminance levels, the time information associated with the event or other triggered record can differ significantly from pixel to pixel or pixel group to pixel group.

SUMMARY

In view of the above-mentioned deficiencies, embodiments of the present disclosure aim to reduce an impact of differing local illuminance levels (which may be enhanced by noise, for example by photon shot noise) at a plurality of pixels or pixel groups, in particular, when they are used in an event sensing entity like a sensor of an event-based camera. An objective is to eliminate at least partly the above-mentioned uncertainty of the time information, which comes from the different illumination-dependent delays of the triggered records or events generated by the pixels or pixel groups. One aim of the disclosure is thus to obtain a more ideal event sensing device for an improved event-based camera.

The objective is achieved by embodiments as provided in the enclosed independent claims. Advantageous implementations of the embodiments are further defined in the dependent claims.

A first aspect of the disclosure provides an entity for compensating latency of a plurality of pixels, wherein the entity is configured to: estimate illumination information for one or more pixels or pixel groups of the plurality of pixels; estimate a latency value for each of the one or more pixels or pixel groups, based on the respective estimated illumination information of the one or more pixels or pixel groups; and compensate, at least partly, time information of a triggered record of a pixel or a pixel group of the one or more pixels or pixel groups, based on the respective estimated latency value of the pixel or pixel group.

The entity of the first aspect is accordingly able to reduce the uncertainty of the time information of the triggered record of a pixel, and particular to reduce or eliminate differences in the time information of the different pixels or pixel groups. This is achieved by compensating at least partly the time information of one pixel or pixel group, or of more than one pixel or pixel group. The entity of the first aspect thus also allows an event sensor (that comprises the plurality of pixels) to become more ideal, as the impact of local brightness differences at the pixels of the event sensor can be reduced, and thus the differing time information associated with the triggered records or events of the pixels of the sensor can be reduced or even eliminated. This may also improve performances of various applications using the event sensor.

In an implementation form of the first aspect, the time information of the triggered record is a timestamp associated with an event generated for the pixel or the pixel group.

Typically, the output of an event sensor for each pixel includes the timestamp as the time information. When an event (i.e., a specific triggered record) is generated for the pixel or pixel group, the calculated latency value may be used to correct the timestamp of that pixel or pixel group. For more than one pixel or pixel group, the time information can be aligned, for instance, by correcting timestamps of pixels or pixel groups that have a higher latency due to the scene conditions (i.e. differing amounts of illumination). Thus, more similar corrected timestamps may be obtained for the pixels or pixel groups.

In an implementation form of the first aspect, the entity is further configured to compensate, at least partly, the time information of the triggered record by subtracting the respective estimated latency value from the timestamp of the event.

Optionally, a corrected timestamp may be equal to the timestamp obtained for the pixel or pixel group minus the estimated latency value. In an implementation form of the first aspect, the entity is further configured to estimate the illumination information based on a background rate of events generated by the one or more pixels or pixel groups.

Optionally, the entity may monitor (e.g., by performing an algorithm) the background events, at least background events from some pixel areas (e.g., flat areas) of the plurality of pixels. Based on the amount or rate of the background events, the entity may further estimate the local illuminance level of those pixels, respectively.

In an implementation form of the first aspect, the entity is further configured to: obtain information related to the one or more pixels or pixel groups; and estimate the illumination information based further on the information related to the one or more pixels or pixel groups.

In particular, when estimating the illumination information, the entity may further consider other information related to the one or more pixels or pixel groups. This may improve an accuracy of the estimation of the latency value, and thus the compensation of the time information.

In an implementation form of the first aspect, the information related to the one or more pixels or pixel groups comprises a temperature of the one or more pixels or pixel groups and/or a setting of the one or more pixels or pixel groups.

The estimation of the illumination information based on the background rate of events may have a dependency on the used sensor settings of a sensor including the pixels or pixel groups. That is, if applications use different sensor settings, then the estimation may need to be adjusted accordingly. Possibly, the background rate(s) may change as a function of event sensor temperature, so that the temperature of the one or more pixels or pixel groups may also be relevant to the estimation of the illumination information.

In an implementation form of the first aspect, the plurality of pixels belong to an event sensor, wherein the entity is further configured to: estimate the illumination information based on an RGB frame and/or a linear output of the event sensor. An event sensor is thus disclosed comprising the plurality of pixels. Possibly, the output data of the event sensor may be used by the entity to estimate the illumination information, such as an RGB (red, green, blue) frame, or the linear output such as a grey scale output.

In an implementation form of the first aspect, the entity is further configured to estimate the illumination information based on a spectral response of the one or more pixels or pixel groups and/or an estimated spectrum of light received by the one or more pixels or pixel groups.

As described above, the inherent latency of a pixel may change depending on the amount of light hitting the pixel (or may be caused or enhanced by noise). Further, the brightness coming to the pixel may depend on what is the pixel spectral response (e.g., for what range of wavelengths the pixel is sensitive to, and how sensitive the pixel is), and what is the spectrum of the incoming light to that pixel. Therefore, the estimation of the illumination information may further take into account the spectral response of pixel or pixel group and the estimated spectrum of the incoming light to the pixel or pixel group.

A second aspect of the disclosure provides an event sensing device comprising an entity according to the first aspect, and an event sensor comprising the plurality of pixels, wherein one or more first thresholds are associated with each of the plurality of pixels or with each of one or more pixel groups of the plurality of pixels, wherein the event sensor is configured to: detect a change in illumination at a pixel or pixel group; and generate an event for the pixel or pixel group, if the change in illumination exceeds any one of the one or more first thresholds associated with the pixel or pixel group, wherein the event comprises a timestamp, and the time information of the triggered record is the timestamp.

Typically, the event is generated when the change of illumination is detected at the pixel or pixel group, and the event is associated with the timestamp. The entity, as proposed in embodiments of the disclosure, may use the calculated latency value to correct at least partly the timestamp. In this way, an impact of the local brightness (which may be enhanced by noise) caused to an event sensor can be reduced. In an implementation form of the second aspect, the event sensing device is further configured to: measure a brightness at the pixel or pixel group; and provide the brightness of the pixel or pixel group to the entity; wherein the entity is further configured to estimate the illumination information for the pixel or pixel group based on the brightness of the pixel or pixel group.

Optionally, the brightness may be estimated at the pixels of the sensor (e.g. Bayer/color camera), for instance, from a linear output of pixels of dual purpose event sensor, and/or from the background rate of the pixels of the event sensor (depending on the availability).

In an implementation form of the second aspect, the event is associated with a timestamp, the event sensing device is further configured to: generate an event stream comprising one or more events, wherein each event is associated with a compensated timestamp, wherein the compensated timestamp equals the timestamp of the event minus the respective estimated latency value.

In this way, the latency issue caused by differing local brightness for the generated events of different pixels or pixel groups can be at least partly eliminated.

In an implementation form of the second aspect, the one or more pixels or pixel groups are associated with a region of interest (ROI) or with a field of view (FOV) of a scene imaged by the event sensing device.

In particular, the compensation may be used for the full FOV, or for pixel groups that belong to the ROI(s) or close to that. Notably, ROI may refer to for example an important object or objects.

In an implementation form of the second aspect, the event sensing device is further configured to estimate a latency value for the pixel or pixel group, based on the respective estimated illumination information of the pixel or pixel group, and based further on one of the one or more first thresholds associated with the pixel or pixel group, or based on a second threshold that is calculated from the one or more first thresholds. Notably, one or more first thresholds may be set and used for capturing events. The latencies may be characterized or modelled with a set of threshold settings, i.e., one or more first thresholds. The latency value may be corrected using the same thresholds that were used for capturing the events. Alternatively, a calculated value of threshold may also be used for estimating the latency value, for example, the closest threshold value, or an averaged/interpolated value of more than one closest threshold values.

A third aspect of the disclosure provides a method performed by an entity for compensating latency of a plurality of pixels, wherein the method comprises: estimating illumination information for one or more pixels or pixel groups of the plurality of pixels or pixel groups; estimating a latency value for each of the one or more pixels and or pixel groups based on the respective estimated illumination information of the one or more pixels or pixel groups; and compensating, at least partly, time information of a triggered record of a pixel or a pixel group of the one or more pixels or pixel groups, based on the respective estimated latency value of the pixel or pixel group.

Implementation forms of the method of the third aspect may correspond to the implementation forms of the entity of the first aspect described above. The method of the third aspect and its implementation forms achieve the same advantages and effects as described above for the entity of the first aspect and its implementation forms.

A fourth aspect of the disclosure provides a method performed by an event sensing device, wherein the event sensing device comprising an entity for compensating latency of a plurality of pixels, and an event sensor comprising a plurality of pixels, wherein one or more first thresholds are associated with each of the plurality of pixels or with each of one or more pixel groups of the plurality of pixels, wherein the method comprises: estimating, by the entity, illumination information for one or more pixels or pixel groups of the plurality of pixels; estimating, by the entity, a latency value for each of the one or more pixels or pixel groups, based on the respective estimated illumination information of the one or more pixels or pixel groups; detecting, by the event sensor, a change in illumination at a pixel or pixel group; generating, by the event sensor, an event for the pixel or pixel group, if the change in illumination exceeds any one of the one or more first thresholds associated with the pixel or pixel group, wherein the event comprise a timestamp; and compensating, by the entity, at least partly, the timestamp based on the respective estimated latency value of the pixel or pixel group.

Implementation forms of the method of the fourth aspect may correspond to the implementation forms of the event sensing device of the second aspect described above. The method of the fourth aspect and its implementation forms achieve the same advantages and effects as described above for the event sensing device of the second aspect and its implementation forms.

A fifth aspect of the disclosure provides a computer program product comprising a program code for carrying out, when implemented on a processor, the method according to the third aspect and any implementation forms of the third aspect, or the method according to the fourth aspect and any implementation forms of the fourth aspect.

It has to be noted that all devices, elements, units and means described in the present application could be implemented in the software or hardware elements or any kind of combination thereof. All steps which are performed by the various entities described in the present application as well as the functionalities described to be performed by the various entities are intended to mean that the respective entity is adapted to or configured to perform the respective steps and functionalities. Even if, in the following description of specific embodiments, a specific functionality or step to be performed by external entities is not reflected in the description of a specific detailed element of that entity which performs that specific step or functionality, it should be clear for a skilled person that these methods and functionalities can be implemented in respective software or hardware elements, or any kind of combination thereof.

BRIEF DESCRIPTION OF DRAWINGS

The above described aspects and implementation forms of the present disclosure will be explained in the following description of specific embodiments in relation to the enclosed drawings, in which

FIG. 1 shows an entity according to an embodiment of the disclosure. FIG. 2 shows an event sensing device according to an embodiment of the disclosure.

FIG. 3 shows a latency change as a function of scene illumination according to an embodiment of the disclosure.

FIG. 4 shows a method according to an embodiment of the disclosure.

FIG. 5 shows a method according to an embodiment of the disclosure.

DETAILED DESCRIPTION OF EMBODIMENTS

Illustrative embodiments of the disclosure are described with reference to the figures. Although this description provides a detailed example of possible embodiments and implementations, it should be noted that the details are intended to be exemplary and in no way limit the scope of the application.

Moreover, an embodiment/example may refer to other embodiments/examples. For example, any description including but not limited to terminology, element, process, explanation and/or technical advantage mentioned in one embodiment/example is applicative to the other embodiments/examples.

To introduce this disclosure, the operating principle of an event pixel (of an event sensor) is first described here. The pixel may capture the incoming light and then calculate the difference between the current value of the log intensity of the incoming light and the previous value. When the difference is greater than a positive threshold Th, a “+1” event may be generated. Similarly, if the difference is smaller than the negative threshold -Th, a “-1” event may be generated.

An event stream may consist of +ls or -Is, thereby indicating cases where the incoming light intensity has increased or decreased, respectively, pixel coordinates, and a timestamp for each event. Typically, events can be generated with a time rate of 1 ps, thus event pixels can be powerful means to detect motion in a scene. Further, since event pixels can operate in the logarithmic domain, ambient light can be intrinsically eliminated when calculating the difference between two instances of the incoming light. Therefore, event pixels can be used to obtain information about the reflectance of the objects of the scene.

Generally speaking, an events pixel/circuitry monitors changes of illuminance, more precisely changes of logarithmic intensity/illuminance. The operation of such circuitry, e.g. of the logarithmic intensity, may depend on an illuminance level. When the illuminance level is higher, the photocurrent will be higher; while when the illuminance level is lower, the photocurrent will be lower.

In example circuitry, logarithmic intensity is implemented by log amplifier that has source follower biased by photocurrent. A smaller photocurrent causes a delayed response compared to a higher photocurrent. This means that the event pixel circuitry includes a delay that depends on the illuminance level.

The photoreceptor bias may be a controllable bias or a summed photocurrent. The on / off threshold voltages (or currents) may be used to set levels which cause event pixel to trigger.

As previously discussed, event sensors may suffer from noise as any other sensor in the real world. With event sensors from an end user point of view, there may be at least one type of “noise” that exists, i.e., a background rate of events. Such noise may depend on an illuminance level. In particular, there may be more noise in a low light environment, and less noise in a bright environment. It is worth mentioning that if the scene or the device (i.e., the event sensor) is static, events may be only generated due to noise. But even when the scene or device is not static, events are sometimes not generated from flat areas as events generation require edges or textured areas, i.e., contrast to be detected (contrast with enough changes on brightness, i.e., from bright to dark, or vice versa).

The output of an event sensor for each pixel, i.e., an event, includes a “timestamp”, i.e. time/delay from the previous event of that pixel, but it has big uncertainty due to the dependency on the illuminance level. In an ideal case, the latency between the light intensity change (e.g. caused by a scene change) and the event that is generated is 0 or very small, e.g. in a range of microsecond (ps) or tens of microsecond. However, the latency depends greatly on the illuminance level at the pixel, which can be up to the milliseconds (ms) or tens of milliseconds level. Typically, the latencies may be measured, tested, or characterized in chip level, however, in practice what matters is the per pixel level illuminance that depends on scene content.

FIG. 1 shows an entity 100 for compensating latency of a plurality of pixels, according to an embodiment of the disclosure. The entity 100 may comprise processing circuitry (not shown) configured to perform, conduct or initiate the various operations of the entity 100 described herein. The processing circuitry may comprise hardware and software. The hardware may comprise analog circuitry or digital circuitry, or both analog and digital circuitry. The digital circuitry may comprise components such as application-specific integrated circuits (ASICs), field-programmable arrays (FPGAs), digital signal processors (DSPs), or multi-purpose processors. The entity 100 may further comprise memory circuitry, which stores one or more instruction(s) that can be executed by the processor or by the processing circuitry, in particular under control of the software. For instance, the memory circuitry may comprise a non-transitory storage medium storing executable software code which, when executed by the processor or the processing circuitry, causes the various operations of the entity 100 to be performed. In one embodiment, the processing circuitry comprises one or more processors and a non-transitory memory connected to the one or more processors. The non-transitory memory may carry executable program code which, when executed by the one or more processors, causes the entity 100 to perform, conduct or initiate the operations or methods described herein.

In particular, the entity 100 is configured to estimate illumination information 101 for one or more pixels or pixel groups of the plurality of pixels. That is, the illumination information 101 may refer to illumination information per pixel, or per a group area of pixels. The entity 100 is further configured to estimate a latency value 102 for each of the one or more pixels or pixel groups, based on the respective estimated illumination information 101 of the one or more pixels or pixel groups. Further, the entity 100 is configured to compensate, at least partly, time information 103 of a triggered record of a pixel or a pixel group of the one or more pixels or pixel groups, based on the respective estimated latency value 102 of the pixel or pixel group. Embodiments of this disclosure propose methods to estimate the latency value 102 using the illumination information 101 at pixel(s). The estimated latency value 102 enables a compensation of the time information 103 related to a triggered record of a pixel or pixel group or to a generated event by the pixel or pixel group.

In particular, the time information 103 of the triggered record may be a timestamp associated with such an event generated for the pixel or the pixel group.

That is, an event generated for each pixel or each pixel group includes a timestamp. Based on this disclosure, when events are generated for a pixel or pixel group, the estimated latency value 102 may be used to compensate a latency (reflected in the respective timestamps) that varies depending on scene conditions, in order to have corrected timestamp (ideally, minimal and similar impact of latency in timestamp for all pixels or pixel groups). For instance, the corrected timestamp may equal the timestamp from the pixel or the pixel group minus the estimated latency value 102.

According to an embodiment of the disclosure, the entity 100 may be further configured to estimate the illumination information 101 based on a background rate of events generated by the one or more pixels or pixel groups.

Optionally, the entity may monitor by an algorithm the background events, at least from some pixel areas (e.g., flat areas). Based on an amount or a rate of background events, the entity may estimate the local illuminance level of those pixels.

Optionally, according to an embodiment of this disclosure, the estimation of the illumination information 101 may further require information related to the one or more pixels or pixel groups. For instance, such information may comprise a temperature of the one or more pixels or pixel groups, and/or a setting of the one or more pixels or pixel groups. Accordingly, the entity 100 may be further configured to obtain the information related to the one or more pixels or pixel groups.

In particular, a mapping of the background rate and the local illuminance level may be assumed to have a dependency on used sensor settings of a sensor including the pixels or pixel groups. For instance, if applications use different sensor settings, this mapping may change accordingly. The background rate may also change as a function of a temperature of event sensor, so also temperature compensation may be used. In temperature compensation, the mapping of the background rate and the local illuminance level may be characterized in different temperatures. For compensation at a particular temperature, a closest mapping (based on the closest temperature) may be used, or an interpolated mapping from two closest temperatures may be used.

According to an embodiment of this disclosure, the plurality of pixels may belong to an event sensor. The entity 100 provided according to embodiments of the disclosure may be responsible for compensating latency for the event sensor.

The event sensor may include only event pixels, or it may include pixels that can have dual purpose i.e. a pixel can be used as an event pixel but also as such a pixel as pixels in color sensors (with Bayer, quad-bayer or any other color filter array (CFA) pattern) or in monochrome sensors, or the event sensor may include both event pixels and pixels as in color sensors or in monochrome sensors, or a combination of these.

FIG. 2 shows an event sensing device 10 according to an embodiment of the disclosure. The event sensing device 10 may comprise processing circuitry (not shown) configured to perform, conduct or initiate the various operations of the event sensing device 10 described herein. The processing circuitry may comprise hardware and software. The hardware may comprise analog circuitry or digital circuitry, or both analog and digital circuitry. The digital circuitry may comprise components such as ASICs, FPGAs, DSPs, or multi-purpose processors. The event sensing device 10 may further comprise memory circuitry, which stores one or more instruction(s) that can be executed by the processor or by the processing circuitry, in particular under control of the software. For instance, the memory circuitry may comprise a non-transitory storage medium storing executable software code which, when executed by the processor or the processing circuitry, causes the various operations of the event sensing device 10 to be performed. In one embodiment, the processing circuitry comprises one or more processors and a non-transitory memory connected to the one or more processors. The non-transitory memory may carry executable program code which, when executed by the one or more processors, causes event sensing device 10 to perform, conduct or initiate the operations or methods described herein. In particular, the event sensing device 10 comprises an entity 100, and an event sensor 200. The entity 100 here may be the entity 100 as shown in FIG. 1. As shown in FIG. 2, the event sensor 200 comprises the plurality of pixels 201. Possibly, the event sensor 200 may comprise a pixel array, and the plurality of pixels 201 may be pixels of the pixel array. The plurality of pixels 201 may be grouped into pixel groups 202, and each pixel group 202 may comprise two or more pixels 201.

In particular, one or more first thresholds are associated with each of the plurality of pixels 201, or with each of one or more pixel groups 202 of the plurality of pixels 201. In particular, the event sensor 200 is configured to detect a change in illumination at a pixel 201 or pixel group 202. The event sensor 200 is further configured to generate an event for the pixel 201 or pixel group 202, if the change in illumination exceeds any one of the one or more first thresholds associated with the pixel 201 or pixel group 202. The event comprises a timestamp, and the time information 103 of the triggered record is the timestamp.

Possibly, the entity 100 may be a software running on a processor outside of the event sensor 200. Such processor may be a central processing unit (CPU), a graphics processing unit (GPU), or a neural processing unit (NPU). The entity 100 may be an algorithm running on some hardware outside of the event sensor 200. Alternatively, the entity 100 may even be an algorithm running on some hardware or processor of the event sensor 200, if such hardware or processor exists.

In particular, the entity 100 may be configured to at least partly compensate the time information 103 of the triggered record (here the time information is a timestamp of an event) by subtracting the respective estimated latency value 102 from the timestamp of the event.

According to an embodiment of the disclosure, the event sensing device 10 may be further configured to generate an event stream comprising one or more events, wherein each event is associated with a compensated timestamp, wherein the compensated timestamp equals the timestamp of the event minus the respective estimated latency value 102. In this way, the generated event stream is associated with corrected timestamp values, thus the effect of local brightness differences (which may be caused for example by local differences in scene, and which may be enhanced by noise) can be reduced or eliminated.

It should be noted that the compensation may be implemented all the time, or may be only used under difficult conditions, such like a high dynamic range, or a low light environment. Possibly, whether to enable the compensation may also depend on whether an application requires more accurate performance for example more accurate timestamp. Further, it may depend on whether the application uses artificial intelligence (Al) processing, and whether the application expects an ideal output (where the compensation is very valuable) or nonideal output (where only partial compensation or no compensation may be preferred).

According to embodiments of the disclosure, the compensation may be used for full FOV, or for pixel groups that belong to the ROI(s) or close to that. That is, the one or more pixels 201 or pixel groups 202 may be associated with the ROI or with the full FOV of a scene that is imaged by the event sensing device 10.

Notably, when a real event is generated from a pixel 201 or a pixel group 202, an original illuminance level can be obtained, and a new illuminance level at the moment of event generation can also be calculated (as a contrast threshold and original illumination are known). These facts allow to estimate / understand the latency value 102 as it depends on the illumination information 101 per pixel 201 or per pixel group 202.

For instance, after a pixel 201 has generated a previous event, the pixel circuitry has stored the original illumination coming to that pixel 201. The pixel circuitry will monitor if the brightness at the pixel 201 goes above a threshold. The brightness that goes above the threshold is the new illuminance level and it will cause a new event to be generated.

Notably, the threshold may be programmable to an event sensor (e.g., the event sensor 200). Typically, there are two thresholds, one for positive and one for negative events. A positive pixel value means the scene getting brighter, and a negative pixel value means the scene getting darker.

An ideal formula new illuminance (12) may be calculated from original illuminance (II) and a threshold: log/2 = log II + threshold. The latency value 102 may be estimated by using the original illuminance, or both the original illuminance and the new illuminance.

The behavior of the pixel circuitry may be measured, characterized or modelled. It may be sufficient to know only the original illuminance for estimating the latency value 102. With certain original illuminance, certain photocurrent is generated and it will have a biasing effect that are valid with certain threshold setting. Different threshold setting (e.g., 10%, 30%, or 50%) may impact how the illuminance / photocurrent impacts to the event generation. For instance, higher threshold means that the difference in photocurrents are larger, and lower threshold means that the difference in photocurrents are smaller. The smaller the differences are, more important is the behavior at the original illuminance. The larger the threshold, the bigger the change is, and wider range of photocurrents impact to the behavior of the pixel circuitry.

For this reason, the latencies as a function of original illuminance may be characterized or modelled with a set of threshold settings (e.g., the one or more first thresholds), if more than one threshold settings are used. The latency value 102 may be corrected using the same threshold that was used for capturing the events. Optionally, the closest threshold value or an average / interpolated value of more than one closest threshold values may also be used. Thus, it may be sufficient to assume that original illuminance is enough to understand the latency, assuming that correct threshold settings are applied. Different threshold settings may be used depending on use case, application, or algorithm requirements, for example if a small latency or small noise prioritized.

In a particular embodiment, the event sensing device 10 may be configured to estimate a latency value for the pixel 201 or pixel group 202, based on the respective estimated illumination information 101 of the pixel 201 or pixel group 202, and based further on one of the one or more first thresholds associated with the pixel 201 or pixel group 202. In another embodiment, the event sensing device 10 may be configured to estimate a latency value for the pixel 201 or pixel group 202, based on the respective estimated illumination information 101 of the pixel 201 or pixel group 202, and based further on a second threshold that is calculated from the one or more first thresholds. For instance, the second threshold may be an average / interpolated value of the one or more first thresholds. In measurement and characterization, the latency value 102 may be measured in a controlled laboratory environment as a function of illuminance. In modelling, a simulator may be used to evaluate the latency value 102 as a function of illuminance.

According to the embodiments of the disclosure, a compensation on the time information 103 may be performed at pixel level or pixel group level. The dynamically varying latency values that cause error in the reported timestamps, may be at least partly corrected.

In particular, if only single threshold settings are used, a corrected timestamp may equal the timestamp obtained for the pixel or pixel group minus the latency value 102 that is estimated based on the original illuminance.

If more than one threshold settings are used, the corrected timestamp may equal the timestamp value obtained for the pixel or pixel group minus the latency value 102 that is estimated based on the original illuminance and the thresholds (i.e. , the one or more first thresholds, and/or the second threshold).

Instead of using the background rate to estimate illuminance information 101, illuminance information 101 may be obtained in multi-camera systems from a frame-based camera, e.g. camera with a color sensor (with Bayer, quad-bayer or any other CFA pattern), or from a monochrome sensor. From the color sensor, the illuminance estimation can be done for instance by using the green channel if that is closest to event sensor response, or further combining information obtained from multiple color channels to produce a more accurate estimate of illuminance.

Optionally, the entity 100 may be configured to estimate the illumination information 101 based on an RGB frame and/or a linear output of the event sensor.

In another implementation, the frame-based camera may be used in addition to using the background rate for estimating illuminance information 101. For example, information obtained from the frame-based camera may be used in addition to the background noise method, for example guiding detecting of the “flat” areas. Optionally, an individual illumination estimate may be obtained using each method, and a single but more robust estimate can be obtained using both illumination estimates. If the event sensor 200 is able to measure also per pixel brightness, this information can also be used for illuminance level estimation. According to an embodiment of the disclosure, the event sensing device 10 may be configured to measure a brightness at the pixel 201 or pixel group 202. Then, the event sensing device 10 may be configured to provide the brightness of the pixel 201 or pixel group 202 to the entity 100. Accordingly, the entity 100 may be further configured to estimate the illumination information 101 for the pixel 201 or pixel group 202 based on the brightness of the pixel 201 or pixel group 202.

Further, it is possible to characterize sensor behavior by using simulations and/or by laboratory measurements. To illustrate how illumination level may impact the latency, a latency change as a function of scene illumination is depicted in FIG. 3. It can be seen that the latency varies depending on different incoming light. In particular, a lower illumination level causes a severer latency compared to a higher illumination level. Notably, it may be possible to come up with a more precise graph by taking into account the spectral information of the scene, the reflectance and color or objects in a scene, and how those impact to the signal coming to the pixel or pixel group. It may also take into account whether an event pixel (e.g., the pixel 201) has a color filter (e.g., red, green, blue, yellow, magenta, or cyan), and what is the pixel spectral response. In other words, the brightness coming to the event pixel depends on what is event pixel spectral response (e.g., for what range of wavelengths the pixel is sensitive to, and how sensitive the pixel is) and what is the spectrum of the incoming light to that event pixel.

Optionally, the entity 100 may be further configured to estimate the illumination information 101 based on a spectral response of the one or more pixels 201 or pixel groups 202 and/or an estimated spectrum of light received by the one or more pixels 201 or pixel groups 202.

FIG. 4 shows a method 400 for compensating latency of a plurality of pixels 201 according to an embodiment of the disclosure. In a particular embodiment of the disclosure, the method 400 is performed by an entity 100 as shown in FIG. 1 or FIG. 2. In particular, the method 400 comprises: a step 401 of estimating illumination information 101 for one or more pixels 201 or pixel groups 202 of the plurality of pixels 201 or pixel groups 202; a step 402 of estimating a latency value 102 for each of the one or more pixels 201 and or pixel groups 202 based on the respective estimated illumination information 101 of the one or more pixels 201 or pixel groups 202; and a step 403 of compensating, at least partly, time information 103 of a triggered record of a pixel 201 or a pixel group 202 of the one or more pixels 201 or pixel groups 201, based on the respective estimated latency value 102 of the pixel 201 or pixel group 202. Possibly, each of the one or more pixels 201 or pixel groups 202 may be a pixel 201 or pixel group 202 as shown in FIG. 2.

FIG. 5 shows a method 500 according to an embodiment of the disclosure. In a particular embodiment of the disclosure, the method 500 is performed by an event sensing device 10 as shown in FIG. 2. In particular, the event sensing device 10 comprises an entity 100 for compensating latency of a plurality of pixels 201, and an event sensor 200 comprising a plurality of pixels 201, wherein one or more first thresholds are associated with each of the plurality of pixels 201 or with each of one or more pixel groups 202 of the plurality of pixels 201. Possibly, each of the one or more pixels 201 or pixel groups 202 may be a pixel 201 or pixel group 202 as shown in FIG. 2.

In particular, the method 500 comprises: a step 501 of estimating, by the entity 100, illumination information 101 for one or more pixels 201 or pixel groups 202 of the plurality of pixels 201; a step 502 of estimating by the entity 100, a latency value 102 for each of the one or more pixels 201 or pixel groups 202, based on the respective estimated illumination information 101 of the one or more pixels 201 or pixel groups 202; and a step 503 of detecting, by the event sensor 200, a change in illumination at a pixel 201 or pixel group 202. The method further comprises a step 504 of generating, by the event sensor 200, an event for the pixel 201 or pixel group 202, if the change in illumination exceeds any one of the one or more first thresholds associated with the pixel 201 or pixel group 202, wherein the event comprises a timestamp; and a step 505 of compensating, by the entity 100, at least partly, the timestamp based on the respective estimated latency value 102 of the pixel 201 or pixel group 202. Possibly, the entity 100 may be the entity as shown in FIG. 1 or FIG. 2.

The present disclosure has been described in conjunction with various embodiments as examples as well as implementations. However, other variations can be understood and effected by those persons skilled in the art and practicing the claimed disclosure, from the studies of the drawings, this disclosure and the independent claims. In the claims as well as in the description the word “comprising” does not exclude other elements or steps and the indefinite article “a” or “an” does not exclude a plurality. A single element or other unit may fulfill the functions of several entities or items recited in the claims. The mere fact that certain measures are recited in the mutual different dependent claims does not indicate that a combination of these measures cannot be used in an advantageous implementation.

Furthermore, any method according to embodiments of the disclosure may be implemented in a computer program, having code means, which when run by processing means causes the processing means to execute the steps of the method. The computer program is included in a computer readable medium of a computer program product. The computer readable medium may comprise essentially any memory, such as a ROM (Read-Only Memory), a PROM (Programmable Read-Only Memory), an EPROM (Erasable PROM), a Flash memory, an EEPROM (Electrically Erasable PROM), or a hard disk drive.

Moreover, it is realized by the skilled person that embodiments of the entity 100, or the event sensing device 10, comprise the necessary communication capabilities in the form of e.g., functions, means, units, elements, etc., for performing the solution. Examples of other such means, units, elements and functions are: processors, memory, buffers, control logic, encoders, decoders, rate matchers, de-rate matchers, mapping units, multipliers, decision units, selecting units, switches, interleavers, de-interleavers, modulators, demodulators, inputs, outputs, antennas, amplifiers, receiver units, transmitter units, DSPs, trellis-coded modulation (TCM) encoder, TCM decoder, power supply units, power feeders, communication interfaces, communication protocols, etc. which are suitably arranged together for performing the solution.

Especially, the processor(s) of the entity 100, or the event sensing device 10 may comprise, e.g., one or more instances of a CPU, a GPU, aNPU, a processing unit, a processing circuit, a processor, an ASIC, a microprocessor, or other processing logic that may interpret and execute instructions. The expression “processor” may thus represent a processing circuitry comprising a plurality of processing circuits, such as, e.g., any, some or all of the ones mentioned above. The processing circuitry may further perform data processing functions for inputting, outputting, and processing of data comprising data buffering and device control functions, such as call processing control, user interface control, or the like.