Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
ADAPTIVE READOUT CIRCUIT FOR TDC-BASED TIME-OF-FLIGHT (TOF) LIDAR
Document Type and Number:
WIPO Patent Application WO/2024/081331
Kind Code:
A1
Abstract:
Various methods and systems are disclosed to improve detection probability of time of flight lidar systems by adaptively controlling the characteristics of a filter that filters signals received from a sensor of a detection system of the lidar, prior to event detection. A lidar system can generate background signals for individual pixels of the sensor and use the background signals to adaptively control the filter to improve the detection probability and reduce the false alarm rate of the lidar system.

Inventors:
FU GENG (US)
WANG TING (US)
ZHOU YONG (US)
Application Number:
PCT/US2023/034969
Publication Date:
April 18, 2024
Filing Date:
October 11, 2023
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
MOTIONAL AD LLC (US)
International Classes:
G01S7/486; G01S7/4861; G01S7/4865; G01S7/487; G01S7/497; G01S17/931
Foreign References:
US20190250257A12019-08-15
US20220196799A12022-06-23
US203262633791P
US195562633796P
US202117444956A2021-08-12
US202117443433A2021-07-26
Other References:
FISH A ET AL: "Active Pixel Sensor Design : From Pixels to Systems", 1 January 2004, 4. CMOS IMAGERS: FROM PHOTOTRANSDUCTION TO IMAGE PROCESSING, KLUWER ACADEMIC, BOSTON [U.A.], PAGE(S) 99 - 139, ISBN: 978-1-4020-7962-7, XP002686037
Attorney, Agent or Firm:
ALTMAN, Daniel, E. (US)
Download PDF:
Claims:
WHAT IS CLAIMED IS: 1. A range finding system comprising: a detection subsystem that receives light from an environment and generates an event signal based at least in part on the received light, the detection subsystem comprising: a sensor that generates a sensor signal based on the received light; a filter that receives the sensor signal and generates a filtered sensor signal by filtering the sensor signal based on a transfer function of the filter; a background noise circuit that generates a background signal based on the received light, wherein the background signal indicates a background noise level; a readout control circuit that controls the transfer function based at least in part on the background signal, to maintain a probability of detecting a true event above a threshold probability; and an event detection circuit that generates the event signal based on the filtered sensor signal and a detection threshold level. 2. The range finding system of claim 1, wherein detecting the true event comprises detecting a reflected optical signal resulting from an optical probe signal emitted by the range finding system being reflected from within the environment. 3. The range finding system of any of claims 1-2, wherein the readout control circuit further controls the transfer function based at least in part on a prior sensor signal generated by the sensor before generation of the sensor signal. 4. The range finding system of any of claims 1-3, wherein the event detection circuit comprises: a comparator that generates a timing signal using at least the filtered sensor signal and the detection threshold level, wherein the timing signal indicates a delay between an emission time of a probe signal emitted by the range finding system, and detection of an event; and a time-to-digital converter (TDC) that receives the timing signal and generates a digital timing signal. 5. The range finding system of any of claims 1-4, wherein the readout control circuit controls the detection threshold level based at least in part on the background signal. 6. The range finding system of any of claims 1-5, wherein the readout control circuit compares the background signal with a prior background signal to determine a change in a level of background noise received by the detection subsystem. 7. The range finding system of claim 6, wherein a field of view of the range finding system is narrow and in response to determining that the background noise level is decreased, the readout control circuit decreases a bandwidth of the filter. 8. A method comprising: generating, by at least one processor, a sensor signal using a sensor and based on light received from an environment; receiving, by the at least one processor, the sensor signal and generating a filtered sensor signal using a filter and based on a transfer function; generating, by the at least one processor, a background signal based on the received light, wherein the background signal indicates a background noise level; controlling, by the at least one processor, the transfer function based at least in part on the background signal, to maintain a probability of detecting a true event, above a threshold probability; and generating, by the at least one processor, the event signal based on the filtered sensor signal and a detection threshold level. 9. The method of claim 8, wherein detecting the true event comprises detecting the true even by detecting a reflected optical signal resulting from an optical probe signal emitted by a range finding system being reflected from within the environment.

10. The method of any of claims 8-9, wherein controlling the transfer function further comprised controlling the transfer function based at least in part on a prior sensor signal generated by the sensor before generation of the sensor signal. 11. The method of any of claims 8-10, further comprising: generating, by the at least one processor, a timing signal using at least the filtered sensor signal and the detection threshold level, wherein the timing signal indicates a delay between an emission time of a probe signal emitted by a range finding system, and detection of an event; and generating, by the at least one processor, a digital timing signal using the timing signal. 12. The method of any of claims 8-11, further comprising controlling the detection threshold level based at least in part on the background signal. 13. The method of claim 12, wherein controlling the detection threshold level further comprises controlling the detection threshold level based at least in part on a prior sensor signal generated by the sensor before generating the sensor signal. 14. The method of any of claims 8-13, further comprising, by the at least one processor, comparing the background signal with a prior background signal to determine a change in a level of background noise received by a detection subsystem. 15. The method of claim 14, further comprising, by the at least one processor, decreasing a bandwidth of the filter, in response to determining that the background noise level is decreased. 16. The method of claim 15, further comprising, by the at least one processor, reducing the detection threshold level. 17. At least one non-transitory storage media storing machine-executable instructions that, when executed by at least one processor, cause the at least one processor to: generate a sensor signal using a sensor and based on light received from an environment; receive the sensor signal and generating a filtered sensor signal using a filter and based on a transfer function; generate a background signal based on the received light, wherein the background signal indicates a background noise level; control the transfer function based at least in part on the background signal, to maintain a probability of detecting a true event, above a threshold probability; and generate an event signal based on the filtered sensor signal and a detection threshold level. 18. The at least one non-transitory storage media of claim 17, wherein detecting the true event comprises detecting the true event by detecting a reflected optical signal resulting from an optical probe signal emitted by a range finding system being reflected from within the environment. 19. The at least one non-transitory storage media of any of claims 17-18, wherein the machine-executable instructions further cause the at least one processor to compare the background signal with a prior background signal and determine a change in a level of background noise received by a detection subsystem. 20. The at least one non-transitory storage media of claim 19, wherein the machine- executable instructions further cause the at least one processor to decrease a bandwidth of the filter, in response to determining that the background noise level is decreased.

Description:
MOTN.089WO/ I2022026 PATENT ADAPTIVE READOUT CIRCUIT FOR TDC-BASED TIME-OF-FLIGHT (TOF) LIDAR INCORPORATION BY REFERENCE TO ANY PRIORITY APPLICATIONS [0001] This application claims the priority benefit of U.S. Patent Prov. App. 63/379132, entitled LIDAR SYSTEM AND METHOD FOR ADAPTIVE DETECTION AND EMISSION CONTROL, filed October 11, 2022, and U.S. Patent Prov. App. 63/379655, entitled ADAPTIVE READOUT CIRCUIT FOR TDC-BASED TIME-OF-FLIGHT (TOF) LIDAR, filed October 14, 2022. Each of the above-noted applications is incorporated herein by reference in its entirety. BRIEF DESCRIPTION OF THE DRAWINGS [0002] FIG 1A is a diagram illustrating a Light Detection and Ranging (lidar) system that detects objects in an environment by emitting optical probe beams and receiving the respective reflected optical beams. [0003] FIG 1B is a diagram illustrating interference between two Light Detection lidar systems operating in an environment. [0004] FIG 1C is a diagram illustrating three different lidar signal coding schemes for coding optical probe signals in time, amplitude, or wavelength domain. [0005] FIG 2A is a diagram illustrating a scanning lidar system. [0006] FIG.2B is a diagram illustrating a flash lidar system. [0007] FIG.2C is a diagram illustrating a mechanical lidar system. [0008] FIG.3 is a block diagram illustrating an example lidar detection system. [0009] FIG.4A is a diagram illustrating an example sensor used in a lidar detection system and a close-up view of a pixel of the plurality of pixels included in the sensor. [0010] FIG. 4B is a diagram illustrating an example sensor of a lidar detection system that includes a reference subpixel. [0011] FIG.5A is a diagram illustrating an example signal and noise photocurrents generated by a pixel of a lidar sensor and a threshold level used to generate a return signal based on time-to-digital conversion. [0012] FIG.5B is a diagram illustrating an example signal and noise photocurrents generated by a pixel of a lidar sensor, and the threshold level and sampling pulses used to generate a digital return signal based on analog-to-digital (A-to-D) conversion. [0013] FIG. 6 is a block diagram illustrating another example lidar detection system that includes a detection control system and an event validation circuit. [0014] FIG. 7 is a diagram illustrating signal and noise photocurrents generated during a measurement time interval that includes several measurement time windows. [0015] FIG. 8 is a block diagram illustrating an example of a dynamically controlled lidar detection system. [0016] FIG.9 is a diagram illustrating an example spatial optical filter for filtering light received by the optical system of a lidar detection system. [0017] FIG.10 is a flow diagram illustrating an example of a process implemented by a processor of the readout circuit to generate return signals and background signals. [0018] FIG.11 is a flow diagram illustrating an example of a process implemented by a processor of the detection control system to reduce the false alarm rate (FAR) of the lidar detection system by controlling the optical system, the sensor, and/or the readout circuit. [0019] FIG.12 is a flow diagram illustrating an example of a process implemented by a processor of the event validation circuit for generating a confidence signal. [0020] FIG. 13 is an example environment in which a vehicle including one or more components of an autonomous system can be implemented. [0021] FIG. 14 is a diagram of one or more systems of a vehicle including an autonomous system. [0022] FIG. 15 is a diagram of components of one or more devices and/or one or more systems of FIGs.13 and 14. [0023] FIG.16 is a diagram of certain components of an autonomous system. [0024] FIG. 17 is a diagram illustrating a Light Detection and Ranging (lidar) system with adaptive emission control. [0025] FIG. 18A is a diagram illustrating an example pulse sequence emitted by the lidar system shown in FIG .17. [0026] FIG.18B is a diagram illustrating another example pulse sequence emitted by the lidar system shown in FIG .17. [0027] FIG. 19A is a diagram illustrating a series of adaptively controlled optical probe signals and the respective sensor signals having background noise levels. [0028] FIG. 19B is a diagram illustrating another series of adaptively controlled optical probe signals and the respective sensor signals having varying amplitudes. [0029] FIG.20A is a diagram illustrating a series of sensor signals having varying background noise levels, measured using adaptive detection threshold levels. [0030] FIG.20B is a diagram illustrating a series of sensor signals having varying amplitudes, measured using adaptive detection threshold levels. [0031] FIG.21 is a flow diagram illustrating an example of a process implemented by the lidar system shown in FIG .17 for adaptively controlling the characteristics of the probe signals. [0032] FIG.22 illustrates a block diagram of an example lidar detection system and generation of two sensor signals by the lidar detection system upon receiving reflected optical signals from an object at two different orientations. [0033] FIG. 23 is a block diagram illustrating the readout system of a lidar with adaptive filter and threshold control. [0034] FIG.24A-24B illustrate two subsequent sensor signals output by a filter of a readout system that receives light from narrow field of view (FOV) slices, when the filter is not controlled (A) and when the filter is adaptively controlled (B). [0035] FIG.24C-24D illustrate two subsequent sensor signals output by a filter of a readout system that receives light from a wide FOV, when the filter is not controlled (C), and when the filter is adaptively controlled (D). [0036] FIG. 25 is a block diagram illustrating an example of lidar system with adaptive readout circuit control. [0037] FIG.26 illustrating an example of a filter and control circuit used to control the filter based on background signal. [0038] FIG.27 is a flow diagram illustrating an example of a process implemented by the lidar system shown in FIGs. 23 and 24 for adaptively controlling the characteristics of a filter used to tailor the sensor signals before event detection. DETAILED DESCRIPTION Lidar System Overview [0039] Autonomous self-driving vehicles preferably include highly accurate and reliable sensors for detecting objects and calculating their distances from the vehicle. Among various technologies developed for object detection and ranging, laser-based range finders are often used for autonomous driving systems due to their high resolution and accuracy. Laser based range finders or laser range finders are sometimes called Light Detection and Ranging (lidar) or Laser Detection and Ranging (ladar). The acronyms “lidar” and “ladar” can be used interchangeably to refer to an optical system that detects objects using laser light. [0040] Lidar systems use light beams (e.g., laser beams) to detect objects in the environment surrounding the lidar and determine their distances from the lidar. In some cases, a lidar also determines the velocity of an object with respect the lidar or determine optical characteristics (e.g., surface reflectivity) of an object. High resolution (e.g., high spatial resolution for detecting objects) and high accuracy of lidar systems have made them preferred sensors for many applications. In particular, lidar systems are used in various autonomous driving systems for continuous scanning of the surrounding environment of a vehicle to avoid collision between the vehicle and objects in the environment. A lidar system detects objects by sending optical probe beams (also referred to as lidar beams) to the environment and detecting the respective optical reflections off of the objects in the environment. A detection system of the lidar generates a return signal indicative of detection of a portion of an optical probe beam reflected by an object in the environment. [0041] In addition to the reflections of the optical probe beams emitted by the lidar, in some cases, the detection system of the lidar receives light generated by other sources different from the lidar (e.g., other lidars, sun, traffic lights, cars, ambient light, and the like). In some such cases, light generated by the other sources (herein referred to as background light or environmental light) interferes with the detection of the reflections of the optical probe signals and increase the noise level of the detection system. As such, the presence of background light can result in inaccurate object detection and cause safety concerns for a system (e.g., a vehicle) that used the lidar for detecting objects. [0042] The disclosed methods and systems address problems associated with reception of background light by the detection system of the lidar, for example, by identifying a portion of detected light associated the background light, and using the corresponding data/information to modify the detection system or to determine the reliability of return signals. In various embodiments, the information as to background light can be used to reduce a false alarm rate of the lidar and/or provide information usable to assess the reliability of the return signals. In one embodiment, information as to background light is used to modify the detection system to improve the signal-to-noise ratio (SNR) of the detection system. For example, information as to background light can be used to adjust (e.g., dynamically adjust) a threshold level used for distinguishing reflection of light emitted by the lidar (e.g., reflected by an object), from background light. In some other examples, other parameters of the lidar can be dynamically controlled to reduce background light and improve the signal-to-noise ratio of the detection system. In another embodiment, information as to background light is used to determine a confidence value for a return signal, providing an objective measurement as to reliability of corresponding detection event. [0043] The designs, systems, and method described below could be incorporated into such various type of autonomous vehicles and self-driving cars for examples those disclosed in U.S. Patent Application Publication Nos 17/444,956, entitled “END-TO-END SYSTEM TRAINING USING FUSED IMAGES” and filed August 12, 2021, and 17/443,433, entitled “VEHICLE LOCATION USING COMBINED INPUTS OF REDUNDANT LOCALIZATION PIPELINES” and filed July 26, 2021, the entire contents of which are incorporated by reference herein and made a part of this specification. [0044] In some cases, an optical probe beam emitted by a lidar system includes an optical probe signal. In some such cases, the lidar detects an object and determine a distance between the object and the lidar by illuminating the object with an optical probe signal and measuring a delay between emission of the optical probe signal and reception of the corresponding reflected optical signal from the object. In some cases, the incident optical probe signal includes a temporal variation of an optical property (e.g., amplitude, phases, frequency, polarization) of a laser beam emitted by the lidar. In some cases, the incident optical probe signal can include laser pulses, which are coded using, for example, a temporal, amplitude, phase, or polarization coding scheme. For example, the optical probe signal can be a single laser pulse or pulse train and the lidar can determine the distance from the object by measuring a delay or time-of-flight (ToF) between the transmission of one or more incident laser pulses, and reception of the corresponding reflected laser pulses. In some cases, lidars that determine the distance from the objects based on the time-of-flight of a laser pulse is referred to as ToF lidars. A ToF lidar can also determine the optical reflectivity of the object surface using the reflected laser pulses. In some cases, a ToF lidar can generate return signals usable to determine a position of an object and the reflectivity of the object surface. [0045] In some applications (e.g., to control and guide an autonomous vehicle in a complex driving environment), a lidar can continuously scan an environment (e.g., environment surrounding the vehicle) with a relatively high scanning speed to capture the changes in the position of the objects in the environment. For example, the lidar can scan the surrounding environment by rotating one or more optical probe beams (e.g., laser beams) around a rotational axis while scanning the direction of propagation of the laser beams in a plane parallel to the rotational axis. [0046] The reliability of the lidar system depends on the accuracy of the optical detection process, which can be affected by the amount of noise received by or generated in the detection system of the lidar. In some cases, noise (e.g., external noise) increases the probability of false alarms or invalid detection events. Various sources of noise can interfere with the optical detection process and degrade the performance of the lidar by increasing the probability of missing light associated with reflection of optical probe beams received by the lidar system, or falsely identifying detected light generated by other sources as reflection of an optical probe beam emitted by the lidar system (a false alarm or invalid event). Any light that is not associated with an optical probe signal emitted by the lidar but that is received and detected by the lidar system is considered optical noise and can reduce the accuracy and reliability of the return signals and/or increase the false alarm rate of a lidar system. In some cases, optical noise can be associated with background light that is generated by light sources (e.g., sun, vehicles, streetlights, and the like), in the environment surrounding the lidar. Given the high optical sensitivity of the lidar detection system, even low level light emitted by a source in an environment scanned by the lidar can be detected by the lidar detection system and interfere with the detection and range finding operation of the lidar. The detection probability, accuracy, and false alarm rate (FAR) of a lidar system can be limited by the signal- to-noise (SNR) ratio of the detection system of the lidar. [0047] In some cases, the performance of a lidar, in particular the long-range detection capability of the lidar (e.g., detecting objects at distances larger than 200 meters), is determined by the signal-to-noise-ratio (SNR) of the lidar return signal. In various implementations, noise associated with background light (light not associated with the optical probe beams emitted by the lidar) that reaches the sensor element of lidar detection system can be the dominant noise in a lidar system. Sources of background light include but are not limited to: sun, vehicles moving around the lidar, streetlights, light generated by other lidar systems, light generated by other sources and reflected by objects in the environment, and the like. [0048] As such, there is a need for methods and systems that can improve the reliability of lidar systems in the presence of background light in an environment monitored by the lidar systems. The systems and methods disclosed herein can be used to reduce the impact of background light on the detection and range finding function of various lidar systems. In some cases, the lidar detection system measures background light and generate real-time background signals for one or more light sensing elements (e.g., pixels of a sensor) of a plurality light sensing elements of the lidar detection system. A background signal indicates a level or an amount of background light received by the corresponding light sensing element. In some cases, the background signals are used to reduce a false alarm rate (FAR) of the lidar, for example by identifying and reducing the noise associated with background light in real-time and thereby reducing the signal-to-noise ratio of the detected signal. For example, the background signals can be used to control the optical system, the sensor and/or the readout circuit of the lidar, and thereby improving the signal-to-noise ratio of detected signal. In some cases, background signals are used to provide an indication of level background noise present during a measurement. [0049] In some cases, a lidar detection system includes a detection control system that uses the background signals to increase the signal-to-noise ratio (SNR) of the signals generated by a lidar sensor (sensor signals) and/or lidar return signals generated by the lidar by dynamically controlling one or more subsystems of the lidar system (e.g., subsystems in the lidar detection system). In some cases, the dynamic control includes adjusting a parameter of one or more subsystems based on the background signals to reduce the FAR of the lidar. In some such cases, the dynamic control includes adjusting a parameter of one or more subsystems based on the background signals to reduce the contribution of background light to the photocurrents and/or return signals generated by the detection system of the lidar of by a subset of elements in the detection system. For example, background signals can be used to dynamically change the collecting FOV of the optical system, configuration of the sensors, and/or parameters of a readout circuit (e.g., a true event validation threshold level). [0050] In some examples, the detection control system controls a parameter of an optical system of the detection system (e.g., the size of collecting FOV) to reduce the amount of background light received from the environment or reduce the portion of received background light directed to the detection system, where the sensor is configured to convert received light to photocurrents. In some cases, the detection control system controls a parameter of the sensor to reduce a portion of photocurrents generated by the background light received by the sensor from the optical system. In some examples, the detection control system changes the active area of the sensor to control the background light contribution on a sensor signal. [0051] In some cases, the detection control system controls a parameter of a readout system of the detection system that generates return signals upon receiving sensor signals from the sensor. For example, some of the systems described herein dynamically control a readout threshold (e.g., true event validation threshold) of the readout system to improve detection probability while maintaining the false alarm rate (FAR) below a set FAR limit. In some cases, the method described herein dynamically controls a readout threshold (e.g., true event validation threshold) of the readout system to reduce the FAR below a threshold value. In some cases, the readout threshold can be a threshold level used to identify a true event (detection of reflected light associated with an optical probe signal emitted by the lidar), based on a sensor signal. In some cases, the FAR includes a rate of generation of return signals that are not associated with a reflection of an optical probe signal emitted by the lidar. In some examples, a readout threshold includes a detection threshold. [0052] As the distance and orientation of an object changes with respect to a lidar system (also referred as lidar), the amplitude of the corresponding sensor signals can change. As such, to maintain or improve the probability of detecting a true event following the change of the distance and/or orientation of the object, one or more parameters of the readout system can be adaptively adjusted in response to the change of the distance and/or orientation of the object with respect to the lidar. The lidar system can detect a change of a distance between the object an the lidar based on a change in a delay between transmission of a probe signal and reception of the corresponding reflected signal. In some cases, the lidar can detect an orientation and/or distance change based on a change in an intensity or an average intensity of background light received by the detection system and indicated by a background signal. [0053] In some implementations, the methods and systems described herein adaptively control a detection threshold (e.g., used to generate a timing signal) to improve detection probability while maintaining the false alarm rate (FAR) below a predefined or set FAR limit. The lidar system can control the detection threshold based at least in part on a background signal generated by the detection system. For example, the lidar system can identify a change in orientation of an object with respect to the lidar system using a background signal and /or a time of flight, and adjust the detection threshold based at least in part on an amplitude or an average amplitude of the background signal to improve a probability of detecting subsequent reflected signals. [0054] In some cases, the detection system includes a filter that tailors a spectrum of the sensor signals generated by a sensor upon receiving reflected light, before event detection. In these cases, the filter receives the sensor signals and generates filtered sensor signals having a spectrum tailored according to a transfer function of the filter. In some examples, the filter can be an adaptive filter whose transfer function is adaptively controlled based at least in part on a background signal. Alternatively or additionally, the transfer function can be adaptively controlled and/or adjusted based on an intensity of a reflected light received from an object. The lidar system (e.g., the detection system of the lidar system) can indirectly determine or estimate the intensity of the reflected light based on a measured time-of-flight indicative of a distance from the object, or by directly measuring an amplitude of a corresponding sensor signal. [0055] In some implementations, a lidar system adaptively controls the transfer function of a filter that filters the sensor signals and a detection threshold used for event detection, to maintain or improve the probability of detection when the reflected signals change due to a change in the distance, reflectivity, and or orientation of objects in the environment. [0056] Additionally, some of the lidar detection systems described below use the background signals to generate a confidence signal indicative of the reliability of a return signal generated by the detection system of a lidar. In some embodiments, the lidar detection systems described below use the information of both the “true” event detected and real-time background signals to generate a confidence signal indicative of the reliability of a return signal generated by the detection system of a lidar. For example, the lidar detection system can use the real-time background signals associated with sensor signals received by the lidar sensor to estimate a real-time false alarm rate (FAR) and generate the confidence signal as the supplement of the corresponding return signal. The return signal can indicate a three dimensional (3D) position and, in some cases, surface reflectivity of a detected object and the corresponding confidence signal indicates a level of confidence for the 3D position and the surface reflectivity indicated by the return signals. In some cases, the lidar detection systems described below use a detected true event and estimated FAR to generate a confidence signal. [0057] In some cases, a lidar detection system includes a detection control system for dynamic control of one or more systems of the detection system based on feedback indicative of a noise level (e.g., background noise level). In some cases, a lidar detection system having a detection control system does not generate confidence signals for the return signals. In some cases, a lidar detection system that generates confidence signals does not include a detection control system. In other embodiments, a lidar detection system is dynamically controlled by a detection control system and generates confidence signals for at least a portion of the return signals. [0058] Advantageously, some of the methods described herein can be implemented without modifying a conventional lidar detection system at a hardware level. For example, some of the methods described below can be implemented by reconfiguring the signal processing paths and procedures in a readout circuit of a lidar system (e.g., at software level). In some cases, the implementation can include reprograming a reconfigurable circuit (e.g., a field-programmable gate array) included in the detection system. [0059] In some cases, the different types of noise (e.g., background noise) reduce the accuracy of the detection process by increasing the probability of falsely identifying a reflected optical signal generated by a first optical probe signal as the reflection of a second probe signal different from the first probe signal. In some examples, the first probe signal can be a probe signal emitted by another lidar system, or another light source. In some cases, the first probe signal can be a probe signal by a channel of the lidar system different from a channel that emitted the second probe signal. In some cases, the first probe signal can be a probe signal emitted along a different direction compared to a direction along which the second probe signal is emitted. In some implementations, the first and the second probe signals can be coded probe signals (e.g., pulse coded probe signals) and the noise diminishes the efficacy of the detection system in distinguishing a first coded probe signal from a second coded probe signal. [0060] In some cases, the lidar system preserves or improve the efficacy of the detection system in distinguishing the first coded signal from the second coded signal by changing (e.g., adaptively changing) the characteristics of the coded probe signals emitted by the lidar and/or changing the detection threshold level (e.g., the readout threshold) for measuring the sensor signals and generating the return signals. Some of the systems and methods disclosed herein can be used by a lidar system to improve the probability of detection by adaptively controlling the characteristics of the probe signals and/or detection threshold level based on variations of a noise level (e.g., background noise level), signal-to-noise ratio, and/or amplitude of reflected signals received by the lidar system. Lidar system operation [0061] FIG. 1A shows an example of a lidar system 100 that detects objects in an environment surrounding the lidar system 100 and determines distances between the objects and the lidar system 100. In some examples, the lidar system 100 determines a three- dimensional (3D) position, and/or reflectivity of an object in the environment. In some cases, the lidar system 100 additionally determines a velocity of an object, e.g., relative to the lidar system. The lidar system 100 includes a lidar emission system 102 (referred to as emission system 102) that emits optical probe beams, and a lidar detection system 104 (referred to as detection system 104) that receives the reflected optical beams and generates return signals. In some cases, the lidar system 100 determines a reflectivity of an object surface, e.g., detected signal intensity with distance and optical correction. [0062] The lidar system 100 detect an object 110 by emitting an optical probe beam 108 (e.g., a pulsed laser beam) and receiving a reflected optical beam 112 corresponding to a reflection of the optical probe beam 108. In some cases, the optical probe beam 108 includes one or more optical probe signals (e.g., optical pulses) and the reflected optical beam 112 includes one or more reflected optical signals (e.g., reflected optical pulses). An optical probe signal can include a temporal variation of an optical property (e.g., amplitude, phase, or frequency) of the corresponding optical probe beam. In some examples, an optical probe signal includes a pulse train. The lidar system 100 further includes a detection system 104, and a lidar signal processing system 106. The emission system 102 emits the optical probe beam 108 toward the object 110, and the detection system 104 receives the reflected optical beam 112. In some examples, the optical probe beam 108 includes an optical signal (e.g., an optical pulse) at an emission time (t1) and the reflected optical beam 112 includes a reflected optical signal received by the detection system 104. The detection system 104 determines an amplitude and an arrival time (t2) of the reflected optical signal. In some cases, the detection system determines a delay (t2-t1) between an emission time (t1) and the arrival time (t2). In some implementations, the detection system 104 generates one or more return signals 120, by converting reflected optical signals to electric signals (e.g., photocurrents or photovoltages), and using them to generate sensor signals. In various implementations, a sensor signal can be a digital or analog signal. In some cases, a return signal includes the electric signal or an amplified version of the electric signal. In some cases, the return signal indicates the arrival time (t2), the magnitude (e.g., power or intensity) of the reflected optical signal, and/or the delay (t2-t1) between the emission time (t1) and the arrival time (t2). The detection system 104 can include a plurality of sensing elements (e.g., pixels) that each generate a separate sensor signal. [0063] The lidar system 100 further includes a lidar signal processing system 106 that receives the return signals 120 and determines the presence of the object 110 in the environment and calculates a distance between the lidar system 100 and the object 110 based at least in part on the return signals 120. In some examples where the return signal indicates a delay (t2-t1) between the emission time (t1) and the arrival time (t2) the lidar signal processing system 106 can use the delay to calculate the distance between the lidar system 100 and the object 110. In some examples, where a return signal indicates the arrival time (t2), the lidar signal processing system 106 determines the delay between the emission time (t1) and arrival time (t2), and then use the delay to calculate the distance between the lidar system 100 and the object 110. In some examples, the Lidar signal processing system 106 can generate an output signal 124 including a distance between the lidar system 100 and the object 110 and/or a velocity of the object 110 with respect to the lidar system 100. [0064] In various implementations, the optical probe beam (108) can have a wavelength within an operating wavelength range of lidar system 100. In some cases, the operating wavelength range of the lidar is in the infrared (IR) wavelength range. In some cases, the operating wavelength range of the lidar is in the near-IR (NIR) wavelength range. In some such cases, the optical probe beam (108) has a wavelength from 800 nm to 1200 nm, or from 1200 nm to 1800 nm. In some cases, the detection system 104 has higher sensitivity for detecting light having a wavelength within the operating wavelength range of the lidar system 100. [0065] In some cases, the detection system 104 is configured to receive light or light beams propagating toward an entrance aperture of the detection system 104 along directions within a field of view (FOV) 122 of the detection system 104. In some examples, light beams that are incident on the entrance aperture of the detection system 104 and propagate along a direction within the FOV 122, is received by a sensor that generates an electric signal (a sensor signal) proportional with the power or intensity of the received light. As such, in addition to the reflected optical beam 112, the detection system 104 receives background light 118 that is not associated with a reflection of the optical probe beam 108 but propagates toward the detection system 104 along a direction within the FOV 122 of the lidar system 100. In some cases, background light 118 includes sun light, light associated with other lidar systems, light emitted by a moving vehicle, or constant light emitted by a static source (e.g., a streetlamp). The sensor can include one or more optical-to-electrical converters such as photodiodes (PDs), avalanche photodiodes (APDs), Silicon photomultipliers (SiPM), single-photon avalanche photodiodes (SPADs), or SPAD arrays. [0066] In some cases, the background light 118 received by the detection system 104 decreases a noise level of the detection system 104 and reduce a signal-to-noise ratio of a return signal or a signal generated by the sensor (also referred to as sensor signal). In some examples, the signal-to-noise ratio (SNR) can be defined as a ratio of a signal (e.g., associated with a return signal or a sensor signal) generated as a result of receiving the reflected optical beam 112, to noise associated at least partially with the background light 118. The sensor signals generated by the background light 118 is herein referred to as background noise. When the background light 118 received by the detection system 104 reduces the signal-to-noise ratio of the sensor signals, the accuracy and reliability of the corresponding return signals 120 generated using these sensor signals can be decreased. For example, when the amplitude of a signal generated by background light 118 is not significantly smaller than the amplitude of a signal generated by the reflected optical signal carried by the reflected optical beam 112, the detection system 104, cannot distinguish the signal associated with the reflected optical signal from the back ground noise associated with the background light, or can determine an erroneous arrival time (t2) different from the time at which the reflected optical signal is received by the detection system 104. In some cases, the background light 118 received by the detection system 104 increases a rate of generation of false return signals (herein referred to as false alarm rate or FAR), which are not associated with reflections of optical probe signals emitted by the lidar system 100. For example, an optical signal generated by a source different from the emission system 102 (e.g., sun, other vehicles, and the like) can be received by the detection system 104 and generate a sensor signal that is falsely identified by the detection system 104 as a reflected optical signal associated with an optical probe signal emitted by the lidar emission system. In some cases, the optical probe beam emitted by a lidar (e.g., optical probe beam 108) includes a narrow beam of light having low divergence. In some such cases, the divergence of the optical probe beam 108 can be less than 0.5 degrees, less than 2 degrees, less than 5 degrees, or less than 10 degrees. In some cases, the optical probe beam 108 includes a beam of light beam having a large degree of divergence. In some such cases, the divergence of the optical probe beam 108 can be larger than 10 degrees, larger than 20 degrees, or larger than 30 degrees. In various embodiments, a lidar system can move, scan, or rotate one or more optical probe beams over an azimuthal angular range with respect to a rotational axis of the lidar to scan an environment. In some cases, a detection system of a lidar can have a wide or a narrow field of view (FOV). In some cases, a wide field of view can have azimuthal and polar angular widths larger than 5 degrees, larger than 30 degrees, or larger than 90 degrees. In some cases, a narrow field of view can have azimuthal and polar angular widths smaller than 0.05 degrees, smaller than 0.2 degrees, or smaller than 2 degrees. Interference between lidars [0067] In some cases, multiple lidars operate in the same environment. In some such cases, optical probe beams emitted by a first lidar or reflections of the optical probe beams emitted by the first lidar is received by the detection system of a second lidar and interfere with the detection and range finding operation of the second lidar. In some examples, the first and the second lidars emit optical probe beams and signals having the same or similar optical and/or temporal characteristic. For example, the first and the second lidars can have similar or substantially identical operating wavelength ranges. In some cases, the optical probe beams emitted by the first and the second lidar have the wavelengths that are detectable by detection systems of the first and the second lidars. Thus, the detection systems of the second lidar cannot be capable of effectively distinguishing the reflected optical beams and the corresponding reflected optical signals associated with the first and the second lidars. In various implementations, interference between two lidars can degrade the signal-to-noise ratio (SNR) of the return signals generated by the one or both lidars. In some such cases, the mutual interference between two lidars increases their FAR. [0068] In some cases, an optical probe signal emitted by a lidar generates multiple reflected optical signals that are received by the lidar via different optical paths including a direct optical path from an object illuminated by the corresponding optical probe beam. In some such cases, the multiple reflected optical signals interfere with each other and generate multiple sensor signals. In some cases, one or more sensor signals can be falsely identified by the detection system as a reflected optical signal generated by a reflected optical beam received by the detection system from the object via straight optical path. As such, the detection system can generate return signals falsely indicating the presence of multiple objects at artificial distances different from the actual distance between the object and the lidar. [0069] In some examples, lidar detection system 104 generates a confidence signal for one or more return signals (associated with one or more detection events) generated by the detection system 104. In some cases, when the background light received by a lidar is associated with light generated by another lidar system, the confidence signal can indicate a probability that the one or more return signals are generated by reflections of the optical probe beams emitted by the lidar and not the background light. In some cases, when multiple reflections of an optical probe beam emitted by the lidar are received by the detection system of the lidar, the confidence signal can indicate a probability that a return signal is generated by reflection of the optical probe beam that is received directly (via a straight optical path) from the object that was illuminated by the optical probe beam. In some cases, the lidar detection system 104 generates a confidence signal indicative of the false alarm rate at a sensor output. [0070] In some examples, the optical probe signals emitted by the lidar include coded optical probe signals. A coded optical probe signal associated with an optical probe beam can include two or more optical pulses having specific optical characteristics relative to each other making the optical probe signal and the resulting reflected optical signals recognizable from other optical signals associated with other optical probe beams emitted by the same lidar, by other lidars, or by other optical systems that emit optical signals (e.g., optical signals having temporal characteristics close to that of the optical probe signals emitted by the lidar). In some cases, a first optical probe signal associated with a first optical beam corresponding to a first scanned field of view (FOV) of a lidar can be coded using a first code to distinguish the first optical probe signal from a second optical probe signal coded with a second code different from the first code, where the second optical probe signal is associated with associated a second optical beam corresponding with a second scanned FOV. In some examples, the readout for a first pixel or a first group of pixels of the lidar sensor can be configured to detect sensor signals associated with optical probe signals coded using a first code and the readout for a second pixel or a second group of pixels can be configured to detect sensor signals associated with optical probe signals coded using a second code different that than the first code (pixel based coding scheme). [0071] While coding of the optical probe signals can provide a first level of protection against interference with lights emitted by other lidars in the environment, it can not be sufficient to eliminate interference or reduce its impact on the lidar performance below a desired level. In some examples, the detection system 104 generates confidence signals indicating a probability of a return signal to be a false return signal associated with interference with another lidar. As such, confidence signals can be used by the lidar system (or another system that receives the return signals), to avoid using false return signals, that are not eliminated using on the coding technique, for determining the presence of objects and their distance from the lidar. [0072] FIG.1B shows a first lidar system 130 and a second lidar system 134 (e.g., a first lidar system that is the same as, or similar to, lidar system 100 of FIG.1A) scanning the same environment. In the example shown, the first optical probe signal 131 is reflected by an object 110 and the corresponding reflected optical signal 132 is directly received by the first lidar system 130. In some cases, the first optical probe signal 131 is a coded optical probe signal (e.g., including two or more optical pulses having different amplitudes, delays, or frequencies). [0073] In some cases, the second lidar system 134 (e.g., a second lidar system that is the same as, or similar to, lidar system 100 of FIG. 1A) emits a second optical probe signal 135 that is received by the first lidar system 130, after being reflected by the object 110. Additionally, the second lidar system 134 emits a third optical probe signal 137 that is directly received by the first lidar system 130. In some cases, the detection system of the first lidar system 130 can determine the third optical probe signal 137 and the second reflected optical signal 136 do not match a code included in the first optical probe signal 131 and therefore does not generate any return signal based on these signals. [0074] In some other cases, despite coding of the first optical probe signal 131, the detection system of the first lidar system 130 can falsely identify the third optical probe signal 137 and the second reflected optical signal 136 generated by the second optical probe signal 135, as the reflections of the optical probe signals emitted by the first lidar system 130. In some such cases, the first lidar system 130 can generate return signals based on the third optical probe signal 137 and the second reflected optical signal 136 (this can be an example of detecting a true event with incorrect de-coding). Such return signals are examples of false events that can falsely indicate presence of an object, or indicate an incorrect distance between the first lidar system 130 and the object 110. In some such cases, the detection system of the first lidar system 130 can generate a confidence signal indicative of low probability of the return signal being associated with the optical probe signal 131. As such, the lidar signal processing system 106 (or system separate from the lidar) that receives the return signal and the confidence signal can discard the return signal. Pulse coded lidar probe signals [0075] As described above, in some cases a lidar can generate coded optical probe signals. In some cases, a coded optical probe signal emitted by a lidar can include two or more optical pulses sequentially emitted with a delay t_d. Such optical probe signal can be referred to as a “pulse coded optical signal”. For example, an optical probe signal can include a delayed optical pulse emitted t_d seconds after the emission an initial optical pulse. In some cases, the delay t_d between the two optical pulses, a ratio between the amplitude of the two optical pulses, a phase difference between the two optical pulses, or a frequency difference between the two optical pulses can be used to as a unique identifier for identifying the optical probe signals emitted by a specific lidar. In some such cases, this unique identifier can be used by a lidar to distinguish the received optical signals associated with reflections of the optical probe signals emitted by the lidar, and the received optical signals associated with other optical probe signals emitted by other lidars. [0076] FIG. 1C shows three different pulse coding schemes that can be implemented by a first and a second lidar system (e.g., the first lidar system 130 and the second lidar system 134 shown in FIG. 1B), or a first scanned FOV and a second scanned FOV in a same lidar, to avoid interference between the optical probe signals or the corresponding reflections. In some cases, the lidar probe signals can be temporally encoded 140. For example, the first optical probe signal emitted by the first lidar system 130 can include a delayed pulse 146b emitted after a first delay t_d1 with respect to an initial pulse 146a, while the second optical probe signal emitted by the second lidar system 134 can include a delayed pulse 147b emitted after a second delay t_d2 with respect to an initial pulse 147a. [0077] In some cases, the lidar probe signals can be power encoded 142. For example, the first optical probe signal emitted by the first lidar system 130 can include an initial pulse 148a having an optical power P_L lower than the optical power of a delayed pulse 148b having a high optical power P_H and emitted after a delay with respect to the initial pulse, while the second optical probe signal emitted by the second lidar system 134 can include an initial pulse 149a having an optical power P_H higher than the optical power of a delayed pulse 149b having a low optical power P_L and emitted after with delay with respect to the initial pulse. [0078] In some cases, the lidar probe signals can be spectrally encoded 144. For example, the first optical probe signal emitted by the first lidar system 130 can include an initial pulse 150a having an optical frequency F_L lower than the optical frequency of a delayed pulse 150b having a high optical frequency F_H and emitted after a delay with respect to the initial pulse, while the second optical probe signal emitted by the second lidar system 134 can include an initial pulse 151a having an optical frequency F_H higher than the optical frequency of a delayed pulse 151b having a low optical frequency F_L and emitted after a delay with respect to the initial pulse. [0079] While depicted separately in FIG.1C, in some instances lidar probe signals can be coded according to a combination of any of temporal encoding, power encoding, and wavelength encoding. [0080] In various implementations, the effectiveness of the pulse-coding method described above for mitigating interference between different lidar systems (or other systems in the environment that can emit optical signals), can be reduced when a large number of sensors are employed, or a larger of optical signal are implemented in a short time interval, such as autonomous vehicle application cases. In various implementations, a lidar detection system can mitigate the impact of these interference by generating confidence signals for each detected event or object. In some other implementations, a detection system 104 can use coded optical probe signals and also generate confidence signal signals as a second layer of protection against FAR associated with interference. [0081] In various implementations, a confidence signal can be generated based on the background signals from individual sensing elements of the detection system of the lidar, where a background signal indicates a level of background light received by the corresponding sensing element. Different types of lidars [0082] FIG. 2A illustrates a scanning lidar system 201 that scans one or more narrow optical probe beams 206 over a field of view 214 (e.g., a wide field of view) of a detection system 216 of the scanning lidar system 201 and detects the corresponding reflected optical beams received through the field of view 214. In some cases, the scanning lidar system 201 can include an emission system 204 that scans the one or more light beams generated by a laser source 205 using an optical scanning system 207 (e.g., a rotating mirror). In some cases, the detection system 216 is configured to detect light received through the field of view 214 and generate return signals indicative of presence of one or more objects (e.g., vehicles, bicycle, pedestrian) within the field of view 214 of the scanning lidar system 201, and a distance between the objects and the scanning lidar system 201. For example, the emission system 204 can generate and steer an optical probe beam 206 within the field of view 214 to detect multiple objects located within the field of view 214 of the detection system 216. In the example shown, as the optical probe beam 206 rotates between different angular positions with respect to the emission system 204, it can be reflected by a first object 212a at a first angular position, by a second object 212b at a second angular position, and by a third object 212c at a third angular position. A portion of light reflected by each object 212a-c that propagates within the field of view 214 of the scanning lidar system 201, can reach the detection system 216 and generate one or more sensor signals. The scanning lidar system 201 can use the sensor signals generated by the light reflected by the first object 212a, the second object 212b, and the third object 212c to generate a first, second, and a third signal indicative of the presence of the objects in the field of view 214 and usable for estimating respective distances between the objects and the scanning lidar system 201. In some cases, the detection system 216 can have a stationary field of view 214. In some cases, the field of view 214 of the detection system 216 can be reconfigurable (e.g., by a control system of the detection system 216). In some examples, the detection system 216 can receive background light that is not associated with the reflection of the optical probe beam 206 by an object via the field of view 214. In some such cases, the background light can saturate the detection system 216 or decrease the signal to noise ratio of the sensor signals and the return signals. [0083] FIG. 2B illustrates a flash lidar system 202 that can use a single optical probe beam 222 (e.g., a highly divergent beam) generated by an emission system 220 of the flash lidar system 202 that generates the optical probe beam 222 to illuminate a field of view (e.g., a large field of view). The flash lidar system 202 can include a detection system that measures reflected portions of the optical probe received via different sections of the field of view using a two-dimensional (2D) array of detectors (e.g., pixels). The pixels and an optical system (e.g., one or more lenses) can be configured such that each pixel detects light received from a specific portion of the field of view (e.g., received from a specific direction). For example, optical probe beam 222 can illuminate a first 224a, second 224b, and a third 224c object and reflected light from each object can be received by a first detector (or first detector pixel) 226a, second detector (or second detector pixel) 226b, and third detector (or third detector pixel) 226c of a detector array. [0084] FIG. 2C illustrates a mechanical lidar system 203 that can use a single optical probe beam 232 (e.g., a low divergence optical beam) generated by an emission system 230 of the mechanical lidar system 203 that generates the optical probe beam 232 to illuminate a narrow field of view. In some cases, the optical probe beam 232 can include two or more beams. The mechanical lidar system can rotate the optical probe beam 232 to scan the environment. The mechanical lidar system 203 can include a detection system 236 that measures a reflection of the optical probe beam 232 received via a field of view of the detection system 236. The mechanical lidar system 203 can rotate the detection system 236 and the corresponding field of view together with the emission system 230 such that the optical probe beam 232 and its reflections are transmitted and received within a narrow angular width aimed toward an object. For example, at a first lidar orientation the optical probe beam 232 and the FOV of the detection system 236 can be directed to a first object 234a, and at a second lidar orientation the optical probe beam 232 and the FOV of the detection system 236 can be directed to a second object 234b. [0085] In various implementations, any of the lidar system described above can be a ToF lidar system. Various methods and systems described below can be implemented in any of the lidar systems described above to increase the signal-to-noise ratio of the return signals (or sensor signals), generate confidence signals indicative of a validity of the return signals, and/or reduce the false alarm rate of the lidar. Lidar detection system [0086] FIG.3 is a block diagram illustrating an example detection system 104 of a lidar system (lidar). In some cases, the detection system 104 can be the detection system of ToF lidar. In some embodiments, the detection system 104 (also referred to as “detection system”) can include an optical system 310, a sensor 320 that converts light to electric signals, and a readout system 330 (also referred to as readout processing circuit). The sensor 320 can include different types of elements for converting light to electric signals, e.g., Avalanche photodiodes (APD), Silicon Photo multipliers (SiPM), arrays of single-photon avalanche diodes (SPAD arrays), or other types. The optical system 310 can direct received light 305 (e.g., a reflected optical beam) received from the environment through the FOV of the detection system 104 toward the sensor 320. In some cases, the FOV of the optical system 310 can be the FOV of the detection system 104. The sensor 320 can have a plurality of elements (pixels), dedicated to different or same FOV. The sensor 320 can generate a plurality of sensor signals 323 upon receiving the sensor beam 315 from the optical system 310. The readout system 330 can receive the plurality of sensor signals 323 and generate a return signal 120 (also refer to as an “event signal”) indicative of an “event” (detection event). The return signal 120 can be usable for determining the presence of an object in the environment, estimating or determining reflectivity of a surface (e.g., surface of the object), and determining or estimating a distance between the lidar and the object. In various implementations a return signal can be a digital or an analog signal) indicative of the optical power and an arrival time of an optical signal (e.g., a reflected optical signal). [0087] In some cases, received light 305 can include light associated with one or more optical probe beams emitted by the lidar. In some examples, the optical system 310 can be configured to collect, transform, and redirect received light 305 to generate the sensor beam 315 that illuminates at least a region of the sensor 320. In some examples, the optical system 310 can include optical elements (e.g., controllable optical elements such as lenses, mirrors, prisms, and the like) that can be reconfigured to tailor the sensor beam 315 and thereby the illuminate selected regions of the sensor 320. For example, the sensor 320 can include a plurality of integrated micro-mirrors and micro-lenses that can be controlled using electric signals. In some cases, the controllable optical elements can allow controlling the FOV of the optical system 310 and/or the sensor beam 315. As such, in some cases, the optical system 310 can be used to select received light 305 and selectively direct a portion of received light 305 to a selected portion of the sensor 320. In some cases, the optical system 310 can transform a wavefront of the received light 305 to generate the sensor beam 315. [0088] The sensor 320 can include a plurality of pixels each configured to generate one or more sensor signals upon being illuminated by light received from the optical system 310. The optical system 310 can be reconfigured to direct all or a portion of the light received via its FOV on all or a portion of pixels of the sensor 320. In some implementations, the sensor 320 can generate a plurality of sensor signals 323 where each sensor signal of the plurality of sensor signals is generated by one or more pixels of the sensor 320. In some cases, a pixel can include plurality of microcells. In some cases, a pixel can include a plurality of sub-pixels where a sub-pixel includes two or more microcells. Each microcell can include an array of single-photon avalanche diode also known as a SPAD array. In some such cases, the sensor signal generated by the pixel can include a sum of the sensor signals generated by all or a portion of the microcells or subpixels of the pixel. In some cases, one or more microcells or subpixels can include an optical filter (e.g., a near-IR narrowband optical filter) that filters light received by the microcell or subpixel. Different microcells or subpixels can include optical filter having the same or different spectral response. [0089] In some cases, the pixels of the sensor 320 can be configured to detect low intensity light associated with reflection of an optical probe beam generated by the lidar. In some cases, a pixel can include a silicon photomultiplier (SiPM) and the corresponding sensor can be referred to as a SiPM-based sensor. A SiPM-based sensor can be configured as a single pixel sensor or an array of pixels. A microcell or a sub-pixel of a M-based sensor can include a SPAD. A SiPM-based sensor can be an optical detector that senses, times, and quantifies light (or optical) signals down to the single-photon level. In some examples, a SiPM-based sensor can include a combination of microcells and a photodiode (a reference photodiode). A SiPM pixel can include a plurality of microcells in an array that share a common output (e.g., anode and cathode). In an embodiment, each microcell is a series combination of single-photon avalanche photodiodes (SPADs) and a quenching circuit (e.g., resistor or transistor). In some examples, all microcells can be connected in parallel and detect photons independently. In some embodiments, SiPM-based sensor can include one or multiple SiPM-based pixels, which can detect photon or optical return signals independently. The quenching circuit can lower a reverse voltage applied to the SiPM to a value below its breakdown voltage, thus halting the avalanche of current. The SiPM then recharges back to a bias voltage, and is available to detect subsequent photons. In some cases, all of the subpixels included in a pixel, can be connected in parallel and detect photons independently. [0090] In some embodiments, the SiPM-based sensor operates by readouts both from the photodiodes and the microcells to produce dual output signal via two separate anodes. In some embodiments, the output of the SiPM-based sensor is a continuous analog output (e.g., current output). In this manner, the current output of the plurality of pixels of the SiPM can be received and processed in parallel (e.g., by a readout circuit). In some embodiments, the output of the SiPM-based sensor include individual pulses that are distinguishable and thus can be counted (e.g., digital output). The pulses output by the SiPM can counted to generate an output signal. The output of the SiPM-based sensor according to the present techniques enables the generation of signals within a dynamic range from a single photon to hundreds and thousands of photons detected by the micro-cells and the photodiodes. In some cases, a micro-cell and a reference photodiode can have different optical filters, e.g., the photodiode having broadband filter and the microcell having a narrow band filter (e.g., a narrow band near-infrared filter). [0091] FIG.4A is a diagram illustrating an example sensor 320 of a lidar detection system (e.g., detection system 104) and a close-up view of a pixel 410 (e.g., SiPM pixel) of the plurality of pixels that can be included in the sensor 320. As described above, the pixel 410 can include a plurality of microcells, or subpixels where a subpixel includes two or more microcells (e.g., interconnected microcells). In some cases, the plurality of microcells or subpixels can be of the same or different types. For example, a pixel can include one or more photodiode (PD) type subpixels and one or more microcells or subpixels including SPADs. In some examples, a microcell or subpixel can include a filter configured to allow light having a wavelength within a passband of the filter to be detected by the microcell or subpixel while rejecting light having a wavelength outside of the passband. [0092] In some cases, one or more subpixels (e.g., SPAD type or PD type) of the sensor can include a broadband filter that allows light having a wavelength range outside of the operating wavelength range of the lidar, to be detected. In some such cases, the one or more subpixels that include a broadband filter can be used to measure background light received by the optical system 310. In these cases, the one or more subpixels can be referred to as reference subpixels. FIG.4B is a diagram illustrating an example sensor of a lidar detection system (e.g., detection system 104) that includes a reference subpixel 432. In some cases the reference subpixel 432 can be a photodiode (PD) and the other microcells or subpixels (e.g., microcell 420), can be SPADs. In some embodiments, the pixel 430 can produce dual outputs including an output signal generated by the reference subpixel 432. In these embodiments, the pixel 430 can have a first anode 434 connected to the reference subpixel 432 and a second anode 436 outputting signals associated with all or a portion of other microcells, or subpixels. In some cases, the pixel 430 can have a cathode 438 shared between all microcells and the reference subpixel. [0093] In various embodiments, a pixel can be a SPAD array. One or more SPADs (e.g., gate, transfer, or control transistors), can provide individual sensor signals to the readout system 330. In some examples, one or more SPADs, can provide a single combined sensor signal to the readout system 330. In some cases, the sensor signals generated by the reference subpixels that include broadband filters can be individually provided to the readout system 330. In some such cases, the photocurrents generated by the reference subpixels can be combined (e.g., summed) and provided to the readout system 330 as a single signal. [0094] In some cases, a reference subpixel can include an optical filter that rejects received light having a wavelength within an operating wavelength range of the lidar or within the passband of optical filters included in other subpixels or microcells. In some cases, a reference subpixel can include a broadband optical filter that allows light having a wavelength within and outside an operating wavelength range of the lidar to reach the microcells of the reference subpixel. [0095] In some cases, a pixel can include a broadband optical filter that filters light received by all of its microcells and subpixels. In some such cases, such pixel can be used as a reference pixel for measuring the background light. [0096] In some embodiments, the sensor signals 323 can be a continuous analog output (e.g., current output). In some embodiments, the sensor signals 323 can include individual pulses that are distinguishable and thus can be counted. The output of the sensor 320 according to the present techniques can enable the generation of output signals within a dynamic range from a single photon to hundreds and thousands of photons detected by the pixel. [0097] In some cases, a broadband optical filter included in a reference sub-pixel or a reference pixel can transmit a spectral portion of sun light that is rejected by bandpass filters included in other subpixels or pixels in the sensor. [0098] In some embodiments, all microcells of the pixel 410 or pixel 430 can receive light from the same portion of the FOV of the optical system 310. In some cases, two microcells or two groups of microcells can receive light from two different portions of the FOV of the optical system 310. In some embodiments, all pixels of the sensor 320 can receive light from the same portion of the FOV of the optical system 310. In some cases, two pixels or two groups of pixels can receive light from two different portions of the FOV of the optical system 310. [0099] In various implementations, bias voltages applied on the microcells or subpixels of a pixel can be controlled individually. For example, one or more microcell or subpixels can be biased at higher or lower voltage compared to the other microcell or subpixels. In some cases, the individual output signals received from the microcell or subpixels of a pixel can be summed by the readout system 330 to determine the output signal for the pixel. In some embodiments, the bias voltage applied on a microcell or subpixel can be controlled based at least in part on a background signal indicative of a level or an amount of background light received by the microcell or subpixel. Generating return signals [0100] In some cases, the readout system 330 comprises a readout processing circuit. The readout processing circuit can include a readout circuit 632 and, in some cases, an event validation circuit 610. The readout circuit 632 can be configured to receive and process the one or more sensor signals 323 from the sensor 320. In some cases, the readout circuit 632 can generate one or more return signals using the one or more sensor signals 323. In some cases, a return signal can be generated by a sensor signal received from a single pixel of the sensor 320. In some cases, the readout circuit 632 can use a sensor signal received from a pixel and generate a return signal indicative of the optical power of and the arrival time of an optical signal (e.g., a reflected optical signal) received by the pixel via the optical system 310. In some other cases, the readout circuit 632 can use a plurality of sensor signals received from a group of pixels and generate one or more return signals indicative of the optical power of and the arrival time of an optical signal (e.g., a reflected optical signal) received by the group of pixels via the optical system 310. In some cases, the readout circuit 632 can determine the rise time, peak time, peak value, area, and the temporal shape of the optical signal based on the one or more sensor signals. In some examples, power and timing calculations can be based on edge, peak, and shape of the optical signal. [0101] In some cases, the signal processing system of a lidar (e.g., a ToF lidar) can use the arrival time of the photons received by one or more pixels to calculate a distance between an object by which the photons were reflected and the lidar. In some cases, the signal processing system of a lidar (e.g., a ToF lidar) can use the arrival time of the photons received by one or more pixels to calculate a distance between an object by which the photons were reflected and the lidar. In some cases, the signal processing system can additionally use a temporal behavior (e.g., shape) of sensor signals received from the sensor 320 to determine the distance. [0102] In various implementations, the readout system 330 can use a sensor signal to determine the arrival time and the shape of an optical signal using different methods including but not being limited to: time-to-digital conversion (TDC), peak finding, and high bandwidth analog to digital conversion. [0103] FIGS. 5A-5B are diagrams illustrating the temporal profile of the sensor signal 323 generated by a pixel of the sensor 320 measured using two different methods. In these diagrams, the signal component and the noise component of the sensor signal are shown separately to facilitate the description of the corresponding background estimation methods. In the examples shown, the signal component 510 of the sensor signal (e.g., a photocurrent) can be a portion of sensor signal generated by light associated with reflection of an optical probe signal emitted by the lidar, and the background (noise) component 530 can include a portion of sensor signal generated by the background light received by the pixel. [0104] In the example shown in FIG. 5A, the readout system can use at least one threshold level (a readout threshold level) 520a to determine an arrival time and/or the optical power of the optical signal received by the corresponding pixel based on time-to-digital (TDC) conversion, and to generate a return signal. [0105] In the examples shown in FIG. 5B the readout circuit can use high bandwidth (e.g., 1GHz) ADC to convert the sensor signal to a digital signal. Subsequently, the readout circuit can determine an arrival time and/or the optical power of the optical signal received by the corresponding pixel based using the resulting digital signal. [0106] In some cases, the readout system can use various processing methods (e.g., peaking finding, pulse shape fitting, threshold, and the like) to process the sensor signal. These methods can be implemented using machine-readable instructions stored in a non-transitory memory of the readout system. [0107] In various implementations, any of the method described above can be implemented (e.g., by the readout system 330) to measure sensor signals received from a pixel during one or more measurement time windows (also referred to as measurement windows). In some cases, the measurement window can be a predefined time window. In some cases, the readout system can use the same or different time windows during different measurement time intervals or for different pixel and or different scanning FOVs. [0108] In some embodiments, the readout system can periodically use a first time window to measure the sensor signals received during a first measurement time interval and a second time window to measure the sensor signals received during a second measurement time interval. [0109] As indicated in FIG.5A-5B at each point in time the sensor signal generated by individual pixels includes a noise component that will be measured along with the signal component. In some cases, the noise component can reduce the accuracy of the resulting return signal. For example, if the magnitude of the noise component is comparable with the magnitude of the signal component, the readout system can determine an arrival time that at least partially is associated with the noise component. In some cases, when the power or intensity of background light becomes larger than reflected light associated with an optical probe signal, the readout system can completely miss the signal component and can not generate any return signal. [0110] Various designs and method disclosed herein can be used to improve the signal-to noise-ratio of sensor signals generated by a pixel or a group of pixels, and therefore increase the accuracy and reliability of the corresponding return signals. In some cases, the disclosed methods and system can improve the signal-to-noise ratio of the return signal. [0111] Some of the methods and systems described below can provide a confidence signal indicative of the probability of a return signal to be associated with a reflected optical signal resulting from the reflection of an optical probe signal emitted by the lidar. Detection system with pixel based background light measurement [0112] As described above, the background light generated by various optical sources in an environment scanned and monitored by a lidar system, can increase the noise level and false alarm rate of the lidar system. [0113] In some cases, background light having nearly constant or a slowly varying magnitude (e.g., slowly varying power or intensity), can decrease the signal-to-noise ratio (SNR) of the sensor signals 323 generated by the lidar sensor and/or the resulting return signals. In some cases, when the lidar signal processing system 106 uses sensor signals and/or return signals for determining the presence of an object in the environment and its distance from the lidar, lower signal-to-noise-ratio of the sensor signals and/or return signals results in higher probability of falsely indicating the presence of the object, or reduced accuracy of the determined distance. [0114] In some cases, when the background light includes optical pulses or when its magnitude varies at a relatively high speed (e.g., for example background light generated by another lidar as described with respect to FIG.1B), the readout system can falsely identify the sensor signals generated by the background light as sensor signals generated by reflected optical signals associated with the optical probe signals emitted by the lidar. As such, the light generated by other lidars in the environment can interfere with the operation and more specifically with the detection and range finding function of the lidar. [0115] In some cases, the detection system can quantify the amount of background light received by the detection system dynamically control optical system, sensor, and readout circuit to improve SNR of the return signals, mitigate interference with other optical systems, and in general reduce the impact of the background light on the performance of the lidar system. In some cases, quantifying the amount of background light received can include generating background signals indicative of the level of background light received by a sensor of the detection system or by the pixels in the sensor. [0116] In various implementations, the detection system can generate a background signal indicative of an amount of background light received by a pixel or a group of pixels in a sensor, and use the background signal to improve the detection probability and to reduce the false alarm rate, improve the accuracy of the return signals, or at least quantify a confidence level of the return signals generated based on the sensor signals received from the sensor. [0117] In some cases, the detection system can use the background signals to reduce the background noise associated with slowly varying or constant background light, by dynamically controlling the optical system, the sensor, and/or the readout system of the lidar in order to reduce the contribution of background light in generating the return signals. For example, the detection system can use the background signals to: reduce the amount of background light directed to the sensor, switch off one or more pixels that receive a level of background light greater than a threshold value, eliminate sensor signals received from a one or more pixels that receives a level of background light greater than a threshold value, or adjust a threshold level used to distinguish portions of a sensor signal associated with background noise and optical signal. In some case, the threshold level can include a voltage provided by a discrete electrical component, or a current provided by an application-specific integrated circuit (ASIC). [0118] In some cases, when the background light includes optical pulses or its magnitude varies at a relatively high speed, the detection system can use the background signals to generate a confidence signal for one or more return signals. A confidence signal can indicate a probability that a return signal generated by a light associated with an optical probe signal emitted by the lidar and received by the lidar detection system 604 via a straight optical path from an object. In some implementations, the event validation circuit 610 can generate a confidence signal for a return signal by determining a level of background light received by the sensor 320 in a period during which the return signal is generated. [0119] FIG.6 is a block diagram illustrating an example of a lidar detection system 604 (or detection system 604) that can generate and use background signals 325 and use them to improve SNR of the return signals return signals 120, improve SNR of the return signals sensor signals 323, and/or generate confidence signals 630. In some examples, one or more background signals 325 can be associated with individual pixels of the sensor 320. In some implementations, the lidar detection system 604 can include one or more devices or components of the device 1500 described below with respect to FIG.15. For example, the lidar detection system 604 can include a processor 1504, a memory 1506, a storage device 1508, an input interface 1510, an output interface 1512, a communication interface 1514, and a bus 1502. Various processes and functions described below with respect to lidar detection system 604 can be performed by at least one processor (e.g., processor 1504) that executes machine readable instructions stored in a memory (e.g., memory 1506) and/or storage component (e.g., storage device 1508) to perform the processes and functions. The lidar detection system 604 can exchange data with one or more components of a lidar system (e.g., lidar system 100 or the lidar sensors 1402b) or other devices of the vehicle 1400 (e.g., the autonomous vehicle compute 1402f), using an input interface (e.g., input interface 1510), an output interface (e.g., output interface 1512), a communication interface (e.g., communication interface 1514). [0120] In some implementations, similar to the detection system 104, the detection system 604 can include an optical system 310, a sensor 320, and a readout system 330. In some cases, the readout system 330 can include a readout circuit 632 and an event validation circuit 610. Additionally, in some examples, the detection system 604 can include a detection control system 640 configured to control the readout circuit 632, the sensor 320, and/or the optical system 310 based at least in part on a feedback signal 622 received from the readout circuit 632 or the event validation circuit 610. In various implementations, the detection system 604 cannot include one of the detection control system 640 or the event validation circuit 610. In some cases, the readout circuit 632 or the event validation circuit 610 can generate the feedback signals 622 based at least in part on the background signals 325. In some cases, a feedback signal can carry information usable for controlling the optical system 310, the sensor 320, and/or the readout circuit 632 in order to improve the signal-to-noise ratio of the sensor signals 323 generated by the sensor 320 and/or the return signals 120 generated by the readout circuit 632. [0121] In some cases, a feedback signal can indicate a distribution of ratios between individual return signals and background signals associated with pixels of the sensor 320. [0122] In some examples, the detection control system 640, can use the one or more feedback signals 622 to improve the SNR of sensor signals 323, and/or the return signals 120 generated by the readout circuit, by controlling the optical system 310, the sensor 320, and/or the readout circuit 632. [0123] In some implementations, the optical system 310 can direct a portion of light incident on an input aperture of the optical system 310 to the sensor 320. The portion of light directed to the sensor 320 can include light incident on the input aperture along directions within a field of view (FOV) of the optical system 310. In some cases, the optical system 310 directs a portion of light received via the input aperture to illuminate a portion of the optical system 310. In some cases, the detection control system 640 can control the FOV and/or the illuminated portion of the sensor. In some cases, the detection control system 640 can use the feedback signals 622 to identify a portion of the FOV from which the amount of background light is received exceeds a threshold level and dynamically adjust the FOV of the optical system 310 to reduce the amount of background light received. Alternatively or in addition, the detection control system 640 can adjust the optical system 310 such that the portion of light, received via the FOV, that includes a level of background light larger than a threshold level, is not directed to the sensor (e.g., is filtered out). As such, the detection control system 640 can reduce the amount of background light reaching the sensor 320 by controlling the optical system 310 and thereby improve the SNR of the return signals. The event validation circuit 610 can determine the level of background light using the background signals 325. [0124] The sensor 320 can generate a plurality of sensor signals 323 (e.g., electric signals, such as photovoltages, photocurrents, digital signals, etc.) and transmit the sensor signals to the readout system 330. In some cases, the readout circuit 632 of the readout system 330 generates feedback signals 622, return signals 120, and/or background signals 325, based at least in part the sensor signals 323 received from the sensor 320. In some examples, a return signal can indicate reception of a reflected optical signal by the optical system 310 and a background signal can indicate background light (e.g., a magnitude of the background light) received by one or more pixels of the sensor 320 or one or more subpixels of a pixel of the one or more pixels. [0125] In some cases, the readout circuit 632 can transmit return signals 120 and the background signals 325 to the event validation circuit 610. In some examples, the event validation circuit 610 can generate one or more confidence signals 630 using the return signals 120 and the background signals 325. In some such cases, the lidar detection system 604 can not include the detection control system 640. In various implementations, a confidence signal can indicate a probability that a return signal generated by the readout circuit 632 is associated with an optical probe signal emitted by the lidar. In some cases, a confidence signal can indicate a probability that a return signal generated by the readout circuit 632 is not associated with optical probe signals and/or the corresponding reflected optical signals emitted by another lidar or another source of light. In some examples, the confidence signal can indicate that within a period during which the one or more return signals were received, the level of background light received by the sensor (e.g., a portion of sensor that provides the sensor signals from which the return signals are generated), exceeded a threshold level (a predetermined level). [0126] In some cases, the event validation circuit 610 can generate a confidence signal for one or more return signals based at least in part on a confidence ratio between a number of pixels that have received background light below the threshold level and a number of pixels that have received background light above the threshold level during the period that the return signal was generated. In some cases, the number of pixels can be determined based on a portion of the pixels that contribute in the generation of the return signal. In some cases, the event validation circuit 610 can use the background signals received from the pixels that contribute in the generation of the return signal to determine the confidence ratio. [0127] In some cases, the event validation circuit 610 can generate a confidence signal for one or more return signals based at least in part on detected background light, or signal-to-noise ratio of the return signal. [0128] In some examples, the return signals and the corresponding confidence signals can be generated for the sensor 320 or a pixel of the sensor 320 during a given measurement time interval. In some cases, the event validation circuit can first generate individual confidence signals for each detected event and use the individual confidence signals to generate an overall confidence signal for individual pixels the given measurement time interval. In some cases, the event validation circuit can first generate individual confidence signals for the individual pixels or groups of pixels and use the individual confidence signals to generate an overall confidence signal for the sensor the given measurement time interval. In some cases, the measurement time interval for which a confidence signal is generated can include one or more measurement time windows (herein referred to as “measurement windows”). The event validation circuit 610 can transmit the confidence signals 630 and, in some cases, the corresponding return signals 120, to the lidar signal processing system 106 for further processing and determination of the presence of an object in an environment scanned by the lidar and calculating the distance and/or the velocity of the object with respect to the lidar or another reference frame. In some cases, the lidar signal processing system 106 can receive the confidence signals 630 from the event validation circuit 610, and the return signals 120 from the readout circuit 632. In some examples, the lidar signal processing system 106 can use the confidence signals 630 to select a reliable event output from the event outputs generated by the validation circuit 610, and then process the reliable event output to generate an output signal 124 (e.g., a distance between the lidar system 100 and the object 110 and/or a velocity of the object 110 with respect to the lidar system 100). In some examples, a confidence signal can indicate that the probability that a reliable event output is falsely generated is below a threshold probability. [0129] In some cases, the readout circuit 632 can generate individual background signals for individual pixels, or a background signal for a group of pixels of the sensor 320. In some cases, the readout circuit can generate a background signal for a pixel using sensor signals generated by the pixel during one or more measurement time windows (measurement windows). In some cases, the readout circuit 632 can generate a return signal and a background signal using a sensor signal received from a pixel during the same measurement window, or different measurement windows. In some cases, a background signal can indicate the amount of background light received by a pixel during a measurement window. In some cases, a pixel or subpixel (e.g., a reference pixel or a reference subpixel) of the sensor 320 can be dedicated to generation of a background signal. In some such cases, the readout circuit 632 can use a sensor signal received from the reference pixel or subpixel to generate a background signal for sensor signals generated by the sensor 320 (or one or more pixels of the sensor 320). As described above, light received by the reference pixel or subpixel can be filtered by an optical filter (e.g., a broadband filter) that transmits a spectral portion of light (e.g., sun light) that is rejected by bandpass filters included in other subpixels or pixels of the sensor 320. [0130] The background signal and the sensor signals generated by the sensor 320 (or one or more pixels of the sensor 320) can be generated at the same measurement time interval or same measurement window. [0131] In some cases, the readout circuit 632 can generate background signals for a pixel, using sensor signals generated by one or more subpixels of the pixel during one or more measurement windows. In some cases, the readout circuit can generate a return signal and a background signal using a sensor signal received from the subpixel during the same measurement window, or different measurement windows. In some cases, individual background signals can be generated for individual pixels. In some cases, one or more subpixels (e.g., a reference subpixel) of a pixel can be dedicated to generation of a background signal. [0132] In some implementations, the background signals can be used in various applications, including but not limited to adaptive control of the lidar detection system, improving the SNR of sensor signals, and mitigation of interference with other lidar systems. [0133] In some cases, a reference sub-pixel or a reference pixel that is dedicated to measurement of background light, can include a broadband optical filter that allows light having a wavelength different from the operating wavelength of the lidar to be detected by the subpixel. In some cases, the sensor signal generated by a reference sub-pixel can be measured at selected time windows that do not correspond to reception of a reflections of optical probe signals. As described above a reference sub-pixel can have an anode separate from the anodes of other subpixels of the pixel. [0134] In some cases, the background signals 325 can be generated by measuring the sensor signals (e.g., output currents) during measurement windows that do not include a sensor signal variation associated with the reflection of an optical probe signal generated by the lidar. [0135] Advantageously, generating a background signal indicative of real-time or nearly real- time background noise separately for each pixel during a measurement time interval, can be used to improve the accuracy of the return signals generated by the pixel. [0136] In some cases, the readout circuit 632 can generate a background signal for a pixel of the sensor 320 based at least in part on the sensor signals generated by other pixels of the sensor 320. In some such cases, the other pixels can be pixels adjacent to the pixel for which the background signal is generated. [0137] In some examples, a background signal indicates the amount of background light received by a pixel or a subpixel within a time frame during which a reflected optical signal is received by the pixel or the subpixel. [0138] In some cases, the feedback signals 622 can include at least a portion of the background signals 325. In some cases, the readout circuit can first generate the background signals 325 and then generate the feedback signals 622 using the background signals 325. For example, the readout circuit can use the background signals 325 to identify one or more pixels of the sensor 320 that each generate a background signal having a magnitude larger than a threshold (e.g., threshold magnitude) and generate a feedback signal (e.g., indicative of pixel coordinates, pixel locations, or pixel identifiers) that can be used by the detection control system 640 to modify a configuration of optical system 310, sensor 320, and readout system 330. In some cases, the detection control system 640 can improve the SNR of the return signals 120 by turning off the pixels that have generated a background signal larger than a threshold value. In some cases, the detection control system 640 can improve the SNR of the return signals 120 by reducing the contribution of the pixels that have generated a background signal larger than a threshold value and/or increasing the contribution of the pixels that generate a background signal lower than a threshold value. In some such cases, the feedback signal can include information usable for identifying the pixels that generate a background signal above or below the threshold value. [0139] In some cases, the feedback signals 622 can be generated by the event validation circuit based at least in part on the confidence signals 630. For example, the event validation circuit 610 can identify one or more pixels of the sensor 320 that have a larger contribution in reducing the probability of a return signal to be associated with an optical probe signal emitted by the lidar. As another example, the event validation circuit 610 can identify one or more pixels of the sensor 320 that can receive light associated with interference signals (e.g., ambient light or another lidar system), with a high probability. [0140] In some implementations, in addition to the feedback signals 622, the detection control system 640 can receive the return signals 120 from the readout circuit 632 and control the detection system based at least in part on the return signals 120. [0141] In some cases, the event validation circuit can generate an event signal 650 indicative of an event detected by the lidar detection system 604. [0142] In some implementations, the readout circuit 632 can include a channel readout circuit that generated the return signals 120 and a background monitor circuit that generated the background signals 325. [0143] In some implementations, the readout circuit 632 can generate the return signals 120 and the background signals 325 by measuring the sensor signals received during a measurement time interval. In some cases, the measurement time interval can include a plurality of measurement windows during which the sensor signal is analyzed to find a temporal variation of the sensor signal amplitude that can be associated with a reflected optical signal. In some cases, the readout circuit can use one of the methods described above with respect to FIGs. 5A and 5B, to analyze and measure the sensor signals received during a measurement window and search for a peak and determine the corresponding peak amplitude level and/or peak time. In some cases, during an analysis, the readout circuit 632 can use one or more threshold levels to identify and/or estimate the peak and/or the corresponding peak amplitude level and/or peak time. In some cases, the readout circuit 632 can use a single measurement window during a measurement time interval. In some cases, the readout circuit 632 can use two or more measurement windows during a measurement time interval. In some such cases, the readout circuit 632 can change the measurement window over one or more measurement time intervals to identify a portion of the sensor signal or pixel output associated with background light received by the corresponding pixel or subpixel (one or more microcells), and a generate background signal indicative of a magnitude of the background light. In some cases, the readout circuit 632 can select a measurement window during which the magnitude of the background light is measured based on one or more measurement windows during which a return signal has been generated. For example, the readout circuit 632 can measure the magnitude of the background light during a measurement window that is delayed by a set delay time with respect to a measurement window during which a return signal has been generated. [0144] In some cases, the detection control system 640 can use the feedback signals 622 to adjust one or more parameters of the readout circuit 632 to improve the SNR of the return signals 120 generated by the readout circuit 632. In some cases, the detection control system 640 can reduce the contribution of background noise in the sensor signals used to generate return signals, by selecting (using the feedback signal) a subset of pixels used for generating a return signal and reducing the contribution of the noisy pixels when generating the subsequent return signals. In some examples, noisy pixels can comprise a subset of pixels that generate a level of background noise greater than a threshold noise. For example, the detection control system 640 can use the feedback signal generated in a first measurement time interval, during which a first return signal is generated, to identify pixels or sub-pixels of the subset of pixels that generate background signals having magnitudes larger than a threshold level and adjust the readout circuit 632 to reduce the contribution of the sensor signals received from the identified pixels or sub-pixels (herein referred to noisy pixels or noisy sub-pixels), in generation of a second return signal during a second measurement time interval after the first measurement time interval. In some cases, reducing the contribution of one or more pixels or sub-pixels in generation of the second return signal can include not using the sensor signals generated by these pixels for generating the second return signal. In some implementations, the first and second measurement time intervals can be subsequent time intervals. In some examples, the second return signal can be a subsequent return signal generated after the first signal in less than 1 picoseconds, less than 1 nanosecond, less than 1 microsecond, or less than 1 millisecond. [0145] In some cases, the detection control system 640 can use the feedback signals 622 to adjust one or more parameters of the readout circuit 632 to improve the detection probability of the object while reduce or maintain the FAR of the lidar detection system 604, with respect to a reference FAR level. In some such cases, the readout circuit 632 can increase the probability of detection (PoD) by dynamically adjusting a readout threshold of the readout circuit 632. For example, when during one or more measurement time intervals the background signals associated with a pixel are larger than a threshold value, the detection control system 640 can increase the readout threshold for that pixel. In some cases, increasing the readout threshold for that pixel can reduce the probability of background noise generated by the pixel to be identified as sensor signal associated with a reflection of an optical probe signal emitted by the lidar. [0146] In some cases, the readout threshold can include a sensor signal threshold level (also referred to as readout threshold level) used to distinguish a portion of a sensor signal generated by the reflected light associated with an optical probe signal emitted by the lidar, from a portion of the sensor signal generated by the background light. [0147] In some cases, the FAR (false alarm rate) of the lidar detection system 604 can include a rate of generation of return signals that are not associated with a reflection of an optical probe signal emitted by the lidar. [0148] In various implementations, the readout circuit 632, the detection control system 640, and the event validation circuit 610 can be programmable circuits and can include a memory and one or more processors configured to execute computer-executable instructions stored in the memory. Accordingly, various functions, procedures and parameter values (e.g., various threshold values) described above with respect to generation of confidence signals and dynamically controlling the lidar detection system to improve SNR of the return signals 120, and/or reducing the FAR of the lidar detection system 604 can be implemented and controlled by modifying the computer-executable instructions executed by different circuits and systems. For example, the detection control system can be programed to control the readout circuit 632, sensor 320, and/or the optical system 310 based on threshold values stored in a memory of the detection control system 640. In some embodiments, readout circuit 632, the detection control system 640, and the event validation circuit 610 can include one or components of the device 1500 described above with respect to FIG. 15. For example, a processor and a memory of the readout circuit 632, the detection control system 640, or the event validation circuit 610 can include the memory 1506 and the processor 1504. [0149] In some implementations, the readout circuit 632 can use a plurality of background signals to generate a sensor background signal and use a plurality of return signals to generate a return signal. In some examples, increasing the probability of detection of the lidar system can include decreasing FAR. In some cases, improving or increasing the SNR of the return signals 120 and/or sensor signals 323 can include increasing a ratio between the return signal and the sensor background signal. In some examples, the sensor background signal can include a sum of the plurality of background signals, and the return signal can include a sum of the plurality of return signals. [0150] FIG.7 is a diagram illustrating sensor signal received from a pixel during a measurement time interval 710, and background light measurement based on multiple measurement time windows. To facilitate the description, signal component 510 and background (noise) component 530 of the sensor signal are plotted separately. The sensor signal could be analog signal (e.g., analog current) or digital signal (e.g., digital voltage or current level). In the example shown, the measurement time interval 710 is divided into several measurement windows where during one or more of the measurement windows (e.g., measurement window 722) a signal peak is detected. In some cases, light received during the measurement windows 718, 720, 724, 726, 728, and/or 730, can be measured to generate a background signal indicative of an amplitude of the background component 530. In some cases, the background signal generated based on a sensor signal received during the measurement windows 718 or 726-730 can be used as a background signal during the measurement time interval shown in FIG. 7. In some cases, the background signal can be generated using the portions of the sensor output received during the measurement window 718 or 726-730 by calculating an average of the corresponding signals. In some cases, the background signal generated during measurement time intervals shown in FIG.7 can indicate a magnitude (e.g., an average magnitude) of background light having a nearly constant or slowly varying power or intensity (e.g., sun light, or light generated by another source). [0151] In some cases, the background signal generated for a pixel based on a measurement time interval can be used to reduce the background signals generated for the pixel during subsequent measurement time intervals or time windows. In some cases, the readout circuit 632 can generate a background signal based on the background signals generated for one or more pixels of the sensor 320 during a measurement time interval or time window and use the background signal to adjust the sensor 320, optical system 310, or the readout circuit 632 in real-time to reduce background signals in subsequent measurement time intervals. [0152] In some cases, the readout circuit 632 can generate a feedback signal based on a first background signal generated for the sensor 320 using sensor signals received during a first measurement time interval. In some such cases, the detection control system 640 can use the first background signal to control the sensor 320, and/or the optical system 310 such that a second background signal generated for the sensor 320 using sensor signals received during a second measurement time interval after the second measurement time interval, is smaller than the first background signal. As such, sensor signals and a corresponding second return signal generated during the second time interval, can have a larger signal-to-noise ratio compared to sensor signals and a corresponding first return signal generated during the first time interval. In some examples, the second return signal can be a subsequent return signal generated after the first signal in less than 1 picoseconds, less than 1 nanosecond, less than 1 microsecond, or less than 1 millisecond. In some cases, the first measurement time interval can be a subsequent measurement time interval after the first measurement time interval. For example, the detection control system 640 can use the feedback signal to determine the sensor signal generated by which pixels of the sensor 320 had a lager contribution in the first background signal and turn off those pixels in the second measurement time interval. In some cases, a lager contribution in a background signal generated for a sensor can be determined by determining that the background noise (sensor signal associated with background light) received from a pixel of the sensor is larger than a threshold value. In some cases, the threshold value can be a percentage of the background signal (e.g., 5%, 10%, 30%, 50%, 80% or larger). [0153] Alternatively or in addition, the detection control system 640 can use the first background signal to control the readout circuit 632, such that return signals generated sensor signals received during a second measurement time interval, have a larger signal-to- noise-ratios. For example, the detection control system 640 can use the feedback signal to determine first background signal and the threshold current level (also referred to as event validation level) used by the readout circuit 632 during the first measurement time interval and adjust the threshold current level during the second measurement time interval after the second time interval, to reduce the second background signal. In some examples, the second time interval can be a subsequent time interval after the second time interval. [0154] In some embodiments, the background signal and the feedback signal could be used to control the readout circuit 632 (e.g., by controlling a threshold level), the sensor 320 (e.g., by activating or deactivating pixels, or changing a pixel group size), and/or the optical system 310 (e.g., by changing the field of view). [0155] In some cases, the readout circuit 632 can generate a feedback signal based on real-time background signal to control the sensor 320, and/or the optical system 310. For example, the detection control system 640 can use the feedback signal to determine which pixels of the sensor 320 had a lager contribution in the background signal and turn off pixel output (sensor 320) or reduced the optical transmission (optical system 310) of those pixels. In some cases, a lager contribution in a background signal generated from one or some pixels can be determined by determining that the background noise received from these pixels is larger than a threshold value. [0156] In some embodiments, the detection control system 640 can use the background signal to control the readout circuit 632, such that signal readout and event output, have a higher probability of detection. For example, the detection control system 640 can use the feedback signal to determine the threshold level used by the readout circuit 632 and adjust the threshold level of event determination to maintain a reasonable FAR. In various implementations, individual background signals that indicate the intensity and temporal profile of background light received by individual pixels of the sensor 320, can be used by the event validation circuit 610 to generate a confidence signal for the return signals generated by the readout circuit 632 during the measurement time interval associated with the background signals. Example lidar system with feedback control [0157] FIG. 8 illustrates an example lidar detection system 800. In some cases, lidar detection system 800 can be an embodiment of the lidar detection system 604 having a detection control system 640. In the example shown, the detection control system 640 includes a readout circuit controller 822 that controls the readout system 330, a sensor controller 824 that controls the sensor 320, and an optical system controller 826 that controls the optical system 310. As described with respect to FIG.6, the detection control system 640 can receive feedback signals 622 from the readout system 330 and use the feedback signals to dynamically control the detection control system 640 to reduce the contribution of the background light in the return signals generated by the readout system 330. [0158] In some cases, the detection control system 640 can control the readout system 330 by generating readout system control signals 823 and transmitting them to the readout system 330. The detection control system 640 can generate the readout system control signals 823 using one or more feedback signals 622 received from the readout system 330. In some cases, the readout system control signals 823 can include command and instructions usable for selecting and controlling pixels and/or subpixels of the sensor 320 from which the return signals are generated. In some examples, the feedback signals 622 can be associated with an initial measurement time interval and the detection control system 640 can generate readout system control signals 823 that reconfigure the readout system 330 to reduce the signal-to-noise ratio and/or reliability of the return signals generated by the readout system 330 during one or more measurement time intervals after the initial measurement time interval. In some cases, reconfiguration of the readout system 330 can include adjusting the contribution of individual sensor signals generated by individual pixels or sub-pixels to the return signals. For example, the readout system control signals 823 can change a weight factor of the sensor signal generated by a pixel or a sub-pixel, in a procedure that generates a return signal using a weighted sum of the sensor signals 323. [0159] In some cases, the detection control system 640 can control the sensor 320 by generating sensor control signals 825 and transmitting them to the sensor 320. The detection control system 640 can generate the sensor control signals 825 using one or more feedback signals 622 received from the readout system 330. In some examples, the feedback signals 622 can be associated with a first measurement time interval and the detection control system 640 can generate readout system control signals 823 that reconfigure the sensor 320 to reduce the signal-to-noise ratio and/or reliability of the return signals generated by the readout system 330 using the sensor signals 323 received from the sensor 320 during one or more measurement time intervals after the first measurement time interval. In some cases, reconfiguration of the sensor 320 can include adjusting the bias voltage applied on a pixel or a subpixel of the pixel, or turn off a pixel or a subpixel of the pixel. In some examples, a measurement time interval after the first measurement time interval can be a subsequent measurement time interval immediately after the first measurement time interval. [0160] In some cases, the detection control system 640 can generate the sensor control signals 825 using one or more feedback signals 622 received from the readout system 330. In some examples, the feedback signals 622 can be associated with a real-time measurement and the detection control system 640 can generate readout system control signals 823 that reconfigure the sensor 320 to select pixels or subpixels for next (e.g., subsequent) measurement(s). In some cases, reconfiguration of the sensor 320 can include adjusting the bias voltage applied on a pixel or individual sub-pixels of the pixel, or turning off a pixel or a subpixel of the pixel. [0161] In some cases, the detection control system 640 can control the optical system 310 by generating optical system control signals 827 and transmitting them to the optical system 310. The detection control system 640 can generate the optical system controller 826 using one or more feedback signals 622 received from the readout system 330. In some examples, the feedback signals 622 can be associated with a first measurement time interval and the detection control system 640 can generate optical system control signals 827 that reconfigure the optical system 310 to reduce the signal-to-noise ratio and/or reliability of the return signals generated by the readout system 330 during one or more measurement time intervals after the first measurement time interval. In some cases, reconfiguration of the optical system 310 can include adjusting one or more optical elements of the optical system 310 to reduce an amount of background light directed to at least a portion of pixels of the sensor 320 (e.g., by reducing a collection FOV). In some cases, the optical system control signals 827 can adjust the orientation of one or more mirrors (e.g., micromirrors), the focal length or position of one or more lenses (e.g., microlenses) to select and redirect a portion of light received from the environment (e.g., a portion that includes a lower level of background light). For example, with continued reference to FIG.8, in a first measurement time interval the optical system 310 can direct received light 305 within an FOV of the optical system 310 to illuminate nearly all pixels of the sensor 320. In some examples, the optical system 310 can transform the received light 305 to a sensor beam 315 (e.g., a convergent beam of light) that illuminate nearly all pixels of the sensor 320. In some implementations, the detection control system 640 can use the feedback signals 622 generated by the readout system 330 during the first measurement time interval, to reconfigure the optical system 310 such that during a second measurement time interval, a selected portion 832 of the received light 305 via a portion of the FOV illuminates a selected portion of pixels of the sensor 320. In some examples, after reconfiguration, the optical system 310 can transform the received light 305 to a modified output beam of light 830 to illuminate the selected portion of pixels of the sensor 320. [0162] In some examples, the feedback signals 622 can be associated with real- time measurement and the detection control system 640 can generate optical system control signals 827 that reconfigure the optical system 310 to control optical collection path of each pixel or subpixel. In some cases, reconfiguration of the optical system 310 can include adjusting one or more optical elements of the optical system 310 to reduce an amount of light collection FOV directed to at least a portion of pixels of the sensor 320. In some cases, the optical system control signals 827 can adjust the orientation of a number of micromirrors, the focal length, or lenses position to select and redirect a portion of light received from environment. For example, the optical system 310 can be capable of directing received light 305 within an FOV of the optical system 310 to illuminate nearly all pixels of the sensor 320. In some examples, the optical system 310 can transform the received light 305 to an output sensor beam 315 (e.g., a convergent beam of light) that illuminate nearly all pixels of the sensor 320. In some implementations, the detection control system 640 can use the feedback signals 622 generated by the readout system 330 to reconfigure the optical system 310 such that a selected portion 832 of the FOV is transformed by re-configured optical system 310 to a modified output beam of light 830 which illuminate on the selected portion of pixels of the sensor 320. [0163] In some implementations, the optical system 310 can include a spatial optical filter that does not allow light beams that do not propagate along a specified direction to reach the sensor 320. In some cases, the specified direction can be substantially parallel to an optical axis of the optical system 310. In some cases, the spatial optical filter can be a reconfigurable spatial optical filter that allows changing the specified direction using control signals. In some cases, the detection control system 640 can generate one or more optical system control signals 827 to change the specified direction of a spatial optical filter in the optical system 310 to reduce the magnitude of the background light directed toward the sensor 320. For example, the readout system 330 can use the sensor signals 323 received from the sensor 320 to identify a direction from which a larger portion of the background light is received by the optical system 310 compared to other directions associated with the FOV of the optical system 310. In some cases, the readout system 330 can generate a feedback signal indicating the identified direction and the detection control system 640 can receive the feedback signal and adjust the specified direction of the spatial optical filter to modify a portion of the received light 305 that propagates along the specified direction. In some cases, a spatial optical filter can be used to reduce or eliminate interference between the reflected optical signals associated with the optical probe signals emitted by a lidar and the optical signals associated with other lidars. For example, the spatial optical filters can be configured to block or absorb at least a portion of light beams that are emitted by other lidars. In some such cases, the readout system 330 can generate a feedback signal indicating the directions associated with light received from other lidars and the detection control system 640 can use the feedback signal to adjust the specified direction of the spatial optical filter to block at least a portion of light beams emitted by other lidars so they cannot reach the sensor 320. [0164] In some implementations, the lidar system 800 shown in FIG.8 can generate a confidence signal for one or more return signals generated during a measurement time interval where a confidence signal indicates the probability of the corresponding return signal being associated with a reflection of an optical probe signal emitted by the lidar system 800. For example, similar to the lidar system 604, the readout system 330 of the lidar system 800 can generate the confidence signal using the event validation circuit 610 therein and based on the background signals 325. [0165] In some examples, the feedback signals 622 can be generated based on real- time measurement or evaluation of return signals 120, sensor signals 323, and/or background signals 325. Subsequently the detection control system 640 can generate sensor control signals 825, readout system control signals 823, and/or optical system control signals 827, to reconfigure or adjust the readout system 330, the sensor 320, and/or the optical system 310 for increasing the signal-to-noise-ratio (SNR) of the sensor signals and/or return signals generated after the generation of the feedback signal. In some examples, the delay between generation of the feedback signal and reduction of the SNR can be less than 1 picosecond, less than 1 nanoseconds, less than 1 microseconds, or less than 1 milliseconds. As such, effectively the detection control system 640, can provide real-time or near real-time improvement of SNR and probability of true event detection for lidar detection system 604. [0166] FIG. 9 illustrates an example spatial optical filter 900 that rejects light beams that do not propagate in a direction parallel to an optical axis 912 of the spatial optical filter 900. In some cases, the optical axis 912 of the spatial optical filter 900 can be parallel to an optical axis of the optical system 310. In some cases, the optical axis 912 of the spatial optical filter 900 can overlap with an axis of symmetry of the FOV of the optical system 310. The spatial optical filter 900 includes a first lens 902 (e.g., an input lens), an aperture 904, and a second lens (e.g., an output lens). The aperture 904 can be an opening in an opaque screen 905. In some examples, on axis optical rays (or beams) 907/908 that propagate in a direction parallel to the optical axis 912, can be redirected by the first lens 902 such that they pass through the aperture 904. After passing through the aperture 904, the on axis optical rays 907 and 908 can be redirected again by the second lens 906 toward the sensor 320. In some examples, off axis optical rays (or beams) 909 and 910 that propagate in along different direction than the optical axis 912, can be redirected by the first lens 902 such that they become incident on the screen 905 and are absorbed or reflected by the screen 905. As such, the optical rays 909 and 910 cannot reach the sensor 320. In some cases, the position of the aperture 904 can be controlled by the detection control system 640. In some such cases, the detection control system 640 control the position of the aperture 904 using an electromechanical or micro- electromechanical system integrated with screen 905, In various implementations, the first lens 902, the second lens 906, the aperture 904, and the electromechanical system can be on-chip components or components integrated together. [0167] In some cases, the optical system 310 can include the spatial optical filter 900. In some cases, a spatial optical filter used in the optical system 310 can include one or more features described with respect the spatial optical filter 900. In some cases, the optical system 310 can include two or more spatial optical filters that filter light beams according to principles described above with respect to spatial optical filter 900. [0168] In various implementations, a fixed or dynamically controlled spatial optical filter used in a lidar detection system can improve the signal to noise ratio of the sensor signals and return signals generated by the lidar detection system. In some cases, a fixed or dynamically controlled spatial optical filter used in a lidar detection system can reduce the amount of background light reaching the sensor, the false alarm rate of the lidar, and the interference or probability of interference with other lidar systems. [0169] In some case, a lidar that uses a lidar detection system that includes one or more features described with respect to the lidar detection system 604 can use one of the signal coding methods described with respect to FIG .1C to reduce the probability of generating return signals associated with light emitted by other lidar systems. In some such cases, generation of confidence signals can further improve the performance of a system that uses the return signals generated by the lidar. For example, when the readout system fails to identity sensor signal variations associated with light received from other lidars and generates false return signals, the corresponding confidence signals generated for the return signals (e.g., within the measurement time intervals), can be used to reduce the false return signals. [0170] As described above, the readout circuit 632, the detection control system 640, and the event validation circuit 610 can include a memory and a processor configured to execute computer-executable instructions stored in the memory. In some examples, a memory or a processor of the readout circuit 632, the detection control system 640, or the event validation circuit 610 can include the memory 1506 and the processor 1504 in the device 1500 described below (FIG.15). In some cases, the processors can execute a program to implement a routine or process designed to improve the SNR of the return signals 120, increase the probability of detection, and/or reduce the false alarm rate (FAR) of the lidar detection system 604. In some cases, a processor of the readout circuit 632, the detection control system 640, or the event validation circuit 610, can execute programs or implement routines that are the same as, or similar to, those stored in memory 1506 or storage device 1508 and described below with respect to device 1500 in FIG.15. [0171] FIG. 10 is a flow diagram illustrating an example of a process or routine 1000 implemented by one or more processors of the readout circuit 632 to generate the return signals 120 and the background signals 325. In some cases, the optical system 310 can include sensor signals from the sensor 320 and measures the received sensor signals. In some cases, the sensor signals can be generated continuously. In some examples, the received sensor signals can include sensor signals from each pixel of the sensor 320. In some examples, the readout circuit 632 can divide the measurement time interval into multiple time windows and measure the sensor signals received during each time window separately. [0172] At block 1002, the readout circuit 632 receives and measures sensor signals. [0173] At block 1004, the readout circuit 632 generates background signals using at least a portion of the measured sensor signals. In some cases, the readout system generates a return signal using a first portion of a sensor signal and a background signal using a second portion of the sensor signal different than the first portion of the sensor signal where the first and second portions of the first sensor signal are received at two different measurement time windows (e.g., two non-overlapping time windows). In some cases, the readout circuit 632 generates a background signal for a pixel using sensor signals generated by the pixel during two or more measurement time windows. In some cases, the readout circuit 632 generates a background signal for a pixel using sensor signals generated by other pixels of the sensor 320. [0174] At block 1006, the readout system 330 generates a feedback signal using at least a portion of background signals. In some cases, the feedback signal can be a value determined by background signals generated based on sensor signals measured during multiple time intervals. [0175] At block 1008, the readout system 330 transmits the feedback signal to detection control system 640. In some cases, the feedback signal can be different for different measurement time windows. The detection control system 640 uses the feedback signal to adjust at least one of the read out circuit 632, sensor 320, or optical system 310. [0176] At block 1010, the readout circuit 632 transmits the return signal and the background signal to the event validation circuit 610. In some examples, the operations at block 1008 and block 1010 can performed substantially at the same time or sequentially (e.g., with a delay). In some cases, the readout system 330 does not have an event validation circuit 610 or a detection control system 640. In some such cases, the readout circuit 632 skips block 1008 or block 1010. [0177] FIG. 11 is a flow diagram illustrating an example of a process or routine 1100 implemented by one or more processors of the detection control system 640 to reduce the FAR of the lidar detection system 604 by controlling the optical system 310, the sensor 320, and/or the readout circuit 632. In some cases, the detection control system 640 controls the optical system 310, the sensor 320, and/or the readout circuit 632 to reduce the SNR of the return signals 120, and/or the magnitude of one or more of the background signals 325. [0178] At block 1102, the detection control system 640 receives a feedback signals from the readout circuit 632. In some cases, the detection control system 640 uses the feedback signals to dynamically control optical system 310 and the sensor 320 and not the readout circuit 632. In these cases, the process moves to block 1104. In some cases, the detection control system 640 uses the feedback signals to dynamically control the readout circuit 632 and not optical system 310 and the sensor 320. In these cases, the process moves to block 1112. In other cases, the detection control system 640 dynamically controls the readout circuit 632, the optical system 310, and the sensor 320. In some such cases, the process moves to the blocks 1104 and 1112 substantially at the same time or at different times. In some cases, the detection control system 640 sequentially adjusts the optical system 310, the sensor 320, and the readout circuit 632 with different orders and different delays between adjustments. [0179] At block 1104, the detection control system 640 uses the information included in the feedback signal to identify one or more noisy pixels. In some examples, noisy pixels can comprise a subset of pixels that generate a level of background noise greater than a threshold noise. In some examples, the detection control system 640 identifies a noisy pixel by comparing the background signals associated with the noisy pixel and determining that the magnitude of the background signal is larger than a threshold level. In some cases, the detection control system 640 determines the threshold level based at least in part the background signals associated with other pixels of the sensor 320. In some cases, the threshold level can be a fixed value stored in a memory of the detection control system 640 or the lidar. In some cases, where the detection control system 640 controls both the optical system 310 and the sensor 320, the process moves to the blocks 1106 and 1108 substantially at the same time or at different times. In some cases, the detection control system 640 controls the optical system 310 and not the sensor 320. In these cases, the process moves to block 1108. In some cases, the detection control system 640 controls the sensor 320 and not the optical system 310. In these cases, the process moves to block 1106. [0180] At block 1106, the detection control system 640 adjusts the bias voltage of all or a portion of the identified noisy pixels to improve the SNR of the return signals that are generated based at least in part the noisy pixels. In some cases, the detection control system 640 turns off some of the noisy pixels (for example but reducing the bias voltage to zero or close to zero). [0181] At block 1108, the detection control system 640 identifies a portion of the FOV of the optical system 310 from which light is directed to the noisy pixels. [0182] At block 1110, the detection control system 640 changes or adjust the FOV to reduce the amount of light directed to the sensor from directions associated with the identified portion of the original FOV. In some cases, the detection control system 640 changes or adjust the FOV using electro-mechanically controllable optical elements (e.g., micro- mirrors, and/or microlenses). [0183] In some cases, where the optical system 310 includes a reconfigurable spatial optical filter, at block 1108 the detection control system 640 identifies a direction along which a portion of light directed to the noisy pixel is received from the environment and at block 1110, the detection control system 640 adjusts the reconfigurable spatial optical filter to block a portion of light received from the environment along the identified direction to reduce the amount of background light directed from the environment to the sensor 320. [0184] At block 1112, the detection control system 640 uses the feedback signals to determine a first background signal and an initial readout threshold level for a first pixel. In some cases, a readout threshold level can be a threshold value of the sensor signal generated by the first pixel below which the sensor signal is considered to be associated with background light and not the reflection of an optical probe signal emitted by the lidar. [0185] At the decision block 1114, the detection control system 640 determines whether the magnitude of the background signal is larger than a threshold noise magnitude. If detection control system 640 determines that the magnitude of the background signal is smaller than a threshold noise magnitude the process moves to block 1116 and the detection control system 640 does not change the first readout threshold level for the first pixel. If detection control system 640 determines that the magnitude of the background signal is larger than a threshold magnitude the process moves to block 1118. [0186] At block 1118, the detection control system 640 increases the initial readout threshold level for the first pixel to reduce the probability of generation of false return signals based on the sensor signals generated by the first pixel and thereby reducing the FAR for the lidar detection system 604. [0187] FIG. 12 is a flow diagram illustrating an example of a process or routine 1100 implemented by one or more processors of the event validation circuit 610 for generating a confidence signal. [0188] At block 1202, the event validation circuit 610 receives a return signal and background signals during a measurement time interval. [0189] At block 1204, the event validation circuit 610 identifies the pixels that contributed to generation of return signal. [0190] At block 1206, the event validation circuit 610 determines the background signals associated with the identified pixels. [0191] At block 1208, the event validation circuit 610 generates a confidence signal for the return signal where the confidence output could be a value or multiple values that are associated with background signals level, interference condition (e.g., how many return signal detected), internal noise signal amplitude, and or current FAR. In some cases, the levels can be a value stored in a memory of the event validation circuit 610. [0192] The flow diagrams illustrated in FIGs. 10, 11, and 12 are provided for illustrative purposes only. It will be understood that one or more of the steps of the routines illustrated in FIGs. 10, 11, and 12 can be removed or that the ordering of the steps can be changed, unless it is otherwise explicitly stated that the ordering cannot change. Furthermore, for the purposes of illustrating a clear example, one or more particular system components are described in the context of performing various operations during each of the data flow stages. However, other system arrangements and distributions of the processing steps across system components can be used [0193] In some aspects and/or embodiments, devices and methods described above can be used in a lidar sensor of an autonomous system included in a vehicle, to improve the autonomous driving capability of the vehicle by reducing the probability of false alarm generation by the lidar sensor (e.g., false alarm associated with indirect light received by the lidar detection system). [0194] Referring now to FIG.13, illustrated is example environment 1300 in which vehicles that include autonomous systems, as well as vehicles that do not, are operated. As illustrated, environment 1300 includes vehicles 1302a–1302n, objects 1304a–1304n, routes 1306a–1306n, area 1308, vehicle-to-infrastructure (V2I) device 1310, network 1312, remote autonomous vehicle (AV) system 1314, fleet management system 1316, and V2I system 1318. Vehicles 1302a–1302n, vehicle-to-infrastructure (V2I) device 1310, network 1312, autonomous vehicle (AV) system 1314, fleet management system 1316, and V2I system 1318 interconnect (e.g., establish a connection to communicate and/or the like) via wired connections, wireless connections, or a combination of wired or wireless connections. In some embodiments, objects 1304a–1304n interconnect with at least one of vehicles 1302a–1302n, vehicle-to-infrastructure (V2I) device 1310, network 1312, autonomous vehicle (AV) system 1314, fleet management system 1316, and V2I system 1318 via wired connections, wireless connections, or a combination of wired or wireless connections. [0195] Vehicles 1302a–1302n (referred to individually as vehicle 1302 and collectively as vehicles 1302) include at least one device configured to transport goods and/or people. In some embodiments, vehicles 1302 are configured to be in communication with V2I device 1310, remote AV system 1314, fleet management system 1316, and/or V2I system 1318 via network 1312. In some embodiments, vehicles 1302 include cars, buses, trucks, trains, and/or the like. In some embodiments, vehicles 1302 are the same as, or similar to, vehicles 1400, described herein (see FIG.14). In some embodiments, a vehicle 1400 of a set of vehicles 1400 is associated with an autonomous fleet manager. In some embodiments, vehicles 1302 travel along respective routes 1306a–1306n (referred to individually as routes 1306 and collectively as route 1306), as described herein. In some embodiments, one or more vehicles 1302 include an autonomous system (e.g., an autonomous system that is the same as or similar to autonomous system 1402). [0196] Objects 1304a–1304n (referred to individually as object 1304 and collectively as objects 1304) include, for example, at least one vehicle, at least one pedestrian, at least one cyclist, at least one structure (e.g., a building, a sign, a fire hydrant, etc.), and/or the like. Each object 1304 is stationary (e.g., located at a fixed location for a period of time) or mobile (e.g., having a velocity and associated with at least one trajectory). In some embodiments, objects 1304 are associated with corresponding locations in area 1308. [0197] Routes 1306a–1306n (referred to individually as route 1306 and collectively as routes 1306) are each associated with (e.g., prescribe) a sequence of actions (also known as a trajectory) connecting states along which an AV can navigate. Each route 1306 starts at an initial state (e.g., a state that corresponds to a first spatiotemporal location, velocity, and/or the like) and ends at a final goal state (e.g., a state that corresponds to a second spatiotemporal location that is different from the first spatiotemporal location) or goal region (e.g. a subspace of acceptable states (e.g., terminal states)). In some embodiments, the first state includes a location at which an individual or individuals are to be picked-up by the AV and the second state or region includes a location or locations at which the individual or individuals picked-up by the AV are to be dropped-off. In some embodiments, routes 1306 include a plurality of acceptable state sequences (e.g., a plurality of spatiotemporal location sequences), the plurality of state sequences associated with (e.g., defining) a plurality of trajectories. In an example, routes 1306 include only high-level actions or imprecise state locations, such as a series of connected roads dictating turning directions at roadway intersections. Additionally, or alternatively, routes 1306 can include more precise actions or states such as, for example, specific target lanes or precise locations within the lane areas and targeted speed at those positions. In an example, routes 1306 include a plurality of precise state sequences along the at least one high level action sequence with a limited look ahead horizon to reach intermediate goals, where the combination of successive iterations of limited horizon state sequences cumulatively correspond to a plurality of trajectories that collectively form the high level route to terminate at the final goal state or region. [0198] Area 1308 includes a physical area (e.g., a geographic region) within which vehicles 1302a-1302n can navigate. In an example, area 1308 includes at least one state (e.g., a country, a province, an individual state of a plurality of states included in a country, etc.), at least one portion of a state, at least one city, at least one portion of a city, etc. In some embodiments, area 1308 includes at least one named thoroughfare (referred to herein as a “road”) such as a highway, an interstate highway, a parkway, a city street, etc. Additionally, or alternatively, in some examples area 1308 includes at least one unnamed road such as a driveway, a section of a parking lot, a section of a vacant and/or undeveloped lot, a dirt path, etc. In some embodiments, a road includes at least one lane (e.g., a portion of the road that can be traversed by vehicles 1302). In an example, a road includes at least one lane associated with (e.g., identified based on) at least one lane marking. [0199] Vehicle-to-Infrastructure (V2I) device 1310 (sometimes referred to as a Vehicle-to-Infrastructure or Vehicle-to-Everything (V2X) device) includes at least one device configured to be in communication with vehicles 1302 and/or V2I infrastructure system 1318. In some embodiments, V2I device 1310 is configured to be in communication with vehicles 1302, remote AV system 1314, fleet management system 1316, and/or V2I system 1318 via network 1312. In some embodiments, V2I device 1310 includes a radio frequency identification (RFID) device, signage, cameras (e.g., two-dimensional (2D) and/or three- dimensional (3D) cameras), lane markers, streetlights, parking meters, etc. In some embodiments, V2I device 1310 is configured to communicate directly with vehicles 1302. Additionally, or alternatively, in some embodiments V2I device 1310 is configured to communicate with vehicles 1302, remote AV system 1314, and/or fleet management system 1316 via V2I system 1318. In some embodiments, V2I device 1310 is configured to communicate with V2I system 1318 via network 1312. [0200] Network 1312 includes one or more wired and/or wireless networks. In an example, network 1312 includes a cellular network (e.g., a long term evolution (LTE) network, a third generation (3G) network, a fourth generation (4G) network, a fifth generation (5G) network, a code division multiple access (CDMA) network, etc.), a public land mobile network (PLMN), a local area network (LAN), a wide area network (WAN), a metropolitan area network (MAN), a telephone network (e.g., the public switched telephone network (PSTN), a private network, an ad hoc network, an intranet, the Internet, a fiber optic-based network, a cloud computing network, etc., a combination of some or all of these networks, and/or the like. [0201] Remote AV system 1314 includes at least one device configured to be in communication with vehicles 1302a-1302n, V2I device 1310, network 1312, fleet management system 1316, and/or V2I system 1318 via network 1312. In an example, remote AV system 1314 includes a server, a group of servers, and/or other like devices. In some embodiments, remote AV system 1314 is co-located with the fleet management system 1316. In some embodiments, remote AV system 1314 is involved in the installation of some or all of the components of a vehicle, including an autonomous system, an autonomous vehicle compute, software implemented by an autonomous vehicle compute, and/or the like. In some embodiments, remote AV system 1314 maintains (e.g., updates and/or replaces) such components and/or software during the lifetime of the vehicle. [0202] Fleet management system 1316 includes at least one device configured to be in communication with vehicles 1302, V2I device 1310, remote AV system 1314, and/or V2I infrastructure system 1318. In an example, fleet management system 1316 includes a server, a group of servers, and/or other like devices. In some embodiments, fleet management system 1316 is associated with a ridesharing company (e.g., an organization that controls operation of multiple vehicles (e.g., vehicles that include autonomous systems and/or vehicles that do not include autonomous systems) and/or the like). [0203] In some embodiments, V2I system 1318 includes at least one device configured to be in communication with vehicles 1302, V2I device 1310, remote AV system 1314, and/or fleet management system 1316 via network 1312. In some examples, V2I system 1318 is configured to be in communication with V2I device 1310 via a connection different from network 1312. In some embodiments, V2I system 1318 includes a server, a group of servers, and/or other like devices. In some embodiments, V2I system 1318 is associated with a municipality or a private institution (e.g., a private institution that maintains V2I device 1310 and/or the like). [0204] The number and arrangement of elements illustrated in FIG.13 are provided as an example. There can be additional elements, fewer elements, different elements, and/or differently arranged elements, than those illustrated in FIG. 13. Additionally, or alternatively, at least one element of environment 1300 can perform one or more functions described as being performed by at least one different element of FIG. 13. Additionally, or alternatively, at least one set of elements of environment 1300 can perform one or more functions described as being performed by at least one different set of elements of environment 1300. [0205] Forego reliance on human intervention in certain situations such as Level 4 ADS-operated vehicles), conditional autonomous vehicles (e.g., vehicles that forego reliance on human intervention in limited situations such as Level 3 ADS-operated vehicles) and/or the like. In one embodiment, autonomous system 1402 includes operational or tactical set of elements of environment 1300. Referring now to FIG.14, vehicle 1400 (which can be the same as, or similar to vehicles 1302 of FIG. 13) includes or is associated with autonomous system 1402, powertrain control system 1404, steering control system 1406, and brake system 1408. In some embodiments, vehicle 1400 is the same as or similar to vehicle 1302 (see FIG.13). In some embodiments, autonomous system 1402 is configured to confer vehicle 1400 autonomous driving capability (e.g., implement at least one driving automation or maneuver- based function, feature, device, and/or the like that enable vehicle 1400 to be partially or fully operated without human intervention including, without limitation, fully autonomous vehicles (e.g., vehicles that forego reliance on human intervention such as Level 5 ADS-operated vehicles), highly autonomous vehicles (e.g., vehicles that functionality required to operate vehicle 1400 in on-road traffic and perform part or all of Dynamic Driving Task (DDT) on a sustained basis. In another embodiment, autonomous system 1402 includes an Advanced Driver Assistance System (ADAS) that includes driver support features. Autonomous system 1402 supports various levels of driving automation, ranging from no driving automation (e.g., Level 0) to full driving automation (e.g., Level 5). For a detailed description of fully autonomous vehicles and highly autonomous vehicles, reference can be made to SAE International's standard J3016: Taxonomy and Definitions for Terms Related to On-Road Motor Vehicle Automated Driving Systems, which is incorporated by reference in its entirety. In some embodiments, vehicle 1400 is associated with an autonomous fleet manager and/or a ridesharing company. [0206] Autonomous system 1402 includes a sensor suite that includes one or more devices such as cameras 1402a, lidar sensors 1402b, radar sensors 1402c, and microphones 1402d. In some embodiments, autonomous system 1402 can include more or fewer devices and/or different devices (e.g., ultrasonic sensors, inertial sensors, GPS receivers (discussed below), odometry sensors that generate data associated with an indication of a distance that vehicle 1400 has traveled, and/or the like). In some embodiments, autonomous system 1402 uses the one or more devices included in autonomous system 1402 to generate data associated with environment 1300, described herein. The data generated by the one or more devices of autonomous system 1402 can be used by one or more systems described herein to observe the environment (e.g., environment 1300) in which vehicle 1400 is located. In some embodiments, autonomous system 1402 includes communication device 1402e, autonomous vehicle compute 1402f, drive-by-wire (DBW) system 1402h, and safety controller 1402g. [0207] In some cases, at least the lidar sensors 1402b of the vehicle 1400 can include the lidar system 100, the lidar detection system 104, the lidar detection system 604, or the lidar detection system 800 described above. [0208] Cameras 1402a include at least one device configured to be in communication with communication device 1402e, autonomous vehicle compute 1402f, and/or safety controller 1402g via a bus (e.g., a bus that is the same as or similar to bus 1502 of FIG. 15). Cameras 1402a include at least one camera (e.g., a digital camera using a light sensor such as a Charge Coupled Device (CCD), a thermal camera, an infrared (IR) camera, an event camera, and/or the like) to capture images including physical objects (e.g., cars, buses, curbs, people, and/or the like). In some embodiments, camera 1402a generates camera data as output. In some examples, camera 1402a generates camera data that includes image data associated with an image. In this example, the image data can specify at least one parameter (e.g., image characteristics such as exposure, brightness, etc., an image timestamp, and/or the like) corresponding to the image. In such an example, the image can be in a format (e.g., RAW, JPEG, PNG, and/or the like). In some embodiments, camera 1402a includes a plurality of independent cameras configured on (e.g., positioned on) a vehicle to capture images for the purpose of stereopsis (stereo vision). In some examples, camera 1402a includes a plurality of cameras that generate image data and transmit the image data to autonomous vehicle compute 1402f and/or a fleet management system (e.g., a fleet management system that is the same as or similar to fleet management system 1316 of FIG. 13). In such an example, autonomous vehicle compute 1402f determines depth to one or more objects in a field of view of at least two cameras of the plurality of cameras based on the image data from the at least two cameras. In some embodiments, cameras 1402a is configured to capture images of objects within a distance from cameras 1402a (e.g., up to 1300 meters, up to a kilometer, and/or the like). Accordingly, cameras 1402a include features such as sensors and lenses that are optimized for perceiving objects that are at one or more distances from cameras 1402a. [0209] In an embodiment, camera 1402a includes at least one camera configured to capture one or more images associated with one or more traffic lights, street signs and/or other physical objects that provide visual navigation information. In some embodiments, camera 1402a generates traffic light data associated with one or more images. In some examples, camera 1402a generates TLD (Traffic Light Detection) data associated with one or more images that include a format (e.g., RAW, JPEG, PNG, and/or the like). In some embodiments, camera 1402a that generates TLD data differs from other systems described herein incorporating cameras in that camera 1402a can include one or more cameras with a wide field of view (e.g., a wide-angle lens, a fish-eye lens, a lens having a viewing angle of approximately 120 degrees or more, and/or the like) to generate images about as many physical objects as possible. [0210] Light Detection and Ranging (lidar) sensors 1402b include at least one device configured to be in communication with communication device 1402e, autonomous vehicle compute 1402f, and/or safety controller 1402g via a bus (e.g., a bus that is the same as or similar to bus 1502 of FIG.15). Lidar sensors 1402b include a system configured to transmit light from a light emitter (e.g., a laser transmitter). Light emitted by lidar sensors 1402b include light (e.g., infrared light and/or the like) that is outside of the visible spectrum. In some embodiments, during operation, light emitted by lidar sensors 1402b encounters a physical object (e.g., a vehicle) and is reflected back to lidar sensors 1402b. In some embodiments, the light emitted by lidar sensors 1402b does not penetrate the physical objects that the light encounters. Lidar sensors 1402b also include at least one light detector which detects the light that was emitted from the light emitter after the light encounters a physical object. In some embodiments, at least one data processing system associated with lidar sensors 1402b generates an image (e.g., a point cloud, a combined point cloud, and/or the like) representing the objects included in a field of view of lidar sensors 1402b. In some examples, the at least one data processing system associated with lidar sensors 1402b generate an image that represents the boundaries of a physical object, the surfaces (e.g., the topology of the surfaces) of the physical object, and/or the like. In such an example, the image is used to determine the boundaries of physical objects in the field of view of lidar sensors 1402b. [0211] Radio Detection and Ranging (radar) sensors 1402c include at least one device configured to be in communication with communication device 1402e, autonomous vehicle compute 1402f, and/or safety controller 1402g via a bus (e.g., a bus that is the same as or similar to bus 1502 of FIG.15). Radar sensors 1402c include a system configured to transmit radio waves (either pulsed or continuously). The radio waves transmitted by radar sensors 1402c include radio waves that are within a predetermined spectrum. In some embodiments, during operation, radio waves transmitted by radar sensors 1402c encounter a physical object and are reflected back to radar sensors 1402c. In some embodiments, the radio waves transmitted by radar sensors 1402c are not reflected by some objects. In some embodiments, at least one data processing system associated with radar sensors 1402c generates signals representing the objects included in a field of view of radar sensors 1402c. For example, the at least one data processing system associated with radar sensors 1402c generates an image that represents the boundaries of a physical object, the surfaces (e.g., the topology of the surfaces) of the physical object, and/or the like. In some examples, the image is used to determine the boundaries of physical objects in the field of view of radar sensors 1402c. [0212] Microphones 1402d includes at least one device configured to be in communication with communication device 1402e, autonomous vehicle compute 1402f, and/or safety controller 1402g via a bus (e.g., a bus that is the same as or similar to bus 1502 of FIG. 15). Microphones 1402d include one or more microphones (e.g., array microphones, external microphones, and/or the like) that capture audio signals and generate data associated with (e.g., representing) the audio signals. In some examples, microphones 1402d include transducer devices and/or like devices. In some embodiments, one or more systems described herein can receive the data generated by microphones 1402d and determine a position of an object relative to vehicle 1400 (e.g., a distance and/or the like) based on the audio signals associated with the data. [0213] Communication device 1402e includes at least one device configured to be in communication with cameras 1402a, lidar sensors 1402b, radar sensors 1402c, microphones 1402d, autonomous vehicle compute 1402f, safety controller 1402g, and/or DBW (Drive-By- Wire) system 1402h. For example, communication device 1402e can include a device that is the same as or similar to communication interface 1514 of FIG. 15. In some embodiments, communication device 1402e includes a vehicle-to-vehicle (V2V) communication device (e.g., a device that enables wireless communication of data between vehicles). [0214] Autonomous vehicle compute 1402f include at least one device configured to be in communication with cameras 1402a, lidar sensors 1402b, radar sensors 1402c, microphones 1402d, communication device 1402e, safety controller 1402g, and/or DBW system 1402h. In some examples, autonomous vehicle compute 1402f includes a device such as a client device, a mobile device (e.g., a cellular telephone, a tablet, and/or the like), a server (e.g., a computing device including one or more central processing units, graphical processing units, and/or the like), and/or the like. In some embodiments, autonomous vehicle compute 1402f is the same as or similar to autonomous vehicle compute 1600, described herein. Additionally, or alternatively, in some embodiments autonomous vehicle compute 1402f is configured to be in communication with an autonomous vehicle system (e.g., an autonomous vehicle system that is the same as or similar to remote AV system 1314 of FIG. 13), a fleet management system (e.g., a fleet management system that is the same as or similar to fleet management system 1316 of FIG. 13), a V2I device (e.g., a V2I device that is the same as or similar to V2I device 1310 of FIG. 13), and/or a V2I system (e.g., a V2I system that is the same as or similar to V2I system 1318 of FIG.13). [0215] Safety controller 1402g includes at least one device configured to be in communication with cameras 1402a, lidar sensors 1402b, radar sensors 1402c, microphones 1402d, communication device 1402e, autonomous vehicle compute 1402f, and/or DBW system 1402h. In some examples, safety controller 1402g includes one or more controllers (electrical controllers, electromechanical controllers, and/or the like) that are configured to generate and/or transmit control signals to operate one or more devices of vehicle 1400 (e.g., powertrain control system 1404, steering control system 1406, brake system 1408, and/or the like). In some embodiments, safety controller 1402g is configured to generate control signals that take precedence over (e.g., overrides) control signals generated and/or transmitted by autonomous vehicle compute 1402f. [0216] DBW system 1402h includes at least one device configured to be in communication with communication device 1402e and/or autonomous vehicle compute 1402f. In some examples, DBW system 1402h includes one or more controllers (e.g., electrical controllers, electromechanical controllers, and/or the like) that are configured to generate and/or transmit control signals to operate one or more devices of vehicle 1400 (e.g., powertrain control system 1404, steering control system 1406, brake system 1408, and/or the like). Additionally, or alternatively, the one or more controllers of DBW system 1402h are configured to generate and/or transmit control signals to operate at least one different device (e.g., a turn signal, headlights, door locks, windshield wipers, and/or the like) of vehicle 1400. [0217] Powertrain control system 1404 includes at least one device configured to be in communication with DBW system 1402h. In some examples, powertrain control system 1404 includes at least one controller, actuator, and/or the like. In some embodiments, powertrain control system 1404 receives control signals from DBW system 1402h and powertrain control system 1404 causes vehicle 1400 to make longitudinal vehicle motion, such as start moving forward, stop moving forward, start moving backward, stop moving backward, accelerate in a direction, decelerate in a direction or to make lateral vehicle motion such as performing a left turn, performing a right turn, and/or the like. In an example, powertrain control system 1404 causes the energy (e.g., fuel, electricity, and/or the like) provided to a motor of the vehicle to increase, remain the same, or decrease, thereby causing at least one wheel of vehicle 1400 to rotate or not rotate. [0218] Steering control system 1406 includes at least one device configured to rotate one or more wheels of vehicle 1400. In some examples, steering control system 1406 includes at least one controller, actuator, and/or the like. In some embodiments, steering control system 1406 causes the front two wheels and/or the rear two wheels of vehicle 1400 to rotate to the left or right to cause vehicle 1400 to turn to the left or right. In other words, steering control system 1406 causes activities necessary for the regulation of the y-axis component of vehicle motion. [0219] Brake system 1408 includes at least one device configured to actuate one or more brakes to cause vehicle 1400 to reduce speed and/or remain stationary. In some examples, brake system 1408 includes at least one controller and/or actuator that is configured to cause one or more calipers associated with one or more wheels of vehicle 1400 to close on a corresponding rotor of vehicle 1400. Additionally, or alternatively, in some examples brake system 1408 includes an automatic emergency braking (AEB) system, a regenerative braking system, and/or the like. [0220] In some embodiments, vehicle 1400 includes at least one platform sensor (not explicitly illustrated) that measures or infers properties of a state or a condition of vehicle 1400. In some examples, vehicle 1400 includes platform sensors such as a global positioning system (GPS) receiver, an inertial measurement unit (IMU), a wheel speed sensor, a wheel brake pressure sensor, a wheel torque sensor, an engine torque sensor, a steering angle sensor, and/or the like. Although brake system 1408 is illustrated to be located in the near side of vehicle 1400 in FIG.14, brake system 1408 can be located anywhere in vehicle 1400. [0221] In some embodiments, the one or more lidar sensors of the lidar sensors 1402b of the autonomous system 1402 can include the lidar detection system 604, and/or the lidar detection system 800. [0222] Referring now to FIG. 15, illustrated is a schematic diagram of a device 1500. As illustrated, device 1500 includes processor 1504, memory 1506, storage device 1508, input interface 1510, output interface 1512, communication interface 1514, and bus 1502. In some embodiments, device 1500 corresponds to at least one device of vehicles 1302a-1302n, at least one device of vehicle 1400, and/or one or more devices of network 1312. In some embodiments, one or more devices of vehicles 1302a-1302n, and/or one or more devices of network 1312 include at least one device 1500 and/or at least one component of device 1500. As shown in FIG. 15, device 1500 includes bus 1502, processor 1504, memory 1506, storage device 1508, input interface 1510, output interface 1512, and communication interface 1514. [0223] Bus 1502 includes a component that permits communication among the components of device 1500. In some cases, the processor 1504 includes a processor (e.g., a central processing unit (CPU), a graphics processing unit (GPU), an accelerated processing unit (APU), and/or the like), a microphone, a digital signal processor (DSP), and/or any processing component (e.g., a field-programmable gate array (FPGA), an application specific integrated circuit (ASIC), and/or the like) that can be programmed to perform at least one function. Memory 1506 includes random access memory (RAM), read-only memory (ROM), and/or another type of dynamic and/or static storage device (e.g., flash memory, magnetic memory, optical memory, and/or the like) that stores data and/or instructions for use by processor 1504. [0224] Storage device 1508 stores data and/or software related to the operation and use of device 1500. In some examples, storage device 1508 includes a hard disk (e.g., a magnetic disk, an optical disk, a magneto-optic disk, a solid state disk, and/or the like), a compact disc (CD), a digital versatile disc (DVD), a floppy disk, a cartridge, a magnetic tape, a CD-ROM, RAM, PROM, EPROM, FLASH-EPROM, NV-RAM, and/or another type of computer readable medium, along with a corresponding drive. [0225] Input interface 1510 includes a component that permits device 1500 to receive information, such as via user input (e.g., a touchscreen display, a keyboard, a keypad, a mouse, a button, a switch, a microphone, a camera, and/or the like). Additionally or alternatively, in some embodiments input interface 1510 includes a sensor that senses information (e.g., a global positioning system (GPS) receiver, an accelerometer, a gyroscope, an actuator, and/or the like). Output interface 1512 includes a component that provides output information from device 1500 (e.g., a display, a speaker, one or more light-emitting diodes (LEDs), and/or the like). [0226] In some embodiments, communication interface 1514 includes a transceiver-like component (e.g., a transceiver, a separate receiver and transmitter, and/or the like) that permits device 1500 to communicate with other devices via a wired connection, a wireless connection, or a combination of wired and wireless connections. In some examples, communication interface 1514 permits device 1500 to receive information from another device and/or provide information to another device. In some examples, communication interface 1514 includes an Ethernet interface, an optical interface, a coaxial interface, an infrared interface, a radio frequency (RF) interface, a universal serial bus (USB) interface, a Wi-Fi® interface, a cellular network interface, and/or the like. [0227] In some embodiments, device 1500 performs one or more processes described herein. Device 1500 performs these processes based on processor 1504 executing software instructions stored by a computer-readable medium, such as memory 1506 and/or storage device 1508. A computer-readable medium (e.g., a non-transitory computer readable medium) is defined herein as a non-transitory memory device. A non-transitory memory device includes memory space located inside a single physical storage device or memory space spread across multiple physical storage devices. [0228] In some embodiments, software instructions are read into memory 1506 and/or storage device 1508 from another computer-readable medium or from another device via communication interface 1514. When executed, software instructions stored in memory 1506 and/or storage device 1508 cause processor 1504 to perform one or more processes described herein. Additionally or alternatively, hardwired circuitry is used in place of or in combination with software instructions to perform one or more processes described herein. Thus, embodiments described herein are not limited to any specific combination of hardware circuitry and software unless explicitly stated otherwise. [0229] Memory 1506 and/or storage device 1508 includes data storage or at least one data structure (e.g., a database and/or the like). Device 1500 is capable of receiving information from, storing information in, communicating information to, or searching information stored in the data storage or the at least one data structure in memory 1506 or storage device 1508. In some examples, the information includes network data, input data, output data, or any combination thereof. [0230] In some embodiments, device 1500 is configured to execute software instructions that are either stored in memory 1506 and/or in the memory of another device (e.g., another device that is the same as or similar to device 1500). As used herein, the term “module” refers to at least one instruction stored in memory 1506 and/or in the memory of another device that, when executed by processor 1504 and/or by a processor of another device (e.g., another device that is the same as or similar to device 1500) cause device 1500 (e.g., at least one component of device 1500) to perform one or more processes described herein. In some embodiments, a module is implemented in software, firmware, hardware, and/or the like. [0231] The number and arrangement of components illustrated in FIG. 15 are provided as an example. In some embodiments, device 1500 can include additional components, fewer components, different components, or differently arranged components than those illustrated in FIG. 15. Additionally or alternatively, a set of components (e.g., one or more components) of device 1500 can perform one or more functions described as being performed by another component or another set of components of device 1500. [0232] In some implementations, one or more components or systems of the lidar detection system 604 can include one or more components of the device 1500. For example, the readout system 330, event validation circuit 610, and/or the detection control system 640, can include the processor 1504 and/or memory 1506. In some implementations, one or more component systems of the lidar system 100 can include one or more components of the device 1500. For example, the lidar detection system 104, and/or the lidar signal processing system 106, can include the processor 1504 and/or memory 1506. In some cases, the device 1500 can include the lidar detection system 104, the lidar detection system 604, or the lidar detection system 800 described above. [0233] In some implementations, one or more systems, subsystems, or devices used in the lidar system 100, lidar system 1700, the lidar detection system 104, the lidar detection system 604, or the lidar detection system 800 described above, can include one or more components of the device 1500 described above with respect to FIG.15. For example, the lidar detection system 1704, lidar emission system 1702, and/or the lidar signal processing system 1706, can include the processor 1504, the memory 1506, and/or the storage device 1508. [0234] Referring now to FIG. 16, illustrated is an example block diagram of an autonomous vehicle compute 1600 (sometimes referred to as an “AV stack”). As illustrated, autonomous vehicle compute 1600 includes perception system 1602 (sometimes referred to as a perception module), planning system 1604 (sometimes referred to as a planning module), localization system 1606 (sometimes referred to as a localization module), control system 1608 (sometimes referred to as a control module), and database 1610. In some embodiments, perception system 1602, planning system 1604, localization system 1606, control system 1608, and database 1610 are included and/or implemented in an autonomous navigation system of a vehicle (e.g., autonomous vehicle compute 1402f of vehicle 1400). Additionally, or alternatively, in some embodiments, perception system 1602, planning system 1604, localization system 1606, control system 1608, and database 1610 are included in one or more standalone systems (e.g., one or more systems that are the same as or similar to autonomous vehicle compute 1600 and/or the like). In some examples, perception system 1602, planning system 1604, localization system 1606, control system 1608, and database 1610 are included in one or more standalone systems that are located in a vehicle and/or at least one remote system as described herein. In some embodiments, any and/or all of the systems included in autonomous vehicle compute 1600 are implemented in software (e.g., in software instructions stored in memory), computer hardware (e.g., by microprocessors, microcontrollers, application-specific integrated circuits (ASICs), Field Programmable Gate Arrays (FPGAs), and/or the like), or combinations of computer software and computer hardware. It will also be understood that, in some embodiments, autonomous vehicle compute 1600 is configured to be in communication with a remote system (e.g., an autonomous vehicle system that is the same as or similar to remote AV system 1314, a fleet management system that is the same as or similar to fleet management system 1316, a V2I system that is the same as or similar to V2I system 1318, and/or the like). [0235] In some embodiments, perception system 1602 receives data associated with at least one physical object (e.g., data that is used by perception system 1602 to detect the at least one physical object) in an environment and classifies the at least one physical object. In some examples, perception system 1602 receives image data captured by at least one camera (e.g., cameras 1402a), the image associated with (e.g., representing) one or more physical objects within a field of view of the at least one camera. In such an example, perception system 1602 classifies at least one physical object based on one or more groupings of physical objects (e.g., bicycles, vehicles, traffic signs, pedestrians, and/or the like). In some embodiments, perception system 1602 transmits data associated with the classification of the physical objects to planning system 1604 based on perception system 1602 classifying the physical objects. [0236] In some embodiments, planning system 1604 receives data associated with a destination and generates data associated with at least one route (e.g., routes 1306a-n) along which a vehicle (e.g., vehicles 1302) can travel along toward a destination. In some embodiments, planning system 1604 periodically or continuously receives data from perception system 1602 (e.g., data associated with the classification of physical objects, described above) and planning system 1604 updates the at least one trajectory or generates at least one different trajectory based on the data generated by perception system 1602. In other words, planning system 1604 can perform tactical function-related tasks that are required to operate vehicle 1302a-1302n in on-road traffic. Tactical efforts involve maneuvering the vehicle in traffic during a trip, including but not limited to deciding whether and when to overtake another vehicle, change lanes, or selecting an appropriate speed, acceleration, deacceleration, etc. In some embodiments, planning system 1604 receives data associated with an updated position of a vehicle (e.g., vehicles 1302) from localization system 1606 and planning system 1604 updates the at least one trajectory or generates at least one different trajectory based on the data generated by localization system 1606. [0237] In some embodiments, localization system 1606 receives data associated with (e.g., representing) a location of a vehicle (e.g., vehicles 1302a-1302n) in an area. In some examples, localization system 1606 receives lidar data associated with at least one point cloud generated by at least one lidar sensor (e.g., lidar sensors 1402b). In certain examples, localization system 1606 receives data associated with at least one point cloud from multiple lidar sensors and localization system 1606 generates a combined point cloud based on each of the point clouds. In these examples, localization system 1606 compares the at least one point cloud or the combined point cloud to two-dimensional (2D) and/or a three-dimensional (3D) map of the area stored in database 1610. Localization system 1606 then determines the position of the vehicle in the area based on localization system 1606 comparing the at least one point cloud or the combined point cloud to the map. In some embodiments, the map includes a combined point cloud of the area generated prior to navigation of the vehicle. In some embodiments, maps include, without limitation, high-precision maps of the roadway geometric properties, maps describing road network connectivity properties, maps describing roadway physical properties (such as traffic speed, traffic volume, the number of vehicular and cyclist traffic lanes, lane width, lane traffic directions, or lane marker types and locations, or combinations thereof), and maps describing the spatial locations of road features such as crosswalks, traffic signs or other travel signals of various types. In some embodiments, the map is generated in real-time based on the data received by the perception system. [0238] In another example, localization system 1606 receives Global Navigation Satellite System (GNSS) data generated by a global positioning system (GPS) receiver. In some examples, localization system 1606 receives GNSS data associated with the location of the vehicle in the area and localization system 1606 determines a latitude and longitude of the vehicle in the area. In such an example, localization system 1606 determines the position of the vehicle in the area based on the latitude and longitude of the vehicle. In some embodiments, localization system 1606 generates data associated with the position of the vehicle. In some examples, localization system 1606 generates data associated with the position of the vehicle based on localization system 1606 determining the position of the vehicle. In such an example, the data associated with the position of the vehicle includes data associated with one or more semantic properties corresponding to the position of the vehicle. [0239] In some embodiments, control system 1608 receives data associated with at least one trajectory from planning system 1604 and control system 1608 controls operation of the vehicle. In some examples, control system 1608 receives data associated with at least one trajectory from planning system 1604 and control system 1608 controls operation of the vehicle by generating and transmitting control signals to cause a powertrain control system (e.g., DBW system 1402h, powertrain control system 1404, and/or the like), a steering control system (e.g., steering control system 1406), and/or a brake system (e.g., brake system 1408) to operate. For example, control system 1608 is configured to perform operational functions such as a lateral vehicle motion control or a longitudinal vehicle motion control. The lateral vehicle motion control causes activities necessary for the regulation of the y-axis component of vehicle motion. The longitudinal vehicle motion control causes activities necessary for the regulation of the x-axis component of vehicle motion. In an example, where a trajectory includes a left turn, control system 1608 transmits a control signal to cause steering control system 1406 to adjust a steering angle of vehicle 1400, thereby causing vehicle 1400 to turn left. Additionally, or alternatively, control system 1608 generates and transmits control signals to cause other devices (e.g., headlights, turn signal, door locks, windshield wipers, and/or the like) of vehicle 1400 to change states. [0240] In some embodiments, perception system 1602, planning system 1604, localization system 1606, and/or control system 1608 implement at least one machine learning model (e.g., at least one multilayer perceptron (MLP), at least one convolutional neural network (CNN), at least one recurrent neural network (RNN), at least one autoencoder, at least one transformer, and/or the like). In some examples, perception system 1602, planning system 1604, localization system 1606, and/or control system 1608 implement at least one machine learning model alone or in combination with one or more of the above-noted systems. In some examples, perception system 1602, planning system 1604, localization system 1606, and/or control system 1608 implement at least one machine learning model as part of a pipeline (e.g., a pipeline for identifying one or more objects located in an environment and/or the like). Database 1610 stores data that is transmitted to, received from, and/or updated by perception system 1602, planning system 1604, localization system 1606, and/or control system 1608. In some examples, database 1610 includes a storage component (e.g., a storage component that is the same as or similar to storage device 1508 of FIG. 15) that stores data and/or software related to the operation and uses at least one system of autonomous vehicle compute 1600. In some embodiments, database 1610 stores data associated with 2D and/or 3D maps of at least one area. In some examples, database 1610 stores data associated with 2D and/or 3D maps of a portion of a city, multiple portions of multiple cities, multiple cities, a county, a state, a State (e.g., a country), and/or the like). In such an example, a vehicle (e.g., a vehicle that is the same as or similar to vehicles 1302 and/or vehicle 1400) can drive along one or more drivable regions (e.g., single-lane roads, multi-lane roads, highways, back roads, off road trails, and/or the like) and cause at least one lidar sensor (e.g., a lidar sensor that is the same as or similar to lidar sensors 1402b) to generate data associated with an image representing the objects included in a field of view of the at least one lidar sensor. [0241] In some embodiments, database 1610 can be implemented across a plurality of devices. In some examples, database 1610 is included in a vehicle (e.g., a vehicle that is the same as or similar to vehicles 1302a-1302n and/or vehicle 1400), an autonomous vehicle system (e.g., an autonomous vehicle system that is the same as or similar to remote AV system 1314, a fleet management system (e.g., a fleet management system that is the same as or similar to fleet management system 1316 of FIG.13, a V2I system (e.g., a V2I system that is the same as or similar to V2I system 1318 of FIG. 13) and/or the like. Adaptive emission and detection control [0242] In some cases, changes in the environmental condition (e.g., background light), and the arrangement of objects and/or light emitting devices around a lidar system, or direction of propagation of lidar signals, can affect the intensity of the reflected optical signals, level of background noise, and/or the signal-to-noise ratio of the sensor signals generated by the lidar detection system. As such, these changes result in a change in the probability of detection or the false alarm rate of the lidar system. In some cases, the probability of detection or the false alarm rate of the lidar can be maintained, preserved or improved by changing (e.g., adaptively changing) the characteristics of the probe signals emitted by the lidar and/or changing the detection threshold level (e.g., the readout threshold associated with time-to- digital conversion) for measuring the sensor signals and generating the return signals. Some of the systems and methods disclosed herein can be used by a lidar system to improve the probability of detecting reflected optical signals by adaptively controlling one or more characteristics of the probe signals and/or detection threshold level based on variations of a noise level (e.g., background noise level), and signal-to-noise ratio or amplitude of the sensor signals (or return signals). [0243] As described above, in some implementations the optical probe beams emitted by a lidar system (also referred to as lidar) include pulse coded optical probe signals. In some cases, a pulse coded optical probe signal includes two or more optical pulses emitted sequentially as a pulse sequence. The emission system of the lidar emits a pulse sequence and receives a reflected pulse sequence resulting from reflection of the emitted pulse sequence. The lidar detection system uses the received pulse sequence to generate sensor signals and return signals usable for determining a position or a velocity of an object (e.g., by the lidar signal processing system). In some cases, a characteristic of a pulse sequence can be used to identify a reflected optical signal associated with an emitted optical probe signal. For example, the detection system of the lidar compares a characteristic of a pulse sequence corresponding to an optical probe signal emitted by the lidar, with that of a reflected pulse sequence corresponding to a reflected optical signal received by the lidar, to determine whether the reflected optical signal is a reflection of the optical probe signal (as opposed to other probe signals emitted by the lidar, or reflection of light generated by other sources). Characteristics of a pulse sequence include but are not limited to: relations (e.g., ratios, or differences) between delays between two or more pairs of optical pulses, intensities (e.g., optical intensities) of one or more optical pulses, relations between intensities of two or more optical pulses, and/or number of optical pulses in a sequence of optical pulses. In some cases, characteristics of a pulse sequence can include other parameters. [0244] In some cases, a series of pulse coded optical probe signals include substantially identical pulse sequences. In some examples, each pulse sequence includes optical pulses having substantially identical intensities and pulse-to-pulse time delays. In some other examples, each pulse sequence includes two or more optical pulses having different intensities, and/or two or more pairs of pulses having different pulse-to-pulse time delays. [0245] Using pulse coded optical probe signals, instead of single pulse optical probe signals, can reduce a probability of falsely identifying light not associated with an optical probe signal as the corresponding reflected optical signal. [0246] The lidar system scans an environment by emitting a series of pulse coded optical probe signals along different directions. In some examples, at least a portion of these optical probe signals are emitted periodically with a period T. In some cases, the lidar emits a first plurality of pulse sequences (optical probe signals) during a first time interval and a second plurality of pulse sequences of pulse sequences during a second time interval different than the first time interval. In some cases, the first and the second time intervals can be non-overlapping. Each of the first and the second plurality of pulse sequences can include substantially identical pulse sequences, but a characteristic of the pulse sequences of the first plurality of pulse sequences can be different from that of the second plurality of pulse sequences. As such the lidar detection system can distinguish reflected pulse sequences associated with the first plurality of pulse sequences from those associated with the second plurality of pulse sequences based on their different characteristic (e.g., a different number of pulses in the pulse sequence). In some such cases, the first and second time intervals are associated with two different directions of propagation for the corresponding optical probe beams. For example, when the lidar emission system 102 (FIG. 1A) is scanning a probe beam 108, the probe beam 108 can include a first plurality of pulse sequences when propagating along a first direction and a second plurality of pulse sequences of when propagating along a second direction different from the first direction. [0247] For a given level of noise or signal-to-noise-ratio the probability of detection of reflected optical signals can be affected by the characteristics of the respective optical probe signals. [0248] The lidar system can change a characteristic of one or more pulse sequences (e.g., number of pulses) to maintain or decrease the false alarm rate of the lidar system. For example, the lidar system can adaptively adjust a characteristic of the one or more pulse sequences (one or more optical probe signals), to maintain the false alarm rate (FAR) of the lidar system within a target FAR. Illustratively, the target FAR can be a predetermined range stored in a memory of the lidar system, or the target FAR can be determined by the lidar system (e.g., by the detection system, and/or emission system). [0249] The lidar system can change a characteristic of one or more pulse sequences (e.g., number of pulses) to maintain or increase the probability of detecting one or more reflected optical signals resulting from the one or more pulse sequences. In some cases, the lidar system adaptively adjusts a characteristic of the one or more pulse sequences, to maintain the probability of detecting the one or more reflected optical signals within a target detection probability range. The target detection probability range can be a predetermined range stored in a memory of the lidar system or can be determined by the lidar system (e.g., by the detection system, and/or emission system). [0250] In some implementations, the set range for the false alarm rate and/or the detection probability is a user defined range stored in a memory of the lidar system and the determined range is determined by the lidar system. For example, the lidar determines the determined range based at least on part on the background signals, determined noise level, or other parameters. [0251] In some examples, when an intensity of the reflected optical signal is much larger than noise (e.g., background noise associated with a level of background light received by the lidar), including a larger number of optical pulses in a sequence or using pulse sequences with more complex combination of intensity and delay variations can increase the probability of detecting the respective reflected optical signals and reduce the false alarm rate. Alternatively, when the intensities of the reflected optical signals are small (e.g., reflection received from long distance objects or objects having low optical reflectivity), and/or the level of noise (e.g., background noise) is large, using optical probe signals including pulse sequences with a larger number of pulses does not improve the detection probability. [0252] In various examples, an optimal value of a characteristic of an optical pulse sequence emitted by a lidar system depend on time varying parameters such as background light, distance of the objects from the lidar, optical reflectivity of the objects, and the like. The optimal value of a characteristic of an optical pulse sequence can be, a value that reduces or minimizes false alarm rate or maximizes the probability of detection for a given value of these parameters. [0253] In some embodiments, the lidar emission system changes a characteristic of pulse sequences emitted by the lidar (e.g., a number of pulses in the sequence) based at least in part on background signals generated by the lidar detection (e.g., the background signals 325 generated by the lidar detection system 604). Additionally or alternatively, the lidar emission system 102 changes a characteristic of pulse sequences emitted by the lidar (e.g., a number of pulses in the sequence) based at least in part on the amplitude of sensor signals and/or return signals generated by the lidar detection (e.g., the signals 323 generated by the sensor 320 and/or the return signals 120 generated by the readout circuit 632). [0254] In some implementations, the lidar detection system 604 detects a reflected pulse sequence based at least in part on a signal received from the lidar emission system, where the signal indicates a characteristic of the emitted pulse sequence. [0255] FIG. 17 is a block diagram illustrating an example of a lidar system 1700 that can detect an object, determine a distance between the object and the lidar system 1700, determine a velocity of the object relative to the lidar system 1700, determine a three- dimensional (3D) position of the object, and/or determine a reflectivity of the object. The lidar system 1700 can include one or features described above with respect to the lidar system 100. The lidar system 1700 can be a time-of-flight (ToF) lidar that detects the position of an object based on a delay (time-of-flight) between emission of an optical probe signal and reception of the respective reflected optical signal. The lidar system 1700 can include machine readable instructions stored in a non-transitory memory of the lidar system 1700 and one or more processors that execute the machine readable instructions to control emission of the probe signals and detection of the reflected optical signals. In some embodiments, the lidar system 1700 corresponds to a lidar sensor of the lidar sensors 1402b of the vehicles 1400 described above with respect to FIG. 14. In some implementations, the lidar system 1700 can include one or more devices or components of the device 1500 described above with respect to FIG. 15. For example, the lidar system 1700 can include a processor 1504, a memory 1506, a storage device 1508, an input interface 1510, an output interface 1512, a communication interface 1514, and a bus 1502. Various processes and functions described below with respect to lidar system 1700 can be performed by processor 1504 by executing machine readable instructions stored in the memory 1506 and/or storage device 1508. In some examples, the lidar system 1700 exchanges data with one or more devices of the vehicle 1400 (e.g., the autonomous vehicle compute 1402f), using the input interface 1510, output interface 1512, and/or the communication interface 1514. [0256] In some cases, the lidar system 1700 includes a lidar emission system 1702 (referred to as emission system or emitter) that emits optical probe beams, a lidar detection system 1704 (referred to as detection system or photodetection circuit ) that receives the reflected optical beams and generates return signals 1735, and a lidar signal processing system 1706 (also referred to as lidar signal processing subsystem) that determines the presence, 3D position, distance, reflectivity, and/or velocity of an object based at least on part on the return signals 1735In some examples, the lidar signal processing system 1706 generates an output signal 1724 including a 3D position of the object with respect to the lidar system 1700, the velocity of the object, and/or the reflectivity of the object. The lidar emission system 1702, the lidar signal processing system 1706, the detection system 1704, and the return signals 1735 are similar to the lidar emission system 102, the signal processing system 106, the detection system 104, and the return signals 120 described with respect to FIG. 1A, respectively. [0257] The lidar emission system 1702 includes an emission control system 1710 (also referred as emission control subsystem or emission control circuit) that controls the emission time, emission period (T), and other characteristics of the emitted optical probe signals (e.g., pulse coded optical signals). For example, when the optical probe signal includes a sequence of optical pulses (referred to as pulse sequence), the emission control system 1710 controls a number of pulses in the pulse sequence, the delay between consecutive optical pulses in the pulse sequence, the intensity of each optical pulse in the pulse sequence, or other parameters. [0258] The emission system 1702 generates detection control signals 1712 indicative of an emission time, an emission period (T), and/or other characteristics of the emitted optical probe signals and provides the detection control signals 1712 to the detection system 1704. The detection system 1704 uses a detection control signal associated with one or more emitted probe signals to identify respective reflected optical signals received from the environment. As described above, for given environmental conditions (e.g., background light), distribution of objects (long and short distances from the lidar), and object properties (e.g., optical properties) the probability of detecting a reflected optical signal resulting from reflection of a probe signal by an object in the environment, can be affected by one or more characteristics of the optical probe signals. In other words, for a given object and a given level of noise (e.g., associated with background light, or lidar circuitry), changing a characteristic of the optical probe signal can increase or decrease the probability of detecting the resulting reflected optical signal. [0259] In some examples, when the optical probe signal includes a pulse sequence and the signal-to-noise ratio of the sensor signal associated with the corresponding reflected optical signal is larger than a threshold value, increasing the number of pulses in the pulse sequence can further improve or maintain the probability of detection. The threshold value can depend on the characteristics of the lidar detection system 1704 and the desired detection probability. In some cases, when the optical probe signal includes a pulse sequence and the signal-to-noise ratio of the corresponding sensor signal is below a threshold level, increasing the number of pulses in the pulse sequence can improve or maintain the probability of detection. In some cases, if signal-to-noise ratio is too low, detection based a larger number of pulses may reduce the probability of detection. In some such cases, when the signal-to-noise ratio of the corresponding sensor signal is below a threshold level, decreasing the number of pulses in the pulse sequence can improve or maintain the probability of detection. [0260] In some cases, when the optical probe signal includes a pulse sequence and the signal-to-noise ratio of the sensor signal associated with the corresponding reflected optical signal is larger than a threshold value, the number of pulses in a sequence can be decreased to reduce optical power and thereby power consumption. In these cases, a lower bound for the decreased number of pulses can be determined such that the probability of detection is not reduced. [0261] In the scenarios discussed above, a threshold value of SNR used for adjusting the number of pulses in the optical probe signal can depend on the characteristics of the lidar detection system 1704, desired detection probability, characteristic of the optical probe signal (e.g., the amplitude and temporal width of the optical pulses, and the like). [0262] As such, real time adaptive control of the characteristics of the optical probe signals emitted by the lidar system 1700 can improve the detection probability and reduce the false alarm rate of the lidar system 1700, e.g., when the environmental condition and/or the intensity of reflected optical signals change (e.g., when the probe signals are reflected from objects that are very far from the lidar or they have low optical reflectivity). In some examples, the emission control system 1710 adjusts a characteristic of the optical probe signal according to a level of background light or a distance of an object from the lidar system 1700 determined based on previous return signals. [0263] In embodiments disclosed herein, the emission control system 1710 adaptively controls the characteristics of the optical probe signals based at least in part on emission control signals 1714 generated by the lidar system 1700. In some examples, the lidar detection system 1704 generates emission control signals 1714 based at least in part on return signals 1735 generated by the readout system (also referred to as readout processing circuit), the sensor signals 1723 generated by the sensor 1720 (similar to the sensor 320 described with respect to FIG. 3 and FIG. 6), a level of noise in the readout system 1730, and/or background signals 325 generated by the readout system 1730. The readout system 1730 can be similar to the readout system 330 described with respect to FIG. 3 and FIG. 6. In some examples, an emission control signal 1714 can include or indicate a signal-to-noise ratio (SNR) of one or more sensor signals 1723, a background noise, amplitude (or average amplitude) of one or more sensor signals 1723, or amplitude (or average amplitude) of one or more return signals. [0264] In some cases, the emission control system 1710 uses the emission control signals 1714 to adjust one or more characteristics of probe signals to increase the detection probability of the lidar system 1700 and/or reduce a false alarm rate associated with the output signal 1724. For example, the emission control system 1710 can use an emission control signal generated in a first measurement time interval, during which a first reflected optical signal associated with a first emitted probe signal is received, to change a characteristic of a second probe signal emitted after the first optical probe signal to increase a probability of detection of a second reflected optical signal associated with the reflection of the second probe signal compared to that of the first reflected optical signal. In some implementations, the delay between the first and the second probe signals can be less than 1 picoseconds, less than 1 nanosecond, less than 1 microsecond, or less than 1 millisecond. [0265] In some examples, the lidar detection system 1704, lidar emission system 1702, and/or the lidar signal processing system 1706 can include the device 1500 or a component of the device 1500 described above with respect to FIG. 15. For example, various processes and functions described with respect to lidar detection system 1704, lidar emission system 1702, and/or the lidar signal processing system 1706, can be performed by the processor 1504 that executes machine readable instructions stored in the memory 1506 and/or storage device 1508. In some examples, the lidar detection system 1704 and the lidar emission system 1702 can exchange emission control signals 1714 and detection control signals 1712, using the input interface 1510, output interface 1512, and/or the communication interface 1514. [0266] In various implementations, emission system 1702 can include one or more features described above with respect to lidar emission system 102 of the lidar system 100, lidar emission system 204 of the scanning lidar system 201, emission system 220 of the flash lidar system 202, or emission system 230 of the mechanical lidar system 203. In some embodiments, the lidar emission system may include one or more emitters configured to emit one or more optical probe beams to an environment surrounding a vehicle that carries the lidar system 1700. In some examples, an emitter can include one or more laser sources configured to generate the optical probe beams. In some examples, an emitter can include one or more laser sources configured to generate laser light and an optical beam controller configured to receive the laser light from the one or more laser sources an generate one or more optical probe beams directed at one or more directions in the environment. In some cases, the lidar emission system may steer the one or more optical probe beams, e.g., using a mechanical, optomechanical, or electro-optical beam steering mechanism. In some cases, lidar emission system can include an electronic circuit (e.g., a digital or analog electronic circuit) configured to control the optical probe beams. In some examples, the electronic circuit may be included in the emission control system 1710. [0267] In some cases, the emission control system 1710 can include a control circuit (e.g., a digital and/or analog electronic circuit) configured to control one or more characteristics of the optical probe beams emitted by the lidar emission system 1702. The characteristics of an optical probe beam can include direction of propagation of the optical probe beam and/or the temporal variation of optical intensity or optical power of the optical probe beam. In some examples, temporal variation of optical intensity or optical power of the optical probe beam can include emission time, emission period, number of pulses in the pulse sequence, the delay between consecutive optical pulses in the pulse sequence, the intensity of each optical pulse in the pulse sequence, and the like. In some cases, the emission control system 1710 can include a non-transitory memory storing machine readable instructions and a hardware processor configured to execute the machine-readable instructions to control the optical probe beams based at least in part on emission control signal(s) 1714 received from the lidar detection system 1704. [0268] In some embodiments, the lidar detection system 1704 may comprise one or more features described above with respect to lidar detection system 104 of the lidar system 100, lidar detection system 216 of the scanning lidar system 201, the detection system (e.g., a detector array receiving light from an optical system) of the flash lidar system 202, or detection system 236 of the mechanical lidar system 203. In some cases, the lidar detection system 1704 comprises one or more features described above with respect to the lidar detection system 604, and/or lidar detection system 800. In some embodiments, lidar detection system 1704 comprises a photodetection circuit that includes the sensor 1720 and the readout processing circuit (readout system) 1730. In some cases, the photodetection circuit includes a photodetector or a photodetector array (e.g., a two-dimensional photodetector array) and electronic circuitry configured to received sensor signals from the photodetector or a photodetector array to generate return signals and/or emission control signals 1714. The photodetector or a photodetector array can include a single-photon detector, a p-i-n photodiode, an avalanche photodiode, a single-photon avalanche photodiode, or a photodetector that is capable to generate sensor signals 1723 having an SNR larger than 1, 3, 5, 10 or larger using reflected optical beams generated by the optical probe beams emitted by the lidar emission system 1702. [0269] In some embodiments, the electronic circuitry of the photodetection circuit comprises a non-transitory memory storing machine readable instructions and a hardware processor configured to execute the machine-readable instructions to generate an emission control signal 1714, a return signal, and/or a background signal, based at least in part on the sensors signals 1723 received from the photodetector or the photodetector array (sensor 1720) and , in some cases, based at least in part on the detection control signals 1712 received from the lidar emission system 1702. [0270] FIG. 18A shows a portion of an example of identical optical probe signals periodically emitted by the lidar system 1700 every T seconds (where T is the probe signal emission period). Each optical probe signal includes a pulse sequence 1802 of optical pulses. While the portion shown include two probe signals (1b and 2b) are shown, the lidar system 1700 can emit a larger number of pulse sequences during a measurement period. The pulse sequence 1802 can include a number (N) of substantially identical and sequentially emitted optical pulses all having an intensity I0 and a pulse width tw. In some cases, the optical pulses can be equally spaced in time domain where each optical pulse is emitted with a fixed delay of td after emission of a pulse immediately emitted before that optical pulse. In the example shown, each probe signal includes five identical optical pulses. [0271] FIG. 18B shows a portion of another example of substantially identical optical probe signals (substantially similar pulse sequences) periodically emitted by the lidar system 1700 every T seconds. Each optical probe signal includes a pulse sequence 1804 of optical pulses. Two substantially identical optical probe signals have the same number of pulses with substantially identical delays between respective optical pulses. The respective optical pulses of two substantially identical optical probe signals have substantially the same amplitudes and pulse width. While two pulse sequences are shown (1b and 2b), the lidar system 1700 can emit a larger number of pulse sequences during a measurement period. The pulse sequence 1804 includes a number (N) of sequentially emitted optical pulses where at least two optical pulses have different intensities. In some cases, the delay (or pulse spacing) t d1 between a first pair of optical pulses in the pulse sequence 1804 can be different from the delay t dn between a second pair of optical pulses in the pulse sequence 1804. In some examples, two or more optical pulses in the pulse sequence 1804 can be identical and equally spaced (similar to pulses in the pulse sequence 1802). [0272] In some implementations, the optical probe beam 1708a emitted by the lidar emission system 1702 includes a plurality of optical probe signals similar to the pulse sequence 1802 (or 1804) emitted with a period T. For each plurality of optical probe signals, the lidar emission system 1702 can send one detection control signal to the detection system 1704 indicative of td and N (for optical probe signals similar to the pulse sequence 1802), or td1, td2, td3,…, tdn, and corresponding relative pulse intensities, (for optical probe signals similar to the pulse sequence 1804). The lidar detection system 1704 can use the detection control signal to identify the reflected optical signals resulting from reflection of at least a portion of the plurality of emitted optical probe signals. In some examples, the lidar emission system 1702 emits an optical probe beam 1708a along a first direction and another optical probe beam 1708b along a second direction different from the first direction. In some such examples, at least one characteristic of the optical probe signals included in the probe beam 1708a can be different from that of the optical probe beam 1708b. For example, when the optical probe beam 1708a and 1708b include optical probe signals similar to the pulse sequence 1802, td and/or N can be different for the optical probe beams 1708a and 1708b. [0273] In some cases, the lidar emission system 1702 emits a first series of optical probe signals during a first time period and a second series of optical probe signals during a second time period. Illustratively, the pulse sequences in a series of optical probe signals include substantially identical pulse sequences (e.g., having substantially equal pulse spacing, pulse amplitude, pulse width, and number of pulses). However a pulse sequence of the first series can be different than a pulse sequence of the second series (e.g., having different N, tds, or pulse intensities). [0274] In various implementations, the duration of the pulse sequence 1802 or 1804, can be from 50 ns to 100 ns, from 100 ns to 150 ns, from 150 ns to 200 ns, or any range formed by these values or larger or smaller values. [0275] In an adaptive emission control mode, the lidar system 1700 (e.g., the detection system 1704 of the lidar system 1700) can generate an emission control signal 1714 after receiving one or more reflected optical signals associated with the first series of optical probe signals. In some examples, the emission control signal 1714 indicates the amplitudes or average amplitudes of one or more sensor signals 1723 generated using the reflected optical signals, a background noise, and/or a signal-to-noise ratio of the one or more sensor signals 1723. In some cases, the emission control signal 1714 is generated based at least in part on the amplitudes of one or more sensor signals 1723 generated using the reflected optical signals, a background noise, and/or a signal-to-noise ratio of the one or more sensor signals 1723. The amplitude of a sensor signal can include an average amplitude calculated based on the individual amplitudes of the optical pulses included in the sensor signal. In some cases, the emission control signal 1714 indicates a characteristic change of the optical probe signals. The emission control system 1710 receives the emission control signal and adjusts a characteristic of the second series optical signals after the first series based at least in part on the emission control signal. For example, when the first series of optical probe signals include the optical probe signals similar to the pulse sequence 1802, the emission control system 1710 changes N, td, and/or I 0 . When the one or more identical optical probe signals include the optical probe signals similar to the pulse sequence 1802, the emission control system 1710 changes N, t d1 , td2, …tdn, and/or the corresponding relative pulse intensities. In some implementations, the adjustment made to the characteristic of the second series optical probe signals, improves a probability of detection of reflected optical signals associated with second series of optical probe signals compared to that of the optical signals associated with first optical probe signals. The first and second series optical probe signals can be emitted along the same or different directions. [0276] FIG. 19A is a diagram illustrating an example of a series of adaptively controlled optical probe signals 1902 emitted by lidar system 1700 and the respective detected sensor signals 1904 in the presence of a noise level 1905 (e.g., background noise level) having a time varying average amplitude. In the example shown, the lidar system 1700 emits optical probe signals similar to the pulse sequence 1802 and adaptively adjusts a characteristic (e.g. a number of pulses) of one or more optical probe signals upon determining that a noise level (e.g., background noise level) measured during a noise measurement window, and/or a signal- to-noise ratio (SNR) associated with the respective sensor signals is not within acceptable ranges (e.g., acceptable ranges associated with the characteristics emitted optical probe signal). In some cases, an acceptable range for a parameter (e.g., background noise, SNR, or sensor signal amplitude) includes a range of values for that parameter smaller than an upper limit and larger than a lower limit. In various implementations, a noise measurement window includes a time window between two consecutive sensor signals or a time window before emission of a series of optical probe signals (e.g., substantially identical probe signals). In some cases, a number of optical probe signals emitted between two consecutive noise measurement time windows can be a set number stored in a memory of the lidar system or a number determined by the lidar system. In some cases, a measurement time window can be a dedicated measurement time window and can be different from the emission period of the optical probe signals. In some examples, a dedicated measurement time window can be longer than an emission period (T) of the optical probe signals. As mentioned above, the readout system 1730 can also determine a level of background noise in a time interval during which the sensor signals are generated. As such, the lidar system can determine a noise level during a dedicated measurement time widow, during the time between emission of optical probe signals, and/or while measuring sensor signals. In some cases, the lidar system determines a noise level based on measured noise during one or more aforementioned time intervals. [0277] For example, a first optical probe signal can include four substantially identical optical pulses emitted by the lidar system 1700, at substantially equal time intervals. The detection system 1704 generates a first sensor signal having four pulses upon receiving the corresponding first reflected optical signal. The lidar system 1700 then determines a first background noise level based on reflected signals received and the corresponding sensor signals generated during one or both of a first signal time window (S1) and first noise measurement time window (W1). The first signal time window (S1) can be a time interval during which the first reflected signal is received and first sensor signal is generated, and the first noise measurement time window (W1) can be a time interval between the first sensor signal and the second sensor signal. In some cases, the duration of W1 can be substantially equal to the delay between the first and second sensor signals. In some implementations, every time interval between two subsequent sensor signals can include a noise measurement time window. In some implementations, the lidar system 1700 emits N optical probe signals between two subsequent noise measurement time windows or two time intervals during which a noise level is measured or determined. In some examples, the lidar system 1700 measures or determines a noise level (or an average noise level) based on two or more signal time windows and/or noise measurement time windows. For example, the lidar system determines a background noise level (or an average background noise level) based on S1 and S2, W1 and W2, or S1, W1, S2, and W2. [0278] In some cases, upon determining that a noise level (e.g., background noise) or average noise level (e.g., average background noise level) is within an acceptable range, the lidar system 1700 does not change the characteristics of the optical probe signals emitted after one or more measurement time windows or a time interval during which a noise level (or an average noise level) is measured or determined. [0279] In some cases, upon determining that a noise level (e.g., background noise) or average noise level (e.g., average background noise level) is above an upper limit, the lidar system 1700 changes the characteristics of the optical probe signals emitted after one or more measurement time windows or a time interval during which a noise level (or an average noise level) is measured or determined, to improve or maintain detection probability (or decrease or maintain false alarm rate). [0280] In the sensor signals 1904, the background noise level during S1, W1, and W2 is below an upper limit (or within an acceptable range) and increases above the upper limit after the second measurement time window W2. The readout system 1730 measures a background noise level during at least one of the time intervals S3, W3, S4, and W4 and determines that the measured background noise has changed and is now above an upper limit or an acceptable range (e.g., an acceptable range for sensor signals having four nearly identical optical pulses with certain intensities). In response to such determination, the lidar system 1700 changes a characteristic of the subsequently emitted optical probe signals until the next noise evaluation period. In the example shown, lidar system 1700 reduces the number of optical pulses in the fifth optical probe signal emitted after the fourth optical probe signals from four to three pulses to improve or maintain the detection probability (or reduce or maintain false alarm rate). In some examples, the lidar system 1700 changes the number of optical pulses in the subsequent optical probe signals by generating an emission control signal and transmitting the emission control signal to the emission control system. [0281] With continued reference to the sensor signals 1904, upon receiving the eight and/or ninth sensor signals, the lidar system 1700 determines that the background noise level is reduced (e.g., based on measured background noise level during at least one of the time intervals S8, W8, S9, and W9), and is now below the upper limit or within an acceptable range for sensor signals having four nearly identical optical pulses with certain intensities. Responsive to such determination, the lidar system can increase the number of optical pulses in the tenth optical probe signal emitted after the ninth optical probe signal from three pulses back to four pulses. [0282] In some embodiments, e.g., when the background noise level increases but is below a threshold value, the lidar system can increase the number of pulses in a probe signal. For example, with continued reference to the sensor signals 1904 , the lidar system 1700 can increase the number of optical pulses in the fifth optical probe signal emitted after the fourth optical probe signals to improve or maintain the detection probability (or reduce or maintain false alarm rate) and upon receiving the eight and/or ninth sensor signals and determining that the background noise level is reduced, the lidar system 1700 can decrease the number of optical pulses in the tenth optical probe signal. As such, depending on a background noise level, a number of the optical pulse in an optical probe signal, the lidar system 1700 may increase or decrease the number of pulses to improve or maintain the detection probability. [0283] Alternatively or in addition, the adaptive adjustment of the optical probe characteristics described above with respect to the optical probe signals 1902 and the corresponding sensor signals 1904, can be made based on the SNRs of one or more sensor signals determined during one or more signal time windows (e.g., S3-S9). [0284] FIG. 19B is a diagram illustrating a series of adaptively controlled optical probe signals 1906 emitted by lidar system 1700, comprising of pulses having substantially equal amplitudes, and the corresponding detected sensor signals 1908 having varying amplitudes in the presence of a noise level 1907 having a nearly constant average value. In some examples, the smaller amplitude of the third, fourth, fifth, sixth, and seventh signals can be associated with reflection form low reflectivity objects and/or objected being farther from the lidar system 1700 compared to the objects that generate the first and the second sensor signals. In the example shown, the lidar system 1700 emits optical probe signals similar to the pulse sequence 1802 and change a characteristic of the optical probe signals upon determining that an amplitude and/or a signal-to-noise ratio (SNR) of the respective sensor signals is not within an acceptable range (e.g., an acceptable range for the intensity of pulses in the emitted probe signal). For example, a first optical probe signal can include four substantially identical optical pulses emitted by the lidar system 1700, at substantially equal time intervals and the detection system (e.g., the detection system 1704) can generate a first sensor signal having four pulses upon detecting the corresponding first reflected optical signal. When the amplitude and/or SNR of first sensor signal is within the acceptable range (e.g., for a sensor signal having four pules), the lidar system 1700 emits the second optical probe signal with the same characteristics (e.g., the same number of pulses) as the first optical probe signal. Upon receiving the third and/or fourth sensor signals after emission of the second optical probe signal, the readout system 1730 determines that the amplitude or the SNR of the third and/or fourth sensor signals is below a lower limit for the corresponding acceptable range. Upon such determination, the lidar system 1700 changes a characteristic of the subsequently emitted optical probe (e.g., by sending emission control signal to the emission control system). In the example shown, the lidar system reduces the number of optical pulses in the fifth optical probe signal emitted after the fourth optical probe signals from four to three pulses to improve or maintain the detection probability. [0285] With continued reference to sensor signals 1908, upon receiving the eight and/or ninth sensor signals, the readout system 1730 determines that the amplitude and/or SNR of the eighth and/or ninth sensor signals is improved and is now above the lower limit of the acceptable range, and therefore increases the number of optical pulses in the tenth optical probe signal emitted after the ninth optical probe signal from three pulses to four pulses. [0286] The adaptive control of the optical probe signal properties, however, should not be limited to the examples described above with respect to FIGs. 18A-18B and 19A-19B. In some cases, for example, upon determining that the sensor signal amplitude or SNR has decreased, or the background noise has increased, the lidar system increases the number of pulses in the subsequently emitted optical probe signals. [0287] Alternatively or additionally, upon determining that the amplitude, SNR, or the background noise level of the sensor signals are not within an acceptable range (e.g., acceptable range for the sensor signals ), the lidar system 1700 can change an amplitude ratio or a delay between two optical pulses in the subsequently emitted optical probe signals. With reference to the pulse sequence 1804 (FIG.18B), a characteristic change can include changing t d1 , t d2 ,…, t dn and/or, I 1 , I 2 , I 3 , … I n . For example, the intensities and the delays of the optical pulses in the fifth optical probe signal can change such that I 1 ^I 2 ^ I 3, I 3 , or t d1 ^t d2 and/or, I1=I2^ I3 I1^I2^ I3, I3, or other combinations. [0288] In various implementations, e.g., when pulse amplitudes and background noise of the received sensor signals both vary, the readout system 1730 determines a signal-to- noise-ratio (SNR) of one or more sensor signals associated with one or more optical probe signals and changes the characteristic of the subsequently emitted optical probe signals based on the determined SNR. For example, with reference to FIGs.19A and 19B, the readout system 1730 determines that SNR of the third and/or fourth sensor signals is not within the acceptable range and generates an emission control signal to reduce the number of pulses in the fifth optical probe signals from four to three. [0289] In some cases, the emission control signals 1714 are generated by another device or subsystem of the lidar system 1700 (other than the readout system 1730). In some examples, the readout system 1730 sends data including a noise level (e.g., background noise level) or an SNR, to the emission control system 1710, and the emission control system 1710 determines the characteristics of the subsequent optical probe signals based at least in part on the data received from the readout circuit. [0290] In some implementations, the readout system 1730 determines a change to a characteristic of the optical probe signals based on the two or more sensor signals (e.g., the last sensor signals generated) and the corresponding noise levels. For example, the readout system 1730 determines an average noise, SNR, or sensor signal amplitude based on the two or more sensor signals and use the average values to determine a change to a characteristic of the subsequent optical probe signals. Alternatively, the readout system 1730 can transmit the average values to the emission control system 1710 (e.g., include them in an emission control signal 1714). Upon receiving the average values, the emission control system 1710 determines a change to a characteristic of the subsequent optical probe signals, and emits the subsequent optical probe signals based on the determined characteristics. [0291] In some implementations, an acceptable range for the background noise, sensor signal SNR, and/or sensor signal amplitude depends on the characteristics of one or more optical probe signals emitted before measuring or determining the background noise, the sensor signal SNR and amplitude. Upon generation of one or more sensor signals, the lidar system 1700 determines or measures the background noise, SNR of the sensor signals, and/or amplitude of the sensor signals and determines whether the SNR, amplitude, and/or the background noise are within the respective acceptable ranges for the characteristics of the one or more probe signals. [0292] In some examples, the acceptable level of background noise measured during a signal time window or a noise measurement time window (e.g., after emission of an optical probe signal similar to the pulse sequence 1802), can depend on the number of optical pulses included in the optical probe signal. [0293] In some cases, an acceptable range for the background noise, sensor signal SNR or amplitude can include a preset range stored in a memory of the lidar system 1700. In some cases, the lidar system 1700 (e.g., the detection system 1704 or the emission control system 1710 of the lidar), determines the acceptable range based at least in part on reference data stored in a memory of the lidar system 1700. The stored reference data can include a mapping table or matrix indicating associations between characteristics of an optical probe signal and the acceptable range for different parameters that can be measured, calculated, or otherwise determined by the lidar system, including but not limited to: background noise, system noise, sensor signal amplitude, pulse amplitude, SNR, and the like. For example, the stored reference data can include acceptable ranges for background noise, SNR, and amplitude of a sensor signal associated with a reflected optical signal resulting from reflection of an optical probe signal having two, three, and four substantially identical optical pulses. [0294] In some cases, the acceptable range includes a lower limit and an upper limit. In some such cases, the lower limit of the acceptable range for SNR or amplitude can be larger when the number of optical pulses in the optical probe signal is smaller. In some such cases, the upper limit of the acceptable range for SNR or amplitude can be larger when the number of optical pulses in the optical probe signal is larger. [0295] In some cases, the lidar system 1700 can use an algorithm to determine a characteristic change for subsequent optical probe signals when the measured or determined sensor signal amplitude, sensor signal SNR, and/or background noise is not within the acceptable range for one or more previously emitted optical probe signals. In some examples, the algorithm uses a parameter associated with the lidar system 1700, and/or an environmental parameter, to determine the characteristic change. [0296] In various implementations, the algorithm and the mapping table can be specific to the lidar system 1700. In these implementations, the algorithm and the mapping table can be stored in a non-transitory memory of the lidar system 1700 as machine readable instructions and reference data respectively. [0297] In some implementations, a lidar system (e.g., the lidar system 1700) adaptively controls a detection threshold of the lidar detection system (e.g., the lidar detection system 1704) to improve detection probability and/or maintain the false alarm rate (FAR) below a set FAR limit. The detection threshold can be a threshold used by the lidar detection system for determining an arrival time and/or the optical intensity of a sensor signal based on time-to- digital (TDC) conversion. In some cases, the detection threshold includes a readout threshold. As described above, the readout threshold can be a threshold level used to identify a true event (detection of reflected light associated with an optical probe signal emitted by the lidar), based on a sensor signal. [0298] FIG. 20A is a diagram illustrating a series of sensor signals 2002 in the presence of a background noise level 1905 having a time-varying average value. In some implementations, the lidar system 1700 detects the sensor signals 2002 using adaptive detection threshold levels. The lidar detection system 1704 uses a first detection threshold level I th1 for generating a first and a second return signals using the first and second sensor signals, and upon detecting a larger background noise for the third and fourth sensor signals, it changes (e.g., increases) the threshold to a second level Ith2. Subsequently upon detecting a lower background noise for the seventh sensor signal, the lidar detection system 1704 changes the detection threshold level back to the first level I th1 , or to a different value smaller than I th2 . [0299] FIG. 20B is a diagram illustrating a series of sensor signals 2004 in the presence of a background noise level 2005 having a nearly constant average value. In some implementations, the lidar system detects the sensor signal 2004 using adaptive detection threshold levels. The detection system 1704 can use a first detection threshold level Ith1 for generating a first and a second return signals using the first and second sensor signals and upon measuring a smaller amplitude for the third and fourth sensor signals, it can change (e.g., decrease) the threshold to a second level Ith2. Subsequently upon detecting a larger amplitude for the seventh sensor signal, the detection system 1704 can change the detection threshold level back to the first level I th1 , or to a different value larger than I th2 . [0300] In some cases, in an adaptive emission control mode the lidar system 1700 adaptively controls one or both of a detection threshold and a characteristic of the optical probe signals to improve or maintain a probability of detection and/or maintain or reduce a false alarm rate below a threshold value. The threshold value for the false alarm rate can be a set value stored in a non-transitory memory of the lidar system, or a value determined by the lidar system 1700 based at least in part on signals and data generated by the lidar detection system 1704 and the emission control system 1710. [0301] FIG. 21 is a flow diagram illustrating an example of a process or routine 2100 implemented by one or more processors of the lidar system 1700 to adaptively control the characteristics of the optical probe signals emitted by the lidar emission system 1702. In various implementations, at least one processor can be a processor of the lidar detection system 1704 and/or the emission control system 1710. In some cases, the at least one processor can be the processor 1504 of the device 1500 used in the vehicle 1400, described above with respect to FIGs.14 and 15. [0302] At block 2102, the lidar system 1700 system emits one or more optical probe signals to an environment. In some cases, the optical probe signals can include periodically emitted identical probe signals each including a sequence of optical pulses (e.g., sequence 1802 or sequence 1804). [0303] At block 2104, the lidar system 1700 receives light from the environment. A first portion of the received light includes one or more reflected optical signals resulting from the one or more first optical probe signals emitted at block 2102, being reflected from a first reflection point within the environment. The first reflection point can be on an object. A second portion of the received light includes light received in time intervals between the emission of one or more optical probe signals. For example, a first portion of received light can include reflected optical signal resulting from a first optical probe of the one or more optical signals being reflected from within the environment, and the second portion of the received light can include light received after the first portion and before emission of a second optical probe signal by emitter (emission system). [0304] At block 2106, the lidar system 1700 generates one or more sensor signals using the first and second portions of light received at block 2104. [0305] At block 2108, the lidar system 1700 determines a noise level and/or a sensor signal amplitude based at least in part on sensor signals generated at block 2106 based on one or both first and second portions of received light. The noise level can include a background noise. The noise level can be generated based at least on a background signal generated by the readout system 1730 of the lidar detection system 1704 using the first and/or the second portion of the received light. In some examples, the noise level can be associated, at least partially, to the noise generated by the readout system 1730 or background light received by the detection system 1704. In some cases, the noise level can include an average noise level determined based on a plurality of noise levels. In some cases, the signal amplitude can be an average sensor signal amplitude estimated using the intensities of the one or more sensor signals generated at block 2106. In some cases, the lidar system 1700 determines a signal-to-noise ratio based on the determined noise level and a sensor signal amplitude. [0306] At the decision block 2110, the lidar system 1700 determines whether the noise and/or the signal amplitude generated at block 2108 is within an acceptable range. In some cases, the lidar system 1700 determines a signal-to-noise ratio (SNR) or an average signal-to- noise ratio associated with the noise and/or the signal amplitude generated at block 2108, and compares the determined SNR with a target SNR or threshold SNR. In some cases, the noise level includes a background noise level and the lidar system compares the background noise level with threshold background noise level. [0307] In some implementations, at block 2110 the lidar system 1700 compares a measured amplitude of at least one sensor signal of the one or more sensor signals with an acceptable range for the amplitude of the one or more sensor signals. Alternatively or additionally, the lidar system can determine whether the background signal is within an acceptable range for the background signal. The acceptable range for the amplitude of the one or more sensor signals and the background signal can be stored in a memory of the lidar system or determined by lidar system. [0308] If at the decision block 2110 the lidar system 1700 determines that the noise and/or the signal amplitude generated at block 2108 is within the acceptable range, the process moves back to block 2102 where the lidar system emits one or more optical probe signals having the same characteristics as the previously emitted optical probe signals. [0309] If at the decision block 2110 the lidar system 1700 determines that the noise and/or the sensor signal amplitude generated at block 2108, based on one or both first and second portions of received light, is not within the acceptable range, the process moves to block 2112 where the lidar system generates an emission control signal based at least in part on the noise and/or the signal amplitude determined at block 2108. The emission control signal can be configured to change a characteristic (e.g., a number of pulses, amplitude of a pulse, a delay between two subsequent pulses and the like) of the probe signals. [0310] At block 2114 the lidar system 1700 changes a characteristic of the probe signals based on the emission signal generated at block 2112. In some implementations, the lidar system changes a characteristic of the probe signals to maintain the probability of detecting the subsequent reflected optical signals within a set range. In some examples, the set range is stored in a memory of the system. In other examples, the set range can be determined by the lidar system 1700. As described above, depending on the noise and/or the sensor signal amplitude generated at block 2108 alone or combined with other factors (e.g., number pulses included in a probe signal emitted at block 2102, a signal-to-noise ratio determined at block 2108, or other factors), the lidar system 1700 changes can change a characteristic of the probe signals by increasing or decreasing a number of pulses in subsequent optical probe signals emitted after the change is made. In some examples, when a noise level (e.g., a background noise level) measured at block 2108 is lower than a first threshold level and larger than a second threshold level the lidar system 1700 increases the number of pulses and when the noise level is lower than the second threshold level, the lidar system 1700 decreases the number of pulses. In some examples, when the measured noise level is larger than the first threshold level the lidar system 1700 increases the number of pulses to improve detection probability. In some examples, when the measured noise level is larger than the first threshold level the lidar system 1700 increases the number of pulses to reduce power consumption. Once the characteristic of the optical probe signals has been changed, the process 2100 proceeds back to block 2102 and the lidar system 1700 emits one more optical probe signals with new characteristic (e.g., different number of optical pulses). [0311] Alternatively or additionally, in some cases, at block 2112 the lidar system 1700 can change the detection threshold level based at least in part on the noise and/or the signal amplitude generated at block 2108 and at block 2106. In some such cases, where the lidar system 1700 does not change the characteristics of the optical probe signals, the process 2100 proceeds back to the block 2106 and the lidar system 1700 generates one or more subsequent sensor signals based on the changed detection threshold level (dashed arrow). Adaptive filter control [0312] As the distance and orientation of objects in an environment scanned by a lidar system change, the amplitude and temporal profile of the sensor signals that the lidar generates based on reflected light from these objects can change. As such, if certain parameters of the detection system (e.g., parameters of the readout system) have fixed values, the probability of detecting events associated with these objects can change (e.g., diminish) as the objects move, rotate, or otherwise reflect different portions of the incident probe beams at different times. Additionally, as a lidar system scans different portions of its field of view (FOV), it can receive reflected light having different intensities based on the distance, reflectivity, and/or orientation of the objects in different portions of the FOV. [0313] Some of the lidar systems and detection methods disclosed here in, can maintain or improve the probability of detecting true events when the distance, reflectivity, and/or orientation of the objects in the environment change as a lidar system scans the environment, by adaptively adjusting one or more parameters of the detection system. [0314] In some implementations, the lidar system adaptively adjusts at least one parameter of the detection system based at least in part on a background signal or change in a background signal, where the background signal indicates an amount of background light received by the lidar. Alternatively or in addition, the lidar system can adaptively adjust the least one parameter of the detection system based on an intensity of a reflected signal received by the detection system. The lidar system can determine or estimate the intensity of the reflected signal received by the detection system by directly measuring the amplitude of a corresponding sensor signal, and/or measuring a time-of-flight indicating a delay between the reception of the corresponding sensor signal and emission of a probe signal that generates the reflected signal. [0315] In various implementations, the adaptively adjusted parameter of the detection system includes, but is not limited to, a detection threshold in the readout circuit or a parameter (e.g., bandwidth) of a filter (herein referred to as filter parameter) that filters sensor signals before using them for event detection. In some examples, the detection threshold includes a threshold level used by a detection system to generate a timing signal (e.g., an analog timing signal). The detection system illustratively uses time-to-digital conversion to generate the return signals using sensor signals (or a filtered sensor signals). An event detection circuit can use the sensors signals and/or the timing signals to detect events. The filter parameter can be a characteristic of a transfer function of the filter. For example, the filter parameter can include a bandwidth (e.g., 3 decibels (dB) or full-width-half-maximum) filter weight coefficient, a cut-off frequency, a peak transmission, a spectral shape, or other characteristics of the transfer function of the filter. [0316] In some embodiments, the filter includes an adaptive filter whose transfer function is controllable by adjusting one or more filter parameters, e.g., by providing one or more control signals. [0317] The lidar system can control or optimize a filter parameter using an optimization algorithm based on a background signal level, a sensor signal amplitude, or a time-of-flight associated with one or more sensor signal. For example, the lidar system can use a Least Mean Squares (LMS) algorithm to determine a value of a filter parameter (e.g., a weight coefficient or bandwidth) that produce the least mean square of an error signal representing, e.g., a difference between the desired and the actual amplitude of a sensor signal. In some cases, the lidar system includes a processor that executes machine readable instructions stored in a memory of the lidar system to determine the value of the filter parameter using the LMS algorithm. In some cases, the lidar system adjusts a bandwidth of the filter (e.g., by tuning a filter parameter) to control an integration time used for measuring and/or detecting sensor signals during event detection. [0318] In some embodiments, the detection system of the lidar scans the FOV of the lidar to redirect light received within a portion of the FOV to one or more pixels of a sensor. For example, the detection system divides the FOV into a number of FOV slices (e.g., non- overlapping FOV slices) each including a spatial angle, and sequentially receives light via different FOV slices until the entire FOV is scanned. In some such embodiments, the lidar system adjusts a parameter of the detection system for each FOV-slice, based on a background signal generated for that slice or a background signal generated based at least one pixel illuminated by the light received via that slice. For example, the lidar system can use a background signal generated based on light received via a FOV-slice during a first measurement period to adjust a parameter of the detection system (e.g., a detection threshold of filter parameter) for measuring light received via the same FOV-slice during a second measurement period. In some cases, the detection threshold can include a threshold level used by an event detection circuit or a threshold level used by a time-to-digital-converter (TDC). [0319] In some implementations, the methods and systems described herein are used to adaptively control a parameter of the lidar detection system to improve detection probability while maintaining the false alarm rate (FAR) below a predefined FAR limit. In some cases, the predefined FAR level can be a value stored in a memory (e.g., non-transitory memory) of the lidar system. [0320] FIG. 22 illustrates a block diagram of an example lidar detection system 2200 (also referred to as detection system or detection subsystem) of a lidar system and generation of two sensor signals by the lidar detection system 2200 upon receiving reflected optical signals from an object 2205 at two different orientations. The lidar detection system 2200 includes an optical system 2210, a sensor 2220, and a readout system 2230. The optical system 2210 receives light 2209 from an environment surrounding the lidar (e.g., light reflected by the object 2205) and redirects the received light 2209 to the sensor 2220 that converts the received light to sensor signals 2223. In some examples, the sensor 2220 includes one or more pixels and the sensor signal is an electrical signal generated by one or a group of pixels upon being illuminated by the light redirected to the sensor 2220 by the optical system 2210. The sensor signals 2223 are transmitted from the sensor 2220 to the readout system 2230 where they are processed and analyzed to detect an event (e.g., an event corresponding to detection of the object 2205). [0321] In some implementations, the readout system 2230 includes a readout circuit 2232 and an event validation circuit 2234. The readout circuit 2232 detects events based on the sensor signals 2223 and the event validation circuit 2234 can validate the detected event as a true event. A true event corresponds to detection of reflected light associated with probing signal emitted by the lidar system. As described above, in some embodiments, the event validation circuit 2234 generates a confidence signal indicative of a probability of an event identified as a true event to be associated with a probing signal emitted by the lidar system. When the object 2205 rotates or moves away from the lidar system, the amplitude and/or the temporal profile of the sensor signals 2223 generated by reflected signals received from the object 2205 can change. In some cases, such change results in reduction of the probability of detecting the corresponding event (reception of the reflected signal). In the example shown, the sensor signal 2240a associated with a first orientation 2203 of the object 2205 has a larger amplitude compared to the sensor signal 2240b associated with a second orientation 2207 of the object 2205, where the second orientation 2207 is rotated by an angle of T^with respect to the first orientation. In some cases, the rotation of the object 2205 also affects the amplitude of the noise floor or noise level 2260. In some such cases, at least a portion of the noise level 2260 includes noise generated by the background light. In some cases, during a period when the sensor signal 2240a (or 2240b) is generated, the optical system 2210 receives light via a narrow slice of the FOV that mainly captures light reflected by the object 2205; as such, the background light received during this period includes background light reflected by the object 2205 (e.g., sun light reflected by the object 2205) and an average amplitude of the noise level 2260 can be correlated to an orientation of the object 2205. In some such cases, the lidar system can use the noise level 2260 to adjust the readout circuit 2232 (e.g., a filter of the readout circuit 2232) to improve the probability of detecting sensor signals generated by the object 2205 as the objects 2205 rotates or moves. [0322] In some examples, the temporal alignment between the emission of probe signals and scan rate of the FOV is adjusted such that during a first portion of the period associated with receiving light from a specific FOV slice, no sensor signal is generated so that the background noise can be directly measured by measuring the noise level 2260. In some cases, measured background noise during the first portion of the measurement period can be used to adjust a parameter of the detection system during a second portion of the measurement period when the sensor signal is received. Using this approach, the lidar detection system 2200, can be adaptively adjusted in real time to increase the probability of detecting of an object (e.g., the object 2205) as the object changes its orientation or moves with respect to the lidar system (e.g., moves away). The same approach can be used to increase the probability of detecting true events based on light received from different FOV slices, by the lidar detection system 2200. In the example shown, the lidar system can reduce a detection threshold level for detecting the sensor signal 2240b, compared to that of the sensor signal 2240a based on background noise measured (e.g., using the noise level 2260) during a portion of the measurement period immediately before the sensor signal 2240a is generated. Alternatively or in addition, the lidar system can increase a bandwidth of a filter that filters the sensor signal 2240b, to increase a width of the filtered sensor signal in time domain and to allow for a larger integration time (e.g., to compensate for the reduction in amplitude). The same approach can be used when the amplitude of a sensor signal is reduced due to a longer distance between the object 2205 and the lidar system, or due to reflection of probe beams off of objects having low reflectivity. In the case of longer distances, the time-of-flight of a first sensor signal can be used to determine a distance and to adjust a parameter of the detection system for detecting an event in a second sensor signal received after the first sensor signal. [0323] FIG. 23 is a block diagram illustrating the readout system 2332 of a lidar system having an adaptive filter and threshold control. The readout system 2332 receives sensor signals 2223 from the sensor 2320 and generates event signals 2360 that can indicate of detection of a true event. The readout system 2332 includes a filter 2335, a background noise circuit 2325, an event detection circuit 2345, and a readout control circuit 2350. In some implementations, one or both of the readout control circuit 2350 and background noise circuit 2325 can be included in a readout circuit of the readout system. In some examples, the readout control circuit 2350 can be included in a control unit separate from the readout system 2332. The true event can include detecting a reflected optical signal resulting from an optical probe signal emitted by the corresponding lidar system, being reflected from within the environment (e.g., by an object). The probability of detecting a true event in an event signal (of the event signals 2360), can be affected, at least partly, by the characteristics of the filter 2335 and a detection threshold level used by comparator 2346. [0324] In some implementations, the readout system 2332 and/or the lidar detection system 2200 include one or more devices or components of the device 1500 described above with respect to FIG. 15. For example, the readout system 2332 can include the processor 1504, the memory 1506, the storage device 1508, the input interface 1510, the output interface 1512, the communication interface 1514, and/or the bus 1502. Various processes and functions described below with respect to readout system 2332 can be performed by at least one processor (e.g., processor 1504) that executes machine readable instructions stored in a memory (e.g., memory 1506) and/or storage component (e.g., storage device 1508) to perform the processes and functions. [0325] The readout system 2332 can exchange data with one or more components of the lidar detection system 2200 (e.g., the sensor 2320 of the lidar detection system 2200) using an input interface (e.g., the input interface 1510), an output interface (e.g., the output interface 1512), and/or a communication interface (e.g., the communication interface 1514). [0326] The lidar detection system 2200 can exchange data with one or more components of a lidar system (e.g., lidar system 2500 or the lidar sensors 1402b) or other devices of the vehicle 1400 (e.g., the autonomous vehicle compute 1402f), using an input interface (e.g., the input interface 1510), an output interface (e.g., the output interface 1512), a communication interface (e.g., the communication interface 1514). [0327] In some embodiments, the readout system 2332 transmits a first portion 2223a of the sensor signals 2223 to the filter 2335 and a second portion 2223b of the sensor signals 2223 to the background noise circuit 2325. In some cases, the second portion 2223b of the sensor signals 2223 includes sensor signals generated during periods where reflections of probe signals emitted by the lidar are not expected. The filter 2335 tailors the spectrum of the first portion 2223a of the sensor signals 2223 according to its transfer function to generate filtered sensor signals 2225. In various implementations, the transfer function can include a bandpass, low pass, or high pass response. The filter 2335 can be an adaptive filter whose transfer function is controllable by adjusting one or more filter parameters. For example, a filter control circuit 2350a of the readout control circuit 2350 can adjust a filter parameter (e.g., electronically) by generating a filter control signal 2226a and sending the filter control signal 2226a to the filter 2335. The filter parameters can include a bandwidth (e.g., 3dB or full-width- half-maximum), a cut-off frequency, a peak transmission, a spectral shape, a slope (e.g., roll off slope), or other characteristics of the transfer function of the filter 2335. [0328] In some examples, the filter 2335 can include a finite impulse response (FIR) filter. In some cases, the filter 2335 can be a digital filter. In various implementations, the filter 2335 (or 2635) can be implemented using a digital circuit, a computer, a field programmable gate array (FPGA) or other types of hardware. The transfer function of the filter 2335 can be tailored by adjusting one or more filter weight coefficients. [0329] The filtered sensor signals 2225 are provided to the event detection circuit 2345 where events are detected using filtered sensor signals 2225. As a non-limiting example, the event detection circuit 2345 includes at least one comparator 2346 and one time-to-digital converter (TDC) 2348. The comparator 2346 includes at least two input ports, one for receiving the filtered sensor signals 2225 and the other for receiving threshold control signals 2226b used to generate timing signals (e.g., analog timing signals) based on filtered sensor signals 2225. The comparator 2346 transmits the timing signals to the TDC 2348 where they are digitized. The output of the TDC 2348 can include event signals 2360 (e.g., digital event signals). An event signal can include or indicate a true event indicating detection of a reflected optical signal resulting from an optical probe signal emitted by the corresponding lidar system, being reflected from within the environment (e.g., by an object). In some cases, an event signal includes a true event indicative of a delay between emission of a probe signal by the lidar and reception of the corresponding sensor signal by the readout system 2332. In some examples, the output of the TDC 2348 is provided to an event validation circuit 2234 to further validate the true event and generate a return signal. A timing signal can indicate a delay between an emission time of a probe signal by the lidar and the detection of the corresponding event. [0330] In some embodiments, the readout system 2332 adaptively adjusts detection threshold levels used for generating timing signals, and/or the transfer function of the filter 2335 to maintain or increase the probability of detecting events based on the received sensor signals 2223. In some cases, the readout system 2332 adaptively adjusts the detection threshold levels used for generating timing signals, and/or the transfer function of the filter 2335 to maintain a probability of detecting true events based on the received sensor signals 2223 above a threshold probability (e.g., a predefined threshold probability). In some cases, the readout system 2332 adaptively adjusts the detection threshold levels used for generating timing signals, and/or the transfer function of the filter 2335 to increase a probability of detecting true events based on a first group of received sensor signals, compared to a probability of detecting true events based on a second group of received sensor signals received after the first group. [0331] Additionally or alternatively, in various implementations, as the readout system 2332 adaptively adjusts the detection threshold levels, and/or the transfer function to maintain FAR below a FAR limit. [0332] In some implementations, one or both of the predefined threshold probability and the FAR limit are stored in a non-transitory memory of the lidar system. In some examples, a processor of the readout system 2332 or another processor of the corresponding lidar system can determine one or both of the predefined threshold probability and the FAR limit. For example, the processor can determine of the predefined threshold probability (or the FAR limit) for a measurement period, based at least in part sensor signals received by the readout system 2332 or background signals generated by the background noise circuit 2325, in a previous measurement period. [0333] As a non-limiting example, the background noise circuit 2325 uses the second portion 2223b of the sensor signals 2223 to generate background signals 2224 indicative of background light received by the detection system during and/or in between receiving sensor signals 2223. The background noise circuit 2325 can generate the background signals 2224 using the methods described above with respect to generating the background signals 325 by the readout circuit 632. In some examples, the sensor 2320 can include one or more features described above with respect to sensor 320 in FIGs. 4, 6, or 8. Accordingly, an individual background signal of the background signals 2224 can be generated for an individual pixel or a group of pixels in the sensor 2320. [0334] The background noise circuit 2325 transmits the background signals 2224 to the readout control circuit 2350. In the example shown, the readout control circuit 2350 includes a filter control circuit 2350a and a threshold control circuit 2350b. The filter control circuit 2350a generates filter control signals 2226a based at least in part on the background signals 2224 received from the background noise circuit 2325. The threshold control circuit 2350b generates threshold control signals 2226b based at least in part on the background signals 2224 received from the background noise circuit 2325. Additionally or alternatively, one or both the filter control circuit 2350a and the threshold control circuit 2350b can generate the filter control signals 2226a and the threshold control signals 2226b based on the sensor signals 2223, and/or signals indicative of a time-of-flight determined based on sensor signals, or filtered sensor signals 2225. The filter control circuit 2350a transmits the filter control signals 2226a to the filter 2335 to adjust the transfer function of the filter 2335. In some examples, where the filter 2335 is a FIR filter, the filter control signals 2226a include filter weight coefficients. Accordingly the transfer function of the FIR are tailored based on the filter weight coefficients. The threshold control circuit 2350b transmits the threshold control signals 2226b to the comparator 2346 to adjust the detection threshold level of the comparator 2346. [0335] In some examples, upon receiving a sensor signal (e.g., sensor signal 2340) the background noise circuit 2325 can determine that an estimated background noise level for the sensor signal 2340 (e.g., based on a sensor signal received before the sensor signal 2340), is decreased compared to a previously determined background noise level. In some such examples, where the FOV slice through which the corresponding reflected signal is received is narrow, a reduction of the background noise indicates that the object has reflected less light (e.g., because the distance has increased) and therefore the amplitude of the sensor signal 2340 following the background measurement can be smaller compared to a previous sensor signal. In some examples, the filter control circuit 2350a reduces the bandwidth of the filter 2335 (e.g., a band pass filter) based on the corresponding background signal, to stretch the filtered sensor signal 2342 in time domain. Stretching the filtered sensor signal 2342 in time domain increases the integration time for the resulting filtered sensor signal 2342 in the event detection circuit 2345, and compensates for possible reduction in the amplitude of the sensor signal 2340. Alternatively or in addition, the threshold control circuit 2350b can reduce a detection threshold level 2315 of the comparator 2346 such that the comparator can capture the filtered sensor signal 2342 having a reduced amplitude. [0336] In some examples, the filter control circuit 2350a and the threshold control circuit 2350b can receive a third portion 2223c of the sensor signal 2223 and upon determining that its amplitude has decreases compared to a previously received sensor signal, they can decrease the bandwidth of the filter 2335 and/or the threshold level of the comparator 2346. [0337] In some embodiments, the lidar system can include reference data stored in a non-transitory memory of the lidar usable by the filter control circuit 2350a and/or the threshold control circuit 2350b for determining a change in the filter transfer function and/or threshold level based on background signals 2224 and/or the sensor signals 2223. In some cases, the reference data can include a mapping table that includes associations between different background signal amplitudes or amplitude variations and the corresponding adjustment to the filter transfer function. Another mapping table can include associations between background signals amplitudes or amplitude variations and the corresponding adjustment to the threshold level. Similarly, some mapping tables can include associations between the sensor signals 2223 or variation of the sensor signals 2223, and the corresponding adjustment to the threshold level or the filter transfer function. [0338] In some cases, the filter control circuit 2350a and/or the threshold control circuit 2350b can determine an adjustment to the filter transfer function and/or to the threshold level using an algorithm or a circuit configuration. [0339] In some implementations, the background noise circuit 2325 can be configured to generate background signals 2224 indicative of optical power or intensity of background light received by the lidar detection system 2200 (e.g., by the optical system 2210). In some cases, background light can be a portion of light 2209 received by the lidar detection system 2200. In some cases, the background noise circuit 2325 comprises digital and/or analog electronic circuits configured to electronically process the second portion 2223b of the sensor signals 2223 to generate the background signals 2224. In some examples, the background noise circuit 2325 can include a non-transitory memory and a hardware processor configured to execute machine-readable instructions stored in the non-transitory memory to process the second portion 2223b of the sensor signals 2223 and generate the background signals 2224. [0340] In some implementations, the readout control circuit 2350 can be configured to generate threshold control signals 2226b and/or filter control signals 2226a using one or both of background signals 2224 and the third portion 2223c the sensor signals 2223. In some cases, readout control circuit 2350 comprises digital and/or analog electronic circuits configured to electronically process one or both of background signals 2224 and the third portion 2223c of the sensor signals 2223 to generate threshold control signals 2226b and/or filter control signals 2226a based at least in part on the third portion 2223c of the sensor signals 2223 and/or background signals 2224. In some examples, readout control circuit 2350 can include a non-transitory memory and a hardware processor configured to execute machine- readable instructions stored in the non-transitory memory to process one or both of background signals 2224 and the third portion 2223c of the sensor signals 2223 to generate threshold control signals 2226b and/or filter control signals 2226a. In some implementations the machine-readable instructions can include a Least Mean Squares (LMS) algorithm configured to determine filter weight coefficients used for generating the filter control signals 2226a. [0341] In some implementations, the event detection circuit 2345 can be configured to generate event signals 2360 using filtered sensor signals 2225 and threshold control signals 2226b. In some examples, the event detection circuit 2345 can be a digital and/or analog electronic circuit configured to electronically compare the filtered sensor signal 2225 with a threshold control signal 2226b and generate the event signal 2360 based at least in part on the outcome of the comparison. In some cases, event detection circuit 2345 may comprise a comparator circuit and a time-to-digital conversion circuit in communication with the comparator circuit. In some examples, the event detection circuit 2345 can include a non- transitory memory and a hardware processor configured to execute machine-readable instructions stored in the non-transitory memory to generate event signals 2360 using the filtered sensor signals 2225 and, in some cases, using the threshold control signal 2226b. In some implementations, the machine-readable instructions comprise a digital implementation of a comparator and a time-to-digital converter configured to generate the event signals 2360 using filtered sensor signals 2225 and, in some cases, using the threshold control signal 2226b. [0342] As a non-limiting example, FIG.24A-24B illustrate two subsequent filtered sensor signals output by a filter (e.g., filter 2335) of the readout system (e.g., the readout system 2332) of a lidar when the filter is not controlled (FIG. 24A) and when the filter is adaptively controlled (FIG.24B). The corresponding sensor signals are generated using reflected light via a narrow FOV or a narrow FOV slice resulting in the background noise to be associated with background light reflected by an object that also reflects light that generates the corresponding sensor signals. As such in this case the background noise can indicate the reflectivity of the object or the amplitude of a portion of a probe signal reflected by the object. The detection system (e.g., the lidar detection system 2200) of the lidar receives a first reflected signal and generates a first filtered sensor signal 2408 during a first measurement window (W1) 2401 and uses a first threshold level 2413 to generate a timing signal. In a second (subsequent) measurement window (W2) 2402, the detection system measures background noise 2468 and generates a background signal. In the example shown, the background noise 2468 decreases from the measurement window W1 (or a measurement window prior to measurement window W1), to the measurement window W2; as such, the background signal indicates that background noise has increased. In a third measurement window (W3) 2403 after the second measurement window (W2) 2402, the detection system receives a second reflected signal substantially identical to the first reflected signal (having substantially the same intensity and temporal profile). When the filter is not controlled (FIG. 24A), in response to receiving the second reflected signal during the third measurement window (W3) 2403, the detection system generates a second filtered sensor signal 2409 that is substantially identical to the first filtered sensor signal 2408 (e.g., having substantially the same amplitude, shape, and Full-Width-Half- Max). When the filter is adaptively controlled (FIG. 24B), the detection system decreases a bandwidth of the filter based on the background signal generated during the second measurement window W2 2402, e.g., to increase the integration time for a second filtered sensor signal 2410. Thus, the second filtered sensor signal 2410 generated in response to receiving the second reflected signal, is stretched in time domain compared to the first filtered sensor signal 2408. Additionally, when the filter is adaptively controlled, the detection system can decrease a second threshold level 2415 (compared to threshold level 2413) for measuring the second filtered sensor signal 2410. However, in the absence of the adaptive filter control, the detection system does not change the threshold level 2414 (compared to threshold level 2413) when measuring the second filtered sensor signal 2409 (the threshold level 2414 is substantially equal to the threshold level 2413). While in the example described above with respect to FIG.24A and 24B the detection system, receiving light from a narrow FOV, reduces the filter bandwidth in response to a reduction of the background noise it should be understood that in some other examples, the detection system can increase the filter bandwidth in response to a reduction of the background noise. [0343] In some cases, the FOV (or for FOV slice) based on which the readout system is controlled can be large enough to receive background light not reflected from the object associated with a sensor signal. In some such cases, in response to determining that background light has increased, the filter control circuit 2350a and/or the threshold control circuit 2350b decreases the bandwidth of the filter 2335 to reduce the background noise. Additionally, the threshold control circuit 2350b can change the threshold level of the comparator 2346 to maintain or increase the probability of detection of the sensor signal. As a non-limiting example, FIG. 24C-24D illustrate two subsequent filtered sensor signals output by a filter (e.g., filter 2335) of the readout system (e.g., the readout system 2332) of a lidar when the filter is not controlled (FIG. 24C) and when the filter is adaptively controlled (FIG. 24D). In this example, the corresponding sensor signals are generated using reflected light received via a wide FOV; as such, background noise can be associated with background light that is not reflected by an object that reflects light used to generate the corresponding sensor signals. The detection system of the lidar receives a first reflected signal, generates a first filtered signal 2406 during a first measurement window (W1) 2401, and uses a first threshold level 2416 to generate a timing signal. In a second (subsequent) measurement window (W2) 2402, the detection system of the lidar can determine background light and generate a background signal. In the example shown, the background noise 2470 can increase from the measurement window W1 (or a measurement window prior to the measurement window W1) to the measurement window W2; as such the background signal indicates that background noise has increased. In a third measurement window (W3) 2403 after the second measurement window (W2) 2402, the detection system can receive a second reflected signal substantially identical to the first reflected signal (having substantially the same intensity and temporal profile). When the filter 2335 is not controlled, FIG. 24C, in response to receiving the second reflected signal, during the third measurement window (W3) 2403, the detection system generates a second filtered sensor signal 2411 that is substantially identical to the first filtered sensor signal 2406 (e.g., having substantially the same amplitude, shape, and Full-Width-Half- Max). When the filter is controlled, FIG. 24D, the detection system can decrease a bandwidth of the filter based on the background signal generated during the second measurement window W2 (e.g., to increase the integration time for the second filtered sensor signal) and/or to reduce background noise. Thus, in response to receiving the second reflected signal the detection system generates a second filtered sensor signal 2412 that is stretched in time domain compared to the first filtered sensor signal 2406, additionally the background noise 2471 in the third measurement window W32403 is reduced as a result of bandwidth reduction. In some cases, when the filter is adaptively controlled (FIG. 24D), the detection system decreases a threshold level 2419 for measuring the second filtered sensor signal 2412 to compensate for the smaller amplitude of the second filtered sensor signal 2412. In some examples, the detection system does not change the bandwidth of the filter but increases the threshold level 2417 to keep it above the noise floor. However, in the absence of the adaptive filter control, the detection system does not change the threshold level 2417 for detecting the second filtered signal 2411 (the threshold level 2416 is substantially equal to the threshold level 2417). While in the examples described above with respect to FIG. 24C and 24D the detection system that receives light from a wide FOV reduces the filter bandwidth in response to an increase in background signal, it should be understood that in some other examples, the detection system can increase the filter bandwidth in response to an increase in background signal. [0344] A lidar system can receive light reflected by an object associated with a sensor signal and light directly received from an environment surrounding the object, via its field of view (FOV). In some cases, when the FOV is narrow, an amount of light (e.g., power of light) received from the surrounding environment is negligible compared to an amount of light received from the object (e.g., less than 5%, less than 10%, or less than 20%). In some cases, when the FOV is wide, an amount of light (e.g., power of light) received from the surrounding environment can be larger than 20% of light received from the object. [0345] In some cases, the lidar detection system described above with respect to FIGs 24A-24D, includes the readout system 2332, the background noise circuit 2325 that generates the background signals, the filter control circuit 2350a that controls the bandwidth of the filter 2335, and the threshold control circuit 2350b that controls the threshold of the comparator 2346. [0346] In various examples, the filter control circuit 2350a and/or threshold control circuit 2350b can use an algorithm, reference data, or mapping table for determining or changing the detection threshold of the comparator 2346 and/or the transfer function of the filter 2335 based on a plurality of parameters including but not being limited to the background signal. As such, depending on values of the other parameters, the readout system 2332 can respond differently to the same change in the background signal (e.g., with respect to adjusting the transfer function or the detection threshold used by the comparator). [0347] FIG. 25 is a block diagram illustrating an example lidar system 2500 with adaptive readout circuit control. In this example, a sensor 2520 transmits a first portion of the sensor signals to a readout circuit 2530 and a second portion of the sensor signals to a sensor pixel monitor circuit 2525. The sensor pixel monitor circuit 2525 uses the second portion of the sensor to generate signals indicative of the background light received by one or more pixels of the sensor 2320. The control unit 2570 uses signals received from the sensor pixel monitor circuit 2525 to generate control signals usable by a readout control circuit 2550 for the controlling one or more parameters of the readout circuit 2530. For example, the readout control circuit 2550 can control a detection threshold of the readout circuit 2530 (e.g., a threshold level used by an event detection circuit) and/or a transfer function of a filter that filters a portion of sensor signals received from the sensor 2520 before they are received by an event detection circuit 2345 of the readout circuit 2530. In some implementations, the control unit 2570 can include non-transitory memory storing machine readable instructions and at least one processor configured to execute the machine readable instruction to at least control one or more parameters of the readout circuit 2530 based on signals received from the sensor pixel monitor circuit 2525. In some implementations, the readout control circuit 2550 can generate control signals for controlling a threshold level of the readout circuit 2530 and/or the transfer function of a filter in the readout circuit 2530, for light received from an FOV or selected FOV portion (also referred to as FOV slice) 2532 of the FOV 2505. The optical system 2510 can redirect light received from an FOV or FOV slice 2532 to illuminate one or more pixels of the sensor 2520. Accordingly, the sensor pixel monitor circuit 2525 can generate a signal (e.g., a background signal) based on one or more illuminated pixels associated with the FOV slice 2532. In some cases, the FOV slice 2532 can include light reflected of generated by an object 2580 in the environment. In some such cases, an amount of light directly received from the environment via the slice 2532 can be less than 5%, 10%, or 205 of the light received from the object 2580. [0348] In some implementations, the sensor pixel monitor circuit 2525, and the readout control circuit 2550 can be included in the readout circuit 2530. In some examples, the readout circuit 2530 can include the readout system 2332 in FIG. 23. [0349] Additionally, the control unit 2570 can control an emitter 2504 (e.g., a laser source) of the lidar system 2500 using an emission control circuit 2502. In some cases, the control unit 2570 can control the emitter 2504 to further maintain or increase the probability of detecting an event associated with emission of a probe beam by the lidar system 2500. [0350] FIG. 26 illustrates an example of a filter 2635 and a filter control circuit 2350a used to control the filter 2635 based on a background signal 2634, in a lidar. In the example shown, the filter 2635 is a finite impulse response (FIR) filter. The filter control circuit 2650 and the filter 2635 can include one or more features described above for filter control circuit 2350a and the filter 2335, respectively. In some embodiments, the sensor signal 2623 can be received from a sensor of a detection system (e.g., sensor 2320) and the background signal 2634 can be generated by a background noise circuit (e.g., background noise circuit 2325) of the same detection system. In some examples, the sensor signal 2623 and the background signal 2634 can be analog signals and the filter control circuit 2650 and the filter 2635 can include digital circuits. In some such examples, a first analog-to-digital (ADC) 2602 and a second ADC 2604 can convert the sensor signal 2623 and the background signal 2634 to digital signals, respectively. [0351] The digital signal output by the ADC 2604 is transmitted to the filter control circuit 2650 that determines one or more filter weight coefficients for configuring the transfer function (e.g., a finite impulse response transfer function) of the filter 2635. The filter control circuit 2650 can control or optimize the filter weight coefficients based on the digitized background signal. Alternatively or in addition, the filter control circuit 2650 can control or optimize the filter weight coefficients based on the digitized sensor signal output by the ADC 2602. In some examples, the filter control circuit 2650 implements a Least Mean Squares (LMS) algorithm to determine the filter weight coefficients. The LMS algorithm produces the least mean square of an error signal representing, e.g., a difference between the desired and the actual amplitude of the sensor signal 2623. The filter control circuit 2650 can generate a threshold weight signal 2655 including the determined filter weight coefficients, and transmit the threshold weight signal 2655 to the filter 2635. [0352] The digital signal output by the ADC 2602 is transmitted to the filter 2635 that digitally filters the digitized sensor signal based on a transfer function configured according to filter weights adjusted using the threshold weight signals 2655 received from the filter control circuit 2650. [0353] FIG. 27 is a flow diagram illustrating an example of a process 2700 implemented by the lidar system shown in FIG 23 or 25 to adaptively control the characteristics of a filter (e.g., the filter 2335) used to tailor the sensor signals (e.g., the sensor signals 2223) before event detection in a detection system of the lidar. The detection system of the lidar can include the lidar detection system 2200 and/or the readout system 2332. In various implementations, process 2700 can performed by a processor that executes machine readable instructions stored in a memory of the lidar system. The processor can be a processor of the lidar detection system 2200, the readout system 2230, or the readout circuit 2232 therein. In some implementations, the processor can be the processor 1504, and the memory can be the memory 1506 (or the storage device 1508), of the device 1500. In some examples, the processor can be a processor of the lidar sensors 1402b or other devices of the vehicle 1400 (e.g., the autonomous vehicle compute 1402f). [0354] At block 2702, the detection system receives light from the environment. The received light can include a reflected signal resulting from reflection of a probe signal emitted by the lidar, and/or background light reflected by an object or directly received by the optical system of the lidar. In some examples, the received light includes light received via a wide FOV or a selected narrow FOV slice (e.g., the selected FOV portion 2532). [0355] At block 2704, the detection system of the lidar generates a background signal indicative of background light received by a pixel or a group of pixels of the lidar sensor. In some cases, the light received by a pixel or a group of pixels can include light received via the wide FOV or the selected narrow FOV slice. [0356] At block 2706, the detection system of the lidar (e.g., a filter control circuit of the detection system) determines a filter response (e.g., filter transfer function) based at least in part on the background signal generated at block 2704. In some examples, the system the filter response (or transfer function) can include a finite impulse response (FIR). In some cases, the lidar system (e.g., a readout circuit of the lidar system), determines the transfer function of the filter by determining a filter weight and generating a threshold weight signal based on the determined filter weight. [0357] At block 2708, the detection system of the lidar adjusts the transfer function or response of a filter of the detection system (e.g., a filter used for filtering the sensor signals before event detection) based on the determined filter response. In some cases, the filter can include a configurable filter. [0358] In some implementations, the lidar system can continuously, periodically, or a periodically repeat the process 2700 to adaptively control the response or the transfer function of the filter. [0359] In some implementations, any of the lidar systems described above can include at least one non-transitory storage media storing machine-executable instructions and at least one processor that executes the machine-executable instructions to control one of more devices, systems, or subsystems of the lidar system (e.g., the lidar detection system, or the emission control system). In some cases the processor can control a read out system, a validation circuit, a detection control system, a sensor, or an optical system of the lidar detection system. Example embodiments [0360] Some additional nonlimiting examples of embodiments discussed above are provided below. These should not be read as limiting the breadth of the disclosure in any way. Group 1 [0361] Example 1. A range finding system comprising: a detection subsystem that receives light from an environment and generates an event signal based at least in part on the received light, the detection subsystem comprising: a sensor that generates a sensor signal based on the received light; a filter that receives the sensor signal and generates a filtered sensor signal by filtering the sensor signal based on a transfer function of the filter; a background noise circuit that generates a background signal based on the received light, wherein the background signal indicates a background noise level; a readout control circuit that controls the transfer function based at least in part on the background signal, to maintain a probability of detecting a true event above a threshold probability; and an event detection circuit that generates the event signal based on the filtered sensor signal and a detection threshold level. [0362] Example 2. The range finding system of Example 1, wherein detecting the true event comprises detecting a reflected optical signal resulting from an optical probe signal emitted by the range finding system being reflected from within the environment. [0363] Example 3. The range finding system of any of Examples 1-2, wherein filtering the sensor signal comprises transforming a frequency spectrum of the sensor signal. [0364] Example 4. The range finding system of any of Examples 1-3, wherein the readout control circuit further controls the transfer function based at least in part on a prior sensor signal generated by the sensor before generation of the sensor signal. [0365] Example 5. The range finding system of any of Examples 1-4, wherein the event detection circuit comprises: a comparator that generates a timing signal using at least the filtered sensor signal and the detection threshold level, wherein the timing signal indicates a delay between an emission time of a probe signal emitted by the range finding system, and detection of an event; and a time-to-digital converter (TDC) that receives the timing signal and generates a digital timing signal. [0366] Example 6. The range finding system of any of Examples 1-5, wherein the readout control circuit controls the detection threshold level based at least in part on the background signal. [0367] Example 7. The range finding system of Example 6, wherein the readout control circuit controls the detection threshold level based at least in part on a prior sensor signal generated by the sensor before generating the sensor signal. [0368] Example 8. The range finding system of any of Examples 1-7, wherein the transfer function comprises a finite impulse response (FIR). [0369] Example 9. The range finding system of any of Examples 1-8, wherein the filter comprises a reconfigurable filter and the readout control circuit adaptively controls the transfer function of the reconfigurable filter. [0370] Example 10. The range finding system of any of Examples 1-9, wherein the readout control circuit controls the transfer function of the filter by determining a filter weight. [0371] Example 11. The range finding system of any of Examples 1-10, wherein the readout control circuit compares the background signal with a prior background signal to determine a change in a level of background noise received by the detection subsystem. [0372] Example 12. The range finding system of Example 11, wherein a field of view of the range finding system is narrow and in response to determining that the background noise level is decreased, the readout control circuit decreases a bandwidth of the filter. [0373] Example 13. The range finding system of Example 11, wherein a field of view of the range finding system is wide and in response to determining that the background noise level is increased, the readout control circuit decreases a bandwidth of the filter. [0374] Example 14. The range finding system of any of Examples 12 and 13, wherein, the readout control circuit further reduces the detection threshold level. [0375] Example 15. A method comprising: generating, by at least one processor, a sensor signal using a sensor and based on light received from an environment; receiving, by the at least one processor, the sensor signal and generating a filtered sensor signal using a filter and based on a transfer function; generating, by the at least one processor, a background signal based on the received light, wherein the background signal indicates a background noise level; controlling, by the at least one processor, the transfer function based at least in part on the background signal, to maintain a probability of detecting a true event, above a threshold probability; and generating, by the at least one processor, the event signal based on the filtered sensor signal and a detection threshold level. [0376] Example 16. The method of Example 15, wherein detecting the true event comprises detecting the true even by detecting a reflected optical signal resulting from an optical probe signal emitted by a range finding system being reflected from within the environment. [0377] Example 17. The method of any of Examples 15-16, wherein generating the filtered sensor signal comprises generating the sensor signal by transforming a frequency spectrum of the sensor signal based on a transfer function. [0378] Example 18. The method of any of Examples 15-17, wherein the controlling the transfer function further comprised controlling the transfer function based at least in part on a prior sensor signal generated by the sensor before generation of the sensor signal. [0379] Example 19. The method of any of Examples 15-18, further comprising: generating, by the at least one processor, a timing signal using at least the filtered sensor signal and the detection threshold level, wherein the timing signal indicates a delay between an emission time of a probe signal emitted by a range finding system, and detection of an event; and generating, by the at least one processor, a digital timing signal using the timing signal. [0380] Example 20. The method of any of Examples 15-19, further comprising controlling the detection threshold level based at least in part on the background signal. [0381] Example 21. The method of Example 20, wherein the controlling the detection threshold level further comprises controlling the detection threshold level based at least in part on a prior sensor signal generated by the sensor before generating the sensor signal. [0382] Example 22. The method of any of Examples 15-21, wherein the transfer function comprises a finite impulse response (FIR). [0383] Example 23. The method of any of Examples 15-22, wherein the filter comprises a reconfigurable filter and controlling the transfer function comprises adaptively controlling the transfer function of the reconfigurable filter. [0384] Example 24. The method of any of Examples 15-23, wherein controlling the transfer function comprises controlling the transfer function by determining a filter weight. [0385] Example 25. The method of any of Examples 15-24, further comprising, by the at least one processor, comparing the background signal with a prior background signal to determine a change in a level of background noise received by a detection subsystem. [0386] Example 26. The method of Example 25, further comprising, by the at least one processor, decreasing a bandwidth of the filter, in response to determining that the background noise level is decreased. [0387] Example 27. The method of Example 26, further comprising, by the at least one processor, reducing the detection threshold level. [0388] Example 28. At least one non-transitory storage media storing machine-executable instructions that, when executed by at least one processor, cause the at least one processor to: generate a sensor signal using a sensor and based on light received from an environment; receive the sensor signal and generating a filtered sensor signal using a filter and based on a transfer function; generate a background signal based on the received light, wherein the background signal indicates a background noise level; control the transfer function based at least in part on the background signal, to maintain a probability of detecting a true event, above a threshold probability; and generate an event signal based on the filtered sensor signal and a detection threshold level. [0389] Example 29. The at least one non-transitory storage media of Example 28, wherein detecting the true event comprises detecting the true event by detecting a reflected optical signal resulting from an optical probe signal emitted by a range finding system being reflected from within the environment. [0390] Example 30. The at least one non-transitory storage media of any of Examples 28-29, wherein the machine-executable instructions cause the at least one processor to generate the filtered sensor signal by generating the sensor signal by transforming a frequency spectrum of the sensor signal based on a transfer function. [0391] Example 31. The at least one non-transitory storage media of any of Examples 28-29, wherein the machine-executable instructions cause the at least one processor to control the transfer function further based on a prior sensor signal generated by the sensor before generation of the sensor signal. [0392] Example 32. The at least one non-transitory storage media of any of Examples 28-31, wherein the machine-executable instructions further cause the at least one processor to: generate, by the at least one processor, a timing signal using at least the filtered sensor signal and the detection threshold level, wherein the timing signal indicates a delay between an emission time of a probe signal emitted by a range finding system, and detection of an event; and generate, by the at least one processor, a digital timing signal using the timing signal. [0393] Example 33. The at least one non-transitory storage media of any of Examples 28-32, wherein the machine-executable instructions further cause the at least one processor to control the detection threshold level based on the background signal. [0394] Example 34. The at least one non-transitory storage media of Example 33, wherein the machine-executable instructions cause the at least one processor to control the detection threshold level further based a on a prior sensor signal generated by the sensor before generating the sensor signal. [0395] Example 35. The at least one non-transitory storage media of any of Examples 28-34, wherein the transfer function comprises a finite impulse response (FIR). [0396] Example 36. The at least one non-transitory storage media of any of Examples 28-35, wherein the filter comprises a reconfigurable filter and controlling the transfer function comprises adaptively controlling the transfer function of the reconfigurable filter. [0397] Example 37. The at least one non-transitory storage media of any of Examples 28-36, wherein the machine-executable instructions cause the at least one processor to control the transfer function by determining a filter weight. [0398] Example 38. The at least one non-transitory storage media of any of Examples 28-37, wherein the machine-executable instructions further cause the at least one processor to compare the background signal with a prior background signal and determine a change in a level of background noise received by a detection subsystem. [0399] Example 39. The at least one non-transitory storage media of Example 38, wherein the machine-executable instructions further cause the at least one processor to decrease a bandwidth of the filter, in response to determining that the background noise level is decreased. [0400] Example 40. The at least one non-transitory storage media of Example 39, wherein the machine-executable instructions further cause the at least one processor to reduce the detection threshold level. [0401] Example 41. A system comprising: a detection subsystem that receives light from an environment and generates an event signal based at least in part on the received light, the detection subsystem comprising: a sensor that generates a sensor signal based on the received light; a filter that receives the sensor signal and generates a filtered sensor signal by filtering the sensor signal based on a transfer function of the filter; a background noise circuit that generates a background signal based on the received light, wherein the background signal indicates a background noise level; a readout control circuit that controls the transfer function based at least in part on the background signal, to maintain a probability of detecting a true event above a threshold probability; and an event detection circuit that generates the event signal based on the filtered sensor signal and a detection threshold level. [0402] Example 42. The system of Example 41, wherein detecting the true event comprises detecting a reflected optical signal resulting from an optical probe signal emitted by the system being reflected from within the environment. [0403] Example 43. The system of any of Examples 41-42, wherein filtering the sensor signal comprises transforming a frequency spectrum of the sensor signal. [0404] Example 44. The system of any of Examples 41-43, wherein the readout control circuit further controls the transfer function based at least in part on a prior sensor signal generated by the sensor before generation of the sensor signal. [0405] Example 45. The system of any of Examples 41-44, wherein the event detection circuit comprises: a comparator that generates a timing signal using at least the filtered sensor signal and the detection threshold level, wherein the timing signal indicates a delay between an emission time of a probe signal emitted by the system, and detection of an event; and a time-to-digital converter (TDC) that receives the timing signal and generates a digital timing signal. [0406] Example 46. The system of any of Examples 41-45, wherein the readout control circuit controls the detection threshold level based at least in part on the background signal. [0407] Example 47. The system of Example 46, wherein the readout control circuit controls the detection threshold level based at least in part on a prior sensor signal generated by the sensor before generating the sensor signal. [0408] Example 48. The system of any of Examples 41-47, wherein the transfer function comprises a finite impulse response (FIR). [0409] Example 49. The system of any of Examples 41-48, wherein the filter comprises a reconfigurable filter and the readout control circuit adaptively controls the transfer function of the reconfigurable filter. [0410] Example 50. The system of any of Examples 41-49, wherein the readout control circuit controls the transfer function of the filter by determining a filter weight. [0411] Example 51. The system of any of Examples 41-50, wherein the readout control circuit compares the background signal with a prior background signal to determine a change in a level of background noise received by the detection subsystem. [0412] Example 52. The system of Example 51, wherein a field of view of the system is narrow and in response to determining that the background noise level is decreased, the readout control circuit decreases a bandwidth of the filter. [0413] Example 53. The system of Example 51, wherein a field of view of the system is wide and in response to determining that the background noise level is increased, the readout control circuit decreases a bandwidth of the filter. [0414] Example 54. The system of any of Examples 52 and 53, wherein, the readout control circuit further reduces the detection threshold level. Group 2 [0415] Example 1. A system, comprising: an emitter that emits one or more of a first optical probe signal directed into an environment; a photodetection circuit that receives light from the environment and generates an emission control signal based at least in part on one or both of a first portion and a second portion of the received light, wherein the first portion of the received light comprises one or more first reflected optical signals resulting from the one or more of the first optical probe signal being reflected at a first point of reflection in the environment, and the second portion of the received light comprises light received after the first portion and before emission of a second probe signal by the emitter; and a control circuit that controls a characteristic of the second optical probe signal emitted by the emitter after the emission of the one or more of the first optical probe signal, based at least in part on the emission control signal, to maintain or increase a probability of detecting at least a second reflected optical signal resulting from the second optical probe signal being reflected by a second point of reflection in the environment. [0416] Example 2. The system of Example 1, wherein the first and second points of reflection are on a single object. [0417] Example 3. The system of Example 1, wherein the one or more of the first optical probe signal and the second optical probe signal are emitted along two different directions. [0418] Example 4. The system of Example 1, wherein the control circuit controls a characteristic of at least a second optical probe signal to maintain the probability of detecting the second reflected optical signal, within a target detection probability range. [0419] Example 5. The system of Example 1, wherein the control circuit controls a characteristic of the second optical probe signal to maintain or improve the probability of detecting the second reflected optical signal by maintaining or reducing a false alarm rate (FAR) of the system. [0420] Example 6. The system of Example 1, wherein the control circuit controls a characteristic of the second optical probe signal to maintain or improve the probability of detecting the second reflected optical signal by maintaining a false alarm rate of the system within a target FAR probability range. [0421] Example 7. The system of Example 4, wherein the target detection probability range is a predefined range stored in a memory of the system. [0422] Example 8. The system of Example 4, wherein the system is further determines the target detection probability range. [0423] Example 9. The system of Example 6, wherein the target FAR probability range is a predefined range stored in a memory of the system. [0424] Example 10. The system of Example 6, wherein the system is further determines the target FAR probability range. [0425] Example 11. The system of Example 1, wherein the emission control signal indicates a level of background light received by the detection system, and wherein the background light is not associated with the one or more of the first optical probe signal. [0426] Example 12. The system of Example 1, wherein the emission control signal indicates an optical intensity of the one or more reflected optical signals. [0427] Example 13. The system of Example 1, wherein the emission control signal indicates a level of noise generated by the detection system. [0428] Example 14. The system of Example 1, wherein the photodetection circuit generates the emission control signal based on a portion of the light received during a time interval between two subsequent first reflected optical signals of the one or more first reflected optical signals. [0429] Example 15. The system of Example 1, wherein the photodetection circuit generates the emission control signal based on at least one of the one or more first reflected optical signals. [0430] Example 16. The system of Example 1, wherein control circuit that controls a characteristic of the second optical probe signal based on reference data stored in a non-transitory memory of the system, wherein the reference data comprises at least one mapping table. [0431] Example 17. The system of Example 16, wherein the mapping table indicates associations between characteristics of the second optical probe signal and at least one of: a background signal generated by the detection system, where the background signal indicates a level of background light; and an amplitude of a sensor signal associated with an intensity of at least one of the one or more first reflected optical signals. [0432] Example 18. The system of Example 1, wherein the second optical probe signal comprises a second sequence of optical pulses and the characteristic of the second optical probe signal comprises at least one of: a second number of optical pulses in the second sequence of optical pulses, a delay between two consecutive optical pulses in the second sequence of optical pulses, or an intensity of an optical pulse in the second sequence of optical pulses. [0433] Example 19. The system of Example 18, wherein the first optical probe signal comprises a first sequence of optical pulses having a first number of pulses. [0434] Example 20. The system of Example 19, wherein the control circuit controls the characteristic of the second optical probe signal by changing the delay between two consecutive optical pulses in the second sequence of optical pulses compared to a delay between two respective consecutive pulses in the first sequence of optical pulses. [0435] Example 21. The system of Example 19, the control circuit controls the characteristic of the second optical probe signal by changing the intensity of at least one optical pulse in the second sequence of optical pulses compared to the intensity of a respective optical pulse in the first sequence of optical pulses. [0436] Example 22. The system of Example 19, wherein the control circuit controls the characteristic of the second optical probe signal by changing the second number of optical pulses to a number different from the first number. [0437] Example 23. The system of Example 21, wherein the control circuit increases the intensity of the at least one optical pulse in the second sequence of optical pulses, when the emission control signal indicates that an optical intensity the one or more first reflected optical signals is smaller than an optical intensity of one or more reflected optical signals received before the one or more first reflected optical signals. [0438] Example 24. The system of Example 22, wherein the second number is smaller than the first number, when the emission control signal indicates that an optical intensity the one or more first reflected optical signals is smaller than an optical intensity of one or more reflected optical signals received before the one or more first reflected optical signals. [0439] Example 25. The system of Example 21, wherein the control circuit increases the intensity of the at least one optical pulse in the second sequence of optical pulses, when the emission control signal indicates a level of background light received by the photodetection circuit has increased. [0440] Example 26. The system of Example 22, wherein the second number is smaller than the first number when the emission control signal indicates a level of background light received by the photodetection circuit has increased. [0441] Example 27. The system of Example 1, wherein the photodetection circuit comprises: a sensor that receives the one or more first reflected optical signals and to generate one or more sensor signals in response to receiving the one or more first reflected optical signals; and a readout system that generates a background signal based on the received light, wherein the background signal indicates a magnitude of light generated by a light source different from the emitter. [0442] Example 28. The system of Example 27, wherein the photodetection circuit generates the emission control signal based at least in part on an amplitude of the one or more sensor signals. [0443] Example 29. The system of Example 27, wherein the photodetection circuit generates the emission control signal based at least in part on an average amplitude of the one or more sensor signals. [0444] Example 30. The system of Example 27, wherein the photodetection circuit generates the emission control signal based at least in part on the background signal. [0445] Example 31. The system of Example 27, wherein the photodetection circuit generates the emission control signal based at least in part on a noise level in the readout system. [0446] Example 32. The system of Example 27, wherein the photodetection circuit generates the emission control signal based at least in part on a signal-to-noise ratio of at least one of the one or more sensor signal. [0447] Example 33. The system of Example 27, wherein the light source comprises another light emitting system, or sun light. [0448] Example 34. The system of Example 27, wherein the photodetection circuit controls a detection threshold level based at least in part on the background signal. [0449] Example 35. The system of Example 27, wherein the photodetection circuit generates the emission control signal based at least in part on an acceptable range for an amplitude of one or more sensor signals. [0450] Example 36. The system of Example 27, wherein the photodetection circuit generates the emission control signal based at least in part on an acceptable range for the background signal. [0451] Example 37. The system of any of Examples 35 and 36, wherein the system is further determines the acceptable range. [0452] Example 38. The system of any of Examples 35 and 36, wherein the system is further determines the acceptable range. Group 3 [0453] Example 1. A method implemented by a at least one processor of a range finding system, the method comprising: emitting, by the at least one processor, one or more of a first optical probe signal directed into an environment using a light emitter; generating, by the at least one processor, an emission control signal based at least in part on one or both on a first and a second portions of light received by a photodetection circuit from the environment, wherein the first portion of the received light comprises one or more first reflected optical signals resulting from the one or more of the first optical probe signal being reflected at a first point of reflection in the environment and the second portion of the received light comprises light received after the first portion and before emission of a second probe signal by the emitter; and controlling, by the at least one processor, a characteristic of the second optical probe signal emitted by the emitter after the emitting the one or more of the first optical probe signal, based at least in part on the emission control signal, to maintain or increase a probability of detecting at least a second reflected optical signal resulting from the second optical probe signal being reflected by a second point of reflection in the environment. [0454] Example 2. The method of Example 1, wherein the first and second points of reflection are on a single object. [0455] Example 3. The method of Example 1, wherein the one or more of the first optical probe signal and the second optical probe signal are emitted along two different directions. [0456] Example 4. The method of Example 1, wherein controlling the characteristic of the at least the second optical probe signal comprises maintaining the probability of detecting the at least the second reflected optical signal within a target detection probability range. [0457] Example 5. The method of Example 4, wherein maintaining the probability of detecting the at least the second reflected optical signal within a target detection probability range comprises maintaining a false alarm rate of the system within a target FAR probability range. [0458] Example 6. The method of Example 4, wherein the detection probability range is stored in a memory of the system. [0459] Example 7. The method of Example 4, further comprising, by the at least one processor, determining the detection probability range. [0460] Example 8. The method of Example 5, wherein the FAR probability range is stored in a memory of the system. [0461] Example 9. The method of Example 5, further comprising, by the at least one processor, determining the FAR probability range. [0462] Example 10. The method of Example 1, wherein the emission control signal indicates a level of background light received by the detection system, wherein the background light is not associated with the one or more of the first optical probe signal. [0463] Example 11. The method of Example 1, wherein the emission control signal indicates an optical intensity of the one or more of the first reflected optical signals. [0464] Example 12. The method of Example 1, wherein the emission control signal indicates a level of noise generated by the detection system. [0465] Example 13. The method of Example 1, wherein the generating the emission control signal comprises generating the emission control signal based on a portion of the light received during a time interval between two subsequent first reflected optical signals of the one or more first reflected optical signals. [0466] Example 14. The method of Example 1, wherein the generating the emission control signal comprises generating the emission control signal based on at least one of the one or more first reflected optical signals. [0467] Example 15. The method of Example 1, wherein controlling the characteristic of the second optical probe signal comprises controlling the characteristic if the second optical probe signal based on reference data stored in a non-transitory memory of the system, wherein the reference data comprises at least one mapping table. [0468] Example 16. The method of Example 15, wherein the mapping table indicates associations between characteristics of the second optical probe signal and at least one of: a background signal generated by the detection system, where the background signal indicates a level of background light; and an amplitude of a sensor signal associated with intensity of at least one of the one or more first reflected optical signals. [0469] Example 17. The method of Example 1, wherein the second optical probe signal comprises a second sequence of optical pulses and the characteristic of the second optical probe signal comprises at least one of: a second number of optical pulses in the second sequence of optical pulses, a delay between two consecutive optical pulses in the second sequence of optical pulses, or an intensity of an optical pulse in the second sequence of optical pulses. [0470] Example 18. The method of Example 17, wherein the first optical probe signal comprises a first sequence of optical pulses having a first number of pulses. [0471] Example 19. The method of Example 18, wherein controlling the characteristic of the second optical probe signal comprises changing the delay between two consecutive optical pulses in the second sequence of optical pulses compared to a delay between two respective consecutive pulses in the first sequence of optical pulses. [0472] Example 20. The method of Example 18, controlling the characteristic of the second optical probe signal comprises changing the intensity of at least one optical pulse in the second sequence of optical pulses compared to the intensity of a respective optical pulse in the first sequence of optical pulses. [0473] Example 21. The method of Example 18, wherein controlling the characteristic of the second optical probe signal comprises changing the second number of optical pulses to a number different from the first number. [0474] Example 22. The method of Example 21, wherein changing the intensity of at least one optical pulse in the second sequence of optical pulses comprises increasing the intensity of at least one optical pulse in the second sequence of optical pulses compared to the intensity of a respective optical pulse in the first sequence of optical pulses, when the emission control signal indicates that an optical intensity the one or more of the first reflected optical signals is smaller than an optical intensity of one or more reflected optical signals received before the one or more of the first reflected optical signals. [0475] Example 23. The method of Example 22, wherein the second number is smaller than the first number, when the emission control signal indicates that an optical intensity the one or more of the first reflected optical signals is smaller than an optical intensity of one or more reflected optical signals received before the one or more of the first reflected optical signals. [0476] Example 24. The system of Example 21, wherein changing the intensity of at least one optical pulse in the second sequence of optical pulses comprises increasing the intensity of at least one optical pulse in the second sequence of optical pulses compared to the intensity of a respective optical pulse in the first sequence of optical pulses, when the emission control signal indicates a level of background light received by the photodetection circuit has increased. [0477] Example 25. The system of Example 22, wherein the second number is smaller than the first number when the emission control signal indicates a level of background light received by the photodetection circuit has increased. [0478] Example 26. The method of Example 1, further comprising, by the at least one processor: in response to receiving the one or more of the first reflected optical signals, generating one or more sensor signals based on the one or more of the first reflected optical signals; and generating a background signal based on the received light, wherein the background signal indicates a magnitude of light generated by a light source different from the emitter. [0479] Example 27. The method of Example 26, wherein generating the emission control signal comprises generating the emission control signal based at least in part on an amplitude of the one or more sensor signals. [0480] Example 28. The method of Example 27, wherein generating the emission control signal comprises generating the emission control signal based at least in part on an average amplitude of the one or more sensor signals. [0481] Example 29. The method of Example 26, wherein generating the emission control signal comprises generating the emission control signal based at least in part on the background signal. [0482] Example 30. The method of Example 26, wherein generating the emission control signal comprises generating the emission control signal based at least in part on a noise level in a readout system. [0483] Example 31. The method of Example 26, wherein generating the emission control signal comprises generating the emission control signal based at least in part on a signal-to-noise ratio of at least one of the one or more sensor signal. [0484] Example 32. The method of any of Examples 26-31, wherein the light source comprises another light emitting system, or sun light. [0485] Example 33. The method of Example 26, further comprising controlling a detection threshold level based at least in part on the background signal. [0486] Example 34. The method of Example 26, wherein generating the emission control signal comprises comparing a measured amplitude of one or more sensor signals with an acceptable range for the amplitude of the one or more sensor signals. [0487] Example 35. The method of Example 26, wherein generating the emission control signal comprises determining whether the background signal is within an acceptable range for the background signal. [0488] Example 36. The method of any of Examples 34 and 35, wherein the acceptable range is stored in a memory of the system. [0489] Example 37. The method of any of Examples 34 and 35, further comprising, by the at least one processor, determining the acceptable range. Group 4 [0490] Example 1. A system comprising: an emitter that emits a first optical probe signal directed to an environment; a photodetection circuit that receives light from the environment and generates an emission control signal based at least in part on one or both of a first portion and a second portion of the received light, wherein the first portion of the received light comprises a first reflected optical signal resulting from the first optical probe signal being reflected from within the environment, and the second portion of the received light comprises light received after the first portion and before emission of a second optical probe signal by the emitter; and a control circuit that controls a characteristic of the second optical probe signal based at least in part on the emission control signal to maintain or increase a probability of detecting a second reflected optical signal resulting from the second optical probe signal being reflected from within the environment, compared to a probability of detecting the first reflected optical signal. [0491] Example 2. The system of Example 1, wherein the control circuit controls the characteristic of the second optical probe signal to maintain the probability of detecting the second reflected optical signal within a predefined range. [0492] Example 3. The system of any of Examples 1-2, wherein the emission control signal indicates a level of background light received by the photodetection circuit, and wherein the background light is not associated with the first optical probe signal. [0493] Example 4. The system of any of Examples 1-3, wherein control circuit further controls a characteristic of the second optical probe signal based on reference data stored in a non-transitory memory of the system, and wherein the reference data comprises at least one mapping table that indicates associations between characteristics of the second optical probe signal and at least one of: a background signal generated by the photodetection circuit, where the background signal indicates a level of background light; or an amplitude of a sensor signal associated with intensity of at least one of the first reflected optical signal. [0494] Example 5. The system of any of Examples 1-4, wherein the second optical probe signal comprises a second sequence of optical pulses and the characteristic of the second optical probe signal comprises at least one of: a second number of optical pulses in the second sequence of optical pulses, a delay between two consecutive optical pulses in the second sequence of optical pulses, or an intensity of an optical pulse in the second sequence of optical pulses. [0495] Example 6. The system of Example 5, wherein the first optical probe signal comprises a first sequence of optical pulses having a first number of pulses and when the emission control signal indicates that a level of background light received by the photodetection circuit has increased, the control circuit controls the characteristic of the second optical probe signal by changing the second number of optical pulses such that the second number is smaller than the first number. [0496] Example 7. The system of Example 5, wherein the control circuit increases the intensity of at least one optical pulse in the second sequence of optical pulses when the emission control signal indicates a level of background light received by the photodetection circuit has increased. [0497] Example 8. The system of any of Examples 1-7, wherein the photodetection circuit comprises: a sensor that receives the first reflected optical signal and generates a sensor signal in response to receiving the first reflected optical signal; and a readout processing circuit generates a background signal based on the received light, wherein the background signal indicates a magnitude of light generated by a light source different from the emitter; wherein the photodetection circuit generates the emission control signal based at least in part on one or both of an average amplitude of the sensor signal or an average amplitude of the background signal. [0498] Example 9. A method comprising: emitting, by at least one processor of a system, a first optical probe signal directed into an environment using an emitter of the system; generating, by the at least one processor, an emission control signal based at least in part on one or both of a first portion and a second portion of light received by a photodetection circuit from the environment, wherein the received light comprises first reflected optical signal resulting from the first optical probe signal being reflected within the environment and the second portion of the received light comprises light received after the first portion and before emission of a second optical probe signal by the emitter; determining, by the at least one processor, a characteristic of the second optical probe signal, based at least in part on the emission control signal, to maintain or increase a probability of detecting a second reflected optical signal resulting from the second optical probe signal being reflected within the environment compared to a probability of detecting the first reflected optical signal; and emitting the second optical probe signal having the determined characteristics. [0499] Example 10. The method of Example 9, wherein determining the characteristic of the second optical probe signal comprises determining the characteristic of the second optical probe signal to maintain a probability of detecting the second reflected optical signal within a predefined range. [0500] Example 11. The method of any of Examples 9-10, wherein the emission control signal indicates a level of background light received by the photodetection circuit, wherein the background light is not associated with the first optical probe signal. [0501] Example 12. The method of any of Examples 9-11, wherein determining the characteristic of the second optical probe signal comprises determining the characteristic of the second optical probe signal based on reference data stored in a non- transitory memory of the system, wherein the reference data comprises at least one mapping table indicating associations between characteristics of the second optical probe signal and at least one of: a background signal generated by the photodetection circuit, wherein the background signal is generated based on a portion of received light that is not associated with the first reflected optical signal; or an amplitude of a sensor signal associated with intensity of the first reflected optical signal. [0502] Example 13. The method of any of Examples 9-12, wherein the second optical probe signal comprises a second sequence of optical pulses and the characteristic of the second optical probe signal comprises at least one of: a second number of optical pulses in the second sequence of optical pulses, a delay between two consecutive optical pulses in the second sequence of optical pulses, or an intensity of an optical pulse in the second sequence of optical pulses. [0503] Example 14. The method of Example 13, wherein the first optical probe signal comprises a first sequence of optical pulses having a first number of pulses and determining the characteristic of the second optical probe signal comprises changing the second number of optical pulses to a number different from the first number of optical pulses, when the emission control signal indicates a level of background light received by the photodetection circuit has increased. [0504] Example 15. The method of Example 13, wherein the first optical probe signal comprises a first sequence of optical pulses, and determining the characteristic of the second optical probe signal comprises changing the intensity of at least one optical pulse in the second sequence of optical pulses compared to the intensity of a respective optical pulse in the first sequence of optical pulses. [0505] Example 16. At least one non-transitory storage media storing machine-executable instructions that, when executed by at least one processor, cause the at least one processor to: emit a first optical probe signal directed into an environment using an emitter; generate an emission control signal based at least in part on one or both of a first portion and second potion of light received by a photodetection circuit from the environment, wherein the first portion of received light comprises a first reflected optical signal resulting from the first optical probe signal being reflected within the environment and the second portion of the received light comprises light received after the first portion and before emission of a second optical probe signal by the emitter; and determine a characteristic of at least the second optical probe signal based at least in part on the emission control signal to maintain or increase a probability of detecting a second reflected optical signal resulting from the second optical probe signal being reflected within the environment, compared to a probability of detecting the first reflected optical signal; and emit the second optical probe signal having the determined characteristics, using the emitter. [0506] Example 17. The at least one non-transitory storage media of Example 16, wherein the first optical probe signal and the second optical probe signal are emitted along two different directions. [0507] Example 18. The at least one non-transitory storage media of any of Examples 16-17, wherein the emission control signal indicates a level of background light received by the photodetection circuit, and wherein the background light is not associated with the first optical probe signal. [0508] Example 19. The at least one non-transitory storage media of any of Examples 16-18, wherein the emission control signal indicates an optical intensity of the first reflected optical signal. [0509] Example 20. The at least one non-transitory storage media of any of Examples 16-19, wherein the machine-executable instructions cause the at least one processor to control the characteristic of the second optical probe signal based on reference data stored in the at least one non-transitory storage media, wherein the reference data comprises at least one mapping table that indicates associations between characteristics of the second optical probe signal and at least one of: a background signal generated by the photodetection circuit, where the background signal indicates a level of background light; and an amplitude of a sensor signal associated with intensity of at least one of the first reflected optical signal. Terminology [0510] In this description numerous specific details are set forth in order to provide a thorough understanding of the present disclosure for the purposes of explanation. It will be apparent, however, that the embodiments described by the present disclosure can be practiced without these specific details. In some instances, well-known structures and devices are illustrated in block diagram form in order to avoid unnecessarily obscuring aspects of the present disclosure. [0511] Specific arrangements or orderings of schematic elements, such as those representing systems, devices, modules, instruction blocks, data elements, and/or the like are illustrated in the drawings for ease of description. However, it will be understood by those skilled in the art that the specific ordering or arrangement of the schematic elements in the drawings is not meant to imply that a particular order or sequence of processing, or separation of processes, is required unless explicitly described as such. Further, the inclusion of a schematic element in a drawing is not meant to imply that such element is required in all embodiments or that the features represented by such element cannot be included in or combined with other elements in some embodiments unless explicitly described as such. [0512] Although the terms first, second, third, and/or the like are used to describe various elements, these elements should not be limited by these terms. The terms first, second, third, and/or the like are used only to distinguish one element from another. For example, a first contact could be termed a second contact and, similarly, a second contact could be termed a first contact without departing from the scope of the described embodiments. The first contact and the second contact are both contacts, but they are not the same contact. [0513] The terminology used in the description of the various described embodiments herein is included for the purpose of describing particular embodiments only and is not intended to be limiting. As used in the description of the various described embodiments and the appended claims, the singular forms “a,” “an” and “the” are intended to include the plural forms as well and can be used interchangeably with “one or more” or “at least one,” unless the context clearly indicates otherwise. It will also be understood that the term “and/or” as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items. It will be further understood that the terms “includes,” “including,” “ includes,” and/or “comprising,” when used in this description specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. [0514] As used herein, the term “if” is, optionally, construed to mean “when”, “upon”, “in response to determining,” “in response to detecting,” and/or the like, depending on the context. Similarly, the phrase “if it is determined” or “if [a stated condition or event] is detected” is, optionally, construed to mean “upon determining,” “in response to determining,” “upon detecting [the stated condition or event],” “in response to detecting [the stated condition or event],” and/or the like, depending on the context. Also, as used herein, the terms “has”, “have”, “having”, or the like are intended to be open-ended terms. Further, the phrase “based on” is intended to mean “based at least partially on” unless explicitly stated otherwise.