Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
APPARATUS, SYSTEMS AND METHODS FOR DETECTING LIGHT
Document Type and Number:
WIPO Patent Application WO/2021/123956
Kind Code:
A1
Abstract:
An apparatus comprising: a first optical path for providing light from a first field of view; a second optical path for providing light from a second field of view; modulation means for applying different effective spatial modulation to the light from the first field of view and the light from the second field of view to provide spatially modulated light from the first field of view and spatially modulated light from the second field of view; and a detector configured to simultaneously detect the spatially modulated light from the first field of view and the spatially modulated light from the second field of view.

Inventors:
YUAN XIN (US)
COSS MICHAEL (US)
QIAO MU (US)
LIU XUAN (US)
Application Number:
PCT/IB2020/060818
Publication Date:
June 24, 2021
Filing Date:
November 17, 2020
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
NOKIA TECHNOLOGIES OY (FI)
International Classes:
G02B26/08; H04N5/225; G03B37/00
Domestic Patent References:
WO2016191367A12016-12-01
WO2008108840A12008-09-12
Foreign References:
US10382700B12019-08-13
US20140231650A12014-08-21
US10198790B12019-02-05
Download PDF:
Claims:
I/we claim:

1 . An apparatus comprising: a first optical path for providing light from a first field of view; a second optical path for providing light from a second field of view; modulation means for applying different effective spatial modulation to the light from the first field of view and the light from the second field of view to provide spatially modulated light from the first field of view and spatially modulated light from the second field of view; and a detector configured to simultaneously detect the spatially modulated light from the first field of view and the spatially modulated light from the second field of view.

2. An apparatus as claimed in claim 1 , wherein the different effective spatial modulation is configured to cause modulation by different spatial patterns at the detector of the light from the first field of view and the light from the second field of view.

3. An apparatus as claimed in claim 1 or 2, wherein the modulation means comprises a common modulation element configured to apply the same spatial modulation to the light from the first field of view and to the light from the second field of view.

4. An apparatus as claimed in claim 3, wherein the common modulation element is configured to apply the same spatially varying amplitude modulation to the light from the first field of view and to the light from the second field of view, and the modulation means is configured to provide a spatial shift at the detector between the amplitude modulated light from the first field of view and the amplitude modulated light from the second field of view.

5. An apparatus as claimed in claim 4, wherein the spatial shift is a lateral shift parallel to at least one of a row or a column of the detector.

6. An apparatus as claimed in any preceding claim, wherein the detector is a low speed camera. 7. An apparatus as claimed in any preceding claim, wherein the apparatus is configured to use separate optical channels for the light from the first field of view and for the light from the second field of view.

8. An apparatus as claimed in any preceding claim, wherein the modulation means comprises a common optical path for the light from the first field of view and the light from the second field of view.

9. An apparatus as claimed in any preceding claim, wherein the spatial modulation is randomized, pixelated modulation over a two-dimensional area.

10. An apparatus as claimed in any preceding claim, wherein the spatial modulation is pixelated modulation over a two-dimensional area, and the pixelated modulation comprises pixels arranged in rows and columns that are parallel to rows and columns of pixels of the detector when projected onto the detector.

11. An apparatus as claimed in any preceding claim, wherein the modulation means is dynamic modulation means for applying different time-varying effective spatial modulation to the light from the first field of view and the light from the second field of view to provide spatially modulated light from the first field of view and spatially modulated light from the second field of view.

12. An apparatus as claimed in claim 11, when dependent upon claim 3, wherein the common modulation element is configured to apply the same time-varying spatial modulation to the light from the first field of view and to the light from the second field of view.

13. An apparatus as claimed in claim 12, wherein the dynamic modulation means is configured to provide a spatial shift at the detector between the modulated light from the first field of view and the modulated light from the second field of view.

14. An apparatus as claimed in any preceding claim, wherein the apparatus comprises: a first polarization element in the first optical path for polarizing the light from the first field of view; a second polarization element, different to the first polarization element, in the second optical path for polarizing the light from the second field of view; means for combining the differently polarized light from the first field of view and the second field of view; wherein the modulation means comprises: a common dynamic modulation element for applying a time-varying spatial modulation to the combined, differently polarized, light from the first field of view and the second field of view to produce combined, differently polarized, spatially modulated, light from the first field of view and the second field of view; means for selectively splitting, based on polarization, the combined, differently polarized, spatially modulated, light from the first field of view and the second field of view to produce in the first optical path polarized, spatially modulated light from the first field of view and to produce in the second optical path polarized, spatially modulated light from the second field of view; and means for applying a different spatial shift to the polarized, spatially modulated light from the first field of view in the first optical path compared to the polarized, spatially modulated light from the second field of view in the second optical path; and wherein the detector is configured to simultaneously detect the differently shifted, differently polarized, spatially modulated light from the first field of view and the second field of view.

15. An apparatus as claimed in claim 14, wherein the common dynamic modulation element is a single dynamic digital mirror that is movable or a fixed pattern spatially coded aperture that is movable.

16. An apparatus as claimed in claim 14 or 15, wherein the means for applying a different spatial shift comprises an angled optic.

17. A method comprising using the apparatus of any preceding claim in the production of an image of the first field of view and an image of the second field of view.

18. A method as claimed in claim 17, wherein the image of the first field of view and the image of the second field of view are products obtained directly by means of the method.

19. A method comprising: providing light from a first field of view via a first optical path; providing light from a second field of view via a second optical path; applying different effective spatial modulation to the light from the first field of view and the light from the second field of view to provide spatially modulated light from the first field of view and spatially modulated light from the second field of view; detecting simultaneously the spatially modulated light from the first field of view and the spatially modulated light from the second field of view.

Description:
Apparatus, Systems and Methods for Detecting Light

TECHNOLOGICAL FIELD

Examples of the disclosure relate to apparatus, systems and methods for detecting light. In particular some examples relate to apparatus, systems and methods for detecting light that has been subjected to compressive sensing.

BACKGROUND

According to the theory of compressive sensing, traditional sampling is replaced by measurements of inner products with random vectors.

Light modulated by reflection from or transmission through an object, when detected directly by a two-dimensional pixelated detector, is an oversampled field that has a sparser representation in some domain. As a consequence, detecting coded fields (sparse incoherent fields rather than the whole field) can capture sufficient information to characterise the object.

BRIEF SUMMARY

According to various, but not necessarily all, embodiments of the invention there is provided an apparatus comprising: a first optical path for providing light from a first field of view; a second optical path for providing light from a second field of view; modulation means for applying different effective spatial modulation to the light from the first field of view and the light from the second field of view to provide spatially modulated light from the first field of view and spatially modulated light from the second field of view; and a detector configured to simultaneously detect the spatially modulated light from the first field of view and the spatially modulated light from the second field of view.

In some but not necessarily all examples, the different effective spatial modulation causes modulation by different spatial patterns at the detector of the light from the first field of view and the light from the second field of view.

In some but not necessarily all examples, the modulation means comprises a common modulation element that applies the same spatial modulation to the light from the first field of view and to the light from the second field of view. In some but not necessarily all examples, the common modulation element applies the same spatially varying amplitude modulation to the light from the first field of view and to the light from the second field of view, and the modulation means provides a spatial shift at the detector between the amplitude modulated light from the first field of view and the amplitude modulated light from the second field of view.

In some but not necessarily all examples, the spatial shift is a lateral shift parallel to detector row/column.

In some but not necessarily all examples, the detector is a low speed camera.

In some but not necessarily all examples, the apparatus is configured to use separate optical channels for the light from the first field of view and for the light from the second field of view.

In some but not necessarily all examples, the modulation means comprises a common optical path for the light from the first field of view and the light from the second field of view.

In some but not necessarily all examples, the spatial modulation is randomized, pixelated modulation over a two-dimensional area

In some but not necessarily all examples, the pixelated modulation comprises pixels arranged in rows and columns that are parallel to rows and columns of pixels of the detector when projected onto the detector.

In some but not necessarily all examples, the modulation means is dynamic modulation means for applying different time-varying effective spatial modulation to the light from the first field of view and the light from the second field of view to provide spatially modulated light from the first field of view and spatially modulated light from the second field of view.

In some but not necessarily all examples, the dynamic modulation means comprises a common modulation element that applies the same time-varying spatial modulation to the light from the first field of view and to the light from the second field of view. In some but not necessarily all examples, the dynamic modulation means provides a spatial shift at the detector between the modulated light from the first field of view and the modulated light from the second field of view.

In some but not necessarily all examples, the apparatus comprises: a first polarization element in the first optical path for polarizing the light from the first field of view; a second polarization element, different to the first polarization element, in the second optical path for polarizing the light from the second field of view; means for combining the differently polarized the light from the first field of view and the second field of view; wherein the modulation means comprises: a common dynamic modulation element for applying a time-varying spatial modulation to the combined, differently polarized, light from the first field of view and the second field of view to produce combined, differently polarized, spatially modulated, light from the first field of view and the second field of view; means for selectively splitting, based on polarization, the combined, differently polarized, spatially modulated, light from the first field of view and the second field of view to produce in the first optical path polarized, spatially modulated light from the first field of view and to produce in the second optical path polarized, spatially modulated light from the second field of view; and means for applying a different spatial shift to the polarized, spatially modulated light from the first field of view in the first optical path compared to the polarized, spatially modulated light from the second field of view in the second optical path; and wherein the detector is configured to simultaneously detect the differently shifted, differently polarized, spatially modulated light from the first field of view and the second field of view.

In some but not necessarily all examples, the common dynamic modulation element is a single dynamic digital mirror or a fixed pattern spatially coded aperture that is moved.

In some but not necessarily all examples, the means for applying a different spatial shift comprises an angled optic. In some but not necessarily all examples, a method comprises using the apparatus in the production of an image of the first field of view and an image of the second field of view.

In some but not necessarily all examples, the image of the first field of view and the image of the second field of view are products obtained directly by means of the method.

According to various, but not necessarily all, embodiments of the invention there is provided a method comprising: providing light from a first field of view via a first optical path; providing light from a second field of view via a second optical path; applying different effective spatial modulation to the light from the first field of view and the light from the second field of view to provide spatially modulated light from the first field of view and spatially modulated light from the second field of view; detecting simultaneously the spatially modulated light from the first field of view and the spatially modulated light from the second field of view.

According to various, but not necessarily all, embodiments of the invention there is provided examples as claimed in the appended claims.

BRIEF DESCRIPTION

For a better understanding of various examples that are useful for understanding the detailed description, reference will now be made by way of example only to the accompanying drawings in which:

Fig. 1 illustrates an example apparatus for compressive sensing;

Fig. 2 illustrates an example of the apparatus illustrated in Fig. 1 ;

Fig. 3 shows the compressive sensing principle of examples of the disclosure; and Fig. 4 illustrates an example of the apparatus illustrated in Fig. 2.

DETAILED DESCRIPTION

Examples of the disclosure relate to an apparatus 1 for compressive sensing comprising: a first optical path 14 for providing light 9 from a first field of view 16; a second optical path 24 for providing light 9 from a second field of view 26; modulation means 3 for applying different effective spatial modulation to the light 9 from the first field of view 16 and the light 9 from the second field of view 26 to provide spatially modulated light 18 from the first field of view 16 and spatially modulated light 28 from the second field of view 26; and a detector 7 configured to simultaneously detect the spatially modulated light 18 from the first field of view 16 and the spatially modulated light 28 from the second field of view 26.

In at least some examples, modulation means 3 applies the same spatial modulation to the light 9 from the first field of view 16 and the light 9 from the second field of view 26 followed by a differential spatial offset between the spatially modulated light 9 from the first field of view 16 and the spatially modulated light 9 from the second field of view 26. This applies different effective spatial modulation to the light 9 from the first field of view 16 and the light 9 from the second field of view 26 to provide spatially modulated light 18 from the first field of view 16 and spatially modulated light 28 from the second field of view 26 that have been differently coded at detection. The detector 7 simultaneously detects a first image of the first field of view 16 and a second image of the second field of view 26 that have been differently coded. The different coding allows the first image and the second image to be recovered from the output of the detector 7.

In some but not necessarily all examples, multiple images for each field of view can be detected in a single exposure time of the detector. Each of the multiple images, from the multiple fields of view, can be differently coded allowing them to be recovered from the output of the detector 7. The different coding can, for example, be achieved using a variable code that is common to both fields of fields of view, and a spatial offset that is different between the fields of view and is fixed. The variation in the code can be a variation over time or a variation over frequency. The different coding can, for example, be achieved using a fixed code that is common to both fields of fields of view, and a spatial offset that is different between the fields of view and is variable. The variation in the spatial offset can be a variation over time or a variation over frequency.

Fig. 1 schematically illustrates an example apparatus 1. The example apparatus 1 comprises: a first optical path 14 configured to provide light 9 from the first field of view 16; a second optical path 24 configured to provide light 9 from the second field of view 26; modulation means 3 for applying different effective spatial modulation to the light 9 from the first field of view 16 and the light 9 from the second field of view 26 to provide spatially modulated light 18 from the first field of view 16 and spatially modulated light 28 from the second field of view 26; and a detector 7 configured to simultaneously detect the spatially modulated light 18 from the first field of view 16 and the spatially modulated light 28 from the second field of view 26.

The light 9 from the first field of view 16 can, for example, arrive from any suitable source. The light 9 from the first field of view 16 can, for example, comprise light that has been reflected from or that has passed through a first scene or a first object. The light 9 from the first field of view 16 encodes a first image that is detectable by the detector 7.

The light 9 from the second field of view 26 can, for example, arrive from any suitable source. The light 9 from the second field of view 26 can, for example, comprise light that has been reflected from or that has passed through a second scene or a second object. The light 9 from the second field of view 26 encodes a second image that is detectable by the detector 7.

The different effective spatial modulation causes modulation, by different spatial patterns at the detector 7, of the light 9 from the first field of view 16 and the light 9 from the second field of view 26.

The different spatial patterns represent different codes that differently code the light 9 from the first field of view 16 and the light 9 from the second field of view 26. In this way, the light 9 from the first field of view 16 and the light 9 from the second field of view 26 are code-divided at the detector 7. This code-division allows the separation of the first image of first field of view 16 and the second image of the second field of view 26 from the detector output by processing, for example using processor 10.

In this example, but not necessarily all examples, the modulation means 3 comprises a common modulation element 32 that applies the same spatial modulation to the light 9 from the first field of view 16 and to the light 9 from the second field of view 26 to produce, respectively, spatially modulated light 11 from the first field of view 16 and spatially modulated light 11 from the second field of view 26. The modulation means 3 additionally comprises spatial shift means 34 that provides a spatial shift, when measured at the detector 7, between the spatially modulated light 11 from the first field of view 16 and the spatially modulated light 11 from the second field of view 26.

In the illustrated example, the common modulation element 32 applies the same spatially varying amplitude modulation to the light 9 from the first field of view 16 and to the light 9 from the second field of view 26 to produce, respectively amplitude modulated light 11 from the first field of view 16 and amplitude modulated light 11 from the second field of view 26.

Different combinations of the common spatial modulation and the differential spatial shift can provide different effective spatial modulation. Each different combination of the common spatial modulation and a differential spatial shift can provide a different code for an image thus code dividing the images.

The modulation means 3 comprises a common optical path 36 for the light 9 from the first field of view 16 and the light 9 from the second field of view 26. The common modulation element 32 and the spatial shift means 34 are both in the common optical path 36.

MODULATION

The common modulation element 32 can be configured to selectively remove information from the light 9 so that only portions of the light 9 are detected.

The common modulation element 32 can, for example, comprise any means which may be arranged to spatially modulate the light 9. The spatial modulation occurs over a transverse cross-sectional area of the light 9. The modulation comprises amplitude modulation that varies in dependence upon a location within the transverse cross-sectional area of the incident light 9.

In some examples the common modulation element 32 comprises a spatially coded aperture. The spatially coded aperture provides for spatial modulation over a cross-sectional area of the light 9 that passes through (or is reflected from) the coded aperture. The coded aperture is coded to provide amplitude modulation that varies in dependence upon a location within the aperture. The coded aperture defines a fixed two-dimensional pattern of spatially varying transparency. The spatially coded aperture physically modulates the incident light 9 to a spatially compressed/sparse format. The spatially coded aperture may comprise a non-uniform optical mask or any other suitable type of aperture that provides amplitude modulation that varies in dependence upon a location within the aperture. In some examples the coded aperture is a fixed coded aperture for which the amplitude modulation is fixed in space and time.

The spatially coded aperture may be a two-dimensional spatially coded aperture or any other suitable type of aperture. The two-dimensional spatially coded aperture defines a two- dimensional plane. The light 9 may travel in a direction normal (orthogonal) to the two- dimensional plane.

In other examples the common modulation element 32 could comprise a liquid crystal on silicon (LCOS) modulator, a digital micromirror device (DMD) array or any other suitable type of spatially coded aperture 32.

The common modulation element 32 can comprise multiple different portions that have a particular transparency. In some examples the common modulation element 32 may comprise at least a first portion having a first level of transparency to the input light 9 and at least a second portion having a second, different level of transparency to the input light 9. In some examples the common modulation element 32 may comprise at least multiple spatially distributed non-overlapping first portions, that are distributed over an area in two dimensions and have a first level of transparency to the light 9 and at least multiple spatially distributed non-overlapping second portions that are distributed over the area in two dimensions and have a second, different level of transparency to the input light 9. In at least some examples, the spatially distributed first portions and the spatially distributed second portions do not overlap. The spatially distributed first portions and the spatially distributed second portions can be contiguous and, in some examples, the spatially distributed first portions and the spatially distributed second portions completely fill the area. The different levels of transparency may allow different levels of light to pass through the common modulation element 32. In some examples the common modulation element 32 may be a binary common modulation element 32 so that only two different absorbencies are provided by the respective portions of the common modulation element 32. In other examples the common modulation element 32 may be a grey-scale modulator and may comprise more than two different levels of transparency in the different portions of the common modulation element 32. The different portions of the common modulation element 32 may be arranged in any suitable pattern. In some examples the respective portions of the common modulation element 32 having different transparencies are pixelated and arranged in a pixelated pattern. The pixelated arrangement may have the respective portions of the common modulation element 32 arranged in an array of columns and rows of pixels. In some examples, the pixels are square or rectangular.

The spatially coded aperture 32 can comprise multiple different portions that are coded with a particular transparency, for example, the coded aperture 32 can be pixelated and comprise multiple different portions (pixels) that are arranged as an array in rows and columns, where the pixels are coded with a particular transparency. The two-dimensional pattern of pixels (portions) that have a first transparency is different to the two-dimensional pattern of pixels (portions) that have a second transparency, different to the first transparency.

The transparency at each pixel defines a two-dimensional pattern of spatially varying transparency. In some examples, the transparency at each pixel in a row defines a one dimensional pattern of spatially varying transparency that does not repeat or does not repeat within a minimum number of columns. In some examples, the transparency at each pixel in a column defines a one-dimensional pattern of spatially varying transparency that does not repeat or does not repeat within a minimum number of rows. In some examples, the transparency at each pixel defines a two-dimensional pattern of spatially varying transparency that has a random or pseudorandom spatial distribution. In some examples, the pixels are coded as either opaque or transparent. In other examples, the pixels are coded using grey scale.

The spatial modulation can therefore be a randomized, pixelated modulation over a two- dimensional area

The size p of the pixels (of the common modulation element 32) when projected onto the detector 7, can be directly proportional to a size d of pixels of the detector 7.

The rows and columns of the pixels (common modulation element 32) when projected onto the detector 7, can be parallel to rows and columns of pixels of the detector. The number of transparent pixels, partially transparent pixels, and opaque pixels in a spatially coded aperture 32 can vary in different implementations of the disclosure. In some examples approximately half of the pixels of the modulation element 32 could be absorbent so that half of the incident area of the modulation element 32 acts to block the input light 9 while the other half allows the incident light to pass, or partially pass through in a spatially-coded format.

In some examples the different portions (e.g. pixels) of the common modulation element 32 may be arranged in a random pattern (which encompasses pseudo random patterns) that is random in two dimensions. The random pattern may be an irregular pattern. The random pattern might not be defined or arranged in relation to any specific object. In other examples the respective portions (e.g. pixels) of the common modulation element 32 may be arranged in a predetermined pattern.

The predetermined pattern may be selected according to the source the light 9. It can for example be selected according to the object or type of object that is to be imaged in one or both fields of view 16, 26.

In some examples the common modulation element 32 may be fixed in position relative to the other components of the apparatus 1. In other examples the common modulation element 32 may be arranged to be moveable between imaging measurements relative to the other components of the apparatus 1 .

In some examples the transparency of the portions of the common modulation element 32 may be wavelength dependent. In such examples the modulation of the input light 9 by the respective portions of the common modulation element 32 will be dependent upon the wavelengths within the incident light 9.

The common modulation element 32 provides spatially modulated light 11 comprising spatially modulated light 11 from the first field of view 16 and the spatially modulated light 11 from the second field of view 26.

SPATIAL SHIFT The spatial shift means 34 provides a spatial shift (an offset), when measured at the detector 7, between the spatially modulated light 11 from the first field of view 16 and the spatially modulated light 11 from the second field of view 26.

In some but not necessarily all examples, the spatial shift, when measured at the detector 7, is a lateral shift parallel to row or column of pixels in the detector 7.

In some but not necessarily all examples, the spatial shift is only in the plane of the light 9. In at least some examples, the spatial shift is only in one dimension. That one dimension can be aligned with a row (or a column) of pixels in the spatially coded aperture and/or pixels of the detector 7.

The spatial shift means 34 can be an optical element configured to provides a spatial shift (an offset), when measured at the detector 7, between the spatially modulated light 11 from the first field of view 16 and the spatially modulated light 11 from the second field of view 26.

A spatial offset can be achieved in a transmission configuration by passing light 9 from only one of the fields of view 16, 26 through a trapezoidal prism with input and output faces that are parallel to each other and at an angle to the path of the light 9. The trapezoidal prism could, for example, be a rectangular prism (cuboid) with a constant rectangular cross-section (in plane of angle) or a right parallelepiped with a constant parallelogram cross-section (in plane of angle).

A spatial offset can be achieved in a transmission configuration by differentially polarizing light from the different fields of view 16, 26 and then passing them simultaneously through a birefringent trapezoidal prism that has a refractive index that depends upon the polarization of the light. The birefringent trapezoidal prism has input and output faces that are parallel to each other and at an angle to the path of the light 9. The birefringent trapezoidal prism could, for example, be a rectangular prism (cuboid) with a constant rectangular cross-section (in plane of angle) or a right parallelepiped with a constant parallelogram cross-section (in plane of angle).

A spatial offset can be achieved in a reflection configuration by reflecting light 9 from the fields of view 16, 26 off differently angled mirrors. Other spatial shift means 34 may be used other than the examples provided.

DETECTOR

The detector 7 is configured to simultaneously detect the spatially modulated light 18 from the first field of view 16 and the spatially modulated light 28 from the second field of view 26.

The detector can be arranged to transduce incident light into an electrical output signal 15. In some examples the detector 7 may comprise a charge-coupled device, complementary metal- oxide semiconductor (CMOS) sensors or any other suitable type of sensors.

In some examples the detector 7 may comprise a two-dimensional array of sensors (pixels).

In some examples the detector 7 is a low speed camera. In this context a low speed camera is, for example, a video camera with a frame rate of less than 100 frames per second, for example 30 frames per second. A low speed camera can, for example, have a period of detector integration (a frame period) that is greater than 10ms, for example 33.3ms.

The output electrical signal 15 from the detector can, for example be stored in a memory and/or processed and/or transmitted. The reduced bandwidth provided by compressed sensing is therefore advantageous.

POLARIZATION

Fig. 2 illustrates an example of the apparatus illustrated in Fig. 1. In this example polarization is used, before modulation, to separate the light 9 from the first field of view 16 and the light 9 from the second field of view 26 into different optical channels. This allows the light 9 from the first field of view 16 and the light 9 from the second field of view 26 to be in orthogonal channels despite sharing a common optical path and a common modulation element 32.

In other examples, it may be possible to separate the channels in another domain such as frequency or time.

In Fig. 2, the apparatus 1 comprises: a first polarization element 70 in the first optical path 14 for polarizing (P1) the light 9 from the first field of view 16 and a second polarization element 71 , different to the first polarization element 70, in the second optical path 24 for polarizing (P2) the light 9 from the second field of view 26. In this example, one of the first polarization element 70 and the second polarization element 71 provides horizontal polarization and the other provides vertical polarization.

The apparatus 1 comprises combining means 72 for combining the differently polarized light 9 from the first field of view 16 and the second field of view 26. The combining means 72 can for example be a 2x1 beam combiner.

The modulation means 3 applies different effective spatial modulation to the combined differently polarized light 9 from the first field of view 16 and the second field of view 26 to provide spatially modulated light 18 from the first field of view 16 and spatially modulated light 28 from the second field of view 26.

The detector 7 is configured to simultaneously detect the differently shifted, differently polarized, spatially modulated light 18, 28 from the first field of view 16 and the second field of view 18.

In this example, the modulation means 3 comprises a common modulation element 32 and a polarization-dependent spatial shift means 34.

The modulation element 32 applies common spatial modulation to the combined, differently polarized, light 9 from the first field of view 16 and the second field of view 26 to produce combined, differently polarized, spatially modulated, light 11 from the first field of view 16 and the second field of view 26.

The apparatus 1 also comprises a polarization-dependent spatial shift means 34 for applying a different spatial shift to the polarized, spatially modulated light 18 from the first field of view 16 compared to the polarized, spatially modulated light 28 from the second field of view 26.

MEASUREMENT

The light 9 used during measurement has a broad spectrum (it is broadband). It comprises light that has a broad frequency spectrum. The broadband light 9 can, for example, comprise light that has wavelengths that differ by over 20nm or even over 300nm. The broadband light 9 can, for example, comprise light that has wavelengths that differ by between 20nm and 50nm. The broadband light 9 can, for example, comprise light that has wavelengths from 450nm to 700nm (or even 900nm).

The spatially modulated light 11 is a three-dimensional data cube [x, y, n] with a two- dimensional slice [x,y] for each field of view channel n coded by the same spatially coded aperture 3 that has variable transparency in the x-y plane.

The spatially modulated and spatially offset light 18, 28 represents a skewed version of the sparse three-dimensional data cube. The skew (offset), is within the x-y plane. In the example illustrated in Fig. 2 it is in the y-direction only.

Each spatially coded two-dimensional slice [x,y] for a field of view channel n is shifted (offset) y n .

The detector 7 detects the superposition of the offset spatially coded two-dimensional slices [x,y] for each field of view channel n. This reduces the sparse three-dimensional data cube to a two- dimensional projection in a single shot.

It collapses overlapping differently masked images for different field of view channels to a single image. The different masked images are incoherent.

In other examples the detector 7 may comprise a linear detector which may be scanned across a detecting plane.

In some but not necessarily all examples, the processing means 10 uses the output signal 15 from the detector 7 to produce a first image of the first field of view 16 and a second image of the second field of view 26.

The processing means 10 can be a part of the apparatus 1 or, as shown in Fig. 1 , separate from the apparatus 1. In some examples, the processing means 10 is remote from the apparatus 1. The processing means 10 can comprise a processor or controller and memory. The processing means 10 can comprise load and us a computer program stored in the memory to perform its functions. Fig. 3 shows the compressive sensing principle of examples of the disclosure.

Each of the images l n corresponds to a different field of view n. The image I 14 is a first image corresponding to the first field of view 14. The image l 24 is a second image corresponding to the second field of view 24. The different spatial images define a three-dimensional signal [x, y. n].

In the example of Fig. 3 the spatial modulation element 32 comprises a two-dimensional spatially coded aperture. Other types of modulation element 32 may be used in other examples of the disclosure, for example as previously described.

The spatial images I 14 , l 24 in the input light 9 are modulated by the common modulation element 32 to produce spatially modulated light 11 .

The spatially modulated light 11 is a sparse three-dimensional data cube [x, y, n] with a two- dimensional slice [x,y] for each field of view channel n coded by the same spatially coded aperture that has variable transparency in the x-y plane.

The spatially modulated beam of light 11 provided by the spatial modulation element 32 is then differentially spatially shifted (in the y-direction only). The different field of view channels n are shifted by a different amount in the y-direction as shown schematically in Fig. 3.

The spatially modulated and shifted light 18, 28 is then incident upon the detector 7. The detector 7 comprises a plurality of pixels 25. Only one pixel 25 is shown for clarity in Fig. 3. The plurality of pixels 25 may be arranged in any suitable array. In the example of Fig. 3 the plurality of pixels 25 may be arranged in a matrix array comprising rows and columns. Each pixel 25 detects the summation of the modulated and spatially shifted light 18, 28 for the different fields of view 18, 28 for the area covered by the pixel 25.

As the different images I 14 , l 24 are shifted by different amounts, they have passed though different (incoherent) coded apertures. The detector 7 detects the superposition of the offset spatially coded two-dimensional slices [x,y] for each field of view channel n. This reduces the sparse three-dimensional data cube to a compressed two-dimensional projection in a single shot.

In the above examples the input light 9 can be represented as Nfield of view channels. Each of the field of view channels has a spatial size N x x N y .

The signal provided to the detector 7 may be represented as S m (x,y ) where:

S 0 (x,y,n ) represents the input light 9 from a particular field of view n.

M(x, y, n) represents the combined effect of the spatial modulation and differential spatial offset.

Converting from the object space [x,y] to the detector space [i, j], the measurement z, of S m (x,y), obtained by the ( i,j) th pixel where is given by equation 1

Where S 0 (i,j,n) is the (quantized) three-dimensional input signal and M(i,j,n) is a (quantized) function representing a combination of the common spatial modulation and the differential spatial offset. The value n represents a field of view channel. The function M(i,j,n ) will be dependent on the transparencies of the portions on the spatial modulation element 32, the spatial arrangement of the portions of the spatial modulation element 32, the spatial offset means 34 and any other suitable factors.

The function M(i,j,n ) can be modelled as a series of 2D masks for each field of view, each 2D mask being generated by the same spatially coded aperture mask with an appropriate field of view dependent shift.

In this example, but not necessarily all examples, let us assume a one-to-one correspondence between the [i,j] space at the detector 7 where and the [x,y] space at the coded aperture where As an example, when the spatial shift means 34 causes a spatial shift d(Δn) in the y-direction then:

The 2D mask for all wavelengths can be represented as a matrix V This allows the measurement z obtained by each pixel 25 to be written in matrix form as where z is a N x N y x 1 vectorized version of the measurement obtained by each pixel 25, that is: [Z [i ,ij , s is the N x N y Nx 1 stacked vector of the three-dimensional input light S 0 (x,y,n), that is and sensing matrix and can, in an appropriate basis, be represented by diagonal matrices:

Any suitable compressive sensing inversion algorithms may be used by processing means 10 to solve equation (5) to obtain the desired image. For example, non-linear optimization can be used to reproduce the first and second images.

The sparsity of the transfer function H that represents the combination of the spatial modulating means 3 causes information compression.

It will be appreciated from the foregoing that the apparatus 1 can be used to simultaneously compressively sense two or more fields of view. The approach is not necessarily limited to simultaneously compressively sensing only two fields of view.

MULTIPLE IMAGES IN A SINGLE SHOT

In some but not necessarily all examples, multiple images I for each field of view 16, 26 can be detected in a single exposure time of the detector 7. Each of the multiple images I, from the multiple fields of view 16, 26, can be differently coded allowing them to be recovered from the output 15 of the detector 7. The different coding can, for example, be achieved: i) using a variable code that is common to both fields of view, and a spatial offset that is different between the fields of view and is fixed. The variation in the code can be: a) variation over time (time domain) or b) a variation over frequency (spectral domain)·, or ii) using a fixed code that is common to both fields of fields of view, and a spatial offset that is different between the fields of view and is also variable. The variation in the spatial offset can be: a) a variation over time (time domain) or b) a variation over frequency (spectral domain).

Considering the temporal domain scenarios i) a) and ii) a), each image plane in the temporal data cube (x,y,t) is modulated by a different code. Detection integrates temporally distinct image planes, but the data cube can be recovered by isolating each temporal image plane based on the distinct uncorrelated code patterns for the planes.

The modulation means 3 is a dynamic modulation means 3 for applying different time-varying effective spatial modulation to the light 9 from the first field of view 16 and the light 9 from the second field of view 26 to provide spatially modulated light 18 from the first field of view 16 and spatially modulated light 28 from the second field of view 26. The time-varying effective spatial modulation varies over a period of detector integration.

In the scenario i) a), the dynamic modulation means 3 comprises a dynamic common modulation element 32 that applies the same time-varying spatial modulation to the light 9 from the first field of view 16 and to the light 9 from the second field of view 26. The dynamic modulation means 3 additionally comprises spatial shift means 34 that provides a fixed spatial shift at the detector between the modulated light 9 from the first field of view 16 and the modulated light 9 from the second field of view 26. The dynamic common modulation element 32 could comprise, for example, a liquid crystal on silicon (LCOS) modulator, a digital micromirror device (DMD) array, a fixed pattern (passive) spatially coded aperture that is moved (translated or rotated). Movement can be performed by an electrically controlled actuator. One example of an electrically controlled actuator is a motor, for example a servo motor. In some examples, the actuator is controlled by the processing means 10. The processing means 10 acts as a driver. In other examples, the actuator is pre-programmed to perform a defined series of relative movements.

In the scenario ii) a), the dynamic modulation means 3 comprises a fixed common modulation element 32 that applies the same fixed spatial modulation to the light 9 from the first field of view 16 and to the light 9 from the second field of view 26. The dynamic modulation means 3 additionally comprises dynamic spatial shift means 34 that provides a time-varying spatial shift at the detector 7 between the modulated light 9 from the first field of view 16 and the modulated light 9 from the second field of view 26. The time-varying spatial shift can, for example be achieved by moving an optical element configured to provides a spatial shift (an offset), when measured at the detector 7, between the spatially modulated light 11 from the first field of view 16 and the spatially modulated light 11 from the second field of view 26. The optical element could be a trapezoidal prism or a mirror for example. Movement can be performed by an electrically controlled actuator. One example of an electrically controlled actuator is a motor, for example a servo motor. In some examples, the actuator is controlled by the processing means 10. The processing means 10 acts as a driver. In other examples, the actuator is pre-programmed to perform a defined series of relative movements

Considering the spectral domain scenarios i) b) and ii) b), each image plane in the spectral data cube (x,y,f) is modulated by a different code. Detection integrates spectrally distinct image planes, but the data cube can be recovered by isolating each spectral image plane based on the distinct uncorrelated code patterns for the planes.

The modulation means 3 is spectral modulation means for applying different frequency- varying effective spatial modulation to the light 9 from the first field of view 16 and the light 9 from the second field of view 26 to provide spatially modulated light 18 from the first field of view 16 and spatially modulated light 28 from the second field of view 26.

In scenario i) b) the spectral modulation means 3 comprises a spectral common modulation element 32 that applies the same frequency-varying spatial modulation to the light 9 from the first field of view 16 and to the light 9 from the second field of view 26. The spectral modulation means 3 additionally comprises spatial shift means 34 that provides a fixed spatial shift at the detector 7 between the modulated light 9 from the first field of view 16 and the modulated light 9 from the second field of view 26. The frequency-varying spatial modulation may be provided by a coded aperture that has a frequency-dependent code pattern.

In scenario ii) b) the spectral modulation means 3 comprises a fixed common modulation element 32 that applies a fixed spatial modulation to the light 9 from the first field of view 16 and to the light 9 from the second field of view 26. The spectral modulation means 3 additionally comprises spectral spatial shift means 34 that provides a frequency-dependent spatial shift at the detector 7 between the modulated light 9 from the first field of view 16 and the modulated light 9 from the second field of view 26. The spectral spatial shift means 34 can, for example be a dispersing element. A dispersing element may comprise a refractive element, for example a prism, or a diffractive element, for example a grating, which can be a transmissive diffraction grating or a reflective diffraction grating or any other suitable elements. The dispersing element can be a prism or a combination of prisms. A prism is a polyhedron with two faces parallel, and with surface normals of the other faces lying in the same plane. The or each prism can be a triangular prism. The triangular prism can have a constant triangular cross-section that has a shape of an isosceles triangle or an equilateral triangle.

AN EXAMPLE IMPLEMENTATION

Fig. 4 illustrates an example of an apparatus 1 that uses a time-variable code that is common to both fields of view, and a spatial offset that is different between the fields of view and is fixed (scenario i) a).

The apparatus 1 comprises, as described for Fig. 2: a first polarization element 70 in the first optical path 14 for polarizing (P1) the light 9 from the first field of view 16; a second polarization element 71 , different to the first polarization element 70, in the second optical path 24 for polarizing (P2) the light 9 from the second field of view 26; combining means 72 for combining the differently polarized the light 9 from the first field of view 16 and the second field of view 26; modulation means 3 and the detector 7.

The modulation means 3 comprises: a common dynamic modulation element 32 for applying a time-varying spatial modulation to the combined, differently polarized, light 9 from the first field of view 16 and the second field of view 26 to produce combined, differently polarized, spatially modulated, light 11 from the first field of view 16 and the second field of view 26; splitting means 73 for selectively splitting, based on polarization, the combined, differently polarized, spatially modulated, light 11 from the first field of view 16 and the second field of view 26 to produce in the first optical path 14 polarized, spatially modulated light 18 from the first field of view 16 and to produce in the second optical path 24 polarized, spatially modulated light 28 from the second field of view 26; and spatial shift means 34 for applying a different spatial shift to the polarized, spatially modulated light 18 from the first field of view 16 in the first optical path 14 compared to the polarized, spatially modulated light 28 from the second field of view 26 in the second optical path 24

The detector 7 is configured to simultaneously detect the differently shifted, differently polarized, spatially modulated light 18, 28 from the first field of view 16 and the second field of view 18.

In this example, the common dynamic modulation element 32 is a single digital micromirror device (DMD) array. However other suitable types of spatial modulators include a liquid crystal on silicon (LCOS) modulator and a fixed pattern (passive) spatially coded aperture that is moved (translated or rotated).

In this example, the spatial shift means 34 for applying a different spatial shift comprises an angled optic. The optic is angled in that an axis of the optic is angled to the path of light 9.

In some but not necessarily all examples, the angled optic is an angled transparent trapezoidal prism. The spatial offset is achieved by passing light 9 from the second field of view 26 (but not light from the first field of view 16) through the trapezoidal prism once. If the light 9 from the second field of view 26 passes through the trapezoidal prism on an outward path from the common dynamic modulation element 32, then it does not pass through the trapezoidal prism on a return path towards the detector 7. If the light 9 from the second field of view 26 does not pass through the trapezoidal prism on an outward path from the common dynamic modulation element 32, then it does pass through the trapezoidal prism on a return path towards the detector 7. The paths can be controlled using mirrors and a beam combiner. The trapezoidal prism is angled because it has input and output faces that are parallel to each other and at an angle to the path of the light 9. The trapezoidal prism could, for example, be a rectangular prism (cuboid) with a constant rectangular cross-section (in plane of angle) or a right parallelepiped with a constant parallelogram cross-section (in plane of angle).

In some examples, the spatial shift means 34 for applying a different spatial shift is an angled optic comprising an angled mirror and does not comprise a transparent trapezoidal prism.

The angled optic is the angled mirror. The angled mirror can, for example, be a flat mirror that has a normal vector tilted at an angle to the path of the light 9 from the second field of view 26 coming from the common dynamic modulation element 32.

The term “comprise” is used in this document with an inclusive not an exclusive meaning. That is any reference to X comprising Y indicates that X may comprise only one Y or may comprise more than one Y. If it is intended to use “comprise” with an exclusive meaning then it will be made clear in the context by referring to “comprising only one...” or by using “consisting”.

In this brief description, reference has been made to various examples. The description of features or functions in relation to an example indicates that those features or functions are present in that example. The use of the term ‘example’ or “for example” or “may” in the text denotes, whether explicitly stated or not, that such features or functions are present in at least the described example, whether described as an example or not, and that they can be, but are not necessarily, present in some of or all other examples. Thus “example”, “for example” or “may” refers to a particular instance in a class of examples. A property of the instance can be a property of only that instance or a property of the class or a property of a sub-class of the class that includes some but not all of the instances in the class. It is therefore implicitly disclosed that a feature described with reference to one example but not with reference to another example, can where possible be used in that other example but does not necessarily have to be used in that other example.

Although embodiments of the present invention have been described in the preceding paragraphs with reference to various examples, it should be appreciated that modifications to the examples given can be made without departing from the scope of the invention as claimed. Features described in the preceding description may be used in combinations other than the combinations explicitly described.

Although functions have been described with reference to certain features, those functions may be performable by other features whether described or not.

Although features have been described with reference to certain embodiments, those features may also be present in other embodiments whether described or not. Whilst endeavoring in the foregoing specification to draw attention to those features of the invention believed to be of particular importance it should be understood that the Applicant claims protection in respect of any patentable feature or combination of features hereinbefore referred to and/or shown in the drawings whether or not particular emphasis has been placed thereon.