Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
STEREO IMAGING APPARATUS
Document Type and Number:
WIPO Patent Application WO/2020/120842
Kind Code:
A1
Abstract:
An apparatus (500) comprises: - an imaging device (CAM1) comprising an input element (100), a focusing unit (300) and an image sensor (DET1), wherein the input element (100) has a symmetry axis (AX1), wherein a viewing region (ZONE1) surrounds the input element (100), and wherein the imaging device (CAM1) is arranged to form an annular image (IMG1) of the viewing region (ZONE1) on the image sensor (DET1), - a curved diffractive element (G1) to provide a first diffracted light ray (LR2a) and a second light ray (LR2b, LR2c) by diffracting light (LB1) received from a first object point (P0) located in the viewing region (ZONE1) such that the first diffracted light ray (LR2a) propagates towards the axis (AX1) in a first azimuthal direction (j1), and such that the second diffracted light ray (LR2b, LR2c) propagates towards the axis (AX1) in a second different azimuthal direction (j2), - wherein the imaging device (CAM1) is arranged to form a first image point (P1a) of the first object point (P0) by focusing light of the first diffracted light ray (LR2a) to the image sensor (DET1), and the imaging device (CAM1) is arranged to form a second different image point (P1b, P1c) of the first object point (P0) by focusing light of the second diffracted light ray (LR2b, LR2c) to the image sensor (DET1).

Inventors:
MÄKINEN JUKKA-TAPANI (FI)
OJALA KAI (FI)
Application Number:
PCT/FI2019/050886
Publication Date:
June 18, 2020
Filing Date:
December 12, 2019
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
TEKNOLOGIAN TUTKIMUSKESKUS VTT OY (FI)
International Classes:
G03B35/10; G03B37/06; G06V10/00; H04N13/229
Domestic Patent References:
WO2015181443A12015-12-03
Foreign References:
US20180039050A12018-02-08
US20130147919A12013-06-13
US20110074917A12011-03-31
Attorney, Agent or Firm:
BERGGREN OY (FI)
Download PDF:
Claims:
CLAIMS

1 . An apparatus (500), comprising:

- an imaging device (CAM1 ) comprising an input element (100), a focusing unit (300) and an image sensor (DET1 ), wherein the input element (100) has a symmetry axis (AX1 ), wherein a viewing region (ZONE1 ) surrounds the input element (100), and wherein the imaging device (CAM1 ) is arranged to form an annular image (IMG1 ) of the viewing region (ZONE1 ) on the image sensor (DET1 ),

- a curved diffractive element (G1 ) to provide a first diffracted light ray (LR2a) and a second light ray (LR2b, LR2c) by diffracting light (LB1 ) received from a first object point (P0) located in the viewing region (ZONE1 ) such that the first diffracted light ray (LR2a) propagates towards the axis (AX1 ) in a first azimuthal direction (f1 ), and such that the second light ray (LR2b, LR2c) propagates towards the axis (AX1 ) in a second different azimuthal direction (<p2),

- wherein the imaging device (CAM1 ) is arranged to form a first image point (P1 a) of the first object point (P0) by focusing light of the first diffracted light ray (LR2a) to the image sensor (DET1 ), and the imaging device (CAM1 ) is arranged to form a second different image point (P1 b, P1 c) of the first object point (P0) by focusing light of the second light ray (LR2b, LR2c) to the image sensor (DET1 ).

2. The apparatus (500) of claim 1 , wherein the diffractive element (G1 ) comprises a plurality of substantially linear diffractive features (DF1 ), the diffractive features (DF1 ) are located on a substantially cylindrical surface which is concentric with an axis (AX1 ) of symmetry of the input element (100), and the linear diffractive features (DF1 ) are substantially parallel with the axis (AX1 ).

3. The apparatus (500) of claim 1 or 2, wherein the radial distance (rpia) of the first image point (P1 a) from the center (AX1 ) of the annular image (IMG1 ) is substantially equal to the radial distance (rpib) of the second image point (P1 b) from the center (AX1 ) of the annular image (IMG1 ).

4. The apparatus (500) according to any of the claims 1 to 3, wherein the input element (100) is arranged to provide deflected light (LB10) by deflecting light of the light rays (LR2a, LR2b, LR2c) received from the diffractive element (G1 ), and the focusing unit (300) is arranged to focus the deflected light (LB10) to the image sensor DET1 .

5. The apparatus (500) according to any of the claims 1 to 4, wherein the input element (100) is axially symmetric with respect to a first axis (AX1 ), and wherein the viewing region (ZONE1 ) surrounds the first axis (AX1 ).

6. The apparatus (500) according to any of the claims 1 to 5, wherein the line density (1/dd) of the diffractive element (G1 ) is in the range of 50/mm to 1200/mm.

7. The apparatus (500) according to any of the claims 1 to 6, wherein the diffractive element (G1 ) comprises a diffractive foil wrapped around the input element (100).

8. The apparatus (500) according to any of the claims 1 to 7, wherein the radius of curvature (nsi) of the grating element (G1 ) is greater than 10 times the width (Weff) of the effective optical aperture of the imaging device (CAM1 ).

9. The apparatus (500) according to any of the claims 1 to 8, comprising a spectrally selective optical filter (FIL1 ) to limit spectral bandwidth of light focused to the image sensor (DET1 ).

10. The apparatus (500) according to any of the claims 1 to 9, wherein the apparatus (500) comprises a data processing unit (CNT1 ) configured to detect the angular position (f1 ) of the first image point (P1 a) with respect to the angular position (f2) of the second image point (P1 b, P1 c), and to determine a distance value (L1 ) from the detected angular position (f1 ).

1 1 . The apparatus (500) according to any of the claims 1 to 10, wherein the apparatus (500) is configured to recognize a first partial image (SUB1 a) of a first object (OBJ1 ) and a second partial image (SUB1 b, SUB1 c) of the first object (OBJ1 ) by image recognition.

12. The apparatus (500) according to any of the claims 1 to 11 , wherein the input element (100) is a catadioptric element, which comprises a refractive input surface (SRF1 ), a first reflective surface (SRF2), a second reflective surface (SRF3), and a refractive output surface (SRF4).

13. The apparatus (500) according to any of the claims 1 to 11 , wherein the input element (100) comprises a reflective surface selected from a group consisting of paraboloid surface, hyperboloid surface, conical surface, and ellipsoid surface.

14. The apparatus (500) according to any of the claims 1 to 13, wherein a first angle (ai) defining the first boundary (ZB1 ) of the viewing region (ZONE1 ) is in the range of +10° to +60° with respect to a horizontal plane (REF1 ), and a second angle (012) defining a second boundary (ZB2) of the viewing region (ZONE1 ) with respect to a horizontal plane (REF1 ) is in the range of -60° to - 10° .

15. A vehicle (1000), comprising the apparatus (500) according to any of the claims 1 to 14.

16. A method, comprising:

- using an omnidirectional imaging device (CAM1 ) to form an annular image (IMG1 ) of a viewing region (ZONE1 ) on an image sensor (DET 1 ), wherein the viewing region (ZONE1 ) surrounds an input element (100) of the imaging device (CAM1 ), and wherein the input element (100) has an axis (AX1 ) of symmetry,

- using a curved diffractive element (G1 ) to provide a first diffracted light ray (LR2a) and a second light ray (LR2b, LR2c) by diffracting light (LB1 ) received from a first object point (P0) located in the viewing region (ZONE1 ) such that the first diffracted light ray (LR2a) propagates towards the axis (AX1 ) in a first azimuthal direction (f1 ), and such that the second light ray (LR2b, LR2c) propagates towards the axis (AX1 ) in a second different azimuthal direction (<P2), - forming a first image point (P1 a) of the first object point (P0) by focusing light of the first diffracted light ray (LR2a) to the image sensor (DET1 ), and

- forming a second different image point (P1 b, P1 c) of the first object point (P0) by focusing light of the second light ray (LR2b, LR2c) to the image sensor (DET1 ).

17. The method of claim 16, wherein the diffractive element (G1 ) comprises a plurality of substantially linear diffractive features (DF1 ), the diffractive features (DF1 ) are located on a substantially cylindrical surface which is concentric with an axis (AX1 ) of symmetry of the input element (100), and the linear diffractive features (DF1 ) are substantially parallel with the axis (AX1 ).

18. The method of claim 16 or 17, wherein the input element (100) is arranged to provide deflected light (LB10) by deflecting light of the light rays (LR2a, LR2b, LR2c) received from the diffractive element (G1 ), and the focusing unit (300) is arranged to focus the deflected light (LB10) to the image sensor DET 1.

19. The method according to any of the claims 16 to 18 comprising detecting an angular position (f1 ) of the first image point (P1 a) with respect to the second image point (P1 b, P1 c), and determining a distance value (L1 ) from the detected angular position (f1 ).

Description:
STEREO IMAGING APPARATUS

FIELD

Some aspects relate to capturing a stereo image.

BACKGROUND

It is known that a stereo image of an object may be captured by capturing a first image of the object with a first camera, by capturing a second image of the object with a second camera, and by associating the first captured image with the second captured image.

It is known that the distance to the object may be determined by capturing the first image of the object with the first camera, by capturing the second image of the object with the second camera, and by comparing the first captured image with the second captured image.

It is known that the distance between a single camera and an object may be determined e.g. by capturing a first image of the object when the camera is at a first transverse position, by moving the camera in a transverse direction, by capturing a second image of the object when the camera is in a second transverse position, and by comparing the first captured image with the second captured image.

SUMMARY

Some versions may relate to a stereo imaging apparatus. Some versions may relate to an apparatus for measuring a distance. Some versions may relate to a method for capturing a stereo image. Some versions may relate to a method for measuring a distance. Some versions may relate to a method for monitoring position of an object. Some versions may relate to measuring the position of a vehicle. Some versions may relate to a position monitoring device. Some versions may relate to a vehicle, which comprises a position monitoring device.

According to an aspect, there is provided an apparatus (500), comprising:

- an imaging device (CAM1 ) comprising an input element (100), a focusing unit (300) and an image sensor (DET1 ), wherein the input element (100) has a symmetry axis (AX1 ), wherein a viewing region (ZONE1 ) surrounds the input element (100), and wherein the imaging device (CAM1 ) is arranged to form an annular image (IMG1 ) of the viewing region (ZONE1 ) on the image sensor (DET1 ),

- a curved diffractive element (G1 ) to provide a first diffracted light ray (LR2a) and a second light ray (LR2b, LR2c) by diffracting light (LB1 ) received from a first object point (P0) located in the viewing region (ZONE1 ) such that the first diffracted light ray (LR2a) propagates towards the axis (AX1 ) in a first azimuthal direction (f1 ), and such that the second light ray (LR2b, LR2c) propagates towards the axis (AX1 ) in a second different azimuthal direction (<p2),

- wherein the imaging device (CAM1 ) is arranged to form a first image point (P1 a) of the first object point (P0) by focusing light of the first diffracted light ray (LR2a) to the image sensor (DET1 ), and the imaging device (CAM1 ) is arranged to form a second different image point (P1 b, P1 c) of the first object point (P0) by focusing light of the second light ray (LR2b, LR2c) to the image sensor (DET1 ).

Further aspects are defined in the claims.

The stereo imaging apparatus may comprise a curved diffraction grating, which surrounds an input element of an omnidirectional imaging device. The input element may be e.g. a catadioptric lens or a paraboloid reflector, which has an axis of symmetry. The imaging device may have a wide viewing region about the symmetry axis. The imaging device and the apparatus may have a viewing region, which completely surrounds the axis. The viewing region may represent 360° angle about the axis. The input element of the apparatus may be called e.g. as an omnidirectional lens or as a panoramic lens. The imaging device may be called e.g. as an omnidirectional camera or as a panoramic camera. The stereo imaging apparatus may form an annular image of the viewing region on the image sensor. The apparatus may be arranged to capture the annular stereo image formed on the image sensor.

The annular stereo image may comprise a first partial image of an object and a second partial image of an object, wherein the angular position of the first partial image with respect to the second partial image may depend on the distance between the object and the input element. The apparatus may be arranged to detect the angular position of the first partial image, with respect to the second partial image, and the apparatus may be arranged to determine a distance value from the detected angular position of the first partial image.

Distance information may be determined from angular separation between adjacent partial images of the annular image. The distance information may be determined from the annular image e.g. by using image recognition and/or by using directed illumination.

The apparatus may be configured to determine a distance between an object and the apparatus by analyzing the captured annular image.

The analysis of the captured annular image may also be performed in a distributed manner, e.g. by using a service running on a remote computer and/or in an internet server.

The apparatus may have a 360° horizontal view around the vertical axis. The monitoring device may provide position information by measuring the positions of objects. The monitoring device may provide position information e.g. for controlling operation of a vehicle. The information may be used e.g. for controlling the velocity and/or direction of the vehicle.

The apparatus may be arranged to provide information about the positions of objects located near said apparatus. The position information may be used e.g. for controlling operation of an autonomous vehicle. The control system of an autonomous vehicle may use real time information about positions of the objects e.g. in order to control the velocity and/or direction of the vehicle. The viewing region may completely surround the grating element. The viewing region may correspond to a 360° horizontal view around the vertical axis. The apparatus may measure distances to objects which are within the viewing region. The apparatus may measure distances to multiple objects by analyzing a single captured image. The objects may be located at arbitrary positions around the grating element. The monitoring device does not need to comprise any moving parts. In particular, the monitoring device does not need to comprise a rotating mirror.

BRIEF DESCRIPTION OF THE DRAWINGS

In the following examples, several variations will be described in more detail with reference to the appended drawings, in which

Fig. 1 shows, by way of example, in a three-dimensional view, an omnidirectional imaging device, Fig. 2 shows, by way of example, in a three-dimensional view, a stereo imaging apparatus,

Fig. 3 shows, by way of example, in a three-dimensional view, a portion of a diffraction grating,

Fig. 4 shows, by way of example, in a top view, diffracted light rays formed by a cylindrical diffraction grating,

Fig. 5a shows, by way of example, in a top view, diffracted light rays formed by the cylindrical diffraction grating, in a situation where the object point is at a first distance,

Fig. 5b shows, by way of example, in a top view, diffracted light rays formed by the cylindrical diffraction grating, in a situation where the object point is at a second different distance, Fig. 5c shows, by way of example, in a three-dimensional view, the azimuthal direction of a first light ray, and the azimuthal direction of a second light ray, Fig. 6a shows, by way of example, in a cross-sectional side view, propagation of light in the stereo imaging apparatus,

Fig. 6b shows, by way of example, in a three-dimensional view, propagation of light in the stereo imaging apparatus,

Fig. 7 shows, by way of example, in a top view, angular position of a first partial image and angular position of a second partial image,

Fig. 8 shows, by way of example, in a side view, an effective input pupil of the imaging device,

Fig. 9 shows, by way of example, in a three-dimensional view, upper and lower boundary of a viewing region of the stereo imaging apparatus,

Fig. 10 shows, by way of example, in a three-dimensional view, a vehicle, which comprises the stereo imaging apparatus,

Fig. 1 1 shows, by way of example, functional units of the stereo imaging apparatus,

Fig. 12 shows, by way of example, in a cross-sectional side view, a stereo imaging apparatus, which comprises a curved reflective surface, and

Fig. 13 shows, by way of example, in a cross-sectional side view, a stereo imaging apparatus, which comprises a wavefront modifying unit.

DETAILED DESCRIPTION Referring to Fig. 1 , the omnidirectional imaging device CAM1 may comprise an input element 100, aperture stop AS1 , a focusing unit 300, and an image sensor DET1 . The input element 100 may gather light LB1 from the viewing region ZONE1 . The input element 100 may receive light LB1 from the viewing region ZONE1 , and the input element 100 may deflect at least a part of the received light to focusing optics 300. The input element 100 may provide deflected light LB10 towards the focusing unit 300. The focusing unit 300 may form the annular image IMG1 on the image sensor DET1 by focusing the deflected light LB10. The input element 100 may provide a deflected beam LB10 e.g. by reflecting light LB1 gathered from the viewing region ZONE1 , and the focusing unit 300 may provide focused light LB12 by focusing light of a deflected light beam LB10. The focused light may impinge on the image sensor DET 1 so as to form an annular image IMG1 .

The omnidirectional imaging device CAM1 may be arranged to receive light from the viewing region ZONE1 . The imaging device CAM1 may be arranged to form the annular image IMG1 on an image sensor DET1 . The image sensor DET1 may convert the optical image IMG1 into a digital image. The imaging device CAM1 may be arranged to capture an annular image IMG1 , which comprises images of objects OBJ1 located in the viewing region ZONE1 . The captured image IMG1 may be subsequently processed and/or analyzed e.g. by using a data processor CNT 1 (Fig. 1 1 ).

The imaging device CAM1 may form an image SUB1 of an object OBJ1 , which is located in the viewing region ZONE1 . The input element 100 may gather light LB1 from the object OBJ1 . The image SUB1 may be called e.g. as a partial image or as a sub-image. The imaging device CAM1 may form an image P1 of a point P0 of the object OBJ1 . The image point P1 may be an image of the point P0 of the partial image SUB1 . The annular image IMG1 may comprise the partial image SUB1 , which in turn may comprise the image point P1 .

The input element 100 may be axially symmetric with respect to the axis AX1 . The viewing region ZONE1 may completely surround the axis AX1 . The input element 100 may be e.g. catadioptric lens, which is arranged to receive light from the viewing region ZONE1 . The catadioptric lens may have e.g. two refractive surfaces and two reflective surfaces to provide a folded optical path. The folded optical path may allow reducing the size of the imaging device CAM1. The imaging device CAM1 may have a low height, due to the folded optical path.

The optical elements of the imaging device CAM1 may be axially symmetric with respect to the symmetry axis AX1. The imaging device CAM1 may form an annular image IMG1 of a viewing region ZONE1 , which surrounds the symmetry axis AX1.

The image sensor DET 1 may be e.g. a CCD sensor or a CMOS sensor. CCD means charge coupled device. CMOS means Complementary Metal Oxide Semiconductor.

The imaging device CAM1 may comprise an aperture stop AS1 e.g. to define an input pupil EPU1 of the imaging device CAM1. The aperture stop AS1 may e.g. limit or reduce optical aberrations of the imaging device CAM1. The aperture stop AS1 may be positioned e.g. between the input element 100 and the focusing unit 300.

SX, SY and SZ may denote orthogonal directions. The symmetry axis AX1 may be parallel with the direction SZ.

The vertical direction SZ may be parallel to the direction of gravity, but the direction SZ does not need to be parallel to the direction of gravity.

A reference plane REF1 may be perpendicular to the axis AX1. The viewing region ZONE1 may have a first conical boundary ZB1 and a second conical boundary ZB2. a1 may denote an angle between the first conical boundary ZB1 and the reference plane REF1. a2 may denote an angle between the second conical boundary ZB2 and the reference plane REFI . The angle a1 may be e.g. in the range of +10° to +60°, wherein the angle a2 may be e.g. in the range of -60° to 0°.

The reference plane REF1 may be e.g. a horizontal reference plane REFI . The viewing region ZONE1 may have an upper conical boundary ZB1 and a lower conical boundary ZB2. The vertical field (=a1 +a2) of view of the device 100 may be e.g. in the range of 10° to 120°.

The imaging device CAM1 may further comprise a wavefront modifying unit 200 to increase the resolution of the image IMG1 (Fig. 13). The wavefront modifying unit 200 may e.g. correct wavefront distortion of a light beam received from the input element 100. The wavefront modifying unit 200 may e.g. collimate a light beam received from the input element 100.

Referring to Fig. 2, the stereo imaging apparatus 500 may comprise a curved diffraction grating element G1 and the omnidirectional imaging device CAM1 . The curved diffraction grating element G1 may surround the input element 100 of the imaging device CAM1 .

The grating element G1 may provide diffracted light LB2a, LB2b, LB2c by diffracting light received from an object point P0 of an object OBJ1 . The object OBJ1 may be located in the viewing region ZONE1 . The object point P0 may be located in the viewing region ZONE1 .

The apparatus 500 may comprise:

- an input element 100 to gather light from a viewing region ZONE1 ,

- an image sensor DET 1 ,

- a focusing unit 300 to form an annular image IMG1 of the viewing region ZONE1 on the image sensor DET1 by focusing the gathered light, and

- a diffractive element G1 surrounding the input element 100,

wherein the diffractive element G1 is arranged to provide diffracted light rays by diffracting light gathered from an object point P0 located in the viewing region ZONE1 , wherein the imaging device CAM1 is arranged to form partial images SUB1 a, SUB1 b, SUB1 c on the image sensor DET1 by focusing light of the diffracted light rays to the image sensor DET1 .

The curved diffraction grating G1 may completely surround the input element 100 of the imaging device CAM1 . The curved diffraction grating G1 may be e.g. a substantially cylindrical diffractive element, which may encircle the input element 100. The diffractive features DF1 of the grating element G1 may be located on a substantial cylindrical surface. The grating G1 may provide two or more diffracted light beams LB2a, LB2b, LB2c by diffracting light received from the point PO of the object OBJ1 . The grating G1 may provide a first diffracted light beam LB2a by diffracting light LB1 received from the point PO. The grating G1 may provide a second diffracted light beam LB2b by diffracting light LB1 received from the point PO. The grating G1 may provide a third diffracted light beam LB2c by diffracting light LB1 received from the point PO. The first diffracted light beam LB2a may be formed e.g. in the diffraction order m=1 . The second diffracted light beam LB2b may be formed e.g. in the diffraction order m=0. The third diffracted light beam LB2c may be formed e.g. in the diffraction order m=-1 . The diffracted light beams LB2a, LB2b, LB2c may propagate towards the symmetry axis AX1 of the input element 100.

The imaging device CAM1 may form a first image point P1 a by focusing light of the first diffracted light beam LB2a. The imaging device CAM1 may form a second image point P1 b by focusing light of the second diffracted light beam LB2b. The imaging device CAM1 may form a third image point P1 c by focusing light of the third diffracted light beam LB2c.

The input element 100 may provide deflected light (LB10) by deflecting light (LR2a, LR2b, LR2c) received from the diffractive element G1 . The input element 100 may provide deflected light (LB10) by deflecting light of the light beams (LR2a, LR2b, LR2c) received from the diffractive element G1 . The focusing unit 300 may focus the deflected light (LB10) to the image sensor DET1 . The focusing unit 300 may form e.g. the image points P1 a, P1 b, P1 c on the image sensor DET 1 , by focusing the deflected light (LB10).

The angle cp1 may denote angular distance between the first image point P1 a and the second image point P1 b of the annular image IMG1 , with respect to the symmetry axis AX1 . The angle f2 may denote angular distance between the second image point Pba and the third image point P1 c of the annular image IMG1 , with respect to the symmetry axis AX1 .

The angular distance f1 , f2 may depend on the distance L1 between the point P0 and the symmetry axis AX1 . The angular distance f1 and/or f2 may be determined from the annular image IMG1 . The distance L1 may be determined from the angular distance cp1 and/or cp1 , respectively.

The grating element G1 may be e.g. a transmissive surface relief grating, which comprises a plurality is substantially parallel grooves G1 positioned according to a grating constant d G .

The diffractive element G1 of the apparatus 500 may be formed e.g. by wrapping diffractive foil around the input element 100 of the imaging device CAM1 . The diffractive element G1 of the apparatus 500 may be a substantially cylindrical piece of diffractive foil, which surrounds the input element 100.

Referring to Fig. 3, the diffraction grating G1 may comprise a plurality of longitudinal diffractive features DF1 . POR1 may denote a portion of the curved diffraction grating element G1 . POR1 may denote a portion of a cylindrical diffraction grating element G1 . The diffractive features DF1 may be diffractive lines. The diffractive features DF1 may be e.g. longitudinal grooves and/or ridges. The symbol dG denotes the grating constant of the diffraction grating G1 . The grating constant dG may mean the distance between centers of adjacent similar diffractive linear features DF1 . The line density NG of the grating G1 may be equal to the inverse (1 / dG) of the grating constant dG. The line density NG means the number of substantially similar diffractive features per unit length. The line density NG (=1/dG) may indicate e.g. the number of diffractive grooves DF1 per the length of 1 mm. The line density NG of the grating G1 may be e.g. in the range of 50/mm to 1200/mm. The line density NG of the grating G1 may be e.g. in the range of 50 lines per mm to 1200 lines per mm.

The diffraction grating G1 may form one or more diffracted light rays LR2ai, LR2ao, LR2a-i by diffracting light of an input ray LR1 a received from an object point P0. The light ray LR2ai may be diffracted e.g. in the diffraction order m=1 . The light ray LR2ao may be diffracted e.g. in the diffraction order m=0. The light ray LR2a-i may be diffracted e.g. in the diffraction order m=-1 .

The angle 0 m =i may denote the angle between the direction of the light ray LR2ai and the surface normal N1 of the grating G1 . The angle 0 m =o may denote the angle between the direction of the light ray LR2ao and the surface normal N1. The angle 0 m =-i may denote the angle between the direction of the light ray LR2a-i and the surface normal N1. The angle 0, n may denote the angle between the direction of the input ray LRI a and the surface normal N1.

The diffracted light ray LR2ao may be substantially parallel with the input ray LR1 a. The directions of the diffracted light rays may be determined e.g. according to the following diffraction grating equation. d G (sinQ in — sinQ^ = m l (1 ) m may denote the diffraction order l may denote the wavelength of light.

For example, the 0 m =i = 40° in a situation where the wavelength l = 650 nm and the grating constant dG = mm/1000. The line density NG of the grating G1 may be equal to 1000/mm.

The imaging apparatus 500 may optionally comprise an optical filter FIL1 to limit spectral bandwidth of light impinging on the image sensor DET 1 (Fig. 6a). The optical filter FIL1 may have a fixed bandwidth. The bandwidth of the optical filter FIL1 may be e.g. narrower than or equal to 20 nm, advantageously narrower than or equal to 10 nm. The bandwidth of the optical filter FIL1 may be e.g. in the range of 1 nm to 20 nm. The bandwidth of the optical filter FIL1 may be e.g. in the range of 1 nm to 10 nm. The spectral position of the center of the passband of the filter FIL1 may be fixed.

The optical filter FIL1 may also have an adjustable bandwidth. The spectral position of the center of the passband of the filter FIL1 may also be adjustable. Alternatively, or in addition, the object OBJ1 may be illuminated with illuminating light, which has a narrow bandwidth. The bandwidth of the optical filter FIL1 and/or the illuminating light may be e.g. narrower than or equal to 20 nm, advantageously narrower than or equal to 10 nm. The object OBJ1 may be illuminated e.g. with laser light which has a bandwidth less than 1 nm. Referring to Fig. 4, the apparatus 500 may comprise a substantially cylindrical diffractive element G1 , which may be substantially concentric with the symmetry axis AX1 . For example, a diffraction grating film G1 may be e.g. wrapped around the symmetry axis AX1 to form a substantially cylindrical diffractive element. The longitudinal diffractive features DF1 of the diffractive element G1 may be substantially parallel with the symmetry axis AX1 .

The apparatus 500 may receive light LB1 from the object point P0 located in the viewing region ZONE1 . The light LB1 may comprise light rays LR1 a, LR1 b, LR1 c. The grating G1 may form diffracted light rays LR2ai , LR2ao, LR2a-i by diffracting light of the light ray LR1 a, corresponding to the diffraction orders m=1 , m=0, and m=-1 , respectively. The grating G1 may form diffracted light rays LR2bi , LR2bo, LR2b-i by diffracting light of the light ray LR1 b. The grating G1 may form diffracted light rays LR2ci , LR2co, LR2c-i by diffracting light of the light ray LR1 c.

The diffracted light rays LR2a-i, LR2bo, LR2ci may propagate towards the symmetry axis AX1 . The angle cp1 may denote the (azimuthal) angle between the light rays LR2a-i and LR2bo. The angle f2 may denote the (azimuthal) angle between the light rays LR2bo and LR2ci . The angles f1 , f2 between the directions of the diffracted light rays LR2a-i, LR2bo, LR2ci may depend on the distance L1 between the object point P0 and the symmetry axis AX1 .

Referring to Fig. 5a, the stereo imaging apparatus 500 may simultaneously form several partial images SUB1 a, SUB1 b, SUB1 c of the same object OBJ1 on the image sensor DET 1 . The angular position of the partial images SUB1 a and SUB1 c in the image IMG1 may depend on the distance L1 of the object OBJ1 . The angular distance f1 , f2 between the image points P1 a, P1 b, P1 c (when viewed from the symmetry axis AX1 ) may depend on the distance L1 . The distance L1 may be determined e.g. from the angle cp1 , from the angle f2 and/or from the sum f1 + f2.

The omnidirectional imaging device CAM1 may substantially maintain the azimuthal direction of each light ray which propagates towards the axis AX1 and subsequently through the input element 100 of omnidirectional imaging device CAM1 to the image sensor DET1 . r d may denote the distance between the grating element G1 and the symmetry axis AX1 . The inner diameter of the substantially cylindrical grating element G1 may be equal to 2 GI . The grating element G1 may be in contact with the input element 100, or the inner diameter of the grating element G1 may be greater than the outer diameter of the input element 100. Selecting a large distance TGI may facilitate measuring large distances L1 . Selecting a small distance TGI may minimize the size of the apparatus 500.

Fig. 5b shows a second situation where the distance L1 between an object OBJ1 and the axis AX1 is smaller than the distance L1 of Fig. 5a. The angular values cp1 and f2 of Fig. 5b are smaller than the angular values cp1 and f2 of Fig. 5a. The angle cp1 may increase with increasing distance L1 .

Referring to Fig. 5c, the azimuthal direction of a light ray may be defined with respect to a reference direction (e.g. -SX or +SX) and/or with respect to the azimuthal direction of a reference light ray (e.g. LR2bo).

The grating element G1 may receive light rays LR1 a, LR1 b from an object point P0. The light ray LR1 a may impinge on the grating element G1 at a point PG1 a. The light ray LR1 b may impinge on the grating element G1 at a different point PG1 b. The grating G1 may form light rays LR2ai , LR2ao, LR2a-i by diffracting light of the light ray LR1 a, corresponding to the diffraction orders m=1 , m=0, and m=-1 . The grating G1 may form light rays LR2bi , LR2bo, LR2bi by diffracting light of the light ray LR1 b. The first light ray LR2a-i and the second light ray LR2bo may propagate towards the axis AX1 .

REF1 may denote a plane, which is perpendicular to the axis AX1 . The plane REF1 may be defined by the directions SX and SY. The axis AX1 may be parallel with the direction SZ.

The first diffracted light ray LR2a-i may have a projection LR2a'-i on the plane REF1 . The second light ray LR2bo may have a projection LR2b'o on the plane REF1 . The azimuth angle f 3 of the ray LR2a-i may denote the angle between the projection LR2a-i of the first diffracted light ray LR2a-i and a reference direction (-SX). The azimuth angle of the ray LR2bo may denote the angle between the projection LR2b'o of the second light ray LR2bo and the reference direction (-SX).

The azimuthal direction of the ray LR2a-i may mean the direction of the projection LR2a'-i of the ray LR2a-i on the plane REF1 . The azimuthal direction of the ray LR2bo may mean the direction of the projection LR2b'o of the ray LR2bo on the plane REFI . The azimuthal direction of the ray LR2ci (see Fig. 5b) may mean the direction of the projection of the ray LR2ci on the plane REFI .

The angle f 3 may specify the azimuthal direction of the ray LR2a-i with respect to the reference direction (-SX). The angle may specify the azimuthal direction of the ray LR2bo with respect to the reference direction (-SX). The angle cpi may specify the azimuthal direction of the ray LR2a-i with respect to the projection LR2b'o. The angle y2 may specify the azimuthal direction of the ray LR2ci with respect to the projection LR2b'o.

The light ray LR1 a may have a projection LR1 a' on the plane REF1 . The light ray LR1 a may have a zenith angle i a with respect to the direction (-SZ). The light ray LR1 b may have a projection LR1 b' on the plane REFI . The light ray LR1 b may have a zenith angle b1 b with respect to the direction (-SZ).

Fig. 6a shows, by way of example, propagation of light through the apparatus 500.

The input element 100 may be e.g. a catadioptric element, which comprises a refractive input surface SRF1 , a first reflective surface SRF2, a second reflective surface SRF3, and a refractive output surface SRF4. The grating element G1 may provide diffracted light LB2 by diffracting light LB1 received e.g. from an object point P0.

The input surface SRF1 may provide refracted light LB3 by refracting the diffracted light LB2. The input surface SRF1 may refract the light LR3 towards the first reflective surface SRF2. The first reflective surface SRF2 may reflect light LR4 towards the second reflective surface SRF3. The second reflective surface SRF3 may reflect light LR5 towards the output surface SRF4. The output surface SRF4 may provide output light LB10 to the focusing unit 300 through an aperture stop AS1. The focusing unit 300 may focus light received from the input element 100 to the image sensor DET1. The focusing unit 300 may provide focused light LB12 to the image sensor DET1. The focusing unit 300 may form an image point P1 on the image sensor DET1 by focusing the light received from the input unit 100.

The aperture stop AS1 of the apparatus 500 may limit the effective aperture of the omnidirectional imaging device CAM1 such that device CAM1 may form a substantially sharp partial image SUB1 a by focusing light diffracted by the curved diffractive element G1.

The f-number of the omnidirectional imaging device CAM1 may be e.g. substantially equal to 1.68, the vertical field of view of the viewing region ZONE1 may be e.g. from -15° to +15°, the outer diameter of the annular image IMG1 may be e.g. substantially equal to 8 mm, and the inner diameter of the annular image IMG1 may be e.g. substantially equal to 4 mm.

The diameter of the input element 100 may be e.g. 24 mm, and the height of the omnidirectional imaging device CAM1 may be e.g. 24 mm in the direction of the symmetry axis AX1. The diffractive features DF1 may be e.g. linear grooves. The line density NG of the cylindrical grating element G1 may be e.g. 1000 lines / mm. The line density NG of the cylindrical grating element G1 may be e.g. in the range of 50/mm to 1200/mm.

Referring to Fig. 6b, a light ray LR2a-i diffracted in the diffraction order m=-1 towards the axis AX1 may contribute to forming a first image point P1 a on the image sensor DET1. A light ray LR2bo diffracted in the diffraction order m=0 towards the axis AX1 may contribute to forming a second image point P1 b. A light ray LR2ci diffracted in the diffraction order m=1 towards the axis AX1 may contribute to forming a third image point P1 c. The image points P1 a, P1 b, P1 c may be images of the same object point P0 located in the viewing region ZONE1. The distance L1 of the point P0 may be determined e.g. from the angular position of the image point P1 a, with respect to the point P1 b, P1 c. Referring to Fig. 7, the image points P1 a, P1 b, P1 c may be formed at a radial distance rpi a from the center of the annular image IMG1. The apparatus 500 may form the image points P1a, P1 b, P1 c of the object point P0 substantially at the same radial distance rpi a from the axis AX1. rpia may denote a distance between the image point P1 a and the center of the annular image IMG1. rpi b may denote a distance between the image point P1 b and the center of the annular image IMG1. rpi c may denote a distance between the image point P1 c and the center of the annular image IMG1. The radial distance rpi b may be substantially equal to the radial distance rpi a . The radial distance rpi c may be substantially equal to the radial distance rpi a .

The apparatus 500 may form three or more image points P1 a, P1 b, P1 c of the object point P0 such that the image points P1 a, P1 b, P1 c are not on the same line.

The device CAM1 , 500 may capture the annular image IMG1. The image sensor DET1 may convert the optical annular image IMG1 into a digital image DIMG.

The annular image IMG1 may have an inner boundary IB1 and an outer boundary OB2. The inner boundary IB1 may be e.g. the image of the first conical boundary ZB1 of the viewing region VIEW1. The outer boundary OB2 may be e.g. the image of the second conical boundary ZB2 of the viewing region VI EW1.

Alternatively, the inner boundary IB1 may be the image of the second conical boundary ZB2 and the outer boundary OB2 may be the image of the first conical boundary ZB1.

The center of the annular image IMG1 may coincide with the symmetry axis AX1. The annular image IMG1 may have an inner radius GMIN and an outer radius GMAC. The diameter dMAx may be substantially equal to 2 · GMAC. The image information representing the omnidirectional viewing region ZONE1 may be formed in the annular region between the inner radius GMIN and the outer radius GMAC.

The ratio of the inner radius GMIN to the outer radius GMAC may be e.g. in the range of 0.3 to 0.7.

The annular image IMG1 may surround a central region CREG1 . The apparatus 500 may operate such that light gathered from the omnidirectional viewing region ZONE1 is not focused to the central region CREG1 .

The method may comprise using an image recognition algorithm to recognize the partial images SUB1 a, SUB1 b, SUB1 c of an object OBJ1 in the annular image IMG1 . The apparatus 500 may be arranged to operate such that the angular distance cp1 between the image points P1 a and P1 b is equal to the angular distance f2 between the image points P1 b, P1 c. The method may comprise using information about the equality of the angles (f1 =f2) for verifying and/or facilitating image recognition. The method may comprise determining whether the annular image comprises three substantially identical partial images, which are separated by the same angular distance f1 (=f2).

The input element 100 and the focusing unit 300 may be arranged to form the annular optical image IMG1 on the image sensor DET1 .

Referring to Fig. 8, the imaging device CAM1 may comprise an aperture stop AS1 to define the width w eff Of an effective optical aperture EPU1 of the imaging device CAM1 . The imaging device CAM1 may comprise an aperture stop AS1 to define the height h eff of the effective optical aperture EPU1 . The aperture EPU1 may also be called e.g. as an entrance pupil. The aperture stop AS1 may define the entrance pupil EPU1 of the imaging device CAM1 by preventing propagation of marginal rays.

The aperture stop AS1 may improve the sharpness of the image IMG1 by preventing propagation of marginal rays, which could cause blurring of the optical image IMG1 . The aperture stop AS1 may be arranged to prevent propagation of rays, which may cause blurring of the optical image IMG1 . The aperture stop AS1 may be arranged to define the dimensions entrance pupil EPU1 . The entrance pupil EPU1 may have a width w eff and a height h eff .

For example, the aperture stop AS1 may define an entrance pupil EPE of the imaging device CAM1 such that the effective F-number of the imaging device CAM1 is in the range of 1 .5 to 10.

The aperture stop AS1 may improve the sharpness of the image IMG1 by limiting optical aberrations caused by the curved diffraction grating G1 . The width We ff of the effective optical aperture of the imaging device CAM1 may be e.g. smaller than 10% of the radius of curvature TGI of the grating element G1 , so as to limit optical aberrations caused by the curved diffraction grating G1 . The radius of curvature TGI of the grating element G1 may be e.g. greater than 10 times the width w eff of the effective optical aperture of the imaging device CAM1 , so as to limit optical aberrations caused by the curved diffraction grating G1 .

Referring to Fig. 9, the omnidirectional viewing region ZONE1 may completely surround the input element 100. The viewing region ZONE1 may have an upper conical boundary ZB1 and a lower conical boundary ZB2. a1 may denote the angle between the upper boundary ZB1 and a horizontal plane REFI . a2 may denote the angle between the lower boundary ZB2 and the horizontal plane REFI . The angle OIMAX may be e.g. in the range of 10° to 20°. The angle CUMIN may be e.g. in the range of -20° to +5°. The difference OIMAX- (XMIN may be may be called e.g. as the vertical field of view. The horizontal field of view of the imaging device CAM1 may be e.g. substantially equal to 360°. The horizontal field of view of the apparatus 500 may be e.g. substantially equal to 360°.

Referring to Fig. 10, the apparatus 500 may be installed e.g. to vehicle 1000. The vehicle 1000 may be e.g. an automobile, a bus, a train, or a tram. The apparatus 500 may be installed e.g. to a ship (i.e. boat). The apparatus may be installed to a moving and/or stationary body 1000. An object OBJ1 may be located in the viewing region of the apparatus 500. The apparatus 500 may be arranged to detect the position of one or more objects OBJ1 with respect to the input element 100 of the apparatus 500. The apparatus 500 may be arranged to provide position information POS1 . The position information POS1 may comprise information about the position of the object OBJ1 with respect to the apparatus 500.

For example, a corner of an object OBJ1 may be used as an object point P0. For example, one or more corners, edges and/or surfaces of the object OBJ1 may be used as detectable features, which may be detected by an image recognition algorithm, so as to determine a distance L1 . The annular image may comprise partial images SUB1 a, SUB1 b, SUB1 c of a detectable feature of the object. The apparatus 500 may determine the distance L1 from the angular separation f1 , f2 of the partial images SUB1 a, SUB1 b, SUB1 c.

The apparatus 500 may operate in an environment which comprises one or more objects OBJ1 .

The position information POS1 may be used e.g. for avoiding collision of the vehicle 1000 with the object OBJ1 . The position information POS1 may be used e.g. for optimizing a route of the vehicle 1000 with respect to the object OBJ1 . The position information POS1 may be used e.g. for tracking the position of the vehicle 1000. The position information POS1 may be used e.g. for predicting the position of the vehicle 1000.

The method may comprise controlling the velocity and/or direction of movement of the vehicle 1000 based on the measured position of the object OBJ1 .

The positions of one or more objects OBJ1 may be measured by using the apparatus 500. The distance between and object OBJ1 and the apparatus 500 may be measured by using the apparatus 500. The distance L1 between the object OBJ1 and the apparatus 500 may be monitored by using the apparatus 500. The apparatus 500 may be arranged measure the velocity of the object OBJ1 with respect to the apparatus 500. The apparatus 500 may be arranged measure the velocity of the apparatus 500 with respect to the object OBJ1 . The apparatus 500 may be arranged to detect a change of distance between the object OBJ1 and the apparatus 500. An object or obstacle OBJ1 may comprise a surface portion R1 a and/or R1 b. The apparatus 500 may be attached to a vehicle 1000. The vehicle may be moving at a non-zero velocity with respect to an obstacle OBJ1 . A vehicle 1000 may comprise the apparatus 500. The position of the vehicle 1000 may be monitored by using the apparatus 500. The position of the vehicle 1000 with respect to one or more obstacles may be monitored by using the apparatus 500. The velocity of the vehicle 1000 may be monitored by using the apparatus 500. A collision between the vehicle 1000 may be avoided by using position information provided by the apparatus 500. A route for the vehicle 1000 may be selected based on information about the positions of the obstacles. The vehicle may be e.g. a ground vehicle, an airborne vehicle, or a boat. The vehicle may be e.g. a car, a bus, a train, a motorcycle, a helicopter, or a flying device.

Referring to Fig. 1 1 , a position monitoring apparatus 500 may comprise the grating G1 and the imaging device CAM1 . The imaging device CAM1 may comprise the image sensor DET1 for capturing the annular image IMG1 . The apparatus 500 may comprise a data processing unit CNT1 for performing data processing operations. The processing unit CNT1 may be configured to determine a distance L1 by analyzing the captured image IMG1 .

The data processing unit CNT1 may comprise one or more data processors. The unit CNT1 may be configured to process image data. The memory MEM3 may comprise computer program PROG1 . The computer program code PROG1 may be configured to, when executed on at least one processor CNT 1 , cause the apparatus 500 to capture the annular image IMG1 . The computer program code PROG1 may be configured to, when executed on at least one processor CNT1 , cause the apparatus 500 to determine a distance from the captured image IMG1 .

The image sensor DET1 may convert the annular optical image IMG1 into a digital image DIMG1 . The apparatus 500 may comprise a memory MEM1 for storing the digital image DIMG1 .

The apparatus 500 may comprise a memory MEM2 for storing determined position data POST The position data POS1 may comprise e.g. the coordinates of one or more objects. The apparatus 500 may provide position information POS1 .

The apparatus 500 may comprise a memory MEM3 for storing computer program PROG1 . The computer program PROG1 may comprise computer program code configured to, when executed on at least one processor CNT1 , cause the apparatus 500 to measure the positions of the objects OBJ1 .

The apparatus 500 may comprise a communication unit RXTX1 to send measured position data POS1 . The apparatus 500 may send the position data POS1 e.g. to a control unit of a traffic control system. The apparatus 500 may send the position data POS1 e.g. to a surveillance system. The apparatus 500 may send the position data POS1 e.g. to a control system of a vehicle 1000. COM1 denotes a communication signal. The communication unit RXTX1 may be arranged to send the data e.g. by wireless communication, by an electrical cable, or by an optical cable. The communication unit RXTX1 may be arranged to send the data to the Internet and/or to a mobile communications network.

The apparatus 500 may optionally comprise e.g. a user interface UIF1 . The user interface UIF1 may comprise e.g. a display for displaying information to a user. The user interface UIF1 may be arranged to display e.g. distance information L1 .

The apparatus 500 may be arranged to provide information about the presence of objects e.g. for controlling lighting. The apparatus 500 may be arranged to provide information about the movements of objects e.g. for controlling lighting. The apparatus 500 may be arranged to provide information about the presence of objects e.g. for stopping operation of an industrial robot. The apparatus 500 may be arranged to provide information for a surveillance and/or security system. The apparatus 500 may be arranged to provide information about the presence of objects e.g. for initiating an alarm. The apparatus 500 may be arranged to provide information about the movements of objects e.g. for initiating an alarm.

Referring to Fig. 12, the apparatus 500 may comprise an omnidirectional imaging device CAM1 and a substantially cylindrical diffraction grating G1 . The imaging device CAM1 may comprise an input element 100, a focusing unit 300, and an image sensor DET1 . The input element 100 may comprise e.g. a curved reflective surface SRF5. The input element 100 may comprise single curved reflector SRF5. The reflector SRF5 may be e.g. a paraboloid reflector hyperboloid, conical, spherical, or ellipsoidal reflector. A single paraboloid hyperboloid, conical, spherical, or ellipsoidal reflector may be used as the input element 100.

The grating G1 may provide diffracted light LB2 by diffracting light LB1 received from an object OBJ1 . The surface SRF5 may reflect diffracted light towards the focusing optics 300. The focusing optics 300 may form the annular image IMG1 on the image sensor DET1 by focusing the light reflected from the surface SRF5. The distance L1 to the object OBJ1 may be determined from the angular distance between the partial images of the image IMG1 .

Referring to Fig. 13, the apparatus 500 may comprise an omnidirectional imaging device CAM1 and a substantially cylindrical diffraction grating G1 . The imaging device CAM1 may comprise an input element 100, an aperture stop AS1 , a focusing unit 300, and an image sensor DET1 . The imaging device CAM1 may optionally comprise a wavefront modifying unit 200.

The distance L1 to the object OBJ1 may be determined from the angular distance between the partial images of the image IMG1 .

The apparatus 500 may be arranged to form the annular optical image IMG1 on the image sensor DET1 , by diffracting and focusing light of several light beams LB1 1 , LBI 2. For example, a first light beam LB1 i may be received from a first object OBJ1 , and a second light beam LBI 2 may be received from a second object. The first light beam LB1 1 may propagate in a first direction DIR1. The second light beam LBI 2 may propagate in a second different direction DIR2. The apparatus 500 may be arranged to form a first image point P1 i on the image sensor DET1 by diffracting and focusing light of the first light beam LB1 1. The apparatus 500 may be arranged to form a second image point PI 2 on the image sensor DET1 by diffracting and focusing light of the second light beam LB1 1. The input element 100, the optical elements of the (optional) modifying unit 200, the aperture stop AS1 , and the optical elements of the focusing unit 300 may be substantially axially symmetric with respect to the axis AX1 .

The input element 100 may be substantially axially symmetric about the axis AX1 . The optical components of the imaging apparatus 500 may be substantially axially symmetric about the axis AX1 . The input element 100 may be axially symmetric about the axis AX1 . The axis AX1 may be called e.g. as the symmetry axis, or as the symmetry axis.

The input surface SRF1 of the input element 100 may provide a first refracted light beam LB3 by refracting light of a diffracted light beam LB2. The first reflective surface SRF2 may provide a first reflected beam LB4 by reflecting light of the first refracted beam LB3. The second reflective surface SRF3 may provide a second reflected beam LB5 by reflecting light of the first reflected beam LB4. The output surface SRF4 may provide an output beam LB10 by refracting light of the second reflected beam LB5.

The input element 100 may be arranged to operate e.g. such that the second reflected beam LB5 formed by the second reflective surface SRF3 does not intersect the first refracted light beam LB3 formed by the input surface SRF1 .

The input element 100 may be comprise or consist of substantially transparent material, e.g. glass or plastic. The light may propagate from the input surface SRF1 to the output surface SRF4 in a substantially homogeneous material without propagating in a gas. The reflective surfaces SRF2, SRF3 of the input element LNS1 may be arranged to reflect light by total internal reflection (TIR).

The focusing unit 300 may comprise one or more lenses 301 , 302, 303, 304.

The aperture stop AS1 may be positioned between the input element 100 and the focusing unit 300. The center of the aperture stop AS1 may substantially coincide with the axis AX1 . The aperture stop AS1 may be substantially circular. The aperture stop AS1 may limit transverse dimensions of light beams which propagate from the input element 100 to the focusing unit 300. The aperture stop AS1 may limit transverse dimensions of light beams which pass through the aperture stop AS1. The aperture stop AS1 may define the entrance pupil EPU1 of the imaging device CAM1 (Fig. 8).

The imaging device CAM1 may optionally comprise a wavefront modifying unit 200 to modify the wavefront of light beams, which are coupled out of the input element 100. The wavefront modifying unit 200 may form an intermediate beam by modifying the wavefront of an output beam which is coupled out of the element 100 through the output surface SRF4. The wavefront modifying unit 200 may form e.g. a substantially collimated light beam. The aperture stop AS1 may be arranged to limit the transverse dimensions of the intermediate beam. The light of the intermediate beam may be focused on the image sensor DET 1 by the focusing unit 300. The focusing unit 300 may be arranged to form a focused beam by focusing light of the intermediate beam. The input element 100 may also be arranged to operate such that the wavefront modifying unit 200 is not needed. The output beam of the input element 100 may be directly coupled to the focusing unit 300.

For the person skilled in the art, it will be clear that modifications and variations of the devices and the methods according to the present invention are perceivable. The figures are schematic. The particular embodiments described above with reference to the accompanying drawings are illustrative only and not meant to limit the scope of the invention, which is defined by the appended claims.