Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
AN OPTICAL APPARATUS AND METHOD
Document Type and Number:
WIPO Patent Application WO/2019/170920
Kind Code:
A1
Abstract:
An apparatus for projecting structured light, said structured light comprising at least one pattern to be projected where the intensity varies smoothly, the apparatus comprising a digital micromirror device "DMD", said DMD comprising an array of pixels to produce said structured light, the apparatus further comprising an optical element arranged to 5 soften pixelation in said structured light introduced by the pixelation of the DMD.

Inventors:
BUSET HALVARD (NO)
ULFENG ARILD (NO)
Application Number:
PCT/EP2019/056057
Publication Date:
September 12, 2019
Filing Date:
March 11, 2019
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
ZIVID LABS AS (NO)
International Classes:
G01B11/25; G02B5/30; G02B27/09; G03B21/00; G03B21/20; H01S5/42; H04N9/31
Domestic Patent References:
WO2017125507A12017-07-27
Foreign References:
US20100171927A12010-07-08
US20160033758A12016-02-04
US8047660B22011-11-01
Other References:
DAN DAN ET AL: "DMD-based LED-illumination Super-resolution and optical sectioning microscopy", SCIENTIFIC REPORTS, vol. 3, 23 January 2013 (2013-01-23), XP055277118, DOI: 10.1038/srep01116
0. SKOTHEIM; F. COUWELEERS: "Structured light projection for accurate 3D shape determination", PROCEEDINGS OF THE 12TH INTERNATIONAL CONFERENCE ON EXPERIMENTAL MECHANICS, 2004
G. SANSONI; S. CORINI; S. LAZZARI; R. RODELLA; F. DOCCHIO: "Three-dimensional imaging based on Gray-code light projection: characterization of the measuring algorithm and development of a measuring system for industrial applications", APPLIED OPTICS, vol. 36, no. 19, 1997, XP002387175, DOI: doi:10.1364/AO.36.004463
C. REICH; R. RITTER; J. THESING: "3-D shape measurement of complex objects by combining photogrammetry and fringe projection", OPTICAL ENGINEERING, vol. 39, no. 1, 2000, XP002495170, DOI: doi:10.1117/1.602356
G. SANSONI; M. CAROCCI; R. RODELLA: "Three-dimensional vision based on a combination of Gray-code and phase-shift light projection: analysis and compensation of the systematic errors", APPLIED OPTICS, vol. 38, 1999, pages 6565 - 6573
0. SKOTHEIM; H. SCHUMANN-OLSEN; M. LACOLLE; K. HAUGHOLT; J. THORSTENSEN; A. KIM; T. BAKKE: "A real-time 3D range image sensor based on a novel tip-tilt-piston micromirror and dual frequency phase shifting", IS&T/SPIE ELECTRONIC IMAGING, 2015
J. SALVI; J. PAGES; J. BATLLE: "Pattern codification strategies in structured light systems", PATTERN RECOGNITION, vol. 37, no. 4, 2004, XP004491495, DOI: doi:10.1016/j.patcog.2003.10.002
Attorney, Agent or Firm:
GRANLEESE, Rhian (GB)
Download PDF:
Claims:
CLAIMS:

1. An apparatus for projecting structured light, said structured light comprising at least one pattern to be projected where the intensity varies smoothly, the apparatus comprising a digital micromirror device“DMD”, said DMD comprising an array of pixels to produce said structured light, the apparatus further comprising an optical element arranged to soften pixelation in said structured light introduced by the pixelation of the DMD.

2. An apparatus according to claim 1 , wherein the optical element is an optical low pass filter.

3. An apparatus according to claim 2, wherein the optical filter comprises a birefringent plate.

4. An apparatus according to claim 2, wherein the birefringent plate comprises a natural or synthetic compound

5. An apparatus according to any preceding claim wherein the optical filter is provided to receive light reflected from the DMD.

6. An apparatus according to claim 1 , wherein the optical element is a wavefront modulator for light reflected from the DMD.

7. An apparatus according to claim 5, wherein the optical element comprises a periodic variation in its thickness or refractive index.

8732047-1-RGRANLEE

8. An apparatus according to either of claims 5 or 6, wherein the optical element comprises an undulating surface. 9. An apparatus according to any of claims 6 to 8, the apparatus further comprising a projection optic, the optical element being provided at an aperture stop of the projection optic.

10. An apparatus for projecting structured light, said structured light comprising at least one pattern to be projected where the intensity varies smoothly, the apparatus comprising a digital micromirror device“DMD”, said DMD comprising an array of pixels to produce said structured light, the apparatus further comprising a VCSEL arrange to reflect light off the DMD. 11. An apparatus according to claim 10, further comprising a prism and a projection optic, said prism being arranged to direct light from the VCSEL to the DMD and direct light reflected from the DMD into a projection optic.

12. An apparatus according to claim 1 1 , comprising of a second prism placed next to the said prism.

13. An apparatus according to claim 12, wherein the said prism is a 60, 30, 90 degree prism that is used to facilitate the entrance of light into the second prism that is a 45 degree prism.

8732047-1-RGRANLEE

14. An apparatus according to claim 10, further comprising a diffractive optical element, said diffractive optical element being provided to reshape light from the VCSEL to the shape of the DMD.

15. An apparatus according to any preceding claim, wherein the intensity varies sinusoidally.

16. A system for capturing a 3D image, the system comprising:

a projector adapted to project a pattern towards a scene, the pattern having a temporal variation, the projector comprising an apparatus according to any preceding claim;

an image sensor having a plurality of pixels, said image sensor adapted to capture a plurality of video frames of a scene illuminated by said projector; and

a processor adapted to receive image data from said image sensor,

the processor being further adapted to derive 3D shape information of said scene by calculating depth information by determining how the captured image varies over time with the temporally varying projected pattern,

the system further comprising an output for a 3D video point cloud of the scene derived from said 3D shape information.

17. A method for projecting structured light, said structured light comprising at least one pattern to be projected with a smooth variation in intensity, the method comprising using a digital micromirror device “DMD” to reflect incident light to produce said structured light, said DMD comprising a plurality of pixels, the method further comprising softening the pixelation in said structured light due to the pixelation of the

DMD.

8732047-1-RGRANLEE

18. A method for projecting structured light, said structured light comprising at least one pattern to be projected with a smooth variation in intensity, the method comprising using a digital micromirror device “DMD” to reflect incident light to produce said structured light, said DMD comprising a plurality of pixels, wherein the light reflected off the DMD is provided by a VCSEL.

19. An apparatus for projecting structured light, said structured light comprising at least one pattern to be projected where the intensity varies smoothly, the apparatus comprising a spatial light modulator“SLM”, said SLM comprising an array of pixels to produce said structured light, the apparatus further comprising an optical element arranged to soften pixelation in said structured light introduced by the pixelation of the

SLM.

8732047-1-RGRANLEE

Description:
An Optical Apparatus and Method

FIELD

Embodiments of the invention relate to an optical apparatus and method for projecting a pattern.

BACKGROUND

Three-dimensional surface imaging (3D surface imaging) is a fast growing field of technology. The term“3D surface imaging” as used herein can be understood to refer to the process of generating a 3D representation of the surface(s) of an object by capturing spatial information in all three dimensions - in other words, by capturing depth information in addition to the two-dimensional spatial information present in a conventional image or photograph. This 3D representation can be visually displayed as a“3D image” on a screen, for example.

A number of different techniques can be used to obtain the data required to generate a 3D image of an object’s surface. One of these techniques is to use structured light. Here, one or more light patterns are projected onto the scene with a projector (e.g., video projector, slide projector, laser with diffractive optical element etc.) and observed with a camera placed at an angle with the projector and discussed in more detail in the article by 0. Skotheim, F. Couweleers, "Structured light projection for accurate 3D shape determination". Proceedings of the 12th International Conference on Experimental Mechanics, Bari, Italy (2004). Due of the angle between the camera and the projector, the light patterns will appear distorted by the surfaces of the objects in the light path. By analyzing these distortions

8732047-1-RGRANLEE in software or hardware, the three-dimensional shape of the surfaces can be calculated very accurately and represented, for example, as a point cloud or a polygonal surface mesh. There are several approaches to structured light imaging, depending on the type and number of projected patterns used. The number of projected patterns used is typically related to the system design and affects its complexity and cost.

For instance, if only one static pattern is used, information is extracted from a small neighborhood around each pixel in order to determine 3D depth for the neighborhood. Using neighborhoods of pixels has significant disadvantages if the objective is high- resolution imaging, as it implies that depth is representative for a neighborhood of pixels rather than for each pixel by itself (i.e. the resolution is lower). If more than one projected pattern is available, one can avoid the use of a spatial neighborhood altogether and instead utilize temporal information only, meaning that 3D depth is calculated independently and at each pixel, by sampling information specific to one pixel at different instances in time. Such approaches will be referred to as time- multiplexed structured light methods. Examples of two popular methods for time-multiplexed structured light are: Gray code (based on binary patterns, see G. Sansoni, S. Corini, S. Lazzari, R. Rodella and F. Docchio, "Three-dimensional imaging based on Gray-code light projection: characterization of the measuring algorithm and development of a measuring system for industrial applications". Applied Optics Vol. 36 No. 19 (1997)) and phase stepping (based on sinusoidal patterns, see e.g. C. Reich, R. Ritter, J. Thesing, "3-D shape measurement of complex objects by combining photogrammetry and fringe projection". Optical Engineering Vol 39 (1 ), 2000. Various other methods also exist that either

8732047-1-RGRANLEE combine Gray code and phase stepping, as discussed in G. Sansoni, M. Carocci and R. Rodella, “Three-dimensional vision based on a combination of Gray-code and phase-shift light projection: analysis and compensation of the systematic errors”, Applied Optics, 38, 6565-6573, 1999, or use phase stepping with two or more spatial frequencies as discussed in 0. Skotheim, H. Schumann-Olsen, M. Lacolle, K. Haugholt, J. Thorstensen, A. Kim and T. Bakke., "A real-time 3D range image sensor based on a novel tip-tilt-piston micromirror and dual frequency phase shifting". IS&T/SPIE Electronic Imaging 2015. See J. Salvi, J. Pages, J. Batlle, “Pattern codification strategies in structured light systems”, Pattern Recognition Vol. 37 (4), 2004 for a survey of different structured light methods.

BRIEF DESCRIPTION OF THE FIGURES

Figure 1 (a) is a schematic of an imaging system;

Figure 1 (b) is a picture showing an external view of the system;

Figure 2 shows a sequence of images that may be projected by the system of figure

1 (a);

Figure 3 shows a further sequence of images that may be projected by the system of figure 1 (a);

Figure 4 shows an image of a DMD;

Figure 5(a) shows a sinusoidal variation in intensity profile and figure 5(b) shows a stepwise approximation to a sinusoidal variation in intensity profile produced using the DMD of figure 4;

Figure 6(a) shows a projection system in accordance with an embodiment of the invention;

Figure 6(b) shows an optical element which is used in the system of figure 6(a);

8732047-1-RGRANLEE Figure 7(a) is a schematic of a projection lens that may be used in accordance with an embodiment of the invention and figure 7(b) shows a wavefront modulator positioned in the projection lens figure 7(a);

Figure 8 is a simulation of profile for a wavefront modulator of the type described with reference to figure 7(b);

Figure 9 is a schematic of an array of VCSELs;

Figure 10(a) is a system in accordance with an embodiment of the present invention with VCSELs and a double prism arrangement and figure 10(b) is a plot of the intensity at the DMD;

Figure 11 (a) is a system in accordance with an embodiment of the present invention with VCSELs and a single prism arrangement and figure 11 (b) is a plot of the intensity at the DMD;

Figure 12 is a plot showing the variation achievable by diffractive optical element;

and

Figure 13(a) is a diagram for understanding the configuration of an optical element as a wavefront modulator; Figure 13(b) is a diagram showing an undulating surface; figure 13(c) is a ray diagram with the projection lens represented by its thin lens equivalent and figure 13(d) shows the output of figure 13(c); figure 13(e) is a ray diagram with the projection lens with the optical element and figure 13(f) shows the output of figure 13(e); figure 13(g) is a ray diagram with the projection lens with the optical element; and figure 13(h) is a ray diagram with the projection lens represented by its thin lens equivalent;

Figure 14 (a) is a simulation of the DMD pixels with a sinusoidal brightness distribution and figure 14(b) is the horizontal plot intensity of the brightness distribution of the DMD pixels;

8732047-1-RGRANLEE Figure 15(a) is a simulation of the DMD pixels with a sinusoidal brightness distribution and an optical low pass filter to soften the pixelation;

Figure 15(b) is the horizontal plot intensity of the brightness distribution of the DMD pixels with the low pass filter to soften the pixelation; and

Figure 16 is a system in accordance with an embodiment of the invention with the optical low pass filter.

DETAILED DESCRIPTION

In a first aspect, the present invention provides an apparatus for projecting structured light, said structured light comprising at least one pattern to be projected where the intensity varies smoothly, the apparatus comprising a digital micromirror device“DMD”, said DMD comprising an array of pixels to produce said structured light, the apparatus further comprising an optical element arranged to soften pixelation in said structured light introduced by the pixelation of the DMD.

In one embodiment, the optical element is a so-called optical low pass filter, whereby light with a frequency greater than a cut off frequency is attenuated.

The low pass optical filter can be positioned to receive light reflected from the DMD. For example, the low pass optical filter can be provided between a prism used to direct light from a source to the DMD and the DMD. In a further embodiment, the low pass optical filter is provided in the optical path between the said prism and a projection optic that directs radiation towards an object to be illuminated. The source may be any type of source, for example LED, VCSEL etc. In an embodiment, the filter has a working frequency range within the visible spectrum. In an embodiment, the visible spectrum is taken to be 420nm - 650nm. Other electromagnetic frequency ranges may include but are not limited to include: infrared, near-infrared, and ultraviolet etc.

8732047-1-RGRANLEE An example of a suitable filter comprises a birefringent plate. In an embodiment, the birefringent plate is provided in the optical path between the prism and the projection optic that splits the incoming light into two components. The polarisation of light has two perpendicular vibrational components (ordinary and extraordinary). The birefringent plate splits the light to produce two beams propagating in parallel.

The optical element may be a wavefront modulator for light reflected from the DMD. For example, the optical element may comprise a periodic variation in its thickness or refractive index. In some embodiments, the variation does not have to be periodic. The optical element may comprise an undulating surface.

The apparatus may further comprise a projection optic, the optical element being provided at an aperture stop of the projection optic.

In a further aspect an apparatus for projecting structured light is provided said structured light comprising at least one pattern to be projected where the intensity varies smoothly, the apparatus comprising a digital micromirror device“DMD”, said DMD comprising an array of pixels to produce said structured light, the apparatus further comprising a VCSEL arranged to reflect light off the DMD.

The apparatus may further comprise a prism and a projection optic, said prism being arranged to direct light from the VCSEL to the DMD and direct light reflected from the DMD into a projection optic. Due to the rotationally symmetric and narrow divergence angle of a VCSEL, the optical arrangement can be simplified and in one embodiment, just a single prism is provided between the source and the DMD. A diffractive optical element may be used to reshape light from the VCSEL to the shape of the DMD.

8732047-1-RGRANLEE In embodiments of the above, the intensity of the structured light may vary sinusoidally.

In a further embodiment, a system for capturing a 3D image is provided, the system comprising:

a projector adapted to project a pattern towards a scene, the pattern having a temporal variation, the projector comprising an apparatus as described above;

an image sensor having a plurality of pixels, said image sensor adapted to capture a plurality of video frames of a scene illuminated by said projector; and

a processor adapted to receive image data from said image sensor,

the processor being further adapted to derive 3D shape information of said scene by calculating depth information by determining how the captured image varies over time with the temporally varying projected pattern,

the system further comprising an output for a 3D video point cloud of the scene derived from said 3D shape information.

In a further aspect, a method for projecting structured light is provided, said structured light comprising at least one pattern to be projected with a smooth variation in intensity, the method comprising using a digital micromirror device“DMD” to reflect incident light to produce said structured light, said DMD comprising a plurality of pixels, the method further comprising softening the pixelation in said structured light due to the pixelation of the DMD.

In a yet further aspect, a method for projecting structured light is provided, said structured light comprising at least one pattern to be projected with a smooth variation in intensity, the method comprising using a digital micromirror device“DMD” to reflect

8732047-1-RGRANLEE incident light to produce said structured light, said DMD comprising a plurality of pixels, wherein the light reflected off the DMD is provided by a VCSEL.

In the above embodiment, the temporal pattern can be any of those described in the background section above.

The above has mentioned a DMD. However, any type of spatial light modulator“SLM” could be used. Therefore, in a further aspect, an apparatus for projecting structured light is provided, said structured light comprising at least one pattern to be projected where the intensity varies smoothly, the apparatus comprising a spatial light modulator “SLM”, said SLM comprising an array of pixels to produce said structured light, the apparatus further comprising an optical element arranged to soften pixelation in said structured light introduced by the pixelation of the SLM. Figure 1 (a) is a schematic showing an apparatus for capturing a 3D image using structured light. The apparatus comprises a camera 1 and a projection system 3. The camera 1 and the projection system 3 are housed within a box of the type shown in figure 1 (b). In the box of figure 1 (b), light is emitted from the box via projector output 5. The image is then captured using the camera 1 which is located behind lens 7.

As this application is mainly concerned with the projection side, only the projection side will be described in detail. For further details on the image capture, the inventors refer to published patent application WO2017/125507. The projector 3, is adapted to project structured light images. Many different types of patterns are used for 3D image capture using structured light, for example, phase shifted patterns, Gray codes or combinations thereof.

8732047-1-RGRANLEE Figure 2 illustrates four phase shifted cosine patterns where the intensity horizontally is a cosine shifted, from left to right by 0, 90, 180 and 270 degrees. At the bottom of Figure 2, a horizontal cross section of the grey level is shown, which illustrates the sinusoidal variation in intensity. Each cosine pattern is laterally displaced by introducing a phase shift, Af h . The intensity distribution in such a pattern can be represented as

where A(x,y) corresponds to the background illumination, R(x,y) is the object reflectance in each point and N is the number of patterns.

The distortion of the light patterns is encoded in the phase of the cosine, 0(x, y). This phase can be calculated from the related intensities by the following equation:

, , ,

0(x, y) = arctan

A problem with phase stepping as a method for measuring stripe displacement and hence 3D depth, is that the absolute phase cannot be recovered, but rather a so-called wrapped phase which will always be in the interval [0,2TT], This means that every time the phase exceeds 2p it drops to zero, and the 3D image that is obtained will be only piecewise continuous. An alternative structured light method makes use of binary, so-called Gray code patterns as described in the article by G. Sansoni, S. Corini, S. Lazzari, R. Rodella and F. Docchio, "Three-dimensional imaging based on Gray-code light projection: characterization of the measuring algorithm and development of a measuring system

8732047-1-RGRANLEE for industrial applications". Applied Optics Vol. 36 No. 19 (1997). The idea is to "tag" each individual stripe with a temporal binary code which gives us the ability to distinguish individual light stripes from each other. The code can be generated by illuminating the scene with a series of binary patterns and by thresholding the sequence of transitions that occur between dark and bright at each location. An example of a Gray code pattern sequence is shown in Figure 3, where two distinct locations in a planar scene are chosen and the transitions that occur here are followed as a sequence of six binary patterns is projected onto it. The upper line indicates the transitions that occur at position A, while the lower line indicates the transitions at position B. If darkness is denoted with 0 and brightness with 1 , the transitions that occur at position A can be described as the binary word 001100. Likewise, the binary word for position B becomes 101 110. Since the code consists of six binary digits, it is possible to distinguish 2 6 = 64 separate locations in the pattern.

In an embodiment, a Gray code coding scheme is used which is composed in such a way that successive numbers differ by at most one digit in their binary words. This coding scheme ensures that the error is minimized if one of the transitions is erroneously detected; as described in the articles by G. Sansoni, S. Corini, S. Lazzari, R. Rodella and F. Docchio, "Three-dimensional imaging based on Gray-code light projection: characterization of the measuring algorithm and development of a measuring system for industrial applications". Applied Optics Vol. 36 No. 19 (1997) and 0. Skotheim, F. Couweleers, "Structured light projection for accurate 3D shape determination". Proceedings of the 12th International Conference on Experimental Mechanics, Bari, Italy (2004).for a more thorough explanation.

Phase shifting yields only piecewise continuous 3D images, while Gray code patterns

8732047-1-RGRANLEE yield absolute distance but with relatively poor resolution. When the two methods are combined, such that both a set of phase shifted sinusoidal patterns and a set of Gray code patterns are projected, one can obtain 3D images that are both highly accurate (due to the phase shifting), and do not contain any discontinuities or ambiguities (due to the Gray code).

Thus, the projector may be adapted to project two different types of temporally varying patterns, wherein one of the types of patterns is adapted to extract data with a high resolution relating to the depth value attributed to a pixel and the other type of pattern is adapted to extract data concerning the exact location of the pixel.

In a further embodiment, the type of pattern adapted to extract data with a high resolution is a pattern that has a continuous variation in its intensity such as a phase shift pattern. Patterns which are adapted to extract data concerning the exact location of the pixel, can be, for example, patterns that assign a digital code to each pixel, for example, binary or Gray code patterns.

In a yet further embodiment, the first type of temporally varying pattern is a Gray code pattern and the second type of temporally varying pattern is a phase shift pattern. However, other types of pattern may be used.

The projector 3 of figure 1 is used to project structured light images of the types described above with reference to figures 2 and 3. Referring back to figure 1 (a), the projector 3 comprises a light source 1 1 projects light through optical arrangement 13.

Optical arrangement 13 comprises convex lens 15 which collimates the 20° diverging radiation and directs it into fly’s eye arrangement 17, the Fly’s Eye shapes the

8732047-1-RGRANLEE radiation from a round beam to a rectangle with the same shape as the DMD that will be described later. The light output it by fly’s eye arrangement 17 is then collected by second convex lens 19 that collects the radiation and sends it into prism arrangement 21 .

In this embodiment, the prism arrangement comprises a first prism 23 which is placed next to a second prism 25. In an embodiment, the first prism is a 60, 30, 90 degree prism that is used to facilitate the entrance of light into the second prism 25 that is a 45 degree prism.

Then, the light enters second prism 25 and exits through first surface 27 to form an evenly lit rectangle that is similarly shaped as the DMD 29. First surface 27 is provided with a DMD (digital micromirror device) 29. The DMD 29 is shown in more detail in figure 4. The DMD 29 comprises an array of micromirrors arranged as an array of pixels. The micro-mirrors can be individually controlled to allow reflection or absorption of certain parts of the incoming beam to allow a pattern to be introduced to the beam. The DMD 29 is controlled in order to provide the structured light patterns described above. The DMD consists of many small mirrors with small, dark openings. To make a greyscale, each mirror switches on and off in a binary time-sequence so that one grey level consist of e.g. an 8 bit sequence to constitute a byte. The human eye/camera then integrates over a sufficient time to record the correct grey level.

The DMD 29 in turn reflects the radiation at an angle substantially perpendicular to the first surface 27, proceeds to the hypotenuse 31 of the second prism 25 for total internal reflection and finally exit through the second leg where it enters the projection lens that is a telecentric lens 33.

8732047-1-RGRANLEE Returning now to the DMD 29, figure 4 shows the structure of a DMD, which can be considered to be an array of mirror elements. However, as shown in figure 5(a), the desired beam shape for producing a structured light pattern has a sinusoidal cross- section. However, the relatively large pixels of the DMD do not allow such a pattern to be performed exactly. Figure 5(b) shows a better approximation to the output of the beam produced by the DMD. Here, as is shown, the cross-section of the intensity of the beam is a plurality of columns that generally follow a sinusoidal shape.

Figure 6(a) shows a system in accordance with an embodiment of the invention. To avoid unnecessary repetition, like reference numerals will be used to denote like features with that of figure 1 (a). However, here, an optical element 41 is provided before DMD 29, such that light passes through the optical element to the DMD and then is reflected by the DMD back once again through optical element 41. In this embodiment, the optical element 41 is an optical low pass filter. The structure of this element is shown in more detail in figure 6(b). Here, the DMD 61 is shown arranged in parallel with a quartz crystal birefringence plate 63. Any birefringent material can be used for the plate that is transparent and can withstand the high irradiation will do, for example Si0 2 . In an embodiment, so called“two-way” and“four- way” variants can be used.

The pixel pitch 65 between the mirror elements of the DMD is shown. The optical axis 67 of the birefringence plate is provided at an angle to its upper surface 69. Birefringent plate 63 splits an incoming optical beam into two components. The polarisation of an incoming beam has two perpendicular vibrational components (ordinary and extraordinary), the birefringent plate 63 splits these components such that the output is

8732047-1-RGRANLEE two parallel propagating beams. In an embodiment, the width of the birefringence plate 63 can be selected so that the separation of the 2 beams equals half of the pixel pitch 65. In a further embodiment, the separation width required by the birefringence plate is chosen depending on the level of softening required.

It should be noted that due to the wide beam impinging on the birefringence plate, the output of the birefringence plate is not two distinct beams as the divided components from each ray impinging on the birefringence plate will overlap with one another causing the softening of the pixlelation.

Thus, the birefringence plate 63 in turn serves to soften the pixelation from the output of the DMD 61.

In figure 6(a), the optical element 41 is positioned between the DMD 61 and the second prism 25. However, the optical element can be provided at any position in the light path after the light has been reflected from the DMD 61. In a further embodiment, it is positioned between the second prism 25 and the projection lens 33. The advantage of the filter positioned between the second prism and the projection lens 33 is that it results in smaller loss of light since light pass only once, and 2) a reduction in prism size since it can be closer to the DMD

A further arrangement to reduce the effect of pixelation caused by the DMD will be described with reference to figures 7(a) and 7(b). Figure 7(a) shows the telecentric lens 33 of figure 1 (a) in more detail. As explained above, the telecentric lens is a compound lens that has its entrance pupil at infinity. This means that the rays that pass through the centre of the aperture are parallel to the

8732047-1-RGRANLEE optical axis in front of or behind the system, respectively. The aperture stop of the telecentric lens is positioned between a pair of facing concave lenses 51 and 53.

Figure 7(b) shows an optical element 55 which is designed to reduce the effect of pixelation caused by the DMD. The optical element is provided in the aperture stop.

The aperture of the lens is defined by the aperture stop. It is usually inside the lens somewhere. Light approaching the lens will see an image of the stop and only the light that pass that image will reach the sensor. In an embodiment, the wavefront modulator is positioned in the aperture stop.

Although a telecentric lens is described here, this is just one example of an arrangement of possible projection optics. However, regardless of the projection optics used, in an embodiment, the optical element 55 is provided at the aperture stop of the optics.

The optical element 55 comprises a wavefront modulator that has the effect of widening the point spread function and hence softening the pixelated image produced by the DMD 61. It can be made in different ways e.g. an undulating surface, a phase plate with modulated refractive index or other. The modulation index (or modulation depth) will depend of how much softening is needed.

The above systems allow the use of a monochrome light source. In one embodiment this can be a VCSEL. A VCSEL is an efficient source both in terms of conversion of electric current to photons and in the radiation properties which are quite directional. This directional behaviour allows for high F number -4 and possibly even (>6) and thus simplifies the optics.

8732047-1-RGRANLEE Further, this monochromatic property allows for a narrowband optical filter to pass only the laser and reject ambient light reaching the camera 1 (of figure 1 (a)). The higher F number of the source allows an increased depth of field (in proportion). Further, it can help reduce the pixelation introduced by the DMD.

Figure 8 shows a simulation of the profile for the wavefront modulator. The modulator has an effect of softening the pixelated image by widening the point spread function of the incident light.

Figure 9 shows an image of an array of vertical cavity surface emitting lasers (VCSELs). An array of VCSELs is used that helps to reduced speckle and facilitate eye safety. The light from a VCSEL is close to collimated. In an embodiment, this allows the removal of the first prism 23.

Figure 10(a) is a schematic showing a simplified arrangement of the apparatus discussed in relation to figure 1 (a). As can be seen on figure 10(a), these a scale bar is shown indicating 50 mm. Referring now to figure 11 (a), due to the use of the VCSEL, the first prism 23, has been removed which allows a reduction in the size of the device and a simplification of the prism system. This reduction in size is achieved since it allows the removal of the first glass prism and the angle is reduced between the illumination axis and the projection axis. Also, the use of the VCSEL can additionally reduce the size due to the directionality of its light. VCSELs have a much lower etendue than LEDs.

8732047-1-RGRANLEE However, the removal of the first prism can also introduce distortion and an uneven focus of the light onto the DMD. In a further embodiment, the fly’s eye lens of figure 10(a) is replaced with a diffractive optical element (DOE) 81. The DOE allows monochromatic light to be virtually any shape as shown in figure 12.

Figure 10(b), shows the radiance at the position of the DMD for the arrangement of figure 10(a). It can be seen that there is an even distribution of the radiance across the area. However, contrasting this with figure 11 (b), it can be seen that the distribution shown in figure 11 (b), has less speckle than that shown in figure 10(b), however, the distribution of speckle is not as even as that of figure 10(b).

As explained with reference to figure 7(b) and 8, in an embodiment, the optical element is a wavefront modulator, this will now be explained with reference to figures 13(a) to (h).

Figure 13(a) is a simplified ray diagram showing the optical path from the DMD 203 via a projection optic 201 to the projected image plane 205. The diagram has been simplified to indicate a single projection optic. This is purely to allow the principles to be shown clearly. In practice, the projection optic will have many components and be a compound optic, for example the telecentric lens as described above. The use of a more advanced projection optic allows a sharp image to be produced from components such as the DMD.

Referring to figure 13, the following definitions will be used:

f - Focal length

Fno - F- number

8732047-1-RGRANLEE Wave front - is a surface of constant phase centred around the Chief ray coming from a point on the SLM. It’s area change depending on position along the chief ray.

pp - Pixel Pitch (distance between pixels) in the light modulator (SLM)

SLM - Spatial Light Modulator (this is referred as a DMD above)

Conjugate Distance (CD) - Distance from the principal plane to the SLM. Since the image space normally is far from the projector, CD is approximately equal to the focal length. In this embodiment, a phase modulator, for instance in the form of an undulating surface or an index modulated element, is used to reduce sharp edges from the pixelated and digitally modulated SLM (DMD).

A representation of the phase modulator is shown in figure 13(b). Here, the phase modulator shown a sinusoidal surface to 7 which is provided at the widest part of the beam.

In a perfect lens, wave-fronts have spherical shape. Their radii of curvature and diameters change as they propagate through the optical system. Positions on wavefronts will be referred to as Altitude (a) and Azimuth (b).

Since the desired modulation from the modulator depends on the pixel pitch of the DMD, it is convenient to describe the phase modulation in terms of variation around an angle. The main direction is defined by a and b and the variation is determined by the pixel pitch divided by the conjugate distance (CD). At the principal plane, the aperture diameter is determined by the ratio f/Fno.

8732047-1-RGRANLEE In the design of a modulator, there are two parameters to consider:

1 ) the amount of modulation

2) the period of modulation

In an embodiment, the amount: should be between 0 and about pp/CD radians, the exact limit depending on total system considerations. This ensures that rays belonging to one point on the SLM propagates within expected limits.

Period of modulation: In an embodiment, the modulation will have several periods over the aperture, to avoid the possibility of finding a partial focus somewhere else in the projected image space. The period should be defined both in Altitude and Azimuth. In an embodiment, it is done by a sinusoidal function:

Where a and b should stay inside the aperture (SQRT(a 2 + b 2 ) < f/Fno), 0=f/Fno and n, m are the number of periods in each direction. The irradiance variation inside the blurred spot should be reasonably even.

Figure 13 (c) shows the ray diagram in the absence of the optical element 207. Figure 13(e) shows the same diagram but with the presence of the optical element. It can be seen here that at the imaging plane in figure 13(e), the beam is not focused to a single point. Figure 13(d) is the image at the image plane of figure 13(c) where the focused point is seen in the center of the image. Figure 13(f) shows the image at the image plane of figure 13(e) where the effect of the optical element can be seen.

Figure 13(g) shows the ray diagram here with the optical element 207 and figure 13(h) shows the ray diagram in the absence of the optical element 207. In these diagrams,

8732047-1-RGRANLEE the rays are shown beyond the imaging plane. It can be seen that here, at the focus 209, in the diagram with the optical element 207, a sharp focus is not reached. This is in contrast to the focus shown in figure 13(h). The above described wavefront modulator can be achieved either using a variation in the thickness of optical element 207 or a variation in its refractive index.

In an embodiment, the wave front modulator is provided at the aperture stop. However, the way from modulator could be provided pretty much anywhere along the optical path. However, since the size of the features of the wavefront modulator correspond to the size of the beam, in an embodiment, the wavefront modulator is provided at the widest part of the beam. If the wavefront modulator was provided at a narrower part of the beam, for example, immediately after the DMD, then the wavefront modulator requires finer features which increase manufacturing complexity.

Figure 14(a) show a simulation of the beam distribution from the DMD of 5 x 36 pixels. A sinusoidal brightness distribution is applied but with no pixel softening. The profile of the brightness intensity in the horizontal direction can be seen on Figure 14(b). It can be seen that with the optical element which is the birefringence plate described above positions such that the beams only traverse it once, beam distribution from the DMD is softened as shown in Figure 15(a). The profile of the horizontal brightness intensity after softening the pixelation is shown in Figure 15(b).

Figure 16 is a diagram of a system in accordance with an embodiment.

The LED assay 101 provides the light source. This is directed through lenses 103 and 105 onto Fly’s eye 107.

8732047-1-RGRANLEE The light passes through Fly’s eye lens 107 and lens 109. The light then passes through RTIR 1 11 and is directed towards DMD 113. DMD 1 13 then reflects the light back through RTIR 1 11 and through low-pass filter 1 13 into the telecentric lens arrangement which was described with reference to figures 7(a) and 7(b) and comprises lenses 115, 117, 119, 121 and 123.

It can be seen that the arrangement of lenses allows the beam emitted by LED 101 to be steered through an angle of 90° or greater. In the specific example shown in figure 16, the angle is 95.1 °.

The arrangement of lenses is provided within a housing. The housing comprises retainer ring 125 that holds in place the final lens 123 of the telecentric lens. Lens housing 127 serves to hold the lenses of the telecentric lens in position.

The term illumination interface 129 is used to describe the housing that supports the lenses and the arrangement from the LED 101 to the DMD 1 13. The lens housing 127 and illumination interface are joined together by the engine housing 131.

The birefringent plate can be provided in a number of locations. Although it can be mounted between the DMD and RTIR, if it is mounted here, it is necessary to increase the distance between the DMD and RTIR. This will increase the size of the prism and thus the cost. It will also create 2x loss of light compared to a position between RTIR and the telecentric (projection) lens.

8732047-1-RGRANLEE Thus, in the embodiment of figure 16, it is provided between the RTIR and telecentirc lens. It is also provided at the join between the illumination interface 129 and lens housing 127 that allows easy removal and maintenance. In a further embodiment, a filter may be provided that will operate in two dimensions. For example, to achieve this, a quarter-wave plate can be provided between two orthogonally oriented birefringent filters.

In a further embodiment, a rejection filter is provided on the birefringent plate.

The above embodiments have referred to a DMD. However, a DMD is a type of Spatial Light modulator. Other types of spatial light modulators could be used, for example LCos or other electrically addressable SLMs. Different variations have been described above. However, it should be noted that the arrangements of each of the embodiments can be combined with one another. For example, the VCSEL described with reference to figures 9 to 12 can be used in combination with one or both of the optical element 41 of figures 6(a) and (b) and the optical element of figure 7(b). Also, the optical element 41 of figures 6(a) and (b) and may be used in combination with the optical element of figure 7(b) with or without a

VCSEL.

8732047-1-RGRANLEE