Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
DEPTH IMAGING
Document Type and Number:
WIPO Patent Application WO/2006/087682
Kind Code:
A2
Abstract:
A method for measuring the distance of an element from a defined point comprises obtaining first and second images of the element, with the object planes of the first and second images differing from one another. The element (which could be a point or an object in the image) is located in each image, and the distance of the element from the defined point is calculated based upon the change in position of the element between the first and second images and information relating to the different object planes of the first and second images.

Inventors:
BRUEKERS ALPHONS A M L (NL)
SCHLEIPEN JOHANNES J H B (NL)
Application Number:
PCT/IB2006/050506
Publication Date:
August 24, 2006
Filing Date:
February 16, 2006
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
KONINKL PHILIPS ELECTRONICS NV (NL)
BRUEKERS ALPHONS A M L (NL)
SCHLEIPEN JOHANNES J H B (NL)
International Classes:
G06T7/00
Other References:
SUBBARAO M ET AL: "Depth from defocus: a spatial domain approach" INTERNATIONAL JOURNAL OF COMPUTER VISION NETHERLANDS, [Online] vol. 13, no. 3, December 1994 (1994-12), pages 271-294, XP002406148 ISSN: 0920-5691 Retrieved from the Internet: URL:http://citeseer.ist.psu.edu/cache/pape rs/cs/13051/http:zSzzSzwww.ee.sunysb.eduzS z~cvlzSzPublicationszSzSurya_IJCV1994.pdf/ subbarao94depth.pdf> [retrieved on 2006-11-07]
WATANABE M ET AL: "Minimal operator set for passive depth from defocus" PROCEEDINGS 1996 IEEE COMPUTER SOCIETY CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CAT. NO.96CB35909) IEEE COMPUT. SOC. PRESS LOS ALAMITOS, CA, USA, 1996, pages 431-438, XP002406149 ISBN: 0-8186-7258-7
HIURA S ET AL: "Depth measurement by the multi-focus camera" COMPUTER VISION AND PATTERN RECOGNITION, 1998. PROCEEDINGS. 1998 IEEE COMPUTER SOCIETY CONFERENCE ON SANTA BARBARA, CA, USA 23-25 JUNE 1998, LOS ALAMITOS, CA, USA,IEEE COMPUT. SOC, US, 23 June 1998 (1998-06-23), pages 953-959, XP010291663 ISBN: 0-8186-8497-6
TSAI D-M ET AL: "A moment-preserving approach for depth from defocus - Explorations in the Microstructures of Cognition" PATTERN RECOGNITION, ELSEVIER, KIDLINGTON, GB, vol. 31, no. 5, 1 March 1998 (1998-03-01), pages 551-560, XP004115772 ISSN: 0031-3203
RAYALA J ET AL: "Estimation of depth from defocus as polynomial system identification" IEE PROCEEDINGS-VISION, IMAGE AND SIGNAL PROCESSING IEE UK, vol. 148, no. 5, October 2001 (2001-10), pages 356-362, XP006017597 ISSN: 1350-245X
Attorney, Agent or Firm:
WHITE, Andrew, G. et al. (Cross Oak Lane, Redhill Surrey RH1 5HA, GB)
Download PDF:
Claims:

CLAIMS

1. A method for measuring the distance (v) of an element (16) from a defined point (19) comprising obtaining (32) first and second images of the element (16), the object planes (20) of the first and second images differing from one another, locating (34) the element (16) in each image, and calculating (36) the distance (v) of the element (16) from the defined point (19) based upon the change in position of the element (16) between the first and second images and information relating to the different object planes (20) of the first and second images.

2. A method according to claim 1 , wherein the step of obtaining the first and second images of the element (16) is achieved by using a lens (12) and image recording device (14).

3. A method according to claim 2, wherein the step of obtaining first and second images of the element (16), with the object planes (20) of the first and second images differing from one another, is achieved by moving the lens (12) relative to the image recording device (14), between the obtaining of the first and second images.

4. A method according to claim 2, wherein the step of obtaining first and second images of the element (16), with the object planes (20) of the first and second images differing from one another, is achieved by changing the focal length of the lens, between the obtaining of the first and second images.

5. A method according to claim 3, wherein the information relating to the different object planes (20) of the first and second images comprises the distance (δx) that the lens (12) has moved relative to the image recording device (14), between the obtaining of the first and second images.

6. A method according to any preceding claim, wherein the change in position of the element (16) between the first and second images is measured as a change in the size of the element (16).

7. A method according to any one of claims 1 to 5, wherein the change in position of the element (16) between the first and second images is measured as a movement of the element (16) relative to a defined origin.

8. A method according to any preceding claim, wherein the element (16) being observed is a single point.

9. A method according to any one of claims 1 to 7, wherein the element (16) being observed is an object (16).

10. A method according to any preceding claim, wherein the method is repeated for multiple elements (16) within the first and second images.

11. Apparatus (8) for measuring the distance (v) of an element (16) from a defined point (19) comprising a detector (10) for obtaining first and second images of the element (16), the detector (10) arranged so that the object planes (20) of the first and second images differ from one another, and a processor (22) arranged to locate the element (16) in each image, and to calculate the distance (v) of the element (16) from the defined point (19) based upon the change in position of the element (16) between the first and second images and information relating to the different object planes (20) of the first and second images.

12. Apparatus according to claim 10, wherein detector (10) comprises a lens (12) and image recording device (14).

13. Apparatus according to claim 12, wherein the detector (10) being arranged so that the object planes (20) of the first and second images differ

from one another is achieved by moving the lens (12) relative to the image recording device (14), between the obtaining of the first and second images.

14. Apparatus according to claim 12, wherein the detector (10) being arranged so that the object planes (20) of the first and second images differ from one another is achieved by changing the focal length of the lens, between the obtaining of the first and second images.

15. Apparatus according to claim 13, wherein the information relating to the different object planes (20) of the first and second images comprises the distance (δx) that the lens (12) has moved relative to the image recording device (14), between the obtaining of the first and second images.

16. Apparatus according to any one of claims 11 to 15, wherein the change in position of the element (16) between the first and second images is measured as a change in the size of the element (16).

17. Apparatus according to any one of claims 11 to 15, wherein the change in position of the element (16) between the first and second images is measured as a movement of the element (16) relative to a defined origin.

18. Apparatus according to any one of claims 11 to 17, wherein the element (16) being observed is a single point.

19. Apparatus according to any one of claims 11 to 17, wherein the element (16) being observed is an object (16).

20. Apparatus according to any one of claims, wherein the apparatus is arranged to repeat the distance measurement for multiple elements (16) within the first and second images.

21. A computer program product on a computer readable medium for controlling a data processing system, the computer program product comprising instructions for measuring (32) the distance (v) of an element (16) from a defined point (19) comprising obtaining first and second images of the element (16), the object planes (20) of the first and second images differing from one another, locating (34) the element (16) in each image, and calculating (36) the distance (v) of the element (16) from the defined point (19) based upon the change in position of the element (16) between the first and second images and information relating to the different object planes (20) of the first and second images.

22. A computer program product according to claim 21 , wherein the step of obtaining the first and second images of the element (16) is achieved by using a lens (12) and image recording device (14).

23. A computer program product according to claim 22, wherein the step of obtaining first and second images of the element (16), with the object planes (20) of the first and second images differing from one another, is achieved by moving the lens (12) relative to the image recording device (14), between the obtaining of the first and second images.

24. A computer program product according to claim 22, wherein the step of obtaining first and second images of the element (16), with the object planes (20) of the first and second images differing from one another, is achieved by changing the focal length of the lens, between the obtaining of the first and second images.

25. A computer program product according to claim 23, wherein the information relating to the different object planes (20) of the first and second images comprises the distance (δx) that the lens (12) has moved relative to the image recording device (14), between the obtaining of the first and second images.

26. A computer program product according to any one of claims 21 to 25, wherein the change in position of the element (16) between the first and second images is measured as a change in the size of the element (16).

27. A computer program product according to any one of claims 21 to 25, wherein the change in position of the element (16) between the first and second images is measured as a movement of the element (16) relative to a defined origin.

28. A computer program product according to any one of claims 21 to 27, wherein the element (16) being observed is a single point.

29. A computer program product according to any one of claims 21 to 27, wherein the element (16) being observed is an object (16).

30. A computer program product according to any one of claims 21 to 29, wherein the method is repeated for multiple elements (16) within the first and second images.

Description:

DESCRIPTION

DEPTH IMAGING

This invention relates to depth imaging using object plane scan for extracting three-dimensional information.

Object detection and identification in images is becoming more and more important. An example of an application is found in the field of biometrics where face recognition is used for security or convenience. Obtaining information relating to the distance of objects from the camera makes the process of face detection more robust. Furthermore, it becomes possible to measure the exact three-dimensional surface-profile of someone's face, instead of a two-dimensional contour only, making the identification process more selective.

One known method for obtaining depth information involves using the changes in blurring of an image caused by changing the position of the lens or changing the focal length, which can be used for estimating depth. In "Depth Through Focus" by T. Bailey and A. Eliazar, available at http://www.cs.duke.edu/~'eliazar/vision/. a method is described that uses a sequence of images of the same scene, each with a different setting of the lens, to determine the depth. For each pixel in each of the images taken a measure of sharpness is calculated, and the sharpness of a pixel across the images is used to estimate the distance of that pixel. So, for example, if a particular pixel is found to be sharpest in an image with an object plane of .5m, then this is the estimated depth of that pixel.

This method of calculating the depth of an object has a number of weaknesses. The method can only deliver an estimate of the depth of an object, according to the number of images that are obtained. If a large number of images are obtained, then a heavy processing requirement is needed, in which every pixel in every image requires processing. In environments where the object being imaged is not stationary, then there is also a limit on the

number of images that can be obtained without errors creeping in to the method caused by the movement of objects within the area being imaged.

It is therefore an object of the invention to improve upon the known art. According to a first aspect of the present invention, there is provided a method for measuring the distance of an element from a defined point comprising obtaining first and second images of the element, the object planes of the first and second images differing from one another, locating the element in each image, and calculating the distance of the element from the defined point based upon the change in position of the element between the first and second images and information relating to the different object planes of the first and second images.

According to a second aspect of the present invention, there is provided an apparatus for measuring the distance of an element from a defined point comprising a detector for obtaining first and second images of the element, the detector arranged so that the object planes of the first and second images differ from one another, and a processor arranged to locate the element in each image, and to calculate the distance of the element from the defined point based upon the change in position of the element between the first and second images and information relating to the different object planes of the first and second images.

According to a third aspect of the present invention, there is provided a computer program product on a computer readable medium for controlling a data processing system, the computer program product comprising instructions for measuring the distance of an element from a defined point comprising obtaining first and second images of the element, the object planes of the first and second images differing from one another, locating the element in each image, and calculating the distance of the element from the defined point based upon the change in position of the element between the first and second images and information relating to the different object planes of the first and second images.

Owing to the invention, it is possible to provide a method of measuring the distance of an element from, for example, an imaging device, in a simple and efficient manner. The distance can be measured accurately in a relatively short period of time, thereby reducing errors caused by any movement of the element being imaged. The accuracy of the depth measurement is limited by the number of pixels in the images obtained. Since detectors are readily available with very large numbers of effective pixels, in most practical situations the depth is measured without limit on the accuracy for that application. The depth of several different elements can be calculated, with only two different images being acquired. Changes in magnification are used instead of blurring, and only a single image sensor and no prisms are needed.

The distance of the element (such as an eye in a face), for which data is being obtained, is measured from a defined point. This defined point may be an absolute physical point, such as the lens or image recording device of the detector that acquires the images, or it can be a relative point, such as another element within the image being processed (such as a chin in a face). Relative distances between multiple elements can also be measured.

The apparatus incorporates a scanning of the object plane by an imaging system, and a corresponding change of the principal planes of the imaging lens, in order to extract depth information of an object, positioned in front of the camera's objective lens. Such a scanning of the object plane can be achieved by using a fast responding lens such as lens that uses fluids for focussing. Such lenses are described in, for example, WO 03/0693890. Since the settings of such a lens can be changed very fast (of the order of 1 ms) the scene can be considered stationary. The extracted depth information can be used to facilitate the detection and/or identification process of objects, in, for example, face recognition in a biometric system. Other applications for the invention are possible in the fields of robotics and vision.

Preferably, the step of obtaining the first and second images of the element is achieved by using a lens and image recording device. This is the simplest method of obtaining the two images, which can be captured as digital images and processed to obtain the depth information.

Advantageously, the step of obtaining first and second images of the element, with the object planes of the first and second images differing from one another, is achieved by moving the lens relative to the image recording device, between the obtaining of the first and second images. In systems using a lens made from a solid material such as glass, then the change in the object plane between the capturing of the images is achieved by moving the lens relative to the image recording device.

Ideally, the information relating to the different object planes of the first and second images comprises the distance that the lens has moved relative to the image recording device, between the obtaining of the first and second images. This measurement of the distance moved by the lens relative to the image recording device is used in the ultimate calculation of the distance of the observed element from the defined point (normally the image recording device). Alternatively, the step of obtaining first and second images of the element, with the object planes of the first and second images differing from one another, is achieved by changing the focal length of the lens, between the obtaining of the first and second images. In those systems that use a fluid focus lens, then the change in the object plane between the capturing of the images is achieved by a change in the focal length of the lens.

Preferably, the change in position of the element between the first and second images is measured as a change in the size of the element, or the change in position of the element between the first and second images is measured as a movement of the element relative to a defined origin. The element in the images that is being processed (to ascertain its distance from the defined point) will change in appearance between the two images. This change can either be considered as a change in size or as a change in movement relative to a defined point, which, if necessary, can be used to calculate the change in size of the element between the images. Ideally the element being observed is a single point or the element being observed is an object. The element being processed in the images could be a single point identified by shading or colour or could be a specific object,

such as, for example, an eye in a face, which can be easily identified. Using either a point or an object an accurate measure of distance can be obtained from the processing of the two images.

In a preferred embodiment of the system, the distance measuring method is repeated for multiple elements within the first and second images. A device or system operating the distance measuring method will be able to repeat the distance measurement for multiple elements within the two images, without the need for obtaining any further images, and therefore build a distance map for different elements within the images, which can be useful in many applications.

Embodiments of the present invention will now be described, by way of example only, with reference to the accompanying drawings, in which:-

Figure 1 shows three schematic side views of an apparatus for measuring the distance of an object from a point,

Figure 2 is a schematic view of the apparatus of Figure 1 , Figure 3 is a flowchart of a method for measuring the distance of an element from a point, Figure 4 is a schematic view showing the change in position of an object between a first and second image, and

Figure 5 is a schematic view similar to Figure 4, showing the change in position of a second object between a first and second image.

In Figure 1 , a detector 10 comprises a lens 12 and an image recording device 14, which form part of an apparatus for measuring the distance of an object 16 from the lens 12 (other components of the apparatus such as a processor and user interface are omitted for reasons of clarity). An optical axis 18 is shown in the Figure, which passes from the object 16 through the lens 12 and image recording device 14 (such as a conventional CCD used in a digital camera). The apparatus is designed to measure the distance of the object 16

from a defined point 19, which is the intersection of the optical axis 18 with the lens 12.

The relationship between the lens 12 and image recording device 14 define an object plane 20, and the three views shown in Figure 1 each have a different object plane 20. In the top representation, the object 16 lies on the object plane 20 and the object 16 is correctly imaged on the image recording device 14 (in focus). The middle and lower representations have object planes

20 that result in the object 16 being out of focus at the image recording device

14, although the object is still detectable by the device 14. The method uses a variable focus system, where the change in position of the object 16 at the image recording device 14 is measured as a function of the principal plane displacement.

Although Figure 1 shows the object 16 being imaged at the image recording device 14 three different times, only two images are needed by the image recording device 14 to make a calculation of the distance of the object

16 from the defined point 19. It is also the case that there is no requirement for the object 16 to be in focus in either of the two images captured by the image recording device 14. Two out of focus images of the object 16 could be used, with in both images the object plane 20 not coinciding with the object 16. It is sufficient that the object plane 20 is different between each image to enable an accurate calculation of the distance of the object 16 from the lens 12 to be made.

By scanning the object plane by changing the position of the lens 12 the object 16 is imaged out of focus at the image recording device 14. However, by measuring the change in magnification of the image, as a function of object plane shift, it is possible to obtain information on the distance of the object 16 with respect to the detector 10.

In the top and middle representations of the apparatus shown in Figure

1 various measurements are shown on the views. The variable v is the unknown distance of the object 16 from the lens 12 which is to be calculated.

Distance b is the displacement between the lens 12 and the image recording device 14 and y is the actual size of the object 16 as detected by the image

recording device 14. In addition, on the middle representation, δx is the distance that the lens 12 has moved in between the acquiring of the first and second images of the object 16 and δy is the change in size of the object 16 as recorded by the image recording device 14. These measurements can be expressed in the following formulae:

y v + δx b

Rearranging the formula to express it in terms of the object depth v (which is the unknown to be calculated), the formula can be expressed as:

&c b

By plotting the measured quantity δy/y as a function of δx, it is possible to determine a value for v, where b is the nominal distance from lens to detector. However this short cut becomes inaccurate for large object distances v. The formulae have been derived using paraxial Gaussian optics. In practice the optical design of the imaging system is known and its parameters can be used in order to obtain a more accurate relation between the change in magnification and the corresponding change in principal planes position.

As mentioned above this method only works when the position of the principal planes (the object plane 20) of the imaging system is being changed.

This can be achieved either by a direct lateral displacement of the lens, or by a change in focal length (by changing the radii of curvature of the lens surfaces in a fluid focus lens system)

Figure 2 shows a schematic view of the apparatus 8 for measuring the distance of an element from the apparatus 8. The detector 10 is connected to a processor 22. The processor 22 is arranged to receive the first and second

images from the image recording device 14 and to locate the object 16 in those images. The processor 22 is also arranged to calculate the distance of the object 16 for the apparatus, based on the formula for v listed above. The processor 22 is connected to a user interface 24, which will comprise a keypad and display.

The arrangement of the apparatus 8 shown in Figure 2 is of a self contained device, although in many situations, the apparatus 8 will contain only the detector 10 and will be connected (wired or wirelessly) to a remote station that will comprise the processor 22 and user interface 24. In such an arrangement the processor 22 and user interface 24 could be a standard PC connected to a plurality of sensors.

Figure 3 summarises the method carried out by the apparatus 8, being the steps of obtaining 32 the first and second images of the element 16, the object planes 20 of the first and second images differing from one another, locating 34 the element 16 in each image, and calculating 36 the distance (v) of the element 16 from a defined point 19. The calculation of the distance (v) is based upon the change in position of the element 16 between the first and second images and information relating to the different object planes 20 of the first and second images. It is not essential that the processing of the images must wait until both the first and second images have been acquired. Once the first image has been obtained, processing such as the identification of the element in the image can begin.

Figures 4 and 5 give examples of elements as they would be located in the two different images captured by the apparatus. In Figure 4, the element 16 is shown in a first position located within the first image and then shown again as the element 16' as it would be located in the second image. As can be seen from Figure 1 δx is a positive value when the lens 12 is moved towards the image recording device 14, and this results in an smaller image of the object 16 being received by the image recording device 14. Therefore δy is a negative value and y + δy < y. This is illustrated in Figure 4, where it can be seen that the object 16 (y) is larger than the object 16' (y + δy).

It will be appreciated by the skilled reader that the choice of δx as a positive or negative value is a matter of convention, and, for example, movement of the lens 12 away from the image recording device 14 could be defined as creating a positive δx. However the formulae in this document are consistent with a positive δx being a movement of the lens 12 towards the image recording device 14 which creates a negative δy in the change in size of the object 16 on the image recording device 14.

Figure 5 shows a view similar to Figure 4 of an object that can be seen in the first image as the object 16 and then in the second image as the object 16'. In this example the object 16 is located across the optical axis of Figure 1 (which would be a z axis through the origin in Figure 5). However, the change in position of the object 16 does not change the calculation of the values of y and δy.

If δx = 1 mm and b = 3mm, then for some sample values of δy/y, v can be calculated as follows:

For large values of v there is an asymptotic behaviour of δy/y, as illustrated in the graph below:

-bc/b

As can be appreciated, this places limits on the distances that can be measured effectively, but in a range of 5 to 30cm, the apparatus described above will function as a very accurate and fast device for measuring the depth of elements in an image.