Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
OPTICAL SYSTEM
Document Type and Number:
WIPO Patent Application WO/2012/038601
Kind Code:
A1
Abstract:
A first camera and a second camera are integrated together in a common case. Each camera comprises a detecting component and an omnidirectional optical component for directing optical radiation from the environment towards the detecting component by four optical surfaces each omnidirectional optical component having a first side, a second side and a circumferential side. Detecting components transform optical images formed through the omnidirectional optical components on the detecting components into electrical signals and feed the electrical signals to the image processing unit. The image processing unit forms stereoscopic information on the environment on the basis of the electrical signals from different detecting components. The user interface presents information to a user on the basis of the stereoscopic information.

Inventors:
AIKIO, Mauri (Sadepisara 4, Kempele, FI-90450, FI)
NEDEVSCHI, Sergiu (C. Daicoviciu no. 15, Cluj-Napoca, R-400020, RO)
Application Number:
FI2011/050812
Publication Date:
March 29, 2012
Filing Date:
September 21, 2011
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
TEKNOLOGIAN TUTKIMUSKESKUS VTT (Vuorimiehentie 3, Espoo, FI-02150, FI)
AIKIO, Mauri (Sadepisara 4, Kempele, FI-90450, FI)
NEDEVSCHI, Sergiu (C. Daicoviciu no. 15, Cluj-Napoca, R-400020, RO)
International Classes:
G02B13/06; G02B17/08; G03B35/08; G03B37/00
Domestic Patent References:
WO2004042428A22004-05-21
Foreign References:
KR20090095761A2009-09-10
EP2059058A22009-05-13
US20100110565A12010-05-06
EP2172798A12010-04-07
JP2009015253A2009-01-22
CN101414054A2009-04-22
Attorney, Agent or Firm:
KOLSTER OY AB (Iso Roobertinkatu 23, P.O.Box 148, Helsinki, FI-00121, FI)
Download PDF:
Claims:
Claims

1 . An apparatus comprising: a first camera and a second camera integrated together in a common case, an image processing unit and a user interface;

each camera comprising a detecting component and an omnidirectional optical component for directing optical radiation coming from environment of the omnidirectional optical component towards the detecting component;

each omnidirectional optical component having a shape of a disc with a first side, a second side and a circumferential side between the first side and the second side being integrated together such that the first sides, the second sides or the first side and the second side of the omnidirectional optical components face each other;

each omnidirectional optical component having four optical surfaces of which a receiving surface in the circumference side and an outputting surface in the middle of the second side having a concave curvature for refracting optical radiation, and two remaining optical surfaces being configured to reflect optical radiation inside the omnidirectional component;

detecting components of the cameras being configured to transform optical images formed through the omnidirectional optical components on the detecting components into electrical signals and feed the electrical signals to the image processing unit;

the image processing unit being configured to form stereoscopic information on the environment on the basis of the electrical signals from differ- ent detecting components; and

the user interface being configured to present information on the basis of the stereoscopic information.

2. The apparatus of claim 1 , wherein a first reflecting surface of the optical surfaces on the first side in the middle of the omnidirectional optical component having a convex curvature, and a second reflecting surface of the optical surfaces being placed on the second side around the outputting surface.

3. The apparatus of claim 1 , wherein the concave receiving surface and/or the first convex reflecting surface having freeform shapes.

4. The apparatus of claim 1 , wherein an optical axis () of one omnidirectional optical component being configured to pass through a center of another omnidirectional optical component.

5. The apparatus of claim 1 , wherein the cameras are configured to form video image of the environment.

6. The apparatus of claim 1 , wherein the second side of one omnidirectional optical component being configured to face the second side of another omnidirectional optical component.

7. A vehicle, the vehicle comprising the apparatus of claim 1 , the image processing unit being configured to form on the basis of the stereoscopic information at least one of the following information: a distance between the vehicle and an object in the environment, a speed of the vehicle, a speed of the object, and the user interface being configured to present the information.

8. The vehicle of claim 7, wherein the image processing unit being configured to estimate on the basis of the stereoscopic information a future state of at least one of the following: a distance between the vehicle and an object in the environment, a speed of the vehicle, a speed of the object, and the user interface being configured to present the information.

9. The vehicle of claim 7, wherein the vehicle comprising at least one actuator for controlling the movement of the vehicle, and a controller, the controller being configured to receive the stereoscopic information from the image processing unit and control the at least one actuator on the basis of the stereoscopic information.

10. A method comprising: directing,

- in two cameras each having an omnidirectional optical component of a shape of a disc with a first side, a second side and a circumferential side between the first side and the second side,

- optical radiation from environment through the omnidirectional optical components, the cameras being integrated together such that the first sides, the second sides or the first side and the second side of the omnidirectional optical components face each other,

- towards detecting components, - by four optical surfaces of which a receiving surface in the circumference side and an outputting surface in the middle of the second side having a concave curvature for refracting optical radiation, and two remaining optical surfaces being configured to reflect optical radiation in- side the omnidirectional component;

transforming, by the detecting components of the cameras, optical images formed through the omnidirectional optical components on the detecting components into electrical signals;

feeding the electrical signals from the detecting components to the image processing unit;

forming, by the image processing unit, stereoscopic information on the environment on the basis of the electrical signals from different detecting components; and

presenting, by the user interface, information on the basis of the ste- reoscopic information.

Description:
Optical system Field

The exemplary and non-limiting embodiments of this invention relate generally to an optical system for forming wide field of view stereoscopic imag- es.

Background

Stereoscopic images may be formed by two cameras which are apart from each other. To have a stereoscopic image of wide field of view, each of the two cameras should have a suitable optics for such a purpose. Alt- hough an omnidirectional stereoscopic image would be desirable, current fish- eye cameras cannot have a wider field of view than about 120° - 160°. Many applications require the entire 360° to be observable which means that at least three stereoscopic camera systems have to be used or the cameras has to be rotating.

A plurality of cameras or a rotating system is large and complicated, produces a lot of data, and requires a lot image processing capacity.

Brief description

According to an aspect of the present invention, there is provided an apparatus as specified in claim 1 .

According to another aspect of the present invention, there is provided a method as specified in claim 10.

Preferred embodiments of the invention are disclosed in the dependent claims.

The invention provides advantages. The optical system is simple and it may be made compact. The optical system provides image data with high efficiency which is easy to process.

List of drawings

Embodiments of the present invention are described below, by way of example only, with reference to the accompanying drawings, in which

Figure 1 shows the general architecture of the optical system;

Figure 2 shows a camera;

Figure 3 presents the optical surfaces of the omnidirectional optical component; Figure 4 presents two omnidirectional optical components having second sides facing each other;

Figure 5 shows the optical system placed in a vehicle;

Figure 6 shows a vehicle with the optical system, actuator and a controller; and

Figure 7 presents a flow chart of the method.

Detailed description of some embodiments

Exemplary embodiments of the present invention will now be described more fully hereinafter with reference to the accompanying drawings, in which some, but not all embodiments of the invention are shown. Although the specification may refer to "an", "one", or "some" embodiment(s) in several locations, this does not necessarily mean that each such reference is to the same embodiment(s), or that the feature only applies to a single embodiment. Single features of different embodiments may also be combined to provide other em- bodiments. Therefore, all words and expressions should be interpreted broadly and they are intended to illustrate, not to restrict, the embodiment.

A general architecture of an optical system of the apparatus is illustrated in Figure 1 which is a simplified representation and shows only some elements and functional entities. The implementation of the apparatus may differ from what is shown.

The apparatus may comprise a first camera 100 and a second camera 102, an image processing unit 106 and a user interface 108. The user interface 108 may also comprise or may be coupled to a control means similar to those illustrated in Figure 6 by reference number 602. The control means may include brake pedal and/or steering wheel or the control means may be automatic. The cameras may be integrated together and they may reside in a common case 104. The case may be made of plastic or metal, for example. The case 104 may have an optical window 1 10 for imaging. The cameras 100, 102 may have at least nearly similar structure and they may operate at least nearly identically.

Figure 2 shows a camera. A camera 100, 102 may comprise a detecting component 222 which may comprise a matrix of pixels and an omnidirectional optical component 220 which may direct optical radiation coming from environment of the omnidirectional optical component 220 towards the detect- ing component 222. In general, a pixel is an element in a matrix of elements which may parts of a detecting element or a screen. Between the omnidirec- tional optical component 220 and the detecting component 222 there may be an imaging lens system 223 which may be used to further correct aberrations in the image. The imaging lens 223 may be aspheric. The detecting component 222 may be a CCD cell (Charge Coupled Device) or a CMOS cell (Com- plementary Metal Oxide Semiconductor), for instance.

The optical radiation is a band of electromagnetic radiation from about 10nm to about 50000nm i.e. it comprises bands of ultraviolet light, visible light and infrared light. However, even a single wavelength or any band in said range can be called optical radiation.

The omnidirectional optical component 220 may be well attached the frame and/case 104 of the optical system. In that way, the omnidirectional optical component 220 may protect the detecting component 222 and other parts of the optical system from dust and dirt, and hence the optical system does not necessarily need any external protection.

The omnidirectional optical component 220 may generally have a shape resembling a disc which has a first side 226, a second side 228 and a circumferential surface 230 between the first side 226 and the second side 228. The shape of the disc may be understood topologically and hence the disc may be curved and/or stretched. The shape may resemble a disc curved into a form resembling the letter V, for example. An optical axis 232 of the optical radiation directed towards the detecting component 222 may go through a center 236 of the omnidirectional optical component 220.

The cameras 100, 102 may be integrated together in the direction of optical axes 232, 234 such that an optical axis 232 of one omnidirectional opti- cal component 220 may pass through another omnidirectional optical component 218. Since the cameras 100, 102 may be identical, the components in the different cameras 100, 102 have been numbered using the same numbers.

Figure 3 presents the optical surfaces of the omnidirectional optical component. The omnidirectional optical component 220 may have four optical surfaces 300, 302, 304 and 306. A receiving surface 300 in the circumference surface 230 and an outputting surface 302 in the middle of the second side 228 may have a concave curvature for refracting optical radiation. The two remaining optical surfaces 304, 306 may reflect optical radiation inside the omnidirectional component 220. A first reflecting surface 304 may be on the first side 226 in the middle of the omnidirectional optical component 220. The first reflecting surface 304 may also have a concave curvature, and its reflection may be based on specular reflection. A second reflecting surface 306 being placed on the second side 228 around the outputting surface 302.

The omnidirectional optical components 218, 220 may be identical with identical sides and optical surfaces. The omnidirectional optical compo- nent 218, 220 may be made of plastic or glass, and it may have a diameter of less than one centimeter to few centimeters and its thickness may be a few millimeters to a couple of centimeters, for example. The potential for a small size enables an extraordinary compact size for the whole optical system.

When beams of optical radiation approach the omnidirectional opti- cal component 218, 220, they may pass through the receiving surface 300 which may diverge the beams, the divergence taking place in the direction of the optical axis 232, 234 due to the concave curvature of the surface. The direction of the optical axis 232, 234 may be considered the same as the direction of a normal of a flat disc corresponding to the omnidirectional optical com- ponent 218, 220. The angular aperture a in which the omnidirectional optical component 220 may receive optical radiation may be about 20°. More generally the angular aperture may be 10° to 60°, for instance. In the transverse direction with respect to the optical axis 232, 234, the beams converge due to the convex curvature of the surface 300 since the surface 300 forms a circumfer- ence of a disc.

When the beams of optical radiation have entered the omnidirectional optical component 218, 220, they may reflect from the second reflecting surface 306 towards the first reflecting surface 304 inside the omnidirectional optical component 218, 220. The beams of optical radiation reflected from the second reflecting surface 306 may diverge in the transverse direction with respect to the optical axis 232, 234 thus at least partly compensating the convergence in the receiving surface 300. Similarly, the reflection from the second reflecting surface 306 diverges the beams, the diverging taking place in the direction of the optical axis 232, 234 due to the convex curvature of the second reflecting surface 306.

The first reflecting surface 304 diverges the beams due to the convex curvature of the surface. The beams of optical radiation reflected from the first reflecting surface 304 may travel toward the outputting surface 302 which may pass them towards the detecting component 222. The outputting surface 302 may diverge the beams in the direction of the optical axis 232, 234 due to the concave curvature of the surface. The concave receiving surface 300 and/or the first convex reflecting surface 304 may have a freeform shape. The freeform means a form or shape deviating from a spherical or even aspheric curvature which has been conventional in optics. The freeform shape may be asymmetrical.

The detecting components 222 of the cameras 100, 102 may transform optical images formed through the omnidirectional optical components 218, 220 on the detecting components 222 into electrical signals and they may feed the electrical signals to the image processing unit 106. The cameras 100, 102 may form still images of the environment at a regular rate. The rate may be an image per one second. Alternatively, the cameras 100, 102 may form video image of the environment the rate of which may a conventional video rate. The frame rate of the video images may be from a few images per second to thousands of images per second, for example. The frame rate of the video images is not limited to that, however.

The image processing unit 106 may form stereoscopic information on the environment on the basis of the electrical signals from different detecting components 222. The stereoscopic information may also be called 3D (three dimensional) information, and the potential images thus formed may be called stereoscopic images or 3D images. Because the there is a physical dis- tance L between the omnidirectional optical components 218, 220, the image of different cameras 100, 102 are taken at a different angle with respect to an object in the environment. The difference in the angle may result in a shift of objects with respect to each other in the images of different cameras 100, 102, and the sorts of deviations associated with images of different cameras 100, 102 may be used to form stereoscopic information on the environment.

The user interface 108 may comprise a screen for presenting information on the basis of the stereoscopic information to a user. The stereoscopic information may comprise information on the environment in three dimensions.

Figure 4 presents an embodiment where the integration of the cam- eras 100,102 together may be realized by making the second side 228 of one omnidirectional optical component 220 to face the second side 228 of another omnidirectional optical component 218. In general, the cameras 100,102 may be integrated together such that the first sides 226, the second sides 228 or the first side 226 and the second side 228 of the omnidirectional optical com- ponents 218, 220 face each other. Figure 5 presents an embodiment, where the stereo camera 500 explained above may be placed in a vehicle 502. The cameras may be placed on the roof of the vehicle, for instance. The cameras may be on the top of each other such that the cameras 100, 102 have a vertical displacement which is different from the usual horizontal displacement. The vertical displacement makes it possible for the cameras have a true omnidirectional imaging since the cameras do not see each other. Instead of calling the displacement vertical, it may be expressed that the cameras 100, 102 are physically separated in the direction of optical axis 232, 234.

The image processing unit 106 may form on the basis of the stereoscopic information at least one of the following information: a distance D between the vehicle 502 and an object 504 in the environment, a velocity V1 of the vehicle 502, a velocity V2 of the object 504. The user interface 108 may present the information. The distance D may be expressed in a scalar form in a vector form giving information on the direction between the stereo camera 500 and the vehicle 502. Similarly, the velocities V1 , V2 may include a speed in scalar form and/or a direction of the speed. That is, the velocities V1 , V2 may be formed and presented in a vector form or in a scalar form.

The image processing unit 106 may estimate on the basis of the stereoscopic information a future state of at least one of the following: a distance Df between the vehicle 502 and an object 504 in the environment, a velocity V1f of the vehicle 502, a velocity V2f of the object 504. The user interface 108 may present the information.

Figure 6 presents an embodiment, where the vehicle 502 comprises at least one actuator 600 for controlling the movement of the vehicle 502, and a controller 602. The controller 602 may receive the stereoscopic information from the image processing unit 106 and control the at least one actuator 600 on the basis of the stereoscopic information. The actuator 600 may be a braking system, for example. Alternatively or additionally, the actuator 600 may be a steering system or the like. Hence, if there is a danger that a person 504 will be run over by the vehicle 502, the controller 602 may command brakes to stop the vehicle 502 before a contact between the person 504 and the vehicle 502 without an input by the user. The communication between the stereo camera 500, the image processing unit 106, the controller 602 and the actuator 600 may be performed through a wire or wirelessly.

The image processing unit 106 may form stereoscopic information on the basis of images from different cameras by a suitable computer program. Such a computer program may be based on a Semi-global Matching (SMG) method or some modification thereof, for instance.

Figure 7 presents a flow chart of the method. In step 700, optical ra- diation is directed towards detecting components 222. The directing takes place in two cameras 100, 102 each having an omnidirectional optical component 218, 220 of a shape of a disc with a first side 226, a second side 228 and a circumferential side 230 between the first side 226 and the second side 228. The optical radiation comes from the environment through the omnidirectional optical components 218, 220, the cameras 100,102 being integrated together such that the first sides 226, the second sides 228 or the first side 226 and the second side 228 of the omnidirectional optical components 218, 220 face each other. The directing is performed by four optical surfaces 300, 302, 304, 306 of which a receiving surface 300 in the circumference side 230 and an outputting surface 302 in the middle of the second side 228 having a concave curvature for refracting optical radiation, and two remaining optical surfaces being configured to reflect optical radiation inside the omnidirectional component 218, 220.

In step 702, optical images formed through the omnidirectional opti- cal components 218, 220 on the detecting components 222 are transformed into electrical signals by the detecting components 222 of the cameras 100, 102.

In step 704, the electrical signals are fed from the detecting components 222 to the image processing unit 106.

In step 706, stereoscopic information on the environment is formed on the basis of the electrical signals from different detecting components 218, 220 by the image processing unit 106.

In step 708, information is presented on the basis of the stereoscopic information to a user by the user interface 108.

The image processing unit 106 and the controller 602 may comprise a processor and memory. The memory may include volatile and/or non- volatile memory and typically stores content, data, or the like. For example, the memory may store computer program code such as software applications or operating systems, information, data, content, or the like for the processor to perform steps associated with operation of the apparatus in accordance with embodiments. The memory may be, for example, random access memory (RAM), a hard drive, or other fixed data memory or storage device. Further, the memory, or part of it, may be removable memory detachably connected to the apparatus.

The techniques described herein may be implemented by various means. Th e softwa re cod es may be stored in a ny su itable , processor/computer-readable data storage medium(s) or memory unit(s) or article(s) of manufacture and executed by one or more processors/computers. The data storage medium or the memory unit may be implemented within the processor/computer or external to the processor/computer, in which case it can be communicatively coupled to the processor/computer via various means as is known in the art.

The presented solution may be used as a part of a surveillance system inside or outside.

The presented solution may be (a part of) a system helping a driver of a vehicle like a car to notice obstacles in the road, find a way through obstacles, determine where is the road or a drivable way, detect a lane in the absence of lane markings etc. The optical system may alarm the user about a danger or problems in the environment.

The optical system may be placed or moved in a pipe or a duct for inspecting the pipe or the duct.

It will be obvious to a person skilled in the art that, as technology advances, the inventive concept can be implemented in various ways. The invention and its embodiments are not limited to the examples described above but may vary within the scope of the claims.