Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
TIME-TO-COLLISION ESTIMATION METHOD AND SYSTEM
Document Type and Number:
WIPO Patent Application WO/2015/010320
Kind Code:
A1
Abstract:
A time-to-collision estimation method is provided. The method includes: obtaining a first image captured by a camera at a first time point and a second image captured by the camera at a second time point; selecting a point on an object in the first image; identifying the point in the second image; and calculating a first collision time based on a time interval between the first and second time points, a vertical position of the first point in the first image and a vertical position of the first point in the second image. No extrinsic parameter of the camera is required, thereby improving the efficiency.

Inventors:
JIANG RUYI (CN)
ZHANG YANKUN (CN)
HONG CHUYANG (CN)
ZHOU YUN (CN)
Application Number:
PCT/CN2013/080182
Publication Date:
January 29, 2015
Filing Date:
July 26, 2013
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
HARMAN INT IND (US)
JIANG RUYI (CN)
ZHANG YANKUN (CN)
HONG CHUYANG (CN)
ZHOU YUN (CN)
International Classes:
G08G1/16
Foreign References:
US20100305857A12010-12-02
US7711147B22010-05-04
US20060008120A12006-01-12
CN101488290A2009-07-22
Attorney, Agent or Firm:
UNITALEN ATTORNEYS AT LAW (Scitech PlaceNo.22, Jian Guo Men Wai Ave., Chao Yang District, Beijing 4, CN)
Download PDF:
Claims:
We Clai m :

1 . A time-to-collision estimation method, comprising: obtaining a first image captured by a camera at a first time point and a second image captured by the camera at a second time point; selecting a first point on an object in the first image; identifying the first point in the second image; and calculating a first collision time based on a time interval between the first and second time points, a vertical position of the first point in the first image and a vertical position of the first point in the second image.

2. The method according to claim 1 , wherein the first collision time is calculated based on the time interval, a distance from the first point to a first vanishing line in the first image and a distance from the first point to a second vanishing line in the second image.

3. The method according to claim 1 , further comprising: selecting a second point in the first image; identifying the second point in the second image; calculating a second collision time based on the time interval, vertical positions of the second point in the first and second images; and calculating an average value of the first and second collision times as a final collision time.

4. The method according to claim 2, wherein the first collision time is calculated based on Equation (1 ):

T = di * At / ( di - d2 ) Equation (1 ), where T represents the first collision time, di represents the distance from the first point to the first vanishing line in the first image, d2 represents the distance from the first point to the second vanishing line in the second image, and At represents the time interval.

5. The method according to claim 1 , wherein selecting the first point in the first image comprises: identifying an object in the first image; and selecting a point on the identified object as the first point.

6. The method according to claim 1 , further comprising: comparing the calculated first collision time with a predetermined threshold; and generating an alarm if the calculated first collision time is less than the threshold.

7. A time-to-collision estimation system, comprising a processing device configured to: select a first point on an object in a first image captured by a vehicle mounted camera at a first time point; identify the first point in a second image captured by the camera at a second time point; and calculate a first collision time based on a time interval between the first and second time points, a vertical position of the first point in the first image and a vertical position of the first point in the second image.

8. The system according to claim 7, wherein the processing device is configured to calculate the first collision time based on the time interval, a distance from the first point to a first vanishing line in the first image and a distance from the first point to a second vanishing line in the second image.

9. The system according to claim 7, wherein the processing device is further configured to: select a second point in the first image; identify the second point in the second image; calculate a second collision time based on the time interval, vertical positions of the second point in the first and second images; and calculate an average value of the first and second collision times as a final collision time.

10. The system according to claim 8, wherein the processing device is configured to calculate the first collision time based on Equation (1 ):

T = di * At / ( di - d2 ) Equation (1 ), where T represents the first collision time, di represents the distance from the first point to the first vanishing line in the first image, 62 represents the distance from the first point to the second vanishing line in the second image, and At represents the time interval.

1 1 . The system according to claim 7, wherein the processing device is configured to select the first point in the first image by identifying an object in the first image and selecting a point on the identified object as the first point.

12. The system according to claim 7, further comprising an output device, where the processing device is further configured to: compare the calculated first collision time with a predetermined threshold; and control the output device to generate an alarm if the calculated first collision time is less than the threshold.

13. A time-to-collision estimation system, comprising: means for selecting a first point on an object in a first image captured by a camera at a first time point; means for identifying the first point in a second image captured by the camera at a second time point; and means for calculating a first collision time based on a time interval between the first and second time points, a vertical position of the first point in the first image and a vertical position of the first point in the second image.

Description:
TIME-TO-COLLISION ESTIMATION METHOD AND SYSTEM

TECHNICAL FIELD

[0001] The present disclosure generally relates to a time-to-collision estimation method and system.

BACKGROUND

[0002] Nowadays, time-to-collision (TTC) estimation methods have been emerged in driving assistance systems to improve driving safety. In some solutions, TTC is estimated based on the scale variation of a lead vehicle's width. In some solutions, TTC is estimated based on object distance from a camera which distance is calculated using the camera's extrinsic parameters.

SUMMARY

[0003] According to one embodiment, a time-to-collision (TTC) estimation method is provided. The method may include: obtaining a first image captured by a camera at a first time point and a second image captured by the camera at a second time point; selecting a first point on an object in the first image; identifying the first point in the second image; and calculating a first collision time based on a time interval between the first and second time points, a vertical position of the first point in the first image and a vertical position of the first point in the second image.

[0004] In some embodiments, the first collision time may be calculated based on the time interval, a distance from the first point to a first vanishing line in the first image and a distance from the first point to a second vanishing line in the second image.

[0005] In some embodiment, the method may further include: selecting a second point in the first image; identifying the second point in the second image; calculating a second collision time based on the time interval, vertical positions of the second point in the first and second images; and calculating an average value of the first and second collision times as a final collision time.

[0006] In some embodiments, a plurality of collision times may be calculated, and the final collision time may be an average value of the plurality of collision times.

[0007] In some embodiments, the first collision time may be calculated based on Equation (1 ):

T = di * At / ( di - d 2 ) Equation (1 ), where T represents the first collision time, di represents the distance from the first point to the first vanishing line in the first image, d 2 represents the distance from the first point to the second vanishing line in the second image, and At represents the time interval.

[0008] In some embodiments, selecting the first point in the first image may include: identifying an object in the first image; and selecting a point on the identified object as the first point. The object may be moving or still, and altered based on specific scenarios, for example, a vehicle, a pedestrian, a static obstacle, etc. There are various solutions for identifying such an object in an image, such as structure from motion based methods and optical flow based methods.

[0009] In some embodiments, the method may further include: comparing the calculated first collision time with a predetermined threshold; and generating an alarm if the calculated first collision time is less than the threshold.

[0010] According to one embodiment, a time-to-collision estimation system is provided. The system may include a processing device configured to: select a first point on an object in a first image captured by a vehicle mounted camera at a first time point; identify the first point in a second image captured by the camera at a second time point; and calculate a first collision time based on a time interval between the first and second time points, a vertical position of the first point in the first image and a vertical position of the first point in the second image.

[0011] In some embodiments, the processing device may be configured to calculate the first collision time based on the time interval, a distance from the first point to a first vanishing line in the first image and a distance from the first point to a second vanishing line in the second image.

[0012] In some embodiments, the processing device may be further configured to: select a second point in the first image; identify the second point in the second image; calculate a second collision time based on the time interval, vertical positions of the second point in the first and second images; and calculate an average value of the first and second collision times as a final collision time.

[0013] In some embodiments, the processing device may be configured to calculate a plurality of collision times, and the final collision time may be an average value of the plurality of collision times.

[0014] In some embodiments, the processing device may be configured to calculate the first collision time based on Equation (1 ):

T = di * At / ( di - d 2 ) Equation (1 ), where T represents the first collision time, di represents the distance from the first point to the first vanishing line in the first image, d 2 represents the distance from the first point to the second vanishing line in the second image, and At represents the time interval.

[0015] In some embodiments, the processing device may be configured to identify an object in the first image and select a point on the identified object as the first point.

[0016] In some embodiments, the system may further include an output device. The processing device may be further configured to: compare the calculated first collision time with a predetermined threshold; and control the output device to generate an alarm if the calculated first collision time is less than the threshold.

[0017] According to one embodiment, a non-transitory computer readable medium, which contains a computer program for time-to-collision estimation, is provided. When the computer program is executed by a processor, it will instruct the processor to: obtain a first image captured by a camera at a first time point and a second image captured by the camera at a second time point; select a first point on an object in the first image; identify the first point in the second image; and calculate a first collision time based on a time interval between the first and second time points, a vertical position of the first point in the first image and a vertical position of the first point in the second image.

[0018] According to one embodiment, a time-to-collision estimation system is provided. The system may include: means for selecting a first point on an object in a first image captured by a camera at a first time point; means for identifying the first point in a second image captured by the camera at a second time point; and means for calculating a first collision time based on a time interval between the first and second time points, a vertical position of the first point in the first image and a vertical position of the first point in the second image.

BRIEF DESCRIPTION OF THE DRAWINGS

[0019] The foregoing and other features will become more fully apparent from the following description and appended claims, taken in conjunction with the accompanying drawings. Understanding that these drawings depict only several embodiments in accordance with the disclosure and are, therefore, not to be considered limiting of its scope, the disclosure will be described with additional specificity and detail through use of the accompanying drawings.

[0020] FIG. 1 schematically illustrates a flow chart of a time-to-collision estimation method according to one embodiment;

[0021] FIG. 2 schematically illustrates two vehicles running on a road plane;

[0022] FIG. 3 schematically illustrates projection of a point onto an image plane of a camera at different time points;

[0023] FIG. 4 schematically illustrates a distance from a point to a first vanishing line in a first image and a distance from the point to a second vanishing line in a second image; and

[0024] FIG. 5 illustrates a time-to-collision estimation system according to one embodiment.

DETAILED DESCRI PTION

[0025] In the following detailed description, reference is made to the accompanying drawings, which form a part hereof. In the drawings, similar symbols typically identify similar components, unless context dictates otherwise. The illustrative embodiments described in the detailed description, drawings, and claims are not meant to be limiting. Other embodiments may be utilized, and other changes may be made, without departing from the spirit or scope of the subject matter presented here. It will be readily understood that the aspects of the present disclosure, as generally described herein, and illustrated in the Figures, can be arranged, substituted, combined, and designed in a wide variety of different configurations, all of which are explicitly contemplated and make part of this disclosure.

[0026] FIG. 1 schematically illustrates a flow chart of a time-to-collision (TTC) estimation method 100 according to one embodiment.

[0027] In S101 , obtaining a first image captured by a camera at a first time point and a second image captured by the camera at a second time point.

[0028] In some embodiments, the two images may be obtained from a frame sequence captured by the camera. In some embodiments, the camera may be mounted on a first vehicle either facing or opposite to the driving direction. In some embodiments, the two images may be two adjacent frames in a frame sequence. In some embodiments, the two images may be obtained in a predetermined time interval, for example, in every 1 /30 second.

[0029] In S103, identifying an object in the first image.

[0030] In various scenarios, the object may be different. For example, when the first vehicle is running on a highway, a second vehicle in front of it may be the object; in a parking environment, an obstacle like a wall may be the object. There are various solutions for identifying such an object in an image, such as structure from motion based methods and optical flow based methods, which are well known in the art and will not be described in detail here. In some embodiments, the identified object may be enclosed by a frame on a display.

[0031] In S105, selecting a point on the identified object in the first image.

[0032] Many methods may be used to select the point. In some embodiments, the point may be selected randomly. In some embodiments, the point may be selected using a feature point identifying method such as Harris Corner Detection. In some embodiments, a plurality of points may be selected on the identified object.

[0033] In S107, identifying the point in the second image.

[0034] The point may be identified in the second image in many ways, such as point tracking methods such as a KLT method, which is well known in the art.

[0035] The point may have different positions in the first and second images because of relative movement between the first vehicle, i.e., the camera, and the identified object from the first time point to the second time point. Hereunder, the relation between the position change of the point and the distance from the point to the camera will be illustrated with reference to accompanying drawings.

[0036] FIG. 2 schematically illustrates a first vehicle 201 having a camera 203 mounted thereon and a second vehicle 209 in front of the first vehicle 201 . The first vehicle 201 and the second vehicle 209 are running on a road plane 205. An optical axis of the camera 203 is parallel to the road plane 205. The second vehicle 209 may be projected onto an image plane 207 of the camera 203, and identified in the images captured by the camera 203. A point P may be selected on the second vehicle 209 whose position change will be used to calculate a collision time representing a period from the present time point to a future time point at which the vehicle carrying the camera may collide with an object, such as another vehicle or an obstacle. Collision time may be called as time-to-collision (TTC) in the art.

[0037] FIG. 3 schematically illustrates projections of the selected point P onto the image plane 207 at different time points. The point P has a first position Pi at the first time point and a second position P 2 at the second time point. Based on intrinsic transformation, the first position Pi and the second position P 2 may be transformed into a first image position Pi' and a second image position P 2 ' in the image coordinate system, respectively.

[0038] When an object and a camera get closer to each other, a point on the object moves downward in the camera's image plane and vice versa. Further, the faster the object gets closer to the camera and the longer the time interval between the first and second time points is, the greater the difference of vertical positions of the point in the image plane may be. Therefore, the change of a horizontal distance from the object to the camera may be reflected by the difference of the vertical positions of the point in the image plane. It is well known in the art that, a collision time is defined as a ratio of a relative distance and a relative velocity. Therefore, a collision time may be calculated based on vertical positions of the point in the image plane and the time interval. Hereunder, a specific example is given for illustrating calculating a collision time based on the time interval and the vertical positions.

[0039] In S109, calculating a collision time based on vertical positions of the point in the first and second images and the time interval between the first and second time points.

[0040] Referring still to FIG. 3, based on projection principles, at the first time point, assuming the camera's pitch angle is configured to be zero or close to zero, in the camera coordinate system, we have: di / Hi = f / Zi Equation (1 ) where f represents a focal length of the camera, di represents a first distance from the first image position Pi' to a first reference plane, Zi represents a horizontal distance from the first position Pi to the camera's optical center, and Hi represents a distance from the first position Pi to the first reference plane.

[0041] The first reference plane is defined as a plane parallel with the road plane 205, and the first reference plane passes through an optical center of the camera 203 at the first time point. Therefore, the first reference plane may have a height above the road plane, which height may equal to an installation height of the camera.

[0042] Since the first vehicle is running on the road plane 205, the camera 203 mounted thereon may have a fixed installation height. Therefore, at the second time point, a second reference plane which is defined based on the same principle as described above may almost coincide with the first reference plane. As a result, the two reference planes are illustrated as a same "reference plane" in FIG. 3 for convenience.

[0043] At the second time point, since the pitch angle may not change substantially, based on the projection principles, we have d 2 / H 2 = f / Z 2 Equation (2) where d 2 represents a second distance from the second image position P 2 ' to the second reference plane, Z 2 represents a horizontal distance from the second position P 2 to the camera's optical center, and H 2 represents a distance from the second position P 2 to the second reference plane.

[0044] As described above, the installation height of the camera 203 may not change substantially, so that the first and second reference planes may have a same height above the road plane 205. Further, the vertical distance from the selected point to the road plane 205 may not change substantially. In conclusion, we have:

Hi - H 2 Equation (3) [0045] Based on Equations (1 ), (2) and (3), we obtain: di d2 = Z 2 Zi Equation (4),

[0046] Since the predetermined time interval At between the first time point and the second time point is very small, a relative velocity between the camera and the selected point can be assumed as a constant v, then we obtain:

Z 2 = Zi + v * At Equation (5),

[0047] Dividing both two sides of Equation (5) by Z 2 , we obtain:

1 = Zi / Z 2 + v * At / Z 2 Equation (6),

[0048] As a collision time T is defined as the ratio of the relative distance to the relative velocity, i.e., T = Z 2 / v, based on Equations (4) and (6), we obtain:

T = di * At / (di - d 2 ) Equation (7),

[0049] It could be concluded from Equation (7) that the collision time T is only relative to the first distance di between the first image position Pi' and the first reference plane, the second distance d 2 between the second image position P 2 ' and the second reference plane, and the time interval At.

[0050] FIG. 4 schematically illustrates a distance from the selected point P to a first vanishing line in the first image and a distance from the selected point P to a second vanishing line in the second image. On the image plane, the distance di from the first image position Pi ' to the first reference plane may be represented as a distance from the first image position Pi ' to the first vanishing line. That is because the first vanishing line is defined as a combination of vanishing points in the first image, coinciding with an intersecting line of the first reference plane and the image plane, which is well known in the art. Similarly, the distance d 2 from the second image position P 2 ' to the second reference plane may be represented as a distance from the second image position P 2 ' to the second vanishing line. As described above, the first and second reference planes may almost coincide with each other, so as the first and second vanishing lines. Therefore, there is one "vanishing line" illustrated in FIG. 4, representing the first and second vanishing lines. From FIG. 4, it could be concluded that the distance di and the distance d 2 can be calculated by: detecting the first and second vanishing lines in the first and second images, respectively; and calculating a distance from the first vanishing line to the first image position Pi' in the first image as the first distance di, and calculating a distance from the second vanishing line to the second image position P 2 ' as the second distance d 2 .

[0051] Detecting a vanishing line in an image may be based on detecting vanishing points in the image, which is well known in the art. In some embodiments, the vanishing points may be calculated by methods based on Gaussian sphere. In some embodiments, the vanishing points may be detected using a polar parameter space.

[0052] Since the time interval is already known, based on the above described calculation, the collision time may be obtained. In some scenarios, the calculated collision time may be less than 0, which means the first vehicle and the second vehicle are moving far away from each other.

[0053] In some embodiments, a plurality of points may be selected on the object. Accordingly, a plurality of collision times may be calculated. In some embodiments, a final collision time may be calculated, which may be an average value of the plurality of collision times.

[0054] In some embodiments, the calculated collision time may be presented to a driver.

[0055] In S111 , comparing the calculated collision time with a predetermined threshold and generating an alarm if the calculated collision time is less than the threshold. In some embodiments, if the calculated collision time is less than the predetermined threshold, which means an object is within a dangerous distance and a collision will take place soon, an alarm may be generated to warn the driver.

[0056] From the above, the collision time can be conveniently estimated. No extrinsic parameter of the camera is required, thereby improving the efficiency.

[0057] FIG. 5 schematically illustrates a time-to-collision estimation system 400 according to one embodiment. The system 400 may include a camera 401 , a processing device 403, a memory device 405 and a sound alert generator 407. The system 400 may be mounted on a vehicle.

[0058] The camera 401 is configured to capture images. The processing device 403 may be configured to: identify an object in a first image captured by the camera at a first time point; select a point on the object in the first image; identify the point in a second image captured by the camera at a second time point; calculate a collision time based on vertical positions of the point in the first and second images and a time interval between the first and second time points; and compare the calculated collision time with a predetermined threshold. Detail configurations of the processing device 403 may be obtained by referring above descriptions of the method 100, and will not be illustrated in detail here. The memory device 405 may store an operating system and program instructions.

[0059] When the calculated collision time is less than the predetermined threshold, the processing device 403 may send an instruction to control the sound alert generator 407 to generate a sound alert to warn the driver. In some embodiments, the predetermined threshold may be 10 seconds or less.

[0060] According to one embodiment, a time-to-collision estimation system is provided. The system may include: means for selecting a first point on an object in a first image captured by a camera at a first time point; means for identifying the first point in a second image captured by the camera at a second time point; and means for calculating a first collision time based on a time interval between the first and second time points, a vertical position of the first point in the first image and a vertical position of the first point in the second image.

[0061] According to one embodiment, a non-transitory computer readable medium, which contains a computer program for time-to-collision estimation, is provided. When the computer program is executed by a processor, it will instruct the processor to: obtain a first image captured by a camera at a first time point and a second image captured by the camera at a second time point; select a first point on an object in the first image; identify the first point in the second image; and calculate a first collision time based on a time interval between the first and second time points, a vertical position of the first point in the first image and a vertical position of the first point in the second image.

[0062] There is little distinction left between hardware and software implementations of aspects of systems; the use of hardware or software is generally a design choice representing cost vs. efficiency tradeoffs. For example, if an implementer determines that speed and accuracy are paramount, the implementer may opt for a mainly hardware and/or firmware vehicle; if flexibility is paramount, the implementer may opt for a mainly software implementation; or, yet again alternatively, the implementer may opt for some combination of hardware, software, and/or firmware.

[0063] While various aspects and embodiments have been disclosed herein, other aspects and embodiments will be apparent to those skilled in the art. The various aspects and embodiments disclosed herein are for purposes of illustration and are not intended to be limiting, with the true scope and spirit being indicated by the following claims.