Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
DRIVER ASSISTANCE SYSTEM FOR A VEHICLE AS WELL AS METHOD FOR ASSISTING A DRIVER OF A VEHICLE
Document Type and Number:
WIPO Patent Application WO/2016/087900
Kind Code:
A1
Abstract:
The invention relates to a driver assistance system for a vehicle (10), the driver assistance system comprising a sensor system (12) for observing at least a portion of the vehicle's surroundings (14), the sensor system (12) comprising: - at least one camera (16, 18) configured to take images of at least the portion of the surroundings (14) and provide image data characterizing the images; - at least one sensor (20) being different from the camera (16, 18) and configured to capture at least one parameter characterizing at least the portion of the surroundings (14), the sensor (20) being configured to provide sensor data characterizing the parameter; and - a data fusion unit configured merge the image data and the sensor data thereby creating virtual three-dimensional images of at least the portion of the surroundings (14).

More Like This:
Inventors:
KAUSCH CARSTEN (CN)
Application Number:
PCT/IB2014/066619
Publication Date:
June 09, 2016
Filing Date:
December 05, 2014
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
AUDI AG (DE)
International Classes:
G01S7/06; G01S13/86; G01S13/93
Foreign References:
US20070106440A12007-05-10
US20120154592A12012-06-21
Other References:
ANONYMOUS: "Fujitsu Develops World's First 3D Image Synthesis Technology to Display Vehicle Exterior without Distortion - Fujitsu Global", 9 October 2013 (2013-10-09), XP055197255, Retrieved from the Internet [retrieved on 20150622]
AMIR ILIAIFAR: "cars/lidar-lasers-and-beefed-up-computers-the-intricate-anatomy-of-an-autonomous-vehicle", 6 February 2013 (2013-02-06), XP055197393, Retrieved from the Internet [retrieved on 20150622]
SEBASTIAN THRUN ET AL: "Stanley: The robot that won the DARPA Grand Challenge", JOURNAL OF FIELD ROBOTICS, JOHN WILEY & SONS, INC, US, vol. 23, no. 9, 1 September 2006 (2006-09-01), pages 661 - 692, XP009156447, ISSN: 1556-4959, [retrieved on 20060925], DOI: 10.1002/ROB.20147C
Download PDF:
Claims:
What is claimed is:

1. A driver assistance system for a vehicle (10), the driver assistance system comprising a sensor system (12) for observing at least a portion of the vehicle's surroundings (14), the sensor system (12) comprising:

- at least one camera (16, 18) configured to take images of at least the portion of the surroundings (14) and provide image data characterizing the images;

- at least one sensor (20) being different from the camera (16, 18) and configured to capture at least one parameter characterizing at least the portion of the surroundings (14), the sensor (20) being configured to provide sensor data characterizing the parameter; and

- a data fusion unit configured merge the image data and the sensor data thereby creating virtual three-dimensional images of at least the portion of the surroundings (14).

2. The driver assistance system according to claim 1,

wherein the driver assistance system (12) comprises at least one display unit (32) configured to present the three-dimensional images to the driver of the vehicle (10).

3. The driver assistance system according to claim 2,

wherein the display unit (32) is configured to present objects (34, 36, 38, 40) in different colors on the basis of at least one predetermined criterion, the objects (34, 36, 38, 40) being part of the three-dimensional images and being different from the vehicle (10), arranged in the vehicle's surroundings (14) and detected by the sensor system (12).

4. The driver assistance system according to any one of the preceding claims,

wherein the camera (16) is configured as an ultra-high definition camera or a light fidelity camera.

5. The driver assistance system according to any one of the preceding claims,

wherein the sensor is arranged above a roof (24) of the vehicle.

6. The driver assistance system according to any one of the preceding claims,

wherein the sensor and/or the camera (16, 18) is mounted on a roof (24) of the vehicle (10).

7. The driver assistance system according to any one of the preceding claims,

wherein the sensor (20) is configured as an electro-magnetic sensor or an ultrasonic sensor

8. The driver assistance system according to any one of the preceding claims,

wherein the sensor system (12) comprises at least one sensor band (26) mounted on an outside of a body (28) of the vehicle (10), the sensor band (26) extending across at least a major portion of the length of the vehicle (10).

9. The driver assistance system according to any one of the preceding claims,

wherein the sensor system (12) comprises:

- at least one sensor box (22) arranged above a roof (24) of the vehicle (10); and

- a plurality of sensors arranged in the sensor box (22).

10. A method for assisting a driver in driving a vehicle (10), the method comprising:

- observing at least a portion of the vehicle's surroundings (14) by means of a sensor system (12) comprising at least one camera (16, 18), at least one sensor (20) being different from the camera (16, 18), and a data fusion unit;

- taking images of at least the portion of the surroundings (14) by means of the camera (16, 18);

- providing image data characterizing the images by the camera (16, 18);

- capturing at least one parameter characterizing at least the portion of the surroundings (14) by means of the sensor (20);

- providing sensor data characterizing the parameter by the sensor (20); and

- merging the image data and the sensor data by means of the data fusion unit thereby creating virtual three-dimensional images of at least the portion of the surroundings (14).

Description:
Driver Assistance System for a Vehicle as well as Method for Assisting a Driver of a Vehicle

Field of the Invention

The invention relates to a driver assistance system for a vehicle as well as a method for assisting a driver in driving a vehicle.

Background Art

US 2007/0106440 Al shows a vehicle parking assisting system comprising camera means for taking picture images outside a vehicle. The vehicle parking assisting system further comprises image generating means for generating a bird's-eye view image by subjecting the picture images to a bird's-eye view conversion. The vehicle parking assisting system further comprises motion detecting means for detecting a motion of the vehicle and combined image generating means for generating a combined bird's-eye view image by combining, based on the motion of the vehicle, a past bird's-eye view image generated previously by the image generating means or a past combined bird's-eye view image generated previously with a present bird's-eye view image generated at present. The vehicle parking assisting system further comprises obstacle detecting means for detecting an obstacle in a fan-shaped detection area around the vehicle, and display means for displaying, on the combined bird's-eye view image, both a present detection point and a past detection point for the obstacle in an overlapping manner. The display means displays the present detection point as a series of marks extending form a predetermined position in the fan-shaped detection area in an arcuate shape along the detection area of the obstacle detecting means, and the l past detection point as a mark located at the predetermined position in the fan-shaped detection area.

Moreover, US 2012/0154592 Al shows an image-processing system operative to process image data obtained by capturing images outside a periphery of a vehicle, the image-processing system comprising a plurality of image-capturing units that is fixed to the vehicle and that generates image-data items by capturing images outside the periphery of the vehicle. The image-processing system further comprises a bird's-eye-view-image drawing unit configured to generate a bird's-eye-view image by determining a viewpoint above the vehicle for each of the image-data items generated by the image-capturing units based on the image-data item so that end portions of real spaces corresponding to two adjacent bird's-eye- view images overlap each other.

Summary of the Invention

Technical problem to be solved

It is an object of the present invention to provide a driver assistance system as well as method by means of which a driver of a vehicle can be assisted in driving the vehicle particularly advantageously.

This object is solved by a driver assistance system having the features of patent claim 1 as well as a method having the features of patent claim 10. Advantageous embodiments with expedient developments of the invention are indicated in the other patent claims.

Technical solution

A first aspect of the invention relates to a driver assistance system for a vehicle, the driver assistance system comprising a sensor system for observing at least a portion of the vehicle's surroundings.

The sensor system comprises at least one camera configured to take images of at least the portion of the surroundings. Moreover, the camera is configured to provide image data characterizing the images. The sensor system further comprises at least one sensor being different from the camera. The sensor is configured to capture at least one parameter characterizing at least the portion of the surroundings. For example, objects arranged in the portion of the surroundings can be detected by the sensor and characterized by the parameter, wherein the sensor is configured to provide sensor data characterizing the parameter. In other words, said sensor data represent the parameter and, thus, objects detected by the sensor, for example.

Moreover, the sensor system comprises a data fusion unit configured to merge the image data and the sensor data thereby creating virtual three-dimensional images of at least the portion of the surroundings. For example, the image data and the sensor data are transmitted to the data fusion unit which receives the image data and the sensor data. The data received by the data fusion unit are combined in such a way that three-dimensional images of at least the portion of the surroundings are created on the basis of the received data. By creating three-dimensional images of the surroundings of the vehicle the driver of the vehicle can be assisted in driving particularly effectively. For example, the driver can be assisted in driving through heavy traffic so that the risk of a collision between the vehicle and other objects, in particular traffic participants can be kept particularly low. Moreover, the driver can be assisted in parking the vehicle so that the driver can park the vehicle in particularly narrow parking places without colliding with objects bounding the parking places. Moreover, the driver assistance system according to the present invention can help the driver to keep calm and keep the control whilst being in a chaotic traffic situation.

In an advantageous embodiment of the invention the driver assistance system comprises at least one display unit configured to present the three-dimensional images to the driver of the vehicle. Thus, the driver can visually perceive at least the portion of the surroundings in a three-dimensional manner by means of the display unit showing the three-dimensional images so that current traffic situations can be visualized and the driver's view can be extended by means of the driver assistance system. Thus, the risk of collision between the vehicle and other traffic participants can be kept particularly low. For example, the driver can drive the vehicle through very narrow gaps bounded by other traffic participants without colliding with the other traffic participants on the basis of three-dimensional images shown by the display.

In a further advantageous embodiment of the invention the display unit is configured to present objects in different colors on the basis of at least one predetermined criterion. Said objects are part of the three-dimensional images displayed by the display unit. Moreover, said objects are different from the vehicle, arranged in the vehicle's surroundings and detected by the sensor system. For example, said criterion can be a distance between the detected objects and the vehicle and/or a relative speed between the objects and the vehicle and/or respective directions of movement of the vehicle and the objects. For example, objects approaching the vehicle are shown in red since the vehicle can potentially collide with said objects. Moreover, for example, objects moving away from the vehicle and/or objects with a constant distance from the vehicle are shown in grey since the risk of a collision between the vehicle and these objects is particularly low. In other words, for example, the display unit is configured to show objects in the surroundings of the vehicle in different colors on the basis of a determined hazardous situation.

In a further advantageous embodiment of the invention the camera is configured as an ultra-high definition camera (UHD camera) or a light fidelity camera (LiFi camera). Thus, at least the portion of the surroundings can be detected and observed particularly precisely since the camera has a high resolution. A light fidelity camera is also referred to as a LiFi camera which uses an optical data transmission system which is also referred as visible light communications.

In a further advantageous embodiment of the invention the sensor is arranged above a roof of the vehicle so that a particularly large portion of the surroundings can be detected by the sensor.

Alternatively or additionally, the sensor and/or the camera is mounted on a roof of the vehicle so that the surroundings of the vehicle can be detected particularly effectively.

In a further advantageous embodiment of the invention the sensor is configured as an electro-magnetic sensor or an ultrasonic sensor. Thereby, objects being arranged very closely to the vehicle can be detected precisely. Moreover, a distance between such an object and the vehicle can be detected particularly precisely.

Preferably, the sensor system is configured as a near-range sensor system having a detection range of 15 meters at the most. Thereby, the near surroundings of the vehicle can be observed particularly precisely so that the risk of collisions between the vehicle and other objects can be kept particularly low. Moreover, a near-range surrounding overview assistant can be realized to present a three-dimensional virtual overview of the near-range surroundings of the vehicle to the driver. Thus, overtaking actions and/or parking actions and/or low range steering can be assisted by the driver assistance system.

In a further embodiment of the invention the sensor system comprises at least one sensor band mounted on a outside of a body of the vehicle, the sensor band extending across at least a major portion of the length of the vehicle, said length extending in the longitudinal direction of the vehicle. Thereby, for example, the surroundings of the vehicle can be detected over 360° around the vehicle since, preferably, the sensor band extends completely around the body, i.e. vehicle.

In a further advantageous embodiment of the invention the sensor system comprises at least one sensor box arranged above a roof of the vehicle. Moreover, the sensor system comprises a plurality of sensors arranged in the sensor box so that the surroundings of the vehicle can be observed particularly precisely.

The invention also relates to a method for assisting a driver in driving a vehicle, the method comprising observing at least a portion of the vehicle's surroundings by means of a sensor system comprising at least one camera, at least one sensor being different from the camera, and a data fusion unit. The method further comprises taking images of at least the portion of the surroundings by means of the camera. Image data characterizing the images are provided by the camera. At least one parameter characterizing at least the portion of the surroundings is captured by means of the sensor. Moreover, sensor data characterizing the parameter are provided by the sensor. Furthermore, the image data and the sensor data are merged by means of the data fusion unit thereby creating virtual three-dimensional images of at least the portion of the surroundings. Advantageous embodiments and advantages of the driver assistance system according to the present invention are to be regarded as advantageous embodiments and advantages of the method according to the present invention and vice versa.

Further advantages, features and details of the invention derive from the following description of a preferred embodiment as well as from the drawing. The features and feature combinations previously mentioned in the description as well as the features and feature combinations mentioned in the following description of the figures and/or shown in the figures alone can be employed not only in the respective indicated combination but also in any other combination or taken alone without leaving the scope of the invention.

Brief Description of the Drawings

Fig. 1 a schematic side view of a vehicle in the form of a passenger vehicle comprising a driver assistance system for assisting a driver of the vehicle in driving; and

Fig. 2 a schematic front view of a display unit of the driver assistance system, the display unit showing a three-dimensional image of at least a portion of the surroundings of the vehicle.

Detailed Description of Embodiments

Fig. 1 shows a vehicle 10 in the form of a passenger vehicle. The vehicle 10 comprises a driver assistance system having a sensor system 12 for observing at lest a portion of the surroundings 14 of the vehicle 10. The sensor system comprises a plurality of cameras 16 and 18. For example, the camera 16 is configured as a LiFi-camera or a ultra-high definition camera (UHD camera), wherein the cameras 18 are configured as conventional cameras. Moreover, the sensor system 12 comprises a plurality of sensors 20 which are configured as ultrasonic sensors. Moreover, the sensor system 12 comprises a plurality of sensors which are arranged in a sensor box 22 arranged above a roof 24 of the vehicle 10, the sensor box 22 being mounted on the roof 24. Moreover, the sensor system 12 comprises a sensor band 26 which extends around at least a major portion of the circumference of the vehicle 10. In the present case the sensor band 26 extends completely around the body 28 of the vehicle 10 and, thus, the vehicle 10 itself, wherein the sensor band 26 is mounted on an outside 30 of the body 28. The sensor band 26 comprises a plurality of sensors which are configured as, for example, electro-magnetic sensors and/or cameras and/or ultrasonic sensors. Said sensors of the sensor system 12 are different form the cameras 16 and 18. The cameras 16 and 18 are configured to take images of at least a portion of the surroundings 14, wherein the cameras 16 and 18 provide image data characterizing the taken images. Said sensors being different from the cameras 16 and 18 are configured to capture at least one parameter characterizing at least the portion of the surroundings 14. For example, said sensors are configured to detect objects arranged in the surroundings 14 and respective distances between the detected objects and the vehicle 10. Moreover, said sensors are configured to provide sensor data characterizing said parameter which in turn characterizes the detected objects and distances. Moreover, the sensor system 12 comprises a data fusion unit which receives the image data and the sensor data. The data fusion unit is configured to merge the received image data and the received sensor data thereby creating virtual three-dimensional images of at least the portion of the surroundings 14. As can be seen from Fig. 2, the sensor system 12 comprises at least one display unit 32 which is arranged in the interior of the vehicle 10. The display unit is configured to present the created three-dimensional images to the driver of the vehicle 10, wherein one of said three-dimensional images can be seen in Fig. 2. Said three-dimensional image presented by the display unit 32 comprises or shows the vehicle 10 itself as well as other traffic participants such as other vehicles 34 and 36, a cyclist 38 and pedestrians 40, each in a three-dimensional manner. Thus, the driver of the vehicle 10 can visually perceive the orientation of the vehicle 10 in relation to said other traffic participants. Thereby, the driver can drive the vehicle 10 in a particularly safe manner without colliding with the other traffic participants.

The sensor system 12 has a detection range of 15 meters at the most so that a near-range overview of the surroundings 14 can be presented to the driver by the display unit 32. Moreover, the display unit 32 and, thus, the sensor system 12 are configured to present objects and, thus, said traffic participants detected by the sensor system 12 in different colors on the basis of at least one predetermined criterion. In the present case, the criterion comprises a distance between the vehicle 10 and the respective other traffic participants as well as a direction of movement of the respective other traffic participant in relation to the own vehicle 10. For example, the vehicle 36 and the pedestrians 40 are shown in grey color in the three-dimensional image since the vehicle 36 and the pedestrians 40 do not approach the own vehicle 10. For example, the vehicle 46 moves away from the vehicle 10 so that the distance between the vehicle 10 and the vehicle 36 increases. Moreover, for example, the pedestrians 40 stand still so that respective distances between the pedestrians 40 and the vehicle 10 are constant or increase. Thus, the risk of collisions between the vehicle 10 and the vehicle 36 and the pedestrians 40 is below a predetermined threshold value so that the vehicle 36 and the pedestrians 40 are shown in grey in the three-dimensional image.

However, the vehicle 34 and the cyclist 38 are, for example, shown in red color since a respective distance between the vehicle 10 and the vehicle 34 and between the vehicle 10 and the cyclist 38 is below a threshold value and the vehicle 34 and the cyclist 38 approach the vehicle 10. Thus, the vehicle 34 and the cyclist 38 are potential collision objects which can potentially collide with the vehicle 10. This means the risk of a collision between the vehicle 10 and the vehicle 34 and a risk of a collision between the vehicle 10 and the cyclist 38 exceed the threshold value so that the vehicle 34 and the cyclist 38 are shown in red in the three-dimensional image.

By presenting the three-dimensional images to the driver the driver can be assisted in driving in heavy traffic so that the risk of collisions between the vehicle 10 and other traffic participants can be kept particularly low. In other words, the driver 10 can avoid collisions between the vehicle 10 and other traffic participants on the basis of the presented three-dimensional images. Additionally or alternatively, the driver can be assisted in parking the vehicle in narrow parking places without colliding with objects bounding the parking places.