Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
ON-VEHICLE NIGHT VISION DEVICE AND ON-VEHICLE NIGHT VISION METHOD
Document Type and Number:
WIPO Patent Application WO/2008/078187
Kind Code:
A2
Abstract:
An on-vehicle night vision device includes: image capture means (2) for capturing an image of the vicinity of a vehicle in a dark; in-vicinity-of-vehicle object detection means (24) for detecting an object in the vicinity of the vehicle from the captured image by the image capture means (2); display image generation means (12, 23) for generating a display image based on the captured image; image display means (12, 23) for displaying the display image generated by the display image generation means (4); image capture condition dividing means (25) for dividing image capture conditions of the image capture means (2) into a first conditions for in-vicinity-of-vehicle object detection and a second conditions for an image to be displayed; and image distribution means (22) for classifying the captured image as one of the image for in-vicinity-of-vehicle object detection and the image to be displayed.

Inventors:
MURANO TAKAHIKO (JP)
Application Number:
PCT/IB2007/004274
Publication Date:
July 03, 2008
Filing Date:
December 19, 2007
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
TOYOTA MOTOR CO LTD (JP)
MURANO TAKAHIKO (JP)
International Classes:
G02B23/12
Foreign References:
JP2004364112A2004-12-24
Download PDF:
Claims:
CLAIMS

1. An on-vehicle night vision device characterized by comprising: image capture means for capturing an image of a vicinity of a vehicle in a dark; in-vicinity-of-vehicle object detection means for detecting an object in the vicinity of the vehicle from the captured image by the image capture means; display image generation means for generating a display image based on the captured image; image display means for displaying the display image generated by the display image generation means; image capture condition dividing means for dividing image capture conditions of the image capture means into a first conditions for in-vicinity-of-vehicle object detection and a second conditions for an image to be displayed; and image distribution means for classifying the captured image as one of an image for in-vicinity-of-vehicle object detection and the image to be displayed.

2. The on-vehicle night vision device according to claim 1, wherein the display image generation means performs a delay process to delay a timing of displaying the image which was classified as the image to be displayed by a time equivalent to a time for the in-vicinity-of-vehicle object detection means to detect an object in the vicinity of the vehicle, and generates the display image by indicating a position of the object in the vicinity of the vehicle detected by the in-vicinity-of-vehicle object detection means in the image to be displayed subjected to the delay process.

3. The on-vehicle night vision device according to claim 1 or 2, characterized by further comprising: identification signal imparting means for imparting an identification signal to the captured image in accordance with the image capture conditions, wherein the image distribution means classifies the captured image as one of the image for

in-vicinity-of-vehicle object detection and the image to be displayed according to the identification signal imparted to the captured image.

4. The on-vehicle night vision device according to claim 1 or 2, wherein the image distribution means classifies the captured image as one of the image for in-vicinity-of-vehicle object detection and the image to be displayed by image processing of the captured image.

5. The on-vehicle night vision device according to any one of claims 1 to 4, characterized by further comprising: irradiation means for irradiating the vicinity of the vehicle; and irradiation condition dividing means for dividing irradiation conditions of the irradiation means into a first conditions for in-vicinity-of-vehicle object detection and a second conditions for the image to be displayed.

6. An on-vehicle night vision method, characterized by comprising: dividing image capture conditions under which to capture an image of a vicinity of a vehicle in a dark, into a first conditions for in-vicinity-of-vehicle object detection and a second conditions for an image to be displayed; classifying the image captured in accordance with the image capture conditions, as one of an image for in-vicinity-of-vehicle object detection and the image to be displayed; detecting an object in the vicinity of the vehicle, from the image for in-vicinity-of-vehicle object detection, generating a display image such that a position of the detected object in the vicinity of the vehicle is indicated the image to be displayed; and displaying the generated display image.

7. An on-vehicle night vision device comprising: an image capture section for capturing an image of a vicinity of a vehicle in a dark;

an in-vicinity-of-vehicle object detection section for detecting an object in the vicinity of the vehicle from the captured image by the image capture section; a display image generation section for generating a display image based on the captured image; an image display section for displaying the display image generated by the display image generation section; an image capture condition dividing section for dividing image capture conditions of the image capture section into a first conditions for in-vicinity-of-vehicle object detection and a second conditions for an image to be displayed; and an image distribution section for classifying the captured image as one of an image for in-vicinity-of-vehicle object detection and the image to be displayed.

Description:

ON-VEHICLE NIGHT VISION DEVICE AND ON-VEHICLE NIGHT VISION

METHOD

BACKGROUND OF THE INVENTION

1. Field of the Invention

[0001] The invention relates to an on-vehicle night vision device and on-vehicle night vision method for capturing an image of the vicinity of a vehicle in the dark. 2. Description of the Related Art

[0002] There are generally known night vision devices for capturing and displaying an image of the vicinity of a vehicle in the dark, including an image of a view ahead of the vehicle, to ensure the safety of the vehicle while running in the nighttime. The night vision devices detect an object, such as an oncoming vehicle, a pedestrian, and a sign around the road, from the captured image of the vicinity of the vehicle to display the captured image and the detected object on a display device.

[0003] Such image capture devices include a vicinity display device for a vehicle disclosed in Japanese Patent Application Publication No. 2004-364112 (JP-A- 2004-364112). This vicinity display device for a vehicle displays a visual image from an infrared camera on a display device, and detects a pedestrian or the like in the visual image. The detected pedestrian or the like is enhanced and displayed on the display device.

[0004] In the vicinity display device for a vehicle disclosed in the above publication, an object such as a pedestrian is detected from the captured image, and displayed along with the captured image. However, an image easily viewable to the driver or other person is occasionally not easy to detect an object from, while an image easy to detect an object from is occasionally not easily viewable to the driver or other person. Therefore, for an image captured by the image capture device that is easily viewable to the driver or

other person, it may be difficult to detect an object such as a pedestrian from the image, and to accurately display the object.

SUMMARY OF THE INVENTION [0005] In view of the foregoing, an object of the invention is to provide an on-vehicle night vision device and an on-vehicle night vision method that can present an image captured in the dark in a manner easily viewable to the driver, and that can precisely detect an object such as a pedestrian in the image.

[0006] In order to achieve the above object, a first aspect of the invention provides an on-vehicle night vision device characterized by including: image capture means for capturing an image of a vicinity of a vehicle in a dark; in-vicinity-of-vehicle object detection means for detecting an object in the vicinity of the vehicle from the captured image by the image capture means; display image generation means for generating a display image based on the captured image; image display means for displaying the display image generated by the display image generation means; image capture condition dividing means for dividing image capture conditions of the image capture means into a first conditions for in-vicinity-of-vehicle object detection and a second conditions for an image to be displayed; and image distribution means for classifying the captured image as one of an image for in-vicinity-of-vehicle object detection and the image to be displayed.

[0007] In the on-vehicle night vision device according to the invention, the image capture conditions of the image capture means are divided into a first conditions for in-vicinity-of-vehicle object detection and a second conditions for an image to be displayed, and images captured under these conditions are classified by the image distribution means. Therefore, conditions suitable for the specific purpose can be set when capturing one of an image for in-vicinity-of-vehicle object detection and the image to be displayed. Thus, it is possible to present an image captured in the dark in a manner easily viewable to the driver, and to precisely detect an object such as a pedestrian in the image.

[0008] In the on-vehicle night vision device according to the above first aspect, in addition, the display image generation means may perform a delay process to delay a timing of displaying the image which was classified as the image to be displayed by a time equivalent to a time for the in-vicinity-of-vehicle object detection means to detect an object in the vicinity of the vehicle, and generates the display image by indicating a position of the object in the vicinity of the vehicle detected by the in-vicinity-of-vehicle object detection means in the image to be displayed subjected to the delay process.

[0009] In addition, the on-vehicle night vision device according to the above first aspect may further include identification signal imparting means for imparting an identification signal to the captured image in accordance with the image capture conditions, and the image distribution means may classify the captured image as one of the image for in-vicinity-of-vehicle object detection and the image to be displayed according to the identification signal imparted to the captured image.

[0010] Since an identification signal has been imparted to the captured image in accordance with the image capture conditions in this way, it is easy for the image distribution means to classify the image as one of the image for in-vicinity-of-vehicle object detection and the image to be displayed.

[0011] In the on-vehicle night vision device according to the above first aspect, in addition, the image distribution means may classify the captured image as one of the image for in-vicinity-of-vehicle object detection and the image to be displayed by image processing of the captured image.

[0012] In addition, the on-vehicle night vision device according to the above first aspect may further include: irradiation means for irradiating the vicinity of the vehicle; and irradiation condition dividing means for dividing irradiation conditions of the irradiation means into a first conditions for in-vicinity-of-vehicle object detection and a second conditions for the image to be displayed.

[0013] Since the irradiation conditions of the irradiation means are also divided in this way, it is possible to capture more suitable images for in-vicinity-of-vehicle object detection and for the image to be displayed.

[0014] In order to achieve the above object, a second aspect of the invention provides an on-vehicle night vision method characterized by including: dividing image capture conditions under which to capture an image of a vicinity of a vehicle in a dark, into a first conditions for in-vicinity-of-vehicle object detection and a second conditions for an image to be displayed; classifying the image captured in accordance with the image capture conditions, as one of an image for in-vicinity-of-vehicle object detection and the image to be displayed; detecting an object in the vicinity of the vehicle, from the image for in-vicinity-of-vehicle object detection, generating a display image such that a position of the detected object in the vicinity of the vehicle is indicated the image to be displayed; and displaying the generated display image.

[0015] With the on-vehicle night vision device and method according to the invention, it is possible to present an image captured in darkness in a manner easily viewable to the driver, and to precisely detect an object such as a pedestrian in the image.

BRIEF DESCRIPTION OF THE DRAWINGS

[0016] The foregoing and/or further objects, features and advantages of the invention will become more apparent from the following description of preferred embodiment with reference to the accompanying drawings, in which like numerals are used to represent like elements and wherein: FIG. 1 is a block diagram showing the configuration of a night vision system in accordance with an embodiment of the invention;

FIG. 2 is a flowchart showing the procedure of a process performed by a pedestrian detection section;

FIG. 3 is a flowchart showing the procedure of a process following that of FIG. 2; and FIG. 4 is a diagram showing the contents of a visual image signal.

DETAILED DESCRIPTION OF EMBODIMENTS

[0017] A description will hereinafter be made of an embodiment of the invention with reference to the accompanying drawings. In the drawings, the same elements are

given the same reference numerals to omit a repetitive description. The dimensions and proportions used in the drawings do not necessarily coincide with those used in the description for convenience of illustration.

[0018] FIG. 1 is a block diagram showing the configuration of a night vision system as the on-vehicle night vision device in accordance with an embodiment of the invention. As shown in FIG. 1, the night vision system in accordance with this embodiment includes a night vision ECU 1. The night vision ECU 1 is a computer of an automobile device to be electronically controlled, and composed of a CPU (Central Processing Unit), a ROM (Read Only Memory), a RAM (Random Access Memory), a main timer, an input/output interface, and so forth. The night vision ECU 1 includes a pedestrian detection section 11, and a drawing section 12 as the display image generation means.

[0019] The pedestrian detection section 11 includes an A/D conversion section 21, a visual image distribution processing section 22 as the image distribution means, a visual image delay processing section 23 as the display image generation means, a pedestrian detection processing section 24 as the in-vicinity-of-vehicle object detection means, and a control section 25. The drawing section 12 includes a pedestrian detection frame activation indicator drawing section 31. Here, the control section 25 constitutes the image capture condition dividing means and the irradiation condition dividing means. To the night vision ECU 1 are connected a near infrared camera 2 as the image capture means, a near infrared emission device 3 as the irradiation means, and a monitor 4 as the image display means.

[0020] The near infrared camera 2 is provided inside the inner rearview mirror of the vehicle, for example, to capture an image of the view ahead of the vehicle. The near infrared camera 2 outputs the captured image to the A/D conversion section 21 of the night vision ECU 1 as a visual image signal. The near infrared emission device 3 is provided at left and right positions of the front bumper of the vehicle to irradiate the area to be photographed by the near infrared camera 2 ahead of the vehicle. The near infrared camera 2 captures an image of an object in the vicinity of the vehicle by receiving light emitted from the near infrared emission device 3 and reflected from the

object.

[0021] The near infrared camera 2 is provided with a camera controller, while the near infrared emission device 3 is provided with an emitter controller, so that their image capture conditions and operating conditions (irradiation conditions) can be adjusted (divided) based on an instruction signal output from the control section 25. For the image capture conditions of the near infrared camera 2 and the operating conditions of the near infrared emission device 3, there are respectively provided two types of conditions: those suitable to detect an object in the vicinity of the vehicle, and those suitable to display a captured image. The image capture conditions of the near infrared camera 2 include, for example, dividing of internal parameters and the number of shutter releases. The operating conditions of the near infrared emission device 3 include, for example, switching between the use of a filter that passes components of a particular wavelength and the use of a filter that cuts them, the emission angle, and whether or not near infrared light is emitted. [0022] The monitor 4 is provided at a position in the cabin viewable to the driver to display an image output from the drawing section 12 of the night vision ECU 1. While driving the vehicle in the nighttime, the driver can check the vicinity of the vehicle to ensure safety by viewing the vicinity of the vehicle in the dark displayed on the monitor 4. [0023] The AfD conversion section 21 provided in the pedestrian detection section

11 of the night vision ECU 1 converts an analog image obtained from the visual image signal output from the near infrared camera 2 into a digital image by AfD conversion. The AfD conversion section 21 outputs the digital image obtained by the conversion to the visual image distribution processing section 22. The visual image distribution processing section 22 detects a visual image identification signal imparted to the digital image output from the A/D conversion section 21 to classify the output digital image as one of an image for visual image display as the image to be displayed, and an image for pedestrian detection as the image for in-vicinity-of-vehicle object detection, based on the imparted visual image identification signal. The output image is classified as an image

for visual image display in the case where the visual image identification signal imparted to the output image is a visual image display signal, while the output image is classified as an image for pedestrian detection in the case where the visual image identification signal is an object detection signal. Upon classification, the visual image distribution processing section 22 outputs the image for visual image display to the visual image delay processing section 23, and outputs the image for pedestrian detection to the pedestrian detection processing section 24.

[0024] The visual image delay processing section 23 performs a delay process to delay the timing of displaying the image for visual image display output from the visual image distribution processing section 22 by a predetermined unit time tl. Here, the unit time tl is set to be equivalent to the time for the pedestrian detection processing section 24 to detect a pedestrian from the image for pedestrian detection. The visual image delay processing section 23 outputs the display image subjected to the delay process to the pedestrian detection frame activation indicator drawing section 31 of the drawing section 12.

[0025] The pedestrian detection processing section 24 detects a pedestrian from the image for pedestrian detection output from the visual image distribution processing section 22. Upon detection of a pedestrian, the pedestrian detection processing section 24 outputs the position of the pedestrian in the image for pedestrian detection to the pedestrian detection frame activation indicator drawing section 31 of the drawing section 12.

[0026] For the image capture conditions of the near infrared camera 2 and the operating conditions of the near infrared emission device 3, the control section 25 provides two modes: the image recognition mode and the visual image display mode. An instruction signal on the camera parameters of the near infrared camera 2 and the operating conditions of the near infrared emission device 3 in accordance with the present mode is output to the near infrared camera 2 and the near infrared emission device 3. The camera controller of the near infrared camera 2 imparts a synchronization signal to the visual image signal to be output to the pedestrian detection section 11. The

near infrared camera 2 (camera controller) constitutes the identification signal imparting means. In addition, the camera controller imparts a visual image identification signal in accordance with the present mode to the header portion immediately after the synchronization signal. The interval at which the synchronization signal is imparted is shorter than the unit time tl, and several synchronization signals are imparted in each unit time tl.

[0027] The pedestrian detection frame activation indicator drawing section 31 of the drawing section 12 generates a display image obtained by indicating the position of the pedestrian output from the pedestrian detection processing section 24 in the image for visual image display output from the visual image delay processing section 23. The pedestrian detection frame activation indicator drawing section 31 outputs the generated display image to the monitor 4.

[0028] A headlight switch (not shown) for turning on and off the headlight of the vehicle is connected to the night vision ECU 1. Turning on the headlight switch to turn on the headlight starts the night vision ECU 1.

[0029] A description will now be made of the procedure of a process performed by the night vision system in accordance with this embodiment. FIGs. 2 and 3 are flowcharts showing the procedure of a process performed by the pedestrian detection section 11 of the night vision system. [0030] As shown in FIG. 2, first, the pedestrian detection section 11 determines whether or not the night vision system is activated with the night vision ECU 1 started (Sl). If it is determined that the night vision system is not activated, camera control by the control section 25 is stopped and the main timer of the night vision ECU 1 is stopped (S2), before the process returns to step Sl to repeat the same process. [0031] If it is determined that the night vision system is activated, it is determined whether or not the operation of the near infrared camera 2 is under the control of the control section 25 (S3). If it is determined that the operation of the near infrared camera 2 is not under the control of the control section 25, the control section 25 starts controlling the near infrared camera 2 (S4), before the process returns to step S3.

[0032] If it is determined that the operation of the near infrared camera 2 is under the control of the control section 25, it is determined whether or not the main timer of the night vision ECU 1 is in operation (S5). If it is determined that the main timer of the night vision ECU 1 is not in operation, the main timer of the night vision ECU 1 is started (S6), before the process returns to step S5.

[0033] If it is determined that the main timer of the night vision ECU 1 is in operation, it is determined whether or not the operating conditions of the near infrared emission device 3 are met (S7). If it is determined that the operating conditions of the near infrared emission device 3 are met, it is determined whether or not the near infrared emission device 3 is turned on (S8). If it is determined that the near infrared emission device 3 is not turned on, the near infrared emission device 3 is turned on (S9). If the near infrared emission device 3 is determined to be turned on, the process proceeds to step S 12 (see FIG 3) without any change. On the other hand, if it is determined that the operating conditions of the near infrared emission device 3 are not met, it is determined whether or not the near infrared emission device 3 is turned on (SlO). If it is determined that the near infrared emission device 3 is turned on, the near infrared emission device 3 is turned off (SIl). If the near infrared emission device 3 is determined to be not turned on, the process proceeds to step S12 (see FIG. 3) without any change.

[0034] Subsequently, with reference to FIG. 3, it is determined whether or not the main timer of the night vision ECU 1 is indicating an integer n times the unit time tl (S12). If it is determined that the main timer of the night vision ECU 1 is indicating the integer n times the unit time tl, the control section 25 is set to the image recognition mode, and set to the visual image retrieval state (S 13). In the image retrieval state, images can be retrieved from the visual image distribution processing section 22 to the pedestrian detection processing section 24. If it is determined that the main timer of the night vision ECU 1 is not indicating the integer n times the unit time tl, the process proceeds to step S 14 without any change.

[0035] After that, it is determined whether or not retrieval of visual images to the pedestrian detection section 11 have been completed (S14). If it is determined that

retrieval of visual images have been completed, the control section 25 is set to the visual image display mode, and the visual image retrieval state from the near infrared camera 2 is canceled (S 15). If it is determined that retrieval of visual images have not been completed, the process proceeds to step S 16 without any change, until retrieval of visual images have been completed.

[0036] Then, it is determined whether or not the present mode (current mode) of the control section 25 coincides with the mode in the preceding routine (previous mode) (S 16). If it is determined that the current mode does not coincide with the previous mode, an instruction signal in accordance with the present mode is output to the near infrared camera 2 and the near infrared emission device 3 as an instruction to change the settings of the image capture conditions of the near infrared camera 2 and the operating conditions of the near infrared emission device 3 (S17). Specifically, an instruction to change the settings of the camera parameters is given to the near infrared camera 2, and an instruction on the operation settings is given to the near infrared emission device 3. [0037] The near infrared camera 2 outputs the captured image to the pedestrian detection section 11 as a visual image signal. Here, the camera controller of the near infrared camera 2 generates a visual image signal based on the captured image. This visual image signal includes a visual image identification signal AS in the header portion following a synchronization signal SS, as shown in FIG. 4. The header portion is followed by visual image data VD. As the visual image identification signal AS, an object detection signal is recorded when the image is captured under conditions suitable to detect an object in the vicinity of the vehicle, and a visual image display signal is recorded when the image is captured under conditions suitable to display a captured image. [0038] Then, when an instruction signal in accordance with the present mode is output the near infrared camera 2 and the near infrared emission device 3 in step S 17, or if it is determined that the current mode coincides with the previous mode, the previous mode is stored as the current mode (S 18), before the process by the pedestrian detection section 11 is terminated. After that, the A/D conversion section 21 converts the visual

image signal output from the near infrared camera 2 into a digital image, and the visual image distribution processing section 22 classifies the converted digital image as one of an image for visual image display and an image for pedestrian detection (S 19). The visual image distribution processing section 22 verifies whether the visual image identification signal imparted to the header portion of the visual image signal output from the near infrared camera 2 is one of a visual image display signal and an object detection signal. If the imparted visual image identification signal is a visual image display signal, the image is output to the visual image delay processing section 23. If the imparted visual image identification signal is an object detection signal, the image is output to the pedestrian detection processing section 24. Specifically, with reference to the example shown in FIG. 4, in the case where four synchronization signals are imparted in each unit time tl, for example, three visual image display signals VAS and one object detection signal OAS are imparted as the visual image identification signals while four synchronization signals are imparted. [0039] After the lapse of the time tl for the pedestrian detection processing section

24 to perform processing (the time equivalent to the unit time), the visual image delay processing section 23 outputs a display image obtained by performing a delay process on the image output from the visual image distribution processing section 22 to the pedestrian detection frame activation indicator drawing section 31 of the drawing section 12 (S20). The pedestrian detection processing section 24 detects a pedestrian or the like captured in the image output from the visual image distribution processing section 22 by image processing. If a pedestrian or the like is detected, a pedestrian detection frame is output to the pedestrian detection frame activation indicator drawing section 31 of the drawing section 12 to emphatically indicate the existence of the pedestrian or the like by displaying a detection frame at the corresponding position (S20). Then, the process by the pedestrian detection section 11 is terminated.

[0040] When the process by the pedestrian detection section 11 is finished, the pedestrian detection frame activation indicator drawing section 31 of the drawing section 12 generates an image by adding the pedestrian detection frame to the image output from

the visual image delay processing section 23 at the position output from the pedestrian detection processing section 24 to display the generated image on the monitor 4.

[0041] Upon activation of the night vision system in accordance with this embodiment having the above configuration, the near infrared camera 2 captures an image of the vicinity of the vehicle in darkness, and a visual image signal based on the captured visual image is output from the near infrared camera 2 to the pedestrian detection section 11 of the night vision ECU 1. A visual image identification signal has been imparted to the visual image signal output from the near infrared camera 2 to the pedestrian detection section 11 of the night vision ECU 1. The visual image distribution processing section 22 of the pedestrian detection section 11 verifies the visual image identification signal imparted to the visual image signal to distribute the output one of visual image signal to the visual image delay processing section 23 and the pedestrian detection processing section 24.

[0042] Here, the control section 25 outputs to the near infrared camera 2 an instruction signal to divide the image capture conditions of the near infrared camera 2. The near infrared camera 2 divides the image capture conditions based on the instruction signal output from the control section 25. Of the visual image signals output from the near infrared camera 2, those containing a visual image captured under conditions suitable to detect an object in the vicinity of the vehicle have been imparted with an object detection signal, and those containing a visual image captured under conditions suitable to display a captured image have been imparted with a visual image display signal. Therefore, the visual image delay processing section 23 receives an image for visual image display captured under conditions suitable to display a captured image, while the pedestrian detection processing section 24 receives an image for pedestrian detection captured under conditions suitable to detect an object in the vicinity of the vehicle. Thus, it is possible to present an image captured in the dark in a manner easily viewable to the driver, and to precisely detect an object such as a pedestrian in the image.

[0043] In addition, since an identification signal has been imparted to the visual image signal output from the near infrared camera 2, the visual image distribution

processing section 22 can classify images as one of those for pedestrian detection and those for visual image display by just verifying the identification signal. Thus, it is easy to classify images.

[0044] Further, the control section 25 outputs to the near infrared emission device 3 an instruction signal to divide the operating conditions. Therefore, the operating conditions of the near infrared emission device 3 can be set to one of those suitable to detect an object in the vicinity of the vehicle and those suitable to display a captured image. Thus, it is possible to capture more suitable images for in-vicinity-of-vehicle object detection and for the image to be displayed. [0045] Although a preferred embodiment of the invention has been described above, the invention is not limited to the above embodiment. For example, although a visual image identification signal has been imparted to a visual image signal to classify an image as one of an image for pedestrian detection and an image to be displayed in the above embodiment, an image may be classified by image processing of the output visual image signal or the like. Although a pedestrian is treated as an object in the vicinity of the vehicle in the above embodiment, other objects such as a traffic signal and a sign in the vicinity of the road may be treated as an object.

[0046] While the invention has been described with reference to example embodiments thereof, it should be understood that the invention is not limited to the example embodiments or constructions. To the contrary, the invention is intended to cover various modifications and equivalent arrangements. In addition, while the various elements of the example embodiments are shown in various combinations and configurations, which are example, other combinations and configurations, including more, less or only a single element, are also within the spirit and scope of the invention.