Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
METHOD AND CONTROL UNIT FOR VEHICLE SELF-DIAGNOSIS
Document Type and Number:
WIPO Patent Application WO/2017/160201
Kind Code:
A1
Abstract:
Method (400) and control unit (310) in a vehicle (100) for vehicle diagnosis by visual inspection. The method (400) comprises comparing (401) an image (220) of at least a part of the vehicle (100) with a previously stored image (210), depicting a normal state of the vehicle (100); detecting (402) an anomaly (230) in the image (220), in comparison with the previously stored image (210); and executing (403) a measurement in order to at least reduce the impact of the detected (402) anomaly (230).

Inventors:
CLAEZON FREDRICH (SE)
LINDBERG MIKAEL (SE)
Application Number:
PCT/SE2017/050196
Publication Date:
September 21, 2017
Filing Date:
March 02, 2017
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
SCANIA CV AB (SE)
International Classes:
B60W50/02; G07C5/08; G01M17/00
Domestic Patent References:
WO2013098980A12013-07-04
Foreign References:
KR100962408B12010-06-11
US20090290757A12009-11-26
Attorney, Agent or Firm:
ELLIOT, Douglas (SE)
Download PDF:
Claims:
PATENT CLAIMS

1 . A method (400) in a vehicle (100) for vehicle diagnosis by visual inspection, comprising:

comparing (401 ) an image (220) of at least a part of the vehicle (100) with a previ- ously stored image (210), depicting a normal state of the vehicle (100);

detecting (402) an anomaly (230) in the image (220), in comparison with the previously stored image (210); and

executing (403) a measurement in order to at least reduce the impact of the detected (402) anomaly (230).

2. The method (400) according to claim 1 , wherein the executed measurement comprises outputting an alert for informing a person responsible of the vehicle (100) concerning the detected (402) anomaly (230). 3. The method (400) according to any of claim 1 or claim 2, wherein the anomaly (230) results from a malfunctioning physical vehicle part.

4. The method (400) according to any of claims 1 -3, wherein the anomaly (230) comprises a deviation from an expected condition of the vehicle (100), under current driving con- ditions.

5. The method (400) according to any of claims 1 -4, wherein the anomaly (230) comprises a plausibility deviation of information in a digital map, based on at least a part of the image (220), depicting vehicle surroundings and wherein the executed (403) measurement comprises outputting an alert for informing a person responsible of the digital map concerning the detected (402) anomaly (230).

6. The method (400) according to any of claims 1 -5, wherein the anomaly (230) comprises a deviation from an expected condition of another vehicle (350), under current driving conditions and wherein the executed (403) measurement comprises outputting an alert for informing a person responsible of the other vehicle (350) concerning the detected (402) anomaly (230).

7. The method (400) according to any of claims 1 -6, further comprising:

storing (404) images (220) of the detected anomaly (230) in a vehicle inspection file.

8. A control unit (310) in a vehicle (100), for vehicle self-diagnosis by visual inspection, configured to:

compare an image (220) of at least a part of the vehicle (100) with a previously stored image (210), depicting a normal state of the vehicle (100);

detect an anomaly (230) in the image (220), in comparison with the previously stored image (210); and

execute a measurement in order to at least reduce the impact of the detected anomaly (230). 9. The control unit (310) according to claim 8, wherein the executed measurement comprises outputting an alert for informing a person responsible of the vehicle (100) concerning the detected anomaly (230).

10. The control unit (310) according to any of claim 8 or claim 9, wherein the anomaly (230) results from a malfunctioning physical vehicle part.

1 1 . The control unit (310) according to any of claims 8-10, wherein the anomaly (230) comprises a deviation from an expected condition of the vehicle (100), under current driving conditions.

12. The control unit (310) according to any of claims 8-1 1 , wherein the anomaly (230) comprises a plausibility deviation of information in a digital map, based on at least a part of the image (220), depicting vehicle surroundings and wherein the executed measurement comprises outputting an alert for informing a person responsible of the digital map concern- ing the detected anomaly (230).

13. The control unit (310) according to any of claims 8-12, wherein the anomaly (230) comprises a deviation from an expected condition of another vehicle (350), under current driving conditions and wherein the executed measurement comprises outputting an alert for informing a person responsible of the other vehicle (350) concerning the detected anomaly (230).

14. The control unit (310) according to any of claims 8-13, further configured to store images of the detected anomaly (230) in a vehicle inspection file in a database (315).

15. A computer program comprising program code for performing a method (400) according to any of claims 1 -7 when the computer program is executed in a processor in a control unit (310), according to any of claims 8-14.

16. A system (500) for vehicle self-diagnosis by visual inspection, which system (500) comprises:

a control unit (310) according to claims 8-14;

at least one sensor (1 10, 120, 130, 140) of the vehicle (100), for capturing an image

(210, 220) of at least a part of the vehicle (100);

a data storage device (315) for storing an image (210) depicting a normal state of the vehicle (100).

Description:
METHOD AND CONTROL UNIT FOR VEHICLE SELF-DIAGNOSIS TECHNICAL FIELD

This document relates to a method and a control unit in a vehicle. More particularly, a method and a control unit for vehicle self-diagnosis by visual inspection are described.

BACKGROUND

A regular check on the vehicle health is important from a traffic security view, but also for keeping the vehicle in operable state. An example may be to walk around the vehicle with the engine running and lights on, before taking off, in order to check that the lights are working, that the lamp glass is broken or has a crack (which may affect the light distribution), that the rear view mirrors are intact and clean, that there are no oil spots under the vehicle, etc.

There are fortunately electrical tests in many vehicles, for detecting e.g. when a lamp is broken and present a visual indication to the driver. However, none of the other above mentioned defects may be detected by such known electrical test. Further, in case the display of the vehicle (where the visual indication is presented) is malfunctioning, the driver may not notice the emitted visual indication. The vehicle as herein discussed may comprise a means for transportation in broad sense such as e.g. a truck, a car, a motorcycle, a trailer, a bus, a bike, a train, a tram, an aircraft, a watercraft, a cable transport, an aerial tramway, an elevator, a drone, a spacecraft, or other similar manned or unmanned means of conveyance. Due to lack of time and/ or interest, many vehicle drivers may not check their vehicle on a regular basis, at least not as intrusive as may be desired.

Further, some vehicles may be unmanned, so called autonomous vehicles. Thus there is no driver present to make any check at all concerning the condition of the vehicle.

In the case of negligent driver and/ or unmanned vehicle, when finally taken to a workshop, it would be a great help for the vehicle mechanic and/ or the vehicle manufacturer to know for example how long the error has existed, how it has emerged, if there are any consecutive errors, etc. The negligent driver may be of little help for the mechanic in such cases, if no regular vehicle inspections are performed.

Another problem concerns digital maps used in a navigator of the vehicle. Such maps are often based on collected data which may be obsolete due to road re-/ constructions. This is dangerous for the driver trusting the navigator blindly. For an autonomous vehicle, a trustworthy digital map is crucial for a successful driving; however, updating digital maps require an extensive work effort. It may be easier to detect various errors of another vehicle while driving. However, there is no convenient way of informing another vehicle driver of an anomaly on his/ her vehicle. Iterated honking is an option which however is likely to be misinterpreted by the other vehicle driver. Looking up the vehicle registration plate in the vehicle register (if at all available), searching for the vehicle owner ' s cell phone number (if public) and calling him/ her is an option which is not only illegal in many legislations, but also rather hazardous as it requires the driver to put his/ her attention into various internet searching and user equipment manipulation activities.

Document US20050062615 describes a system for driver assistance of a vehicle, including a self-diagnosis of camera sensors of a vehicle, having over lapping fields of detection. By comparing the overlapping regions of two or more cameras, it may be determined if one of them is defect. However, the document does not discuss how any other anomaly or defect of any vehicle part may be detected. Document US7991583 presents a method for detecting errors in a component of an Advanced Driver Assistance Systems (ADAS), by measuring electrical currents and detecting anomalies from an expected value. Based on the detected anomaly and a comparison with a model, the error may be localised. However, errors or anomalies besides electrical errors of the vehicle ADAS cannot be detected.

Document US20100066526 also illustrates error detection in a vehicle ADAS system, similar to the previously discussed system, and unfortunately inducing the same disadvantages concerning detection of other anomalies of the vehicle. Document US20060170427 presents a method for cycling vehicle lamps on and off to allow direct sight inspection by one person of operability of the lamp bulbs during a vehicle workaround. Thereby the problem of moving back and forth from a position outside the vehicle where the respective lamp can be seen, and the lamp switches at the driving seat is omitted. However, the method does not provide any error detection at all in case the driver does not make the visual inspection him/ herself.

It would thus be desired to improve error detection of a vehicle. SUMMARY

It is therefore an object of this invention to solve at least some of the above problems and increase traffic security by improving anomaly detection. According to a first aspect of the invention, this objective is achieved by a method in a vehicle for vehicle diagnosis by visual inspection. The method comprises comparing an image of at least a part of the vehicle with a previously stored image, depicting a normal state of the vehicle. The method further comprises detecting an anomaly in the image, in comparison with the previously stored image. In addition, the method also comprises executing a meas- urement in order to at least reduce the impact of the detected anomaly.

According to a second aspect of the invention, this objective is achieved by a control unit in a vehicle. The control unit aims at providing vehicle self-diagnosis by visual inspection. The control unit is configured to compare an image of at least a part of the vehicle with a previ- ously stored image, depicting a normal state of the vehicle. Furthermore, the control unit is in addition configured to detect an anomaly in the image, in comparison with the previously stored image. Also, the control unit is additionally configured to execute a measurement in order to at least reduce the impact of the detected anomaly. Thanks to the described aspects, by capturing images with various on-board sensors and comparing them with a previously stored image covering at least partly the same part of the vehicle, or vehicle surroundings, an anomaly such as a malfunctioning vehicle part could be detected, also when there is no driver present in the vehicle, or at least no observant driver present. Thereby various problems that may occur on a vehicle, which normally are detected by an attentive driver, may be detected and the vehicle driver/ owner may be informed. Also, various errors in digital maps may be detected and continuously updated in some embodiments, leading to more reliable digital maps, which is crucial for automated vehicle navigation, e.g. in autonomous vehicles. Thus increased traffic security is achieved. Other advantages and additional novel features will become apparent from the subsequent detailed description.

FIGURES

Embodiments of the invention will now be described in further detail with reference to the accompanying figures, in which:

Figure 1 illustrates a vehicle according to an embodiment of the invention; Figure 2A illustrates an example of a traffic scenario and an embodiment of the invention;

Figure 2B illustrates an example of an anomaly detection in an image, according to an embodiment of the invention;

Figure 3A illustrates an example of an anomaly detection in an image, according to an embodiment of the invention;

Figure 3B illustrates an example of an anomaly detection in an image, according to an embodiment of the invention;

Figure 3C illustrates an example of an anomaly detection in an image, according to an embodiment of the invention;

Figure 3D illustrates an example of an anomaly detection in an image, according to an embodiment of the invention;

Figure 3E illustrates an example of an anomaly detection in an image, according to an embodiment of the invention;

Figure 4 is a flow chart illustrating an embodiment of the method;

Figure 5 is an illustration depicting a system according to an embodiment.

DETAILED DESCRIPTION

Embodiments of the invention described herein are defined as a method and a control unit, which may be put into practice in the embodiments described below. These embodiments may, however, be exemplified and realised in many different forms and are not to be limited to the examples set forth herein; rather, these illustrative examples of embodiments are provided so that this disclosure will be thorough and complete. Still other objects and features may become apparent from the following detailed description, considered in conjunction with the accompanying drawings. It is to be understood, however, that the drawings are designed solely for purposes of illustration and not as a definition of the limits of the herein disclosed embodiments, for which reference is to be made to the appended claims. Further, the drawings are not necessarily drawn to scale and, unless oth- erwise indicated, they are merely intended to conceptually illustrate the structures and procedures described herein.

Figure 1 illustrates a scenario with a vehicle 100. The vehicle 100 is driving on a road in a driving direction 105.

The vehicle 100 may comprise a means for transportation in broad sense such as e.g. a truck, a car, a motorcycle, a trailer, a bus, a bike, a train, a tram, an aircraft, a watercraft, a cable transport, an aerial tramway, an elevator, a drone, a spacecraft, or other similar manned or unmanned means of conveyance.

The vehicle 100 may be driver controlled or driverless (i.e. autonomously controlled) in different embodiments. However, for enhanced clarity, the vehicle 100 is subsequently de- scribed as having a driver.

The vehicle 100 comprises at least one sensor, typically a plurality of sensors. Such sensors may comprise e.g. a camera, a stereo camera, an infrared camera, a video camera, a radar, a lidar, an ultrasound device, a time-of-flight camera, or similar device, in different embodi- ments.

In some embodiments, the sensors may comprise e.g. a motion detector and/ or be based on a Passive Infrared (PIR) sensor sensitive to a person's skin temperature through emitted black body radiation at mid-infrared wavelengths, in contrast to background objects at room temperature; or by emitting a continuous wave of microwave radiation and detect motion through the principle of Doppler radar; or by emitting an ultrasonic wave an detecting and analysing the reflections; or by a tomographic motion detection system based on detection of radio wave disturbances, to mention some possible implementations. The sensors may be of the same or different types in various embodiments, and may be situated in the vehicle 100 for various other functionalities such as e.g. Advanced Driver Assistance System (ADAS) than to provide a vehicle self-diagnosis.

Such sensors may be situated in the vehicle 100 and be directed out from the vehicle 100, e.g. for detecting an obstacle in front of the vehicle 100. However, a part of the image, video sequence or other information captured by the sensor will cover the own vehicle 100.

By using the on-board sensors for collecting information concerning the own vehicle 100, image, video sequence or other sensor information and analyse this information, e.g. by comparing this information with a stored ideal image, video sequence or other sensor information of the vehicle 100, or a part thereof, an anomaly which may be the result from a broken component on the vehicle 100, could be detected. This solution acts like a second pair of eyes (or the only pair of eyes in case of an autonomous vehicle), constantly looking for anomalies which could indicate failure on the vehicle 100.

Some arbitrary examples of such anomalies detected on the vehicle 100 may be e.g. changes in light distribution implies broken or misfit lamps, dirty or broken lamp glass etc.; lack of direction indication light when using the direction indicator implies dirty or broken indicator lamps; changes in cabin roll-, yaw or pitch angle implies broken cab suspension; detection of errors of high beam area may trigger adjustment of the light; detection of errors in Adaptive Main Beam function (i.e. that the high beam is turned down when meeting another vehicle or traffic user); detection of unsecured cargo; error detection for rain sensor.

Further, error detection of other vehicles may be made, e.g. of malfunctioning lamps etc. The other vehicle is informed via wireless communication. Also, error detection may be made of map data not consistent with the surroundings, such as e.g. wrong numbers of lanes in the road.

By using existing sensors, anomalies and problems may be detected automatically and in some cases fixed by online adjustments. For example, if the light distributions for the Adaptive Main Beam is incorrect, the sensor based self-diagnostic will enable the function to self- adjust, according to an embodiment.

Another example may be, in case the vehicle body panel, vehicle glass and/ or sensor lenses are dirty, a recommendation may be made to drive to a car wash, in some embodiments.

The sensors, or a subset of the on-board sensors may be situated in the cab, directed to- wards the driver, in the cargo space and/ or in the passenger compartment (e.g. in case of a bus or another mass transportation vehicle). Thereby for example a fire in the passenger compartment, displaced cargo in the cargo space or a driver having a heart attack may be detected and appropriate measures may be taken, starting with slowing down and parking the vehicle 100.

Examples of anomalies and functions will later be discussed more in detail, however, firstly some examples of sensors of the vehicle 100 will be presented.

The vehicle 100 comprises at least one sensor such as e.g. a forwardly directed sensor 110 in some embodiments. In the illustrated embodiment, which is merely an arbitrary example, the forwardly directed sensor 1 10 may be situated e.g. at the front of the vehicle 100, behind the windscreen of the vehicle 100.

Mounting the forwardly directed sensor 1 10 behind the windshield have some advantages compared to externally mounted camera systems. These advantages include the possibility to use windshield wipers for cleaning and using the light from headlights to illuminate objects in the camera's field of view. It is also protected from dirt, snow, rain and to some extent also from damage, vandalism and/ or theft. Such sensor 1 10 may also be used for a variety of other tasks, such as detecting an in-front object or vehicle, detecting road signs, detecting road lane markings, for estimating distance to an in-front vehicle, etc.

The sensor 1 10 may be directed towards the front of the vehicle 100, in the driving direction 5 105. The sensor 1 10 may comprise e.g. a camera, a stereo camera, an infrared camera, a video camera, a radar, a lidar, an ultrasound device, a time-of-flight camera, or similar device, in different embodiments.

Further the vehicle 100 may comprise one or two side view sensors 120. The side view0 sensors 120 may be situated at the left/ right sides of the vehicle 100 (as regarded in the driving direction 105), arranged to detect objects at the respective side of the vehicle 100. The side view sensor 120 may comprise e.g. a camera, a stereo camera, an infrared camera, a video camera, a radar, a lidar, an ultrasound device, a time-of-flight camera, or similar device in different embodiments.

5

Instead of using traditional rear view mirrors on the vehicle 100, side view sensors 120 may be utilised in combination with one or more devices intended to display objects outside a driver's direct field of vision may be used. Such presentational device may comprise e.g. a display, a projector, a Head-Up Display, a transparent display being part of the windshield,0 intelligent glasses of the driver, etc., which output an image, or stream of images, captured by a corresponding sensor 1 10, 120. Typically, the sensor 120 on the left side of the vehicle 100 may be associated with a presentational device on the left side of the cabin while the sensor on the right side of the vehicle 100 may be associated with a presentational device on the right side of the cabin, even if other combinations are possible.

5

The sensors 1 10, 120 may be turned and/ or re-directed in different directions and the devices intended to display objects outside a driver's direct field of vision may present the adjusted view of the associated sensor 1 10, 120. 0 Further, in some embodiments, a detected object around the vehicle 100 may be indicated on an overview (bird-eyes view) presentation, e.g. on a display in the cabin, or in any of the presentational devices.

Figure 2A schematically illustrates a scenario, similar to the previously discussed scenario5 illustrated in Figure 1 , but with the vehicle 100 seen from an above perspective wherein a plurality of sensors 1 10, 120, 130, 140 are depicted.

The reverse sensor 130 may be utilised for detecting objects behind the vehicle 100 and the left side sensor 140 may function and detect objects etc., much in the same or similar manner as has been described above in their respective directions.

The sensors 1 10, 120, 130, 140 comprises, or are connected to a control unit configured to image recognition/ computer vision and object recognition.

Computer vision is a technical field comprising methods for acquiring, processing, analysing, and understanding images and, in general, high-dimensional data from the real world in order to produce numerical or symbolic information. A theme in the development of this field has been to duplicate the abilities of human vision by electronically perceiving and understanding an image. Understanding in this context means the transformation of visual images (the input of retina) into descriptions of world that can interface with other thought processes and elicit appropriate action. This image understanding can be seen as the disentangling of symbolic information from image data using models constructed with the aid of geometry, physics, statistics, and learning theory. Computer vision may also be described as the enterprise of automating and integrating a wide range of processes and representations for vision perception.

The image data of the sensors 1 10, 120, 130, 140 may take many forms, such as e.g. im- ages, video sequences, views from multiple cameras, or multi-dimensional data from a scanner.

Computer vision may comprise e.g. scene reconstruction, event detection, video tracking, object recognition, object pose estimation, learning, indexing, motion estimation, and image restoration, just to mention some examples.

Figure 2B illustrates an example of vehicle diagnosis by visual inspection according to an embodiment. A presentational device 200 may be used for presenting a previously stored image 210, e.g. at a first period of time t1 . The previously stored image 210 may depict an original or ideal image of a part of the vehicle 100; or the state when the owner configures the stored image 210 at a first period of time. The sensors 1 10, 120, 130, 140 may then continuously or at predetermined or configurable time intervals collect information, e.g. by capturing an image 220 at a second period of time t2. In the illustrated scenario, an anomaly 230 is detected in the image 220, in comparison with the previously stored image 210. In this case the anomaly 230 is a crack in a glass, e.g. of the wind shield or the head light glass of the vehicle 100. However, the anomaly 230 may comprise e.g. a broken or lost vehicle part; holes, scars or stains in the vehicle paint; mal- functioning vehicle instruments, etc.

In some embodiments, the anomaly 230 may be recognised in the image 220 by deep learning (sometimes also referred to as deep structured learning, hierarchical learning and/ or deep machine learning); a branch of machine learning based on a set of algorithms that attempt to model high-level abstractions in data by using multiple processing layers with complex structures, or otherwise composed of multiple non-linear transformations. Deep learning is based on learning representations of data. An observation (e.g., the image 220) may be represented in many ways such as a vector of intensity values per pixel, or in a more abstract way as a set of edges, regions of particular shape, etc.

Deep learning typically uses a cascade of many layers of nonlinear processing units for feature extraction and transformation. Each successive layer uses the output from the previous layer as input. The algorithms may be supervised or unsupervised and applications may comprise pattern analysis (unsupervised) and classification (supervised). Further, deep learning may be based on the (unsupervised) learning of multiple levels of features or representations of the data. Higher level features may be derived from lower level features to form a hierarchical representation. By Deep learning, multiple levels of representations that correspond to different levels of abstraction are learned; the levels form a hierarchy of concepts. The composition of a layer of nonlinear processing units used in a deep learning al- gorithm depends on the problem to be solved, i.e. recognising the anomaly 230.

In the illustrated embodiment, the presentational device 200 is a mobile device of the vehicle driver, vehicle owner or other vehicle responsible person, who may not necessarily be situated in the vehicle 100.

Figure 3A illustrates an example of a vehicle interior of the vehicle 100 and depicts how the previously scenario in Figure 2B may be perceived by the driver of the vehicle 100, or a passenger as the case may be when the vehicle 100 is autonomous. The vehicle 100 comprises a control unit 310, for vehicle diagnosis by visual inspection by collecting information from sensors 1 10, 120, 130, 140 of the vehicle 100 and comparing a later captured image 220 with a previously stored image 210. The previously stored image 210 may be stored in a data storage device/ database 315. The control unit 310 may communicate with the other vehicle internal units such as the sensors 1 10, 120, 130, 140 via e.g. a communication bus. The communication bus may comprise e.g. a Controller Area Network (CAN) bus, a Media Oriented Systems Transport (MOST) bus, or similar. However, the datalink may alternatively be made over a wireless connection comprising, or at least be inspired by any wireless communication technology such as e.g. Wi-Fi, Bluetooth, etc.

In the illustrated embodiment, a message is displayed on the presentational device 200, for making the driver attentive on the observed and detected anomaly 230. In some embodiments, a recommendation may also be presented for the driver/ owner concerning what to do. In the case of a detected crack in the windshield, the driver/ owner may be advised to cover the crack from the outside with a piece of transparent tape and then drive to a glass repairing workshop. Via a wireless internet connection, a search may be made for such work- shops close to the geographical position of the vehicle 100, and/ or in the driving direction 105 of the vehicle 100, and a recommendation may be made based on e.g. a price comparison, an instant service availability check and/ or customer satisfaction of previous customers, if such information is available. Figure 3B illustrates yet an example of a vehicle interior of the vehicle 100 and depicts how a scenario wherein a broken head lamp glass may be perceived from inside an autonomous vehicle 100, and by a vehicle owner (or other person being responsible for the vehicle 100) via a presentational device 200 on a distance from the vehicle 100. The vehicle 100, besides the already presented control unit 310, data storage device 315 and sensors 1 10, 120, 130, 140 comprises a wireless transmitter or transceiver 320. The transmitter 320 may communicate wirelessly with the presentational device 200 of the vehicle owner. Communication may be made over a wireless communication interface, such as e.g. Vehicle- to-Vehicle (V2V) communication, or Vehicle-to-Structure (V2X) communication.

In some embodiments, the communication between the transmitter 320 and the presentational device 200 may be performed via V2V communication, e.g. based on Dedicated Short- Range Communications (DSRC) devices. DSRC works in 5.9 GHz band with bandwidth of 75 MHz and approximate range of 1000 m in some embodiments. The wireless communication may be made according to any IEEE standard for wireless vehicular communication like e.g. a special mode of operation of IEEE 802.1 1 for vehicular networks called Wireless Access in Vehicular Environments (WAVE). IEEE 802.1 1 p is an extension to 802.1 1 Wireless LAN medium access layer (MAC) and physical layer (PHY) specification.

Such wireless communication interface may comprise, or at least be inspired by wireless communication technology such as Wi-Fi, Wireless Local Area Network (WLAN), Ultra Mobile Broadband (UMB), Bluetooth (BT), Near Field Communication (NFC), Radio-Frequency Identification (RFID), optical communication such as Infrared Data Association (IrDA) or infrared transmission to name but a few possible examples of wireless communications in some embodiments.

The communication may alternatively be made over a wireless interface comprising, or at least being inspired by radio access technologies such as e.g. 3GPP LTE, LTE-Advanced, E-UTRAN, UMTS, GSM, GSM/ EDGE, WCDMA, Time Division Multiple Access (TDMA) networks, Frequency Division Multiple Access (FDMA) networks, Orthogonal FDMA (OFDMA) networks, Single-Carrier FDMA (SC-FDMA) networks, Worldwide Interoperability for Microwave Access (WiMax), or Ultra Mobile Broadband (UMB), High Speed Packet Access (HSPA) Evolved Universal Terrestrial Radio Access (E-UTRA), Universal Terrestrial Radio Access (UTRA), GSM EDGE Radio Access Network (GERAN), 3GPP2 CDMA technologies, e.g., CDMA2000 1 x RTT and High Rate Packet Data (HRPD), or similar, just to mention some few options, via a wireless communication network. In the illustrated embodiment, the image comparison has resulted in detecting that the vehicle 100 has a broken head lamp glass on the left side. The detected anomaly 230 thus results from a malfunctioning physical vehicle part. As driving with malfunctioning head lights is illegal (in at least some jurisdictions), the vehicle 100 has been parked at the road side and the geographical position of the vehicle 100 is sent to the presentational device 200.

Thereby, the owner/ corresponding responsible person is informed about the situation and may take appropriate measures, such as bringing correct spare parts and tools, informing a transportation receiver (if any) about the delay and drive to the vehicle 100 and repairing it. The geographical position of the vehicle 100 may be determined by the positioning unit 330 in the vehicle 100, which may be based on a satellite navigation system such as the Navigation Signal Timing and Ranging (Navstar) Global Positioning System (GPS), Differential GPS (DGPS), Galileo, GLONASS, or the like. The geographical position of the positioning unit 330, (and thereby also of the vehicle 100) may be made continuously with a certain predetermined or configurable time intervals according to various embodiments.

5

Positioning by satellite navigation is based on distance measurement using triangulation from a number of satellites 340-1 , 340-2, 340-3, 340-4. In this example, four satellites 340- 1 , 340-2, 340-3, 340-4 are depicted, but this is merely an example. More than four satellites 340-1 , 340-2, 340-3, 340-4 may be used for enhancing the precision, or for creating redun-

10 dancy. The satellites 340-1 , 340-2, 340-3, 340-4 continuously transmit information about time and date (for example, in coded form), identity (which satellite 340-1 , 340-2, 340-3, 340-4 that broadcasts), status, and where the satellite 340-1 , 340-2, 340-3, 340-4 are situated at any given time. The GPS satellites 340-1 , 340-2, 340-3, 340-4 sends information encoded with different codes, for example, but not necessarily based on Code Division Multiple Ac-

15 cess (CDMA). This allows information from an individual satellite 340-1 , 340-2, 340-3, 340- 4 distinguished from the others' information, based on a unique code for each respective satellite 340-1 , 340-2, 340-3, 340-4. This information can then be transmitted to be received by the appropriately adapted positioning device comprised in the vehicle 100.

20 Distance measurement can according to some embodiments comprise measuring the difference in the time it takes for each respective satellite signal transmitted by the respective satellites 340-1 , 340-2, 340-3, 340-4 to reach the positioning unit 330. As the radio signals travel at the speed of light, the distance to the respective satellite 340-1 , 340-2, 340-3, 340- 4 may be computed by measuring the signal propagation time.

25

The positions of the satellites 340-1 , 340-2, 340-3, 340-4 are known, as they continuously are monitored by approximately 15-30 ground stations located mainly along and near the earth's equator. Thereby the geographical position, i.e. latitude and longitude, of the vehicle 100 may be calculated by determining the distance to at least three satellites 340-1 , 340-2, 30 340-3, 340-4 through triangulation. For determination of altitude, signals from four satellites 340-1 , 340-2, 340-3, 340-4 may be used according to some embodiments.

The geographical position of the vehicle 100 may alternatively be determined, e.g. by having transponders positioned at known positions around the route of the vehicle 100 and a dedi- 35 cated sensor in the vehicle 100, for recognising the transponders and thereby determining the position; by detecting and recognising WiFi networks (WiFi networks along the route may be mapped with certain respective geographical positions in a database); by receiving a Bluetooth beaconing signal, associated with a geographical position, or other signal signatures of wireless signals such as e.g. by triangulation of signals emitted by a plurality of fixed base stations with known geographical positions. The position may alternatively be entered by a passenger in the vehicle 100.

5

Having determined the geographical position of the positioning unit 330 (or in another way), it may be presented on the presentational device 200, e.g. on a map where the position of the vehicle 100 may be marked, in some embodiments.

10 Figure 3C illustrates yet an example of a vehicle interior of the vehicle 100 and depicts a scenario wherein the anomaly 230 comprises a deviation from an expected condition of the vehicle 100, under current driving conditions.

In this example, it is presumed that the vehicle 100 comprises a rain sensor which activates 15 the wipers when it is raining. On the image 220 it is detected rain but the wipers are not active, as may be detected by the forwardly directed sensor 1 10. The reason may be that the rain sensor is defect, or alternatively that the wiper engine is not working properly.

Such discovered anomaly 230, may be presented to the driver on the presentational device 20 200, possibly together with information concerning which measures to take.

Other examples of anomalies 230 comprising a deviation from an expected condition of the vehicle 100, under current driving conditions may be that a door of the vehicle 100 is open while driving; a crane is in upright position while driving; a pair of glasses (or other arbitrary

25 object) has been left on the vehicle roof when the drive is about to commence; a piece of the driver ' s clothes has been jammed in the door; a piece of a passenger ' s clothes has been jammed in the door when exiting (e.g. in a mass transportation vehicle); the vehicle doors are unlocked when driving into a harsh suburban neighbourhood, frequently visited by car- jackers; a person under 18 years old (or a non-authorised person) is trying to start the vehicle

30 100, etc.

Figure 3D illustrates another example of a vehicle interior of the vehicle 100 and depicts a scenario wherein the anomaly 230 comprises a plausibility deviation of information in a digital map, based on at least a part of the image 220, depicting vehicle surroundings.

35

According to some embodiments, the image 220 may be compared with a map state. Such comparison may result in a detection of a different number of lanes in reality than according to the map data, due to a recent roadwork etc., or simply an error in the stored map data. This is merely an arbitrary example of such possible deviation between the reality as captured by the vehicle sensors 1 10, 120, 130, 140. Other examples may be new road/ entrance/ exit; new speed limit on an existing road; that a road has been changed into unidirectional, etc. The sensors 1 10, 120, 130, 140 may capture and collect information from the road, traffic signs etc., continuously while driving, and a comparison may be made with map data, and information associated with the geographical position of the vehicle 100, such as speed limit and other restrictions.

In case an anomaly 230 between the reality and the stored map data is discovered, an alert may be outputted for informing a person responsible of the digital map concerning the detected anomaly 230. Such alert may also be provided to the driver, if any, on the presentational device 200.

By receiving information continuously from a plurality of vehicles 100 in various traffic situa- tions, map data may be updated in a convenient manner. For driver assistance systems and (to even larger extent) autonomous vehicles, it is important that map data can be trusted for navigation, as there may not be any driver present to notice and react on deviations from the map data. In some embodiments, the on-board sensors 1 10, 120, 130, 140 may detect indications of an accident, or a hazardous situation on, or in the vicinity of the road, such as e.g. a stationary vehicle on a highway; a reversing vehicle on a highway; a vehicle driving against the driving direction on a road; a vehicle driving in a pedestrian zone or bicycle path; people or animals lying on the road, etc. In such case, besides slowing down and/ or stopping the own vehicle 100, information concerning the detected accident indication may be sent to a police department, a traffic surveillance centre, an emergency centre or similar entity.

Figure 3E illustrates another example of a vehicle interior of the vehicle 100, comprising all, or at least some of the previously discussed components. However, in the illustrated exam- pie, the anomaly 230 comprises a deviation from an expected condition of another vehicle 350, under current driving conditions.

Such anomaly 230 may comprise any of the previously mentioned possible anomalies 230 for the own vehicle 100, such as e.g. that the left rear position lamp 370 is not working. In such case, an alert may be outputted for informing a person responsible of the other vehicle 350 concerning the detected anomaly 230, e.g. via a wireless signal transmitted by the transmitter 320, according to any of the previously mentioned wireless communication technolo- gies. The other vehicle 350 may comprise a receiver 360 for receiving such transmitted information.

Thereby, the other driver/ owner of the other vehicle 350 may be informed about anomalies 5 230 of his/ her vehicle 350, which otherwise may be difficult for the driver (if any) of the vehicle 350 to notify such as various flaws or defects, i.e. anomalies 230 on the back side of the other vehicle 350, for example.

Thus various potentially traffic dangerous situations due to errors or anomalies 230 of the0 other vehicle 350 may be detected and rectified, leading to a safer traffic environment.

Figure 4 illustrates an example of a method 400 according to an embodiment. The flow chart in Figure 4 shows the method 400 for use in a vehicle 100. The method 400 aims at providing vehicle diagnosis by visual inspection.

5

The vehicle 100 may be e.g. a truck, a bus, a car, or similar means of conveyance as previously mentioned, autonomous or non-autonomous.

The vehicle 100 may comprise a plurality of sensors 1 10, 120, 130, 140, pointable in various0 different directions around the vehicle 100, and having a respective surveillance area which at least partly covers a part of the own vehicle 100.

In order to correctly be able to make a visual inspection via the sensors 1 10, 120, 130, 140, for providing vehicle diagnosis, the method 400 may comprise a number of steps 401-404.5 However, some of these steps 401 -404 may be performed in various alternative manners. Some method steps may only be performed in some optional embodiments; such as e.g. steps 403-404. Further, the described steps 401 -404 may be performed in a somewhat different chronological order than the numbering suggests. The method 400 may comprise the subsequent steps:

0

Step 401 comprises comparing an image 220 of at least a part of the vehicle 100 with a previously stored image 210, depicting a normal state of the vehicle 100.

Step 402 comprises detecting an anomaly 230 in the image 220, in comparison with the5 previously stored image 210. The anomaly 230 may result from a malfunctioning physical vehicle part in some embodiments. Further, in some embodiments, the anomaly 230 may comprise a deviation from an expected condition of the vehicle 100, under current driving conditions. Furthermore, the anomaly 230 may comprise a plausibility deviation of information in a digital map, based on at least a part of the image 220, depicting vehicle surroundings in some embodiments.

According to some embodiments, the anomaly 230 may comprise a deviation from an ex- pected condition of another vehicle 350, under current driving conditions.

The anomaly 230 may be detected, based on image recognition.

Step 403 comprises executing a measurement in order to eliminate or at least reduce the impact of the detected 402 anomaly 230.

The executed measurement comprises outputting an alert for informing a person responsible of the vehicle 100 concerning the detected 402 anomaly 230. The executed measurement may in some embodiments comprise outputting an alert for informing a person responsible of the digital map concerning the detected 402 anomaly 230.

Further, the executed measurement may comprise outputting an alert for informing a person responsible of the other vehicle 350 concerning the detected 402 anomaly 230, according to some embodiments.

Step 404, which may be performed only in some particular embodiments, comprises storing images 220 of the detected anomaly 230 in a vehicle inspection file. The vehicle inspection file may be situated in the data storage device 315, or at any other convenient location, in different embodiments. Thus a vehicle inspection light is provided, noticing various errors and/ or defects of the vehicle 100.

Thereby, information may be saved and provided to a mechanic, who thanks to the vehicle inspection file may understand what has occurred with the vehicle 100, and what problems has occurred.

Figure 5 illustrates an embodiment of a system 500 in a vehicle 100 for vehicle self-diagnosis by visual inspection. The system 500 may perform at least some of the previously described steps 401 -404 according to the method 400 described above and illustrated in Figure 4.

The system 500 comprises at least one control unit 310 in the vehicle 100, for vehicle self- 5 diagnosis by visual inspection. The control unit 310 is configured to compare an image 220 of at least a part of the vehicle 100 with a previously stored image 210, depicting a normal state of the vehicle 100. Further, the control unit 310 is configured to detect an anomaly 230 in the image 220, in comparison with the previously stored image 210. Also, the control unit 310 is configured to execute a measurement in order to at least reduce the impact of the 10 detected anomaly 230.

Further the control unit 310 may be configured to store images of the detected anomaly 230 in a vehicle inspection file in a data storage device 315.

15 Further, in some embodiments, the control unit 310 may be configured to execute the measurement by outputting an alert for informing a person responsible of the vehicle 100 concerning the detected anomaly 230.

The anomaly 230 may result from a malfunctioning physical vehicle part in some embodi- 20 ments. Further, according to some embodiments, the anomaly 230 may comprise a deviation from an expected condition of the vehicle 100, under current driving conditions. In some particular embodiments, the anomaly 230 may comprise a plausibility deviation of information in a digital map, based on at least a part of the image 220, depicting vehicle surroundings and wherein the executed measurement comprises outputting an alert for inform- 25 ing a person responsible of the digital map concerning the detected anomaly 230. In addition, according to some embodiments, the anomaly 230 may comprise a deviation from an expected condition of another vehicle 350, under current driving conditions and wherein the executed measurement comprises outputting an alert for informing a person responsible of the other vehicle 350 concerning the detected anomaly 230.

30

Further, the system 500 also comprises at least one sensor 1 10, 120, 130, 140 of the vehicle 100, for capturing an image 210, 220 of at least a part of the vehicle 100.

The system 500 in addition further comprises a data storage device 315 for storing an image 35 210 depicting a normal state of the vehicle 100.

The control unit 310 comprises a receiving circuit 510 configured for receiving a signal from the sensors 1 10, 120, 130, 140; and/ or from the data storage device 315. Further, the control unit 310 comprises a processor 520 configured for performing at least some steps 401 -404 of the above described method 400, according to some embodiments. Such processor 520 may comprise one or more instances of a processing circuit, i.e. a Central Processing Unit (CPU), a processing unit, a processing circuit, an Application Specific Integrated Circuit (ASIC), a microprocessor, or other processing logic that may interpret and execute instructions. The herein utilised expression "processor" may thus represent a processing circuitry comprising a plurality of processing circuits, such as, e.g., any, some or all of the ones enumerated above.

Furthermore, the control unit 310 may comprise a memory 525 in some embodiments. The optional memory 525 may comprise a physical device utilised to store data or programs, i.e., sequences of instructions, on a temporary or permanent basis. According to some embodi- ments, the memory 525 may comprise integrated circuits comprising silicon-based transistors. The memory 525 may comprise e.g. a memory card, a flash memory, a USB memory, a hard disc, or another similar volatile or non-volatile storage unit for storing data such as e.g. ROM (Read-Only Memory), PROM (Programmable Read-Only Memory), EPROM (Erasable PROM), EEPROM (Electrically Erasable PROM), etc. in different embodiments.

Further, the control unit 310 may comprise a signal transmitter 530 in some embodiments. The signal transmitter 530 may be configured for transmitting a signal to e.g. the presentational device 200, and/ or the transmitter 320. The above described steps 401 -404 to be performed in the vehicle 100 may be implemented through the one or more processors 520 within the control unit 310, together with computer program product for performing at least some of the functions of the steps 401 -404. Thus a computer program product, comprising instructions for performing the steps 401 -404 in the control unit 310 may perform the method 400 comprising at least some of the steps 401 -404 for vehicle diagnosis by visual inspection, when the computer program is loaded into the one or more processors 520 of the control unit 310.

Further, some embodiments of the invention may comprise a vehicle 100, comprising the control unit 310, for vehicle diagnosis by visual inspection, according to at least some of the steps 401 -404.

The computer program product mentioned above may be provided for instance in the form of a data carrier carrying computer program code for performing at least some of the steps 401 -404 according to some embodiments when being loaded into the one or more processors 520 of the control unit 310. The data carrier may be, e.g., a hard disk, a CD ROM disc, a memory stick, an optical storage device, a magnetic storage device or any other appropriate medium such as a disk or tape that may hold machine readable data in a non-transitory manner. The computer program product may furthermore be provided as computer program code on a server and downloaded to the control unit 310 remotely, e.g., over an Internet or an intranet connection.

The terminology used in the description of the embodiments as illustrated in the accompa- nying drawings is not intended to be limiting of the described method 400; the control unit 310; the computer program; the system 500 and/ or the vehicle 100. Various changes, substitutions and/ or alterations may be made, without departing from invention embodiments as defined by the appended claims. As used herein, the term "and/ or" comprises any and all combinations of one or more of the associated listed items. The term "or" as used herein, is to be interpreted as a mathematical OR, i.e., as an inclusive disjunction; not as a mathematical exclusive OR (XOR), unless expressly stated otherwise. In addition, the singular forms "a", "an" and "the" are to be interpreted as "at least one", thus also possibly comprising a plurality of entities of the same kind, unless expressly stated otherwise. It will be further understood that the terms "includes", "comprises", "including" and/ or "comprising", specifies the presence of stated features, actions, integers, steps, operations, elements, and/ or components, but do not preclude the presence or addition of one or more other features, actions, integers, steps, operations, elements, components, and/ or groups thereof. A single unit such as e.g. a processor may fulfil the functions of several items recited in the claims. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage. A computer program may be stored/ distributed on a suitable medium, such as an optical storage medium or a solid-state medium supplied together with or as part of other hardware, but may also be distributed in other forms such as via Internet or other wired or wireless communication system.