Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
METHOD AND CONTROL ARRANGEMENT FOR VISUALISATION OF OBSTRUCTED VIEW
Document Type and Number:
WIPO Patent Application WO/2020/111999
Kind Code:
A1
Abstract:
A method (400) in a control arrangement (220) of an information transmitting unit (100a), for providing information (210) to an information receiving unit (100b, 100c). The method (400) comprises collecting (401) environmental data with at least one sensor (130); identifying (402) an object (200), which is considered relevant; extracting (403) data related to the object (200) from the environmental data; converting (405) the data into information (210); deter- mining (406) position of the object (200) based on the collected (401) environmental data; and providing (407) the information (210) and the determined (406) position of the object (200) to the information receiving unit (100b, 100c) via a wireless transmitter (140a), thereby enabling output of a representation (330) of the object (200) on an output device (240) of the information receiving unit (100b, 100c). Also, a method (600) in a control arrangement (230) of an information receiving unit (100b, 00c), is provided.

Inventors:
LIMA PEDRO (SE)
CIRILLO MARCELLO (SE)
MÁRKUS ATTILA (SE)
PETTERSSON HENRIK (SE)
HEDSTRÖM LARS-GUNNAR (SE)
Application Number:
PCT/SE2019/051145
Publication Date:
June 04, 2020
Filing Date:
November 12, 2019
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
SCANIA CV AB (SE)
International Classes:
G06F3/14; G06T7/00; G06V10/10; G06V20/58; G08G1/00; G08G1/16; G05D1/02
Foreign References:
DE102015105784A12016-10-20
US20180253899A12018-09-06
GB2545571A2017-06-21
US20180105176A12018-04-19
GB2562018A2018-11-07
GB2524385A2015-09-23
DE102013220312A12015-04-09
Attorney, Agent or Firm:
YOUSSEF, Maikel (SE)
Download PDF:
Claims:
PATENT CLAIMS

1. A method (400) in a control arrangement (220) of an information transmitting unit (100a), for providing information (210) to an information receiving unit (100b, 100c), wherein the method (400) comprises the steps of:

collecting (401 ) environmental data with at least one sensor (130);

identifying (402) an object (200) in the environment of the information transmitting unit (100a), which is considered relevant for the information receiving unit (100b, 100c); extracting (403) data related to the identified (402) object (200) from the collected (401 ) environmental data;

converting (405) the extracted (403) data into information (210);

determining (406) position of the object (200) based on the collected (401 ) environ mental data; and

providing (407) the converted (405) information (210) and the determined (406) po sition of the object (200) to the information receiving unit (100b, 100c) via a wireless trans- mitter (140a), thereby enabling output of a representation (330) of the object (200) on an output device (240) of the information receiving unit (100b, 100c).

2. The method (400) according to claim 1 , wherein the conversion (405) of the ex tracted (403) data into information (210) comprises selecting a prestored representation (330) of the object (200); and wherein

the provided (407) information (210) comprises the selected prestored representa tion (330).

3. The method (400) according to any one of claim 1 or claim 2, wherein the conversion (405) of the extracted (403) data into information (210) comprises:

selecting a prestored representation (330) of the identified (402) object (200) in a table (320a, 320b) stored in both a memory (300) of the information transmitting unit (100a) and a memory (310) of the information receiving unit (100b, 100c); and

determining a reference to the selected prestored representation (330) in the table (320a, 320b); and wherein the provided (407) information (210) comprises the determined reference.

4. The method (400) according to claim 3, further comprising the step of:

coordinating (404) the tables (320a, 320b) comprising the prestored representations (330) between the information transmitting unit (100a) and the information receiving unit (100b, 100c) before the converted (405) information (210) is provided (407). 5. The method (400) according to any one of claims 1 -4, wherein the provided (407) information (210) comprises data in object form.

6. A control arrangement (220) of an information transmitting unit (100a), for providing information (210) to an information receiving unit (100b, 100c), wherein the control arrange ment (220) is configured to:

collect environmental data with at least one sensor (130);

identify an object (200) in the environment of the information transmitting unit (100a), which is considered relevant for the information receiving unit (100b, 100c);

extract data related to the identified object (200) from the collected environmental data;

convert the extracted data into information (210); and

determine position of the object (200) based on the collected environmental data; and

provide the converted information (210) and the determined position of the object

(200) to the information receiving unit (100b, 100c) via a wireless transmitter (140a), thereby enabling output of a representation (330) of the object (200) on an output device (220) of the information receiving unit (100b, 100c). 7. The control arrangement (220) according to claim 6, further configured to:

convert the extracted data into information (210) by selecting a prestored represen tation (330) of the object (200); and to

provide, via the wireless transmitter (140a), information comprising the selected prestored representation (330).

8. The control arrangement (220) according to any one of claim 6 or claim 7, further configured to convert the extracted data into information (210) by

selecting a prestored representation (330) of the identified object (200) in a table (320a, 320b) stored in both a memory (300) of the information transmitting unit (100a) and a memory (310) of the information receiving unit (100b, 100c); and

determining a reference to the selected prestored representation (330) in the table (320a, 320b); and wherein the provided information (210) comprises the determined refer ence. 9. The control arrangement (220) according to claim 8, further configured to: coordinate the tables (320a, 320b) comprising prestored representations (330) be tween the information transmitting unit (100a) and the information receiving unit (100b, 100c) before the converted information (210) is provided.

10. The control arrangement (220) according to any one of claims 6-9, further config ured to provide information (210) comprising data in object form.

1 1 . A computer program comprising program code for performing a method (400) ac cording to any of claims 1 -5, when the computer program is executed in a control arrange ment (220) according to any one of claims 6-10.

12. A method (600) in a control arrangement (230) of an information receiving unit (100b, 100c), for outputting a representation (330) of an object (200) detected by at least one sensor (130) of an information transmitting unit (100a), based on information (210) ob tained from the information transmitting unit (100a), wherein the method (600) comprises the steps of:

receiving (601 ) information (210) concerning the object (200) and position of the object (200) from the information transmitting unit (100a) via a wireless receiver (140b); converting (603) the received (601 ) information (210) concerning the object (200) into a representation (330) of the object (200); and

outputting (604) the representation (330) of the object (200) on an output device (240) of the information receiving unit (100b, 100c).

13. The method (600) according to claim 12, wherein the conversion (603) of the re ceived (601 ) information (210) into the representation (330) of the object (200) comprises selecting the representation (330) of the object (200) based on the received (601 ) information (210).

14. The method (600) according to any one of claim 12 or claim 13, wherein the con version (603) comprises:

extracting a reference to a prestored representation (330) in a table (320a, 320b) stored in both a memory (310) of the information receiving unit (100b, 100c) and a memory (300) of the information transmitting unit (100a), from the received (601 ) information (210); selecting the prestored representation (330) of the object (200) in the table (320a, 320b) stored in the memory (310) of the information receiving unit (100b, 100c), based on the extracted reference.

15. The method (600) according to claim 14, further comprising the step of:

coordinating (602) the tables (320a, 320b) comprising prestored representations

(330) between the information transmitting unit (100a) and the information receiving unit (100b, 100c) before the information (210) concerning the object (200) is received (601 ).

16. The method (600) according to any one of claims 12-15, wherein the representation (330) of the object (200) in the table (320b) is configurable by a user of the information re ceiving unit (100b, 100c). 17. A control arrangement (230) of an information receiving unit (100b, 100c), for out- putting a representation (330) of an object (200) detected by at least one sensor (130) of an information transmitting unit (100a), based on information (210) obtained from the infor mation transmitting unit (100a), wherein the control arrangement (230) is configured to: receive information (210) concerning the object (200) and position of the object (200) from the information transmitting unit (100a) via a wireless receiver (140b);

convert the received information (210) concerning the object (200) into a represen tation (330) of the object (200); and

output the representation (330) of the object (200) on an output device (240) of the information receiving unit (100b, 100c).

18. The control arrangement (230) according to claim 17, further configured to convert the received information (210) into the representation (330) of the object (200) by selecting the representation (330) of the object (200) based on the received information (210). 19. The control arrangement (230) according to any one of claim 17 or claim 18, further configured to convert the received information (210) into the representation (330) of the ob ject (200) by

extracting a reference to a prestored representation (330) in a table (320a, 320b) stored in both a memory (310) of the information receiving unit (100b, 100c) and a memory (300) of the information transmitting unit (100a), from the received information (210); and selecting the prestored representation (330) of the object (200) in the table (320a, 320b) stored in the memory (310) of the information receiving unit (100b, 100c), based on the extracted reference. 20. The control arrangement (230) according to claim 19, further configured to: coordinate the tables (320a, 320b) comprising prestored representations (330) be tween the information transmitting unit (100a) and the information receiving unit (100b, 100c) before the information (210) concerning the object (200) is received. 21 . The control arrangement (230) according to any one of claims 17-20, further con figured to enable a user of the information receiving unit (100b, 100c) to configure the rep resentation (330) of the object (200) in the table (320b).

22. A computer program comprising program code for performing a method (600) ac- cording to any of claims 12-16, when the computer program is executed in a control arrange ment (230) according to any one of claims 17-21 .

23. A vehicle (100a, 100b, 100c) comprising a control arrangement (220) according to any one of claims 6-10, or a control arrangement (230) according to any one of claims 17- 21 .

Description:
METHOD AND CONTROL ARRANGEMENT FOR VISUALISATION OF OBSTRUCTED VIEW

TECHNICAL FIELD

This document discloses a control arrangement of an information transmitting unit and a 5 method therein, and a control arrangement of an information receiving unit and a method therein. More particularly, methods and control arrangements are described, for providing information detected by the information transmitting unit, to the information receiving unit.

BACKGROUND

0 Grouping vehicles into platoons is an emerging technology, leading to reduced fuel con sumption and increased capacity of the roads. A number of vehicles, e.g. 2-25 or more, may be organised in a platoon or vehicle convoy, wherein the vehicles are driving in coordination after each other with only a small distance between the vehicles, such as some decimetres or some meters, such as e.g. 20 meters, or at a distance that is dependent on the speed of5 the platoon (e.g., the vehicles may be about 2 or 3 seconds apart during transportation). Thereby air resistance is reduced, which is important for reducing energy consumption, in particular for trucks, busses and goods vehicles or other vehicles having a large frontal area. In principle it may be said that the shorter the distance is between the vehicles, the lower the air resistance becomes, which reduces energy consumption for the vehicle platoon.

0

The distance between the vehicles in the platoon may be reduced as the vehicles are ena bled to communicate wirelessly with each other and thereby coordinate their velocity by e.g. accelerating or braking simultaneously. Thereby the reacting distance needed for human reaction during normal driving is eliminated.

5

Platooning brings a multitude of advantages, such as improved fuel economy due to reduced air resistance, and also reduced traffic congestion leading to increased capacity of the roads and enhanced traffic flow. Also, platoons can readily exploit advancements in automation, for example by letting only the lead vehicle may be human-driven, while the others may follow0 autonomously. This would enable a reduction on the number of the drivers (that is, one or two per platoon), or prolonged continuous driving, as the drivers in all but the first truck can rest.

However, the short distance between the vehicles in the platoon obstructs the field of view5 of the following vehicles. This problem also emerges for any vehicle in a queue, standing/ driving behind another vehicle, in particular a large vehicle such as a truck, a bus, etc. Only the driver of the leading vehicle has an open field of view and can see what is ahead of the vehicle platoon/ vehicle queue. The following vehicles drive very close to each other and, therefore, drivers have their field of view obstructed. It has been proposed that the leading vehicles broadcasts to the following vehicles a video streaming, containing what is in the front of the platoon leader. However, this can be quite complicated to do in real-time, and a delayed video is useless, or even dangerous as it may provide a false sense of security to the recipient of the video stream.

Another emerging technology is remotely controlled vehicles, for example in mining, con- struction, forestry, military applications, rescuing operations, extraterrestrial explorations, etc. The driver could then be safely situated in a control room, protected from a possibly hostile environment of the vehicle. For the driver to become aware of the current situation of the vehicle, a video image or other sensor data captured by a sensor of the vehicle may be provided to the driver in the control room.

However, providing environmental data e.g. by streaming video data leads to a substantial time delay, which may become crucial for the traffic/ operational safety of the vehicle. In case the remotely situated driver receives the images with a time delay, the driving commands of the driver may be reactions to an obsolete situation, and / or arrive too late to the vehicle, which may compromise vehicle safety and / or cause an accident.

Vehicles are sometimes autonomous. Thereby the driver is omitted, superseded by an onboard control logic enabling the vehicle to drive and manage various appearing traffic sit uations, based on sensor data captured by sensors on the vehicle. However, various unde- fined, non-predicted situations may occur, which cannot be handled by the onboard control logic alone. A human operator in a remote monitoring room may then be alerted and sensor data documenting the appeared situation may be transmitted to the operator. However, as already mentioned, streaming video data may cause an unfortunate time delay. Again, the driving commands of the operator may be reactions to an obsolete situation, and / or arrive too late to the vehicle, which may compromise vehicle safety and / or cause an accident.

Document US2017158133 discloses a vehicle vision system with compressed video trans fer. The vehicle utilises one or more cameras to capture image data and transmits com pressed video images to another vehicle. This system can be used in a platooning group of vehicles. The compressed images captured by a forward viewing camera of the lead vehicle are communicated to following vehicles. The compressed video images may be processed by a machine vision processer. This solution is reducing the transferring time of the video stream by compressing the infor mation to be transferred. However, the process of compression/ decompression takes time. Also, a very small delay may be hazardous and cause an accident.

Document CN102821282 discloses a video communication in vehicular network. The video captured by each vehicle is shared among vehicles of the whole vehicle fleet. The captured video is coded and compressed. This solution shares the same or similar problems as the previously described solution.

Document US2015043782 discloses a method for detecting and displaying obstacles and data associated with the obstacles. A digital device displays both the captured image and the related information. For example, the distance of each person is overlayed on their im- ages. The document mentions that the invention can solve the view problem of vehicle fleets.

As these described scenarios, and similar variants of them, may lead to general inconven ience and discomfort for the passengers and / or the driver in a platoon or dense traffic envi ronment, it would be desired to find a solution.

SUMMARY

It is therefore an object of this invention to solve at least some of the above problems and improve visibility of a vehicle. According to a first aspect of the invention, this objective is achieved by a method in a control arrangement of an information transmitting unit. The method aims at providing information to an information receiving unit. The method comprises the steps of: collecting environmental data with at least one sensor. Further the method comprises identifying an object in the en vironment of the information transmitting unit, which is considered relevant for the information receiving unit. The method also comprises extracting data related to the identified object from the collected environmental data. Also, the method furthermore comprises converting the extracted data into information. The method in addition comprises determining position of the object based on the collected environmental data. Furthermore, the method comprises providing the converted information and the determined position of the object to the infor- mation receiving unit via a wireless transmitter, thereby enabling output of a representation of the object on an output device of the information receiving unit. According to a second aspect of the invention, this objective is achieved by a control ar rangement of an information transmitting unit. The control arrangement aims at for outputting a representation of an object detected by at least one sensor of an information transmitting unit, based on information obtained from the information transmitting unit. The method com- prises receiving information concerning the object and position of the object from the infor mation transmitting unit via a wireless receiver. Also, the method comprises converting the received information concerning the object into a representation of the object. The method additionally comprises outputting the representation of the object on an output device of the information receiving unit.

According to a third aspect of the invention, this objective is achieved by a method in a control arrangement of an information receiving unit, for outputting a representation of an object detected by at least one sensor of an information transmitting unit, based on information obtained from the information transmitting unit. The method comprises receiving information concerning the object and position of the object from the information transmitting unit via a wireless receiver. Also, the method further comprises converting the received information concerning the object into a representation of the object. The method in addition comprises outputting the representation of the object on an output device of the information receiving unit.

According to a fourth aspect of the invention, this objective is achieved by a control arrange ment of an information receiving unit, for outputting a representation of an object detected by at least one sensor of an information transmitting unit, based on information obtained from the information transmitting unit. The control arrangement is configured to receive infor- mation concerning the object and position of the object from the information transmitting unit via a wireless receiver. The control arrangement is also configured to convert the received information concerning the object into a representation of the object. Further, the control arrangement is configured to output the representation of the object on an output device of the information receiving unit.

Thanks to the described aspects, by converting the extracted data into information of an object which is considered relevant for the information receiving unit, information may be provided with low time latency to the information receiving unit. Thereby, the information receiving unit may obtain information in real time or with a very low time delay, enabling output of a representation of the object on an output device of the information receiving unit. Thereby, the driver of another vehicle/ the information receiving unit may be informed about the object detected by the first vehicle/ information providing unit without any substantial time delay, enabling the driver to prepare for an appropriate action due to the detected object. It is thereby avoided that the driver of the other vehicle/ the information receiving unit is sur- prised by a suddenly occurring action such as a hard brake, speed bump, etc., which may in a worst-case scenario may cause an accident. Thus, traffic safety is enhanced.

Other advantages and additional novel features will become apparent from the subsequent detailed description.

FIGURES

Embodiments of the invention will now be described in further detail with reference to the accompanying figures, in which:

Figure 1A illustrates an embodiment of a group of vehicles.

Figure 1 B illustrates vehicles transmitting information between each other.

Figure 1C illustrates a vehicle transmitting information to a control tower.

Figure 2A illustrates a vehicle interior according to an embodiment of the invention. Figure 2B illustrates a vehicle interior according to an embodiment of the invention. Figure 2C illustrates a vehicle interior according to an embodiment of the invention. Figure 3A illustrates vehicles transmitting information between each other.

Figure 3B illustrates vehicles transmitting information between each other.

Figure 3C illustrates vehicles transmitting information between each other.

Figure 4 is a flow chart illustrating an embodiment of a first method.

Figure 5 is an illustration depicting a control arrangement of an information transmitting unit according to an embodiment.

Figure 6 is a flow chart illustrating an embodiment of a second method.

Figure 7 is an illustration depicting a control arrangement of an information receiving unit according to an embodiment. DETAILED DESCRIPTION

Embodiments of the invention described herein are defined as control arrangements and methods in control arrangements, which may be put into practice in the embodiments de- scribed below. These embodiments may, however, be exemplified and realised in many dif ferent forms and are not to be limited to the examples set forth herein; rather, these illustra tive examples of embodiments are provided so that this disclosure will be thorough and com plete.

Still other objects and features may become apparent from the following detailed description, considered in conjunction with the accompanying drawings. It is to be understood, however, that the drawings are designed solely for purposes of illustration and not as a definition of the limits of the herein disclosed embodiments, for which reference is to be made to the appended claims. Further, the drawings are not necessarily drawn to scale and, unless oth erwise indicated, they are merely intended to conceptually illustrate the structures and pro cedures described herein.

Figure 1A illustrates a scenario wherein a number of vehicles 100a, 100b, 100c, driving in a driving direction 105, with inter-vehicular distances d1 , d2. The vehicles 100a, 100b, 100c may be coordinated and organised in a group 110 of vehicles, which may be referred to as a platoon. However, the vehicles 100a, 100b, 100c may be non-coordinated, for example standing sequentially after each other in a traffic congestion, or just driving/ standing in a vicinity of each other. The involved vehicles 100a, 100b, 100c may not necessarily be driving in the same direction 105, and / or the same file; or even be driving at all, i.e. one or more vehicles 100a, 100b, 100c may be stationary. In fact, one or more of the vehicles 100a, 100b, 100c in the group 1 10 may be referred to as a structure rather than a vehicle, such as e.g. a building, a control tower, a lamp post, a traffic sign, etc. However, at least one vehicle 100a, 100b, 100c in the group 1 10 is blocking at least a part of the view of at least one other vehicle 100a, 100b, 100c in the group 1 10.

In embodiments wherein the vehicle group 1 10 comprises a platoon, it may be described as a chain of coordinated, inter-communicating vehicles 100a, 100b, 100c travelling at given inter-vehicular distances d1 , d2 and velocity.

The inter-vehicular distances d1 , d2 may be fixed or variable in different embodiments. Thus, the distances d1 , d2 may be e.g. some centimetres, some decimetres, some meters or some tenths of meters in some embodiments. Alternatively, each vehicle 100a, 100b, 100c in the group 1 10 may have a different distance d1 , d2 to the vehicle following, or leading, vehicle 100a, 100b, 100c, than all other vehicles 100a, 100b, 100c in the coordinated group 1 10. The vehicles 100a, 100b, 100c in the group 1 10 may comprise vehicles of the same, or different types in different embodiments, such as trucks, multi-passenger vehicles, trailers, cars, etc; and / or structures such as buildings, road infra structures, etc. The vehicles 100a, 100b, 100c may be driver controlled or driverless autonomously con trolled vehicles in different embodiments. However, for enhanced clarity, the vehicles 100a, 100b, 100c are subsequently described as having a driver.

The vehicles 100a, 100b, 100c in the group 1 10 may be coordinated, or communicate via wireless signal. Such wireless signal may comprise, or at least be inspired by wireless com munication technology such as Wi-Fi, Wireless Local Area Network (WLAN), Ultra Mobile Broadband (UMB), Bluetooth (BT), Near Field Communication (NFC), Radio-Frequency Identification (RFID), optical communication such as Infrared Data Association (IrDA) or in frared transmission to name but a few possible examples of wireless communications in some embodiments.

In some embodiments, the communication between vehicles 100a, 100b, 100c in the group 1 10 may be performed via vehicle-to-vehicle (V2V) communication, e.g. based on Dedicated Short-Range Communications (DSRC) devices. DSRC works in 5.9 GHz band with band- width of 75 MHz and approximate range of 1000 m in some embodiments.

The wireless communication may be made according to any IEEE standard for wireless ve hicular communication like e.g. a special mode of operation of IEEE 802.1 1 for vehicular networks called Wireless Access in Vehicular Environments (WAVE). IEEE 802.1 1 p is an extension to 802.1 1 Wireless LAN medium access layer (MAC) and physical layer (PHY) specification.

The communication may alternatively be made over a wireless interface comprising, or at least being inspired by radio access technologies such as e.g. third Generation Partnership Project (3GPP) 5G/ 4G, 3GPP Long Term Evolution (LTE), LTE-Advanced, Groupe Special Mobile (GSM), or similar, just to mention some few options, via a wireless communication network.

In some embodiments, when the vehicles 100a, 100b, 100c in the group 1 10 are coordinated and are communicating, the driver of the first vehicle 100a drive the own vehicle 100a and the other vehicles 100b, 100c in the group 1 10 are merely following the driving commands of the first vehicle 100a. A non-leading vehicle 100b, 100c driving in the platoon 1 10 has a restricted sight as the leading vehicle 100a obstructs the field of view of the following vehicles 100b, 100c. This phenomenon may occur also in a vehicle queue, a traffic congestion, a parking lot, etc.

According to embodiments disclosed herein, one vehicle 100a in the group 1 10, typically the first vehicle 100a of the group 1 10, comprises one or several sensors 130, of the same or different types. The sensor 130 may be a forwardly directed sensor 130 in some embodi ments. In the illustrated embodiment, which is merely an arbitrary example, the forwardly directed sensor 130 may be situated e.g. at the front of the first vehicle 100a of the group 1 10, behind the windscreen of the vehicle 100.

The sensor 130 may comprise e.g. a camera, a stereo camera, an infrared camera, a video camera, a radar, a lidar, an ultrasound device, a time-of-flight camera, or similar device, in different embodiments.

In some embodiments, the sensor 130 may comprise e.g. a motion detector and / or be based on a Passive Infrared (PIR) sensor sensitive to a person's skin temperature through emitted black body radiation at mid-infrared wavelengths, in contrast to background objects at room temperature; or by emitting a continuous wave of microwave radiation and detect motion through the principle of Doppler radar; or by emitting an ultrasonic wave and detecting and analysing the reflections; or by a tomographic motion detection system based on detection of radio wave disturbances, to mention some possible implementations. Figure 1 B illustrates a scenario wherein the sensor 130 of one/ the first vehicle 100a of the group 1 10 is detecting an object 200. The object 200 in the illustrated example is a human, but it may be any kind of object which may be considered relevant for another vehicle 100a, 100b, 100c in the group 1 10, such as an obstacle on the road, an animal at or in the vicinity of the road, another vehicle, a speed barrier, a structure at or close to the road such as a road sign, a traffic light, a crossing road and vehicles there upon, etc.

The sensor 130 may comprise or be connected to a control arrangement configured for im age recognition/ computer vision and object recognition. Computer vision is a technical field comprising methods for acquiring, processing, analysing, and understanding images and, in general, high-dimensional data from the real world in order to produce numerical or symbolic information. A theme in the development of this field has been to duplicate the abilities of human vision by electronically perceiving and understanding an image. Understanding in this context means the transformation of visual images (the input of retina) into descriptions of world that can interface with other thought processes and elicit appropriate action. This im age understanding can be seen as the disentangling of symbolic information from image data using models constructed with the aid of geometry, physics, statistics, and learning theory. Computer vision may also be described as the enterprise of automating and integrat ing a wide range of processes and representations for vision perception.

The sensor data of the sensor/-s 130 may take many forms, such as e.g. images, video sequences, views from multiple cameras, or multi-dimensional data from a scanner, data of a lidar, radar, etc; or a combination thereof.

When an object 200 is detected by the sensor 130 of the first vehicle/ information transmitting unit 100a, which is considered relevant for the other vehicles/ information receiving units 100b, 100c in the group 1 10 by the control arrangement, information 210 comprising a sim- plified representation of the object 200 may be provided to the other vehicles/ information receiving units 100b, 100c.

Thus, instead of transmitting the whole data of the sensor 130, information 210 concerning the perceived environment may be transmitted as a simplified cartoon with standard images saved a priori, e.g., traffic light, pedestrian, car, bicycle, bus, etc. This data can then be used by the following vehicles to create an“cartooned” image of the obstructed field of view.

The transmission of this information is much simpler and faster, allowing for a real time com munication of what is in front of the vehicle/ information transmitting unit 100a having de- tected the object 200. The drivers of the following vehicles/ information receiving units 100b, 100c are informed about what is happening in front of the leading vehicle/ information trans mitting unit 100a and can react accordingly, e.g. by preparing for a brake, for slowing down, for a speed bump, etc. By being able to receive real time information without time lag, or with only a negligible time lag, the drivers of the other vehicles/ information receiving units 100b, 100c could become aware of the environmental traffic situation and prepare accordingly. Also, certain functions in the vehicle/ information receiving units 100b, 100c could be activated in some embodi ments, triggered by the detected obstacle 200, such as tightening the safety belt when a hard brake could be expected, etc., or the vehicles 100a, 100b, 100c could activate some automatic functions to avoid the impact, e.g., slowing down or changing trajectory Thereby traffic safety is increased, and / or impact of any traffic accident is eliminated or at least de creased.

Figure 1C illustrates an embodiment wherein the group 1 10 comprises a vehicle 100a driv- ing on a road 120, and a control room 100b. In the illustrated embodiment, the vehicle 100a is an information transmitting unit while the control room 100b is an information receiving unit.

The vehicle/ information transmitting unit 100a may be an unmanned vehicle, remotely con- trolled by a human driver in the control room / information receiving unit 100b. In other em bodiments, the vehicle/ information transmitting unit 100a may be an autonomous vehicle while the human driver in the control room / information receiving unit 100b is only alerted when the autonomous vehicle is experiencing an unknown/ undefined problem. However, the situation may in other embodiments be the opposite, i.e. the control room 100b may be the information transmitting unit while the vehicle 100a may be the information re ceiving unit. In this embodiment, the vehicle 100a may be a manned vehicle and a sensor 130 in the control room 100b may detect information which may be provided to the driver of the vehicle 100a, such as for example map / directional information; working instructions for mining/ agriculture/ forestry, etc.

The sensor 130 of the information transmitting unit 100a may detect an object 200. The control arrangement of the information transmitting unit 100a may then determine that the detected object 200 is relevant, and information 210 comprising a simplified representation of the object 200 may be provided to the information receiving unit 100b. At the information receiving unit 100b, the information 210 may be received via the wireless receiver 140b and outputted on an output device such as a display or similar.

Thereby, a human monitoring the vehicle 100a, or a plurality of vehicles comprised in the group 1 10, may become aware of the object 200 detected by the sensor 130 of the vehicle/ information transmitting unit 100a in real time, or almost real time, without risking to suffer the time delay that would have result if all the sensor data of the object 200 would have been transferred. The human may thereby react and determine an appropriate action of the vehicle/ infor mation transmitting unit 100a, e.g. by sending a command or instructions how to handle the situation due to the detected object 200. Figure 2A illustrates an example of a scenario as it may be perceived by the driver of the second vehicle 100b in a group 1 10 of vehicles 100a, 100b, 100c. The vehicle/ information receiving unit 100b comprises a control arrangement 230 for out- putting a representation of an object 200 detected by at least one sensor 130 of an infor mation transmitting unit 100a, based on information 210 obtained from the information trans mitting unit 100a. The information transmitting unit 100a comprises a transmitter 140a, trans mitting wireless data to be received by a receiver 140b in the information receiving unit 100b.

Further, the information receiving unit 100b comprises an output device 240 in form of a display, loud speaker and / or a tactile device. Alternatively, the output device 240 may com prise a pair of intelligent glasses, i.e. an optical head-mounted display, that is designed in the shape of a pair of eyeglasses; or a set of portable head-up displays; a device for illus- trating an Augmented Reality (AR).

By receiving the simplified information 210 of the object 200 via the wireless communication from the transmitter 140a of the information transmitting unit 100a and outputting a repre sentation of the object 200 on the output device 240 in form of a cartooned object, the driver of the vehicle/ information receiving unit 100b becomes aware of the object 200 in real time, or almost real time.

Figure 2B illustrates yet an example of a scenario as it may be perceived by the driver of the second vehicle 100b in a group 1 10 of vehicles 1 00a, 1 00b, 100c, similar to the scenario illustrated in Figure 2A.

In this case, the information concerning the detected object 200, in this case an animal, triggers the output of a prestored image of an animal on the output device 240. The image may be prestored at a memory device of the first vehicle/ information transmitting unit 100a and transmitted to the other vehicles/ information receiving unit 100b, 100c. Alternatively, in some embodiments, the image may be prestored at a memory device of the other vehicles/ information receiving unit 100b, 100c and only a reference to the image is transferred from the first vehicle/ information transmitting unit 100a to the other vehicles/ information receiving unit 100b, 100c.

The position of the detected object 200 relative to the sensor 130, or vehicle 100a, may also be determined and provided to the information receiving unit 100b. Figure 2C illustrates yet an example of a scenario as it may be perceived by the driver of the second vehicle 100b in a group 1 10 of vehicles 100a, 100b, 100c, similar to the scenario illustrated in Figures 2A-2B.

In this case, the information concerning the detected object 200, in this case an animal, triggers the output of a highly stylized image, which may be prestored, of a detected obstacle on the output device 240. The image may be prestored at a memory device of the first vehi cle/ information transmitting unit 100a and transmitted to the other vehicles/ information re- ceiving unit 100b, 100c. Alternatively, in some embodiments, the image may be prestored at a memory device of the other vehicles/ information receiving unit 100b, 100c and only a reference to the image is transferred from the first vehicle/ information transmitting unit 100a to the other vehicles/ information receiving unit 100b, 100c. Figure 3A illustrates an example of information transfer, in some embodiments, and the vehicles 1 00a, 1 00b as regarded from above.

The sensor/-s 130 of the first vehicle/ information transmitting unit 100a may determine the distance D, and / or lateral displacement L of the object 200 in relation to the sensor 180 and / or vehicle 100a (or some other reference point) may be determined. The distance/ lateral displacement may for example be determined by radar, lidar, etc., by triangulation of sensor signals captured by sensors 130 situated at different locations on the vehicle 100a; or by capturing an image and performing an image analysis. The determined information concerning the relative position of the detected object 200, such as e.g. D and L, may then be provided to the information receiving unit 100b in some em bodiments, together with information representing the object 200. In other embodiments, may an absolute position of the detected object 200 be calculated, based on the determined relative position of the object 200 and an absolute geographical position of the vehicle 100a.

By determining the position of the object 200 in relation to the sensor 180 and / or the vehicle 100a, it becomes possible to recreate, at the receiver side, the position of the object 200 in relation to the vehicle 100a, and also in relation to other detected objects 200, if any. At the information receiving unit 100b, calculations may be made for outputting a simplified representation 330 of the object 200, at a relative position on the output device 240, thereby providing a view of the environmental situation of the vehicle 100a, comprising the objects 200 considered important/ relevant. The human driver/ operator at the information receiving unit 100b may thereby obtain a view of the situation which gives him/ her an intuitive under standing of the emerged scenario detected by the sensors 130 of the first vehicle/ information transmitting unit 100a.

Thereby, information concerning the detected object 200 is provided without (relevant) time delay, in a way which is immediately possible to interpret and understand by the human driver/ operator. Thus, traffic safety is enhanced. Figure 3B illustrates an example of information transfer, in some embodiments.

In the illustrated embodiment, the information transmitting unit 100a and the information re ceiving unit 100b comprise a common table 320a, 320b wherein some different examples of object representation are stored, each associated with a reference, such as e.g. a number. The table 320a of the information transmitting unit 100a may be stored in a memory 300 of the information transmitting unit 100a while the table 320b of the information receiving unit 100b may be stored in a memory 310 of the information receiving unit 100b.

Thereby, when the sensor 130 of the information transmitting unit 100a detects the object 200, in this case a pedestrian in front of the information transmitting unit 100a. The control arrangement 220 of the information transmitting unit 100a may then determine that the object 200 is relevant for the information receiving unit 100b and that the object 200 is categorised as a pedestrian/ human. A reference number, in this case“1”, referring to the representation of the object 200 in the table 320a may be transmitted to the information receiving unit 100b, via the transmitter 140a of the information transmitting unit 100a.

At the information receiving unit 100b, upon receiving the information, in this case the refer- ence“1” by the receiver 140b, a search may be made in the table 320b of the memory 310, to determine which representation corresponds to the received reference. In this case, the reference“1” corresponds to a representation 330 of a human, which then may be output on the output device 240 of the information receiving unit 100b. Figure 3C illustrates yet an example of information transfer, in some embodiments rather similar to the embodiment disclosed in Figure 3B. The difference between the embodiment of Figure 3C and the previously discussed embod iment of Figure 3B is that the prestored representations 330 stored in the respective tables 320a, 320b may be user-selected. Thereby, the output representation 330 is personalised according to personal preferences of the users/ drivers. The output representation 330 may thereby become easier to identify by the user.

Figure 4 illustrates an example of a method 400 in a control arrangement 220 of an infor- mation transmitting unit 100a, according to an embodiment. The flow chart in Figure 4 shows the method 400 for providing information 210 to an information receiving unit 100b, 100c.

The information transmitting unit 100a may comprise a vehicle in a group 1 10 of vehicles, comprising also an information receiving unit 100b, 100c.

In order to correctly be able to provide information 210 to the information receiving unit 100b, 100c, the method 400 may comprise a number of steps 401-407. Flowever, some of the described method steps 401 -407 such as e.g. step 404 may be performed only in some embodiments. The described steps 401 -407 may be performed in a somewhat different chronological order than the numbering suggests. The method 400 may comprise the sub sequent steps:

Step 401 comprises collecting environmental data with at least one sensor 130. The sensor 130, or plurality of sensors (of the same or different types) as may be the case, may be comprised onboard the information transmitting unit 100a, i.e. on board the vehicle.

The sensor/ -s 130 may comprise e.g. a camera, a stereo camera, an infrared camera, a video camera, a radar, a lidar, an ultrasound device, a time-of-flight camera, or similar de vice, in different embodiments.

Step 402 comprises identifying an object 200 in the environment of the information transmit ting unit 100a, which is considered relevant for the information receiving unit 100b, 100c and / or the information transmitting unit 100a. The object 200 may be detected based on sensor data obtained from the at least one sensor 130, e.g. sensor data fused from a plurality of sensors 130 in some embodiments.

The object 200 may be considered relevant when comprised in a list of entities predeter- mined to be relevant, comprising e.g. any arbitrary object situated on the road 120 in front of the vehicle/ information transmitting unit 100a within a predetermined distance; a traffic sign associated with the road 120, within a predetermined distance; a traffic light associated with the road 120, within a predetermined distance (the information including the concurrent col our of the traffic light); road structure such as bends, curves, crossings; marks on the road 120 indicating a pedestrian crossing, a speed bump, a hole or other irregularity in the road surface; a building or other structure in the vicinity of the road 120, etc.

Thereby, a filtering of information is made, neglecting irrelevant information around the vehi cle/ information transmitting unit 100a. Thus, only relevant information is communicated to the information receiving unit 100b, 100c, leading to a reduced communication delay. It also becomes easier for the human driver/ operator at the information receiving unit 100b, 100c to immediately detect the representation 330 of the relevant object 200, as disturbing non- relevant objects have been filtered out. Step 403 comprises extracting data related to the identified 402 object 200 from the collected 401 environmental data.

The extracted environmental data may comprise e.g. type of object 200, relative/ absolute position of the object 200, direction of motion, speed, size of the object 200, distance D between the sensor 130 and the object 200, colour of the object 200, etc.

Step 404 which may be performed only in some embodiments, comprises coordinating the tables 320a, 320b comprising the prestored representations 330 between the information transmitting unit 100a and the information receiving unit 100b, 100c.

This method step may only be performed in embodiments wherein the prestored represen tations 330 are stored in tables 320a, 320b in respective memories 300, 310 of the infor mation transmitting unit 100a and the information receiving unit 100b. The prestored representations 330 in the respective memories 300, 310 may be identical in some embodiments, or only representing the same kind of object in other embodiments. However, the prestored representations 330 are associated with the same references. Thereby, only the reference has to be communicated from the information transmitting unit 100a to the information receiving unit 100b, leading to a further reduced time delay during the information transfer. Step 405 comprises converting the extracted 403 data into information 210.

The conversion 405 of the extracted 403 data into information 210 may comprise selecting a prestored representation 330 of the object 200 in a memory 300 of the information trans mitting unit 100a.

The conversion 405 of the extracted 403 data into information 210 may in some embodi ments comprise selecting a prestored representation 330 of the identified 402 object 200 in a table 320a, 320b stored in both a memory 300 of the information transmitting unit 100a and a memory 310 of the information receiving unit 100b, 100c. Further, the conversion 405 of the extracted 403 data into information 210 may also comprise determining a reference to the selected prestored representation 330 in the table 320a, 320b.

Step 406 comprises determining position of the object 200 based on the collected 401 envi ronmental data.

The position of the object 200 may be related to the sensor 130/ vehicle 100a, comprising e.g. distance D to the object 200, between the sensor 130/ vehicle 100a and the object 200; lateral displacement L in relation to the sensor 130/ vehicle 100a; position in height of the object 200 (above the road surface), etc.

However, the position of the object 200 may also be an absolute position in some embodi ments, determined based on the absolute geographical position of the vehicle 100a, as de termined by a positioning unit of the vehicle 100a, which may be based on a satellite navi gation system such as the Navigation Signal Timing and Ranging (Navstar) Global Position- ing System (GPS), Differential GPS (DGPS), Galileo, GLONASS, or the like.

The geographical position of the positioning unit, (and thereby also of the vehicle 100a) may be made continuously with a certain predetermined or configurable time intervals according to various embodiments.

The absolute position of the object 200 may be determined based on the geographical posi tion of the vehicle 100a, in addition to the relative position of the object 200 in relation to the vehicle 100a.

Step 407 comprises providing the converted 404 information 210 and the determined 406 position of the object 200 to the information receiving unit 100b, 100c via a wireless trans- mitter 140a, thereby enabling output of a representation 330 of the object 200 on an output device 240 of the information receiving unit 100b, 100c.

In embodiments wherein the conversion 405 of the extracted 403 data into information 210 comprises selecting a prestored representation 330 of the object 200, the provided 407 in- formation 210 may comprise the selected prestored representation 330.

The provided 407 information 210 may comprise the determined reference to the selected prestored representation 330 in the table 320a, 320b. Furthermore, the provided 407 information 210 may comprise various data defining the ob ject 200, such as e.g. type, direction of motion, speed, size of the object 200, distance D between the sensor 130 and the object 200, colour of the object 200, etc., in various embod iments. The provided 407 information 210 may in some embodiments comprise data in object form.

Figure 5 illustrates an embodiment of a control arrangement 220 of an information transmit ting unit 100a. The control arrangement 220 aims at performing at least some of the method steps 401 -407 according to the above described method 400 for providing information 210 to an information receiving unit 100b, 100c.

The control arrangement 220 is configured to collect environmental data with at least one sensor 130. Further, the control arrangement 220 is configured to identify an object 200 in the environment of the information transmitting unit 100a, which is considered relevant for the information receiving unit 100b, 100c. Also, the control arrangement 220 is further con figured to extract data related to the identified object 200 from the collected environmental data. The control arrangement 220 is in addition also configured to convert the extracted data into information 210. Furthermore, the control arrangement 220 is configured to deter mine position of the object 200 based on the collected environmental data. The control ar- rangement 220 is configured to provide the converted information 210 and the determined position of the object 200 to the information receiving unit 100b, 100c via a wireless trans mitter 140a, thereby enabling output of a representation 330 of the object 200 on an output device 220 of the information receiving unit 100b, 100c. The control arrangement 220 may in some embodiments be configured to convert the ex tracted data into information 210 by selecting a prestored representation 330 of the object 200. Further, the control arrangement 220 may be configured to provide, via the wireless transmitter 140a, information comprising the selected prestored representation 330. In some embodiments, the control arrangement 220 may be configured to convert the ex tracted data into information 210 by selecting a prestored representation 330 of the identified object 200 in a table 320a, 320b stored in both a memory 300 of the information transmitting unit 100a and a memory 310 of the information receiving unit 100b, 100c. The control ar rangement 220 may be configured to determine a reference to the selected prestored repre- sentation 330 in the table 320a, 320b, when the provided information 210 comprises the determined reference.

The control arrangement 220 may also in some embodiments be configured to coordinate the tables 320a, 320b comprising prestored representations 330 between the information transmitting unit 100a and the information receiving unit 100b, 100c before the converted information 210 is provided.

Also, the control arrangement 220 may be configured to provide information 210 comprising data in object form, in some embodiments.

The control arrangement 220 comprises a receiving circuit 510 configured for collecting in formation from a sensor 130

The control arrangement 220 further comprises a processing circuitry 520 configured for providing information 210 to an information receiving unit 100b, 100c by performing the de scribed method 400 according to at least some of the steps 401 -407.

Such processing circuitry 520 may comprise one or more instances of a processing circuit, i.e. a Central Processing Unit (CPU), a processing unit, a processor, an Application Specific Integrated Circuit (ASIC), a microprocessor, or other processing logic that may interpret and execute instructions. The herein utilised expression“processing circuitry” may thus represent a processing circuitry comprising a plurality of processing circuits, such as, e.g., any, some or all of the ones enumerated above.

Furthermore, the control arrangement 220 may comprise a memory 525 in some embodi- ments. The optional memory 525 may comprise a physical device utilised to store data or programs, i.e., sequences of instructions, on a temporary or permanent basis. According to some embodiments, the memory 525 may comprise integrated circuits comprising silicon- based transistors. The memory 525 may comprise e.g. a memory card, a flash memory, a USB memory, a hard disc, or another similar volatile or non-volatile storage unit for storing data such as e.g. ROM (Read-Only Memory), PROM (Programmable Read-Only Memory), EPROM (Erasable PROM), EEPROM (Electrically Erasable PROM), etc. in different embod iments.

Further, the control arrangement 220 may comprise a signal transmitting circuit 530. The signal transmitting circuit 530 may be configured for transmitting information 210 to at least some information receiving unit 100b, 100c.

The previously described method steps 401 -407 to be performed in the control arrangement 220 may be implemented through the one or more processing circuits 520 within the control arrangement 220, together with computer program product for performing at least some of the functions of the steps 401 -407. Thus, a computer program product, comprising instruc tions for performing the steps 401 -407 in the control arrangement 220 may perform the method 400 comprising at least some of the steps 401 -407 for providing information 210 to the information receiving unit 100b, 100c, when the computer program is loaded into the one or more processing circuits 520 of the control arrangement 220. The described steps 401 - 407 thus may be performed by a computer algorithm, a machine executable code, a non- transitory computer-readable medium, or software instructions programmed into a suitable programmable logic such as the processing circuits 520 in the control arrangement 220. The computer program product mentioned above may be provided for instance in the form of a data carrier carrying computer program code for performing at least some of the step 401 -407 according to some embodiments when being loaded into the one or more pro cessing circuitry 520 of the control arrangement 220. The data carrier may be, e.g., a hard disk, a CD ROM disc, a memory stick, an optical storage device, a magnetic storage device or any other appropriate medium such as a disk or tape that may hold machine readable data in a non-transitory manner. The computer program product may furthermore be pro vided as computer program code on a server and downloaded to the control arrangement 220 remotely, e.g., over an Internet or an intranet connection. Figure 6 illustrates an example of a method 600 in a control arrangement 230 of an infor mation receiving unit 100b, 100c, according to an embodiment. The flow chart in Figure 6 shows the method 600 for outputting a representation 330 of an object 200 detected by at least one sensor 130 of an information transmitting unit 100a, based on information 210 obtained from the information transmitting unit 100a.

The information receiving unit 100b, 100c may comprise a vehicle in a group 1 10 of vehicles, comprising also the information transmitting unit 100a.

In order to be able to correctly outputting the object representation 330 the method 600 may comprise a number of steps 601-603. However, some of the described method steps 601 - 603 may be performed in a somewhat different chronological order than the numbering sug gests. The method 600 may comprise the subsequent steps:

Step 601 comprises receiving information 210 concerning the object 200 and position of the object 200 from the information transmitting unit 100a via a wireless receiver 140b.

Step 602 which may be performed only in some embodiments, comprises coordinating the tables 320a, 320b comprising prestored representations 330 between the information trans mitting unit 100a and the information receiving unit 100b, 100c before the information 210 concerning the object 200 is received 601.

Step 603 comprises converting the received 601 information 210 concerning the object 200 into a representation 330 of the object 200. The conversion 603 of the received 601 information 210 into the representation 330 of the object 200 may comprise selecting the representation 330 of the object 200 based on the received 601 information 210, in some embodiments.

The conversion 603 may optionally comprise extracting a reference to a prestored represen- tation 330 in a table 320a, 320b stored in both a memory 310 of the information receiving unit 100b, 100c and a memory 300 of the information transmitting unit 100a, from the re ceived 601 information 210. Further, the conversion 603 may comprise selecting the prestored representation 330 of the object 200 in the table 320a, 320b stored in the memory 310 of the information receiving unit 100b, 100c, based on the extracted reference.

The representation 330 of the object 200 may be a simplified or cartooned version of the object 200 in some embodiments. The representation 330 may for example comprise only a contour of the object 200, a geometric figure, a stylised illustration, a colour, a sound, a text, a tactile signal, etc., or possibly a combination thereof. By only transferring and outputting a reduced amount of information, in comparison with transferring and outputting a streamed video or full image, any transfer delay may be omitted, minimised or at least reduced. Thereby, the driver is informed about the object 200 in real time, or almost real time. The driver thereby has time to react on the upcoming situation and an accident could be avoided, and / or the impact of the accident may be reduced.

The representation 330 of the object 200 in the table 320b may be configurable by a user of the information receiving unit 100b, 100c, in some embodiments.

Step 604 comprises outputting the representation 330 of the object 200 on an output device 240 of the information receiving unit 100b, 100c.

The output device 240 may comprise a visual output device such as a screen; a head-up display; a projector projecting the image on either the road 120, or the back of the vehicle ahead; a set of close-eyes displays/ intelligent glasses/ lenses, i.e. an optical head-mounted display; a loudspeaker; a tactile device; and / or a combination thereof. The output device 240 may in some embodiments be configured for Augmented Reality (AR), and / or Virtual Reality (VR).

Figure 7 illustrates an embodiment of a control arrangement 230 of an information receiving unit 100b, 100c, for outputting a representation 330 of an object 200 detected by at least one sensor 130 of an information transmitting unit 100a, based on information 210 obtained from the information transmitting unit 100a. The control arrangement 230 is configured to perform at least some of the above described method steps 601 -604 for outputting a representation 330 of an object 200 detected by at least one sensor 130 of an information transmitting unit 100a, based on information 210 obtained from the information transmitting unit 100a.

The control arrangement 230 is configured to receive information 210 concerning the object 200 and position of the object 200 from the information transmitting unit 100a via a wireless receiver 140b. Also, the control arrangement 220 is configured to convert the received infor mation 210 concerning the object 200 into a representation 330 of the object 200. The control arrangement 220 is furthermore configured to output the representation 330 of the object 200 on an output device 240 of the information receiving unit 100b, 100c.

In some embodiments, the control arrangement 230 may be configured to convert the re ceived information 210 into the representation 330 of the object 200 by selecting the repre sentation 330 of the object 200 based on the received information 210. The control arrangement 230 may be further configured to convert the received information 210 into the representation 330 of the object 200 by extracting a reference to a prestored representation 330 in a table 320a, 320b stored in both a memory 310 of the information receiving unit 100b, 100c and a memory 300 of the information transmitting unit 100a, from the received information 210. Also, the control arrangement 230 may be configured to select the prestored representation 330 of the object 200 in the table 320a, 320b stored in the memory 310 of the information receiving unit 100b, 100c, based on the extracted reference.

In some embodiments, the control arrangement 230 may also be configured to coordinate the tables 320a, 320b comprising prestored representations 330 between the information transmitting unit 100a and the information receiving unit 100b, 100c before the information 210 concerning the object 200 is received.

The control arrangement 230 may in some embodiments be further configured to enable a user of the information receiving unit 100b, 100c to configure the representation 330 of the object 200 in the table 320b.

Also, the control arrangement 230 may be configured to receive information 210 comprising data in object form, in some embodiments. The control arrangement 230 comprises a receiving circuit 710 configured for collecting in formation 210 from a wireless transmitter 140a of an information transmitting unit 100a.

The control arrangement 230 further comprises a processing circuitry 720 configured for outputting a representation 330 of an object 200 detected by at least one sensor 130 of an information transmitting unit 100a, based on information 210 obtained from the information transmitting unit 100a by performing the described method 600 according to at least some of the steps 601 -604. Such processing circuitry 720 may comprise one or more instances of a processing circuit, i.e. a Central Processing Unit (CPU), a processing unit, a processor, an Application Specific Integrated Circuit (ASIC), a microprocessor, or other processing logic that may interpret and execute instructions. The herein utilised expression“processing circuitry” may thus represent a processing circuitry comprising a plurality of processing circuits, such as, e.g., any, some or all of the ones enumerated above.

Furthermore, the control arrangement 230 may comprise a memory 725 in some embodi- ments. The optional memory 725 may comprise a physical device utilised to store data or programs, i.e., sequences of instructions, on a temporary or permanent basis. According to some embodiments, the memory 725 may comprise integrated circuits comprising silicon- based transistors. The memory 725 may comprise e.g. a memory card, a flash memory, a USB memory, a hard disc, or another similar volatile or non-volatile storage unit for storing data such as e.g. ROM (Read-Only Memory), PROM (Programmable Read-Only Memory), EPROM (Erasable PROM), EEPROM (Electrically Erasable PROM), etc. in different embod iments.

Further, the control arrangement 230 may comprise a signal transmitting circuit 730. The signal transmitting circuit 730 may be configured for providing the representation 330 to the output device 240 at the information receiving unit 100b, 100c.

The previously described method steps 601 -604 to be performed in the control arrangement 230 may be implemented through the one or more processing circuits 720 within the control arrangement 230, together with computer program product for performing at least some of the functions of the steps 601 -604. Thus, a computer program product, comprising instruc tions for performing the steps 601 -604 in the control arrangement 230 may perform the method 600 comprising at least some of the steps 601 -604 for outputting a representation 330 of an object 200 detected by at least one sensor 130 of an information transmitting unit 100a, based on information 210 obtained from the information transmitting unit 100a, when the computer program is loaded into the one or more processing circuits 720 of the control arrangement 230. The described steps 601 -604 thus may be performed by a computer al gorithm, a machine executable code, a non-transitory computer-readable medium, or soft ware instructions programmed into a suitable programmable logic such as the processing circuits 720 in the control arrangement 230. The computer program product mentioned above may be provided for instance in the form of a data carrier carrying computer program code for performing at least some of the step 601 -604 according to some embodiments when being loaded into the one or more pro cessing circuitry 720 of the control arrangement 230. The data carrier may be, e.g., a hard disk, a CD ROM disc, a memory stick, an optical storage device, a magnetic storage device or any other appropriate medium such as a disk or tape that may hold machine readable data in a non-transitory manner. The computer program product may furthermore be pro vided as computer program code on a server and downloaded to the control arrangement 230 remotely, e.g., over an Internet or an intranet connection.

The solution may further comprise a vehicle 100a, 100b, 100c, comprising a control arrange ment 220 as illustrated in Figure 5 and / or a control arrangement 230 as illustrated in Figure 7. The terminology used in the description of the embodiments as illustrated in the accompa nying drawings is not intended to be limiting of the described methods 400, 600, control arrangements 220, 230, computer program and / or vehicle 100a, 100b, 100c. Various changes, substitutions and / or alterations may be made, without departing from invention embodiments as defined by the appended claims.

As used herein, the term "and/ or" comprises any and all combinations of one or more of the associated listed items. The term“or” as used herein, is to be interpreted as a mathematical OR, i.e., as an inclusive disjunction; not as a mathematical exclusive OR (XOR), unless ex pressly stated otherwise. In addition, the singular forms "a", "an" and "the" are to be inter- preted as“at least one”, thus also possibly comprising a plurality of entities of the same kind, unless expressly stated otherwise. It will be further understood that the terms "includes", "comprises", "including" and / or "comprising", specifies the presence of stated features, ac tions, integers, steps, operations, elements, and / or components, but do not preclude the presence or addition of one or more other features, actions, integers, steps, operations, ele- ments, components, and / or groups thereof. A single unit such as e.g. a processor may fulfil the functions of several items recited in the claims. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage. A computer program may be stored/ distributed on a suitable medium, such as an optical storage medium or a solid-state medium supplied together with or as part of other hardware, but may also be distributed in other forms such as via Internet or other wired or wireless communication system.