Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
TRANSFER OF FILE CONTENT AS PART OF A VIDEO STREAM
Document Type and Number:
WIPO Patent Application WO/2022/096261
Kind Code:
A1
Abstract:
5 A video processing system (2) comprises a processing unit (7), a video output interface (10) and at least one memory interface (12, 14). The processing unit (7) is configured to receive file content, to determine first pixel values for a first pixel according to a predefined color space based on the file content, wherein the first pixel values encode a bit string of the file content, and to determine second pixel values for a second pixel 0 according to the color space independently of the file content. The processing unit (7) is configured to generate a frame for a video stream, the frame comprising the first pixel within a region of interest (22) of the frame and the second pixel outside of the region of interest (22). The processing unit (7) is configured to provide the video stream at the video output interface (10). 5 (Fig. 2)

Inventors:
WONG CHUP CHUNG (IE)
SOJAN THOMAS (IE)
FARIA RODRIGO (IE)
Application Number:
PCT/EP2021/079030
Publication Date:
May 12, 2022
Filing Date:
October 20, 2021
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
CONNAUGHT ELECTRONICS LTD (IE)
International Classes:
H04N19/17; B60R1/20; G06F3/14; G06F11/10; H04N19/184; H04N19/186; H04N19/46; H04N19/65
Foreign References:
US20180333646A12018-11-22
US20110187860A12011-08-04
US20040145661A12004-07-29
US20190181982A12019-06-13
US20180333646A12018-11-22
Other References:
ROBERT PRANDOLINI: "Scalable Video Coding Requirements for Video Surveillance Systems", no. M10822; m10822, 1 July 2004 (2004-07-01), XP030039632, Retrieved from the Internet [retrieved on 20100827]
Attorney, Agent or Firm:
JAUREGUI URBAHN, Kristian (DE)
Download PDF:
Claims:
22

Claims Video processing system for transfer of file content, wherein the video processing system (2) comprises a processing unit (7), a video output interface (10) for coupling the processing unit (7) to a receiving unit (11) and at least one memory interface (12, 14) for coupling the processing unit (7) to at least one memory unit (13, 15); and the processing unit (7) is configured to

- receive the file content via the at least one memory interface (12, 14);

- determine first pixel values for a first pixel according to a predefined color space based on the file content, wherein the first pixel values encode a bit string of the file content;

- determine second pixel values for a second pixel according to the color space independently of the file content;

- generate a frame for a video stream, the frame comprising the first pixel within a predefined region of interest (22) of the frame and the second pixel outside of the region of interest (22); and

- provide the video stream including the frame via the video output interface (10); wherein a number of channels (C1, CT C2) of the color space is given by a channel number, which is equal to or greater than two, and, for determining the first pixel values, the processing unit (7) is configured to divide the bit string into a number of substrings (S1, ST, S2) equal to the channel number; use a first substring (S1 , S1 ') of the number of substrings (S1 , ST, S2) for a respective number of least significant bits of a first channel (C1 , CT) of the color space. Video processing system according to claim 1, characterized in that the processing unit (7) is configured to receive a binary file via the at least one memory interface (12, 14) and to divide the binary file into a plurality of portions depending on a size of the region of interest (22), wherein the file content corresponds to one of the plurality of portions. Video processing system according to one of the preceding claims characterized in that the processing unit (7) is configured to receive image data via the at least one memory interface (12, 14) and to determine the second pixel values depending on the image data. Video processing system according to claim 1 , characterized in that the processing unit (7) is configured to use a second substring (S2) of the number of substrings (S1 , ST, S2) and a parity bit (P) for a respective number of least significant bits of a second channel (C2) of the color space. Video processing system according to one of claims 1 or 4 , characterized in that the processing unit (7) is configured to distribute the number of substrings (S1, ST, S2) among the channels (C1, CT, C2) such that each of the channels (C1 , CT, C2) comprises at least one bit of the bit string. Video processing system according to one of claims 1 to 3, characterized in that the color space comprises two or more channels (C1 , CT, C2) and, for determining the first pixel values, the processing unit (7) is configured to divide a value of the bit string of the file content by a number of the two or more channels (C1 , CT, C2) of the color space to obtain a quotient value and a remainder value (R); use the quotient value for a respective number of least significant bits of a first channel of the color space; and use a sum of the quotient value and the remainder value (R) for a respective number of least significant bits of a second channel (C2) of the color space. Video processing system according to one of the preceding claims, characterized in that for determining the first pixel values, the processing unit (7) is configured to use a bit string of a predefined bitmap image (26) for a respective number of most significant bits of the first channel (C1 , CT). Video processing system according to one of the preceding claims, characterized in that the processing unit (7) is configured to generate the frame such that the frame comprises a portion of pixels outside of the region of interest (22), wherein metadata are encoded by respective pixel values of the portion of pixels. Video processing system according to one of the preceding claims, characterized in that the video output interface (10) comprises a serializer (10) coupled to the processing unit (7). Video processing system according to claim 9, characterized in that the video processing system (2) comprises a deserializer (16) connected to the serializer (10), wherein the deserializer (16) is configured to be coupled to the receiving unit (11). Motor vehicle comprising a video processing system (2) according to one of the preceding claims. Method for transfer of file content as part of a video stream, wherein the method comprises using a processing unit (7) to receive the file content from at least one memory unit (13, 15); determine first pixel values for a first pixel according to a predefined color space based on the file content, wherein the first pixel values encode a bit string of the file content; determine second pixel values for a second pixel according to the color space independently of the file content; 25 generate a frame for a video stream, the frame comprising the first pixel within a predefined region of interest (22) of the frame and the second pixel outside of the region of interest (22); and provide the video stream including the frame to a receiving unit (11); wherein a number of channels (C1, CT C2) of the color space is given by a channel number, which is equal to or greater than two, and, for determining the first pixel values, the processing unit (7) is configured to divide the bit string into a number of substrings (S1, ST, S2) equal to the channel number; use a first substring (S1 , S1 ') of the number of substrings (S1 , ST, S2) for a respective number of least significant bits of a first channel (C1 , CT) of the color space.

13. Method according to claim 12, characterized in that a number of channels (C1 , CT C2) of the color space is given by a channel number, which is equal to or greater than two, and, for determining the first pixel values, the processing unit (7) is used to divide the bit string into a number of substrings (S1, ST, S2) equal to the channel number; use a first substring (S1 , ST) of the number of substrings (S1 , ST, S2) for a respective number of least significant bits of a first channel (C1 , CT) of the color space.

14. Computer program product comprising instructions, which, when executed by a computing system, in particular by a video processing system (2) according to one of claims 1 to 10, cause the computing system to carry out a method according to one of claims 12 or 13.

Description:
T ransfer of file content as part of a video stream

The present invention is directed to a video processing system transfer of file content, to a motor vehicle comprising such video processing system and to a corresponding method for transfer of file content as part of a video stream.

Electronic control units, ECUs, in motor vehicles may transfer video streams via a video link. Therein, a first ECU may receive corresponding image data from a video source, such as a camera, or from a storage medium and transmit a corresponding video stream to a second ECU, which may for example correspond to an infotainment ECU of the vehicle. It may also be desirable to transfer files, in particular binary files, stored on the first ECU to the second ECU. The binary file may for example correspond to a stored video or to other content.

The different ECUs of a motor vehicle are typically connected via a vehicle network or a vehicle bus system, such as CAN. In principle, the file content could therefore be transmitted via the vehicle network. However, the data transfer rate via such vehicle networks is rather low, which leads to an unacceptably long transfer time when transferring large files. Alternatively, an ethernet connection could be established between the ECUs to improve the data transfer rate. However, this implies additional complexity of the system as well as additional costs, which may also not be acceptable.

Document US 2018/0333646 A1 describes a method for file transfer via a video port. Therein, a sending computer device generates a plurality of bitmap images based on the file and scans the bitmap images as frames to the video port. A receiving computer device receives the video frames via a video-in port and converts each of the frames back to a bitmap image including a header portion and a data portion. The receiving computer device reconstructs the file based on the reconstructed bitmap images according to the header portions.

A drawback of this approach is that, when viewed on a display, the transmitted video frames appear as random noise. On the one hand, this impairs the user experience in case the video frames are nevertheless displayed. On the other hand, it may be desirable to display additional information during the file transfer, for example a progress of the file transfer and so forth.

It is therefore an object of the present invention to provide an improved concept for file transfer via video link, which allows transfer of file content and displaying a video stream in parallel with reduced visual disturbances.

This object is achieved by the respective subject-matter of the independent claims. Further implementations and preferred embodiments are subject-matter of the dependent claims.

The improved concept is based on the idea to encode a part of the file content as pixel values for a first pixel, which is located in a predefined region of interest of a frame.

According to the improved concept, a video processing system for transfer of file content is provided. The video processing system comprises a processing unit, a video output interface for coupling the processing unit to a receiving unit and at least one memory interface for coupling the processing unit to at least one memory unit. The processing unit is configured to receive the file content via the at least one memory interface, in particular from the at least one memory unit, when the memory unit is coupled to the processing unit via the at least one memory interface. The processing unit is configured to determine first pixel values for a first pixel according to a predefined color space based on the file content, wherein the first pixel values encode a bit string of the file content. The processing unit is configured to determine second pixel values for a second pixel according to the color space independent of the file content. The processing unit is configured to generate a frame for a video stream, such that the frame comprises the first pixel within a predefined region of interest of the frame and the frame comprises the second pixel outside of the region of interest. The processing unit is configured to provide a video stream including the frame via the video output interface, in particular to the receiving unit when the receiving unit is coupled to the processing unit via the video output interface.

The video processing system may for example be designed as a video processing system for a motor vehicle. In other words, the video processing system may comprise one or more electronic control units, ECUs, wherein one of the ECUs may for example comprise the processing unit. The processing unit may for example be implemented as or comprise one or more integrated circuits, microcontrollers, systems-on-a-chip, SoC, and so forth. The receiving unit may, in particular, be a part of the video processing system or may be external to the video processing system. In some implementations, the video processing system may partially comprise the receiving unit. The receiving unit may be considered as a further processing unit and may comprise one or more further integrated circuis, microcontrollers, SoCs and so forth. The receiving unit may for example be comprised by a further ECU of the vehicle, for example by an infotainment ECU. For coupling the receiving unit to the processing unit via the video output interface, the receiving unit may comprise a respective video input interface and the video output interface may be coupled or connected to the video input interface, for example via a coaxial cable or via another video connection.

The at least one memory unit may be part of the video processing system or may be external to the video processing system. For example, the at least one memory unit may comprise a flash memory unit, a random-access memory unit, in particular a DDR-RAM, a hard disc or another storage medium.

The predefined color space may for example be characterized by a number of channels and a respective bit depth for each of the channels. The channels may for example correspond to color channels, an a-channel characterizing transparency, et cetera. In case of a color channel, the bit depth may also be denoted as color depth of the respective channel. Pixel values according to the color space therefore correspond to a respective string of binary values or bit string with a length given by the corresponding bit depth for each channel. For example, considering an RGB888 color space, the pixel values or a given pixel comprise eight binary values for each of a red color channel, a green color channel and a blue color channel. Consequently, each pixel may carry 24 bit of information.

The information stored by the second pixel values is used, for example exclusively used, to encode a respective color and/or transparency of the pixel to be displayed according to image data. On the other hand, in case of the first pixel, at least a part of the first pixel values is used to store the bit string of the file content, which may not necessarily be related to a color to be displayed.

The file content may be a part of a file, in particular a binary file, received via the at least one memory interface. The file content comprises the bit string and may comprise one or more further bit strings. Explanations and steps described with respect to the bit string may also be transferred to the further bit strings of the file content.

A bit string is a string or series of respective binary values of a certain length. If the length of the bit string is 8, the bit string may also be denoted as a byte. However, a bit string may also have a different length.

The frame for the video stream represents a plurality of pixels including the first and the second pixel, wherein the plurality of pixels may for example be arranged according to a rectangular pixel grid. When displayed by means of a display unit, which may be comprised by or coupled to the receiving unit, each pixel is shown as a respective portion on a screen of the display unit having a corresponding color defined by the respective pixel values. The region of interest corresponds to one or more subsets of the plurality of pixels of the frame or the pixel grid, respectively, in particular to one connected subset, which may therefore be displayed in respective portions of the screen.

In particular, the region of interest is smaller than the pixel grid. In other words, the number of pixels comprised by the region of interest is smaller than the total number of the plurality of pixels of the frame. In particular, apart from the second pixel, a plurality of further second pixels outside of the region of interest may be generated by the processing unit in the same way as described for the second pixel. Analogously, apart from the first pixel, a plurality of further first pixels within the region of interest may be generated by the processing unit in the same way as described for the first pixel, in particular by encoding respective further bit strings of the file content by means of respective further first pixel values.

The first pixel values may not necessarily be identical to the respective binary values of the bit string. In other words, the first pixel values may, in addition to the bit string, encode further information. The further information may be static or predefined and may therefore, in particular, be used to reduce the impact of the visual noise or to camouflage the visual noise, respectively.

By means of the video processing system according to the improved concept, it is therefore possible to achieve a high-speed transfer of file content or of binary files, while at the same time the second pixel and the further second pixels may be displayed. In other words, the file content is wrapped in a regular video frame enabling file transfer in parallel to video display. Considering, for example, a RGB888 color space, a region of interest with one megapixel and a frame rate of 20 frames/s, a bit rate up to 480 Mbit/s may be achieved for transferring the binary file. As a comparison, conventional vehicle networks like CAN busses may achieve a bit rate in the order of several Mbit/s only.

The video processing system may also comprise an input interface for coupling the processing unit to a video source. The video source may for example comprise a camera of the vehicle or one of the at least one memory unit. Also the camera may be part of the video processing system or may be external to the video processing system.

The video processing system may for example be configured to operate in a normal mode and in a file transfer mode. When operating in the file transfer mode, the processing unit may carry out the steps described above, in particular generating the frame comprising the first and the second pixel. When operating in the normal mode, the processing unit may for example generate a further video stream comprising a plurality of consecutive frames without transferring file content. In the normal operating mode, the processing unit may for example receive image data via the input interface from the video source and generate the further video stream based on the image data. In the following, the video processing system is considered to operate in the file transfer mode, if not stated otherwise.

According to several implementations of the video processing system, the processing unit is configured to receive the binary file via the at least one memory interface and to divide the binary file into a plurality of portions depending on a predefined size of the region of interest, wherein the file content corresponds to one of the plurality of portions.

The size of the region of interest may, in particular, be given by a total number of pixels of the region of interest. By dividing the binary file into the plurality of portions depending on the size of the region of interest, each portion can be transferred or encoded by a single frame. In particular, a total number of bits of a respective portion of the binary file is equal to or smaller than the total number of bits storable by means of all available pixels of the region of interest. For example, the number storable bits may be limited by the number of pixels of the region of interest multiplied with the number of bits storable per pixel according to the color space.

To this end, all portions of the plurality of portions may have the same size. However, also other conventions for the divisions are possible. For example, all portions except one may have the maximum possible size and the remaining portion may include the remaining part of the binary file.

By dividing the binary file accordingly, the transfer rate for transferring the file content may be adapted to the predefined color space and the size of the region of interest. Thus, an increased flexibility is achieved.

In contrast to the first pixel values, the second pixel values are determined independently of the binary file.

According to several implementations, the processing unit is configured to receive image data via the at least one memory interface and to determine the second pixel values depending on the image data.

The image data may for example be identical or equivalent to the respective second pixel values of the second pixel. Consequently, by generating the frame comprising the second pixel and, for example, the further second pixels in the analog way, the image data may be transferred by means of the video stream from the processing unit to the receiving unit. In parallel, however, the file content or at least a part of the file content may be transmitted from the processing unit to the receiving unit, encoded by means of the first pixel and, for example, the further first pixels.

Since at least a part of the first pixel values is defined by the bit string of the file content, the region of interest may, when displayed by means of the display unit, comprise a certain amount of visual noise. However, this noise is restricted to the region of interest such that the user experience is still acceptable. Furthermore, the visual noise may for example be camouflaged by selecting specific portions of the frame or pixel grid for the frame, for which a distraction of the user by means of the visual noise is not significant or not relevant.

According to several implementations, the number of channels of the color space is given by a channel number, which is equal to or greater than two and, for determining the first pixel values, the processing unit is configured to divide the bit string into a number of substrings equal to the channel number. The processing unit is configured to use a first substring of the number of substrings for a respective number of least significant bits of a first channel of the number of channels of the color space. The substrings of the plurality of substrings do not necessarily all have the same length. In particular, the lengths of the substrings may all be smaller than a minimum bit depth of the channels. Depending on the length of the bit string and the channel number, all substrings may have the same length or substrings all but one may have the same length. Considering a length of the bit string of 8 bit and a channel number of three as a nonlimiting example, the three corresponding substrings may have lengths of 3 bit, 3 bit and 2 bit, respectively. Considering a length of the bit string of 8 bit and a channel number of four as a non-limiting example, the corresponding substrings may all have lengths of 2 bit, et cetera.

In particular, the least significant bits used for the first substring correspond to a number of bits having the least significance, which are required to represent the substring. For example, in case the first channel has a bit depth of 8 bit and the length of the first substring is three, the three least significant bits of the first channel are used to store the first substring, leaving 5 bits for storing the further information. Instead or in addition to storing the further information, the first substring may also be stored redundantly in the first channel.

According to several implementations, the processing unit is configured to, for determining the first pixel values, use a bit string of a predefined bitmap image for a respective number of most significant bits of the first channel for the first pixel.

The bitmap image may for example be stored in a buffer of the processing unit or the video processing system or in the at least one memory unit. The number of most significant bits corresponds to the bits according to the first channel required to store the bit string of the bitmap image and having the highest significance. The bit string of the bitmap image corresponds to the further information.

Since the first substring is stored in the least significant bits leaving the most significant bits for the bit string of the bitmap image, the visual impact of the transferred file content is reduced. When the frame is displayed by the display unit, the visual appearance of the region of interest may be dominated by the predefined bitmap image and the significance of the visual noise caused by the encoded file content is reduced since the most significant bits are used for the bitmap image. In this way, user experience may be greatly enhanced by reducing negative effects of the visual noise. According to several implementations, the processing unit is configured to use a second substring of the number of substrings and a parity bit for a respective number of least significant bits of a second channel of the color space.

The value of the parity bit may be chosen by the processing unit to ensure that the parity of the values stored by the second channel is always even or is always odd, depending on the actual implementation. The receiving unit may check the integrity of the transferred frame by determining the parity of the received values stored by the second channel, which should be even in case of agreed even parity for the transfer and odd in case of agreed odd parity.

According to several implementations, the processing unit is configured to distribute all substrings of the number of substrings among the channels such that each of the channels comprises at least one bit of the bit string.

In other words, each channel stores one of the substrings, wherein the substrings are stored to the respective least significant bits of the respective channel. Therein, in implementations making use of the parity bit as described, the second substring is stored together with the parity bit in the least significant bits of the second channel.

By distributing the substrings and therefore the bit string of the file content among the channels, the impact of visual noise is reduced.

According to several implementations, the color space comprises two or more channels. For determining the first pixel values, the processing unit is configured to divide a value of the bit string of the file content by a number of the two or more channels of the color space to obtain a quotient value and a remainder value. The processing unit is configured to use the quotient value for a respective number of least significant bits of the first channel of the two or more channels of the color space and to use a sum of the quotient value and the remainder value for a respective number of least significant bits of the second channel.

In other words, the division of the bit string corresponds to a division with remainder, also denoted as Euclidean division. Therefore, the quotient value as well as the remainder value are integer numbers. In particular, the total number of the two or more channels is equal to or smaller than a minimum bit depth of the two or more channels. For example, in case the bit depths of all of the two or more channels are equal, the number of the two or more channels is equal to or smaller than this bit depth. Consequently, the word length of the quotient value is at least one bit less than the minimum bit depth. In other words, at least one bit, namely the most significant bit, of the first channel is not used by the quotient value and, therefore, can be used to encode the further information.

For reconstructing the bit string, the receiving unit needs to know only the quotient value, the sum and the number of the two or more channels. In this way, the amount of data to be transferred may be reduced or redundancy may be implemented.

According to several implementations, for determining the first pixel values, the processing unit is configured to use a further bit string of the bitmap image for a respective number of most significant bits of the second channel.

In particular, respective further bit strings of the bitmap image may be encoded by the various channels of the two or more channels in order to exploit all available channels to represent the bitmap image.

According to several implementations, the processing unit is configured to generate the frame such that the frame comprises a portion of pixels outside of the region of interest, wherein metadata are encoded by respective pixel values of the portion of pixels.

The portion of pixels may for example correspond to a portion, which is not displayed by the display unit or not visible for the user. The portion of pixels may for example be arranged at an edge of the pixel grid of the respective frame.

In this way, an efficient possibility to transfer the metadata in parallel to the file content is achieved. In particular, this has the advantage that it is not necessary to store the metadata or other type of header information within the region of interest. Consequently, the rate of transfer for the file content may be increased.

According to several implementations, the video output interface comprises a serializer coupled to the processing unit. According to several implementations, the video processing system, in particular the video input interface of the receiving unit, comprises a deserializer connected to the serializer of the video output interface, wherein the deserializer is comprised by or configured to be coupled to the receiving unit.

The connection between the serializer and the deserializer may for example be implemented by means of a coaxial cable or another suitable cable for transferring video data. In other words, the processing unit may be coupled to the receiving unit via the serializer, the deserializer and the coaxial cable. Using the serializer-deserializer architecture may allow for a hardware-based integrity check.

However, also other types of video links or video ports, such that HDMI et cetera, may be used for the video output interface and the video input interface, respectively.

According to several implementations, the video processing system comprises the receiving unit or the video input interface.

According to several implementations, the input interface for coupling the processing unit to the video source comprises a further deserializer.

In this way, one or more camera of the video processing system or the vehicle may be coupled to the processing unit via the input interface.

According to several implementations, the video processing system comprises a first ECU for a motor vehicle, the first ECU comprising the processing unit.

According to several implementations, the first ECU comprises the video output interface and/or the at least one memory interface and/or the input interface and/or the at least one memory unit.

According to several implementations, the video processing system comprises a second ECU for a motor vehicle, the second ECU comprising the receiving unit.

According to several implementations, the second ECU comprises the video input interface and/or a display interface for coupling the display unit to the receiving unit. According to several implementations, the video processing system comprises the display interface and/or the display unit.

According to the improved concept, also a motor vehicle comprising a video processing system according to the improved concept is provided.

According to the improved concept, also a method for transfer of file content as part of a video stream is provided. According to the method, a processing unit is used to receive the file content from at least one memory unit. The processing unit is used to determine first pixel values for a first pixel according to a predefined color space based on the file content, wherein the first pixel values encode a bit string of the file content. The processing unit is used to determine second pixel values for a second pixel according to the color space independent of the file content and to generate a frame for a video stream, such that the frame comprises the first pixel within a predefined region of interest of the frame and the second pixel outside of the region of interest. The processing unit is used to provide the video stream including the frame to a receiving unit.

According to several implementations of the method, a number of channels of the color space is given by a channel number, which is equal to or greater than two. For determining the first pixel values, the processing unit is used to divide the bit string into a number of substrings equal to the channel number, and to use a first substring of the number of substrings for a respective number of least significant bits of a first channel of the color space.

Further implementations of the method according to the improved concept follow directly from the various implementations of the video processing system according to the improved concept and vice versa. In particular, a video processing system according to the improved concept may be configured to or programmed to carry out a method according to the improved concept or the video processing system carries out such a method.

According to the improved concept, also a computer program comprising instructions is provided. When the instructions or the computer program, respectively, are executed by a computing system, in particular by a video processing system according to the improved concept, for example by the processing unit, the instructions cause the computing system or the video processing system, respectively, to carry out a method according to the improved concept. According to the improved concept, also a computer-readable storage medium is provided, which stores a computer program according to the improved concept.

The computer program as well as the computer-readable storage medium according to the improved concept may be considered as respective computer program products comprising the instructions.

Further features of the invention are apparent from the claims, the figures and the description of figures. The features and feature combinations mentioned above in the description as well as the features and feature combinations mentioned below in the description of figures and/or shown in the figures alone may not only be encompassed by the improved concept in the respectively specified combination, but also in other combinations. Thus, implementations of the improved concept are encompassed and disclosed, which may not explicitly be shown in the figures or explained, but arise from and can be generated by separated feature combinations from the explained implementations. Implementations and feature combinations, which do not have all features of an originally formulated claim, may be encompassed by the improved concept. Moreover, implementations and feature combinations, which extend beyond or deviate from the feature combinations set out in the relations of the claims, may be encompassed by the improved concept.

In the figures,

Fig. 1 shows schematically an exemplary implementation of a motor vehicle according to the improved concept;

Fig. 2 shows schematically an exemplary implementation of a video processing system according to the improved concept;

Fig. 3 shows a schematic representation of a frame displayed on a screen according to an exemplary implementation of a method according to the improved concept;

Fig. 4 shows schematically pixel values of color channels according to a color space; Fig. 5 shows schematically a sequence flow according to an exemplary implementation of a method according to the improved concept;

Fig. 6 shows a flow diagram of an encryption process according to a further exemplary implementation of a method according to the improved concept; and

Fig. 7 shows a flow diagram of a further encryption process according to a further exemplary implementation of a method according to the improved concept.

In Fig. 1 , an exemplary implementation of a motor vehicle 1 according to an exemplary implementation of the improved concept is shown schematically. The motor vehicle 1 comprises an exemplary implementation of a video processing system 2 according to the improved concept.

The video processing system 2 may comprise a first ECU 3, for example, a camera ECU, and a second ECU 4, for example, an infotainment ECU. The second ECU 4 may be coupled to a display unit 5 of the vehicle 1 or of the video processing system 2. The motor vehicle 1 may comprise a vehicle network 6, for example a CAN bus, a CAN-FD bus, a LIN network or a Flexray network, and the ECUs 3, 4 may be connected to each other via the vehicle network 6.

In addition, the ECUs 3, 4 are connected via a video link, which allows to send a video stream from the first ECU 3 to the second ECU 4. The second ECU 4 is able to display the video stream accordingly on the display unit 5. The video link can be any suitable interface for video applications in an automotive environment. These include but are not limited to serialization-deserialization links, such as GMSL, GMSL2, GMSL3, FPDLink, FPDLinkl I or FPDLinkl 11. In addition to the ECUs 3, 4, there may be additional ECUs (not shown) comprised by the vehicle 1 , which may also be connected to the vehicle network 6.

The video processing system 2 may also be operated in a file transfer mode, which is explained in more detail with respect to the remaining figures in the following.

Fig. 2 shows schematically a block diagram of an exemplary implementation of the video processing system 2 according to the improved concept, which may for example be used in a motor vehicle 1 as described with respect to Fig. 1. The video processing system 2, for example the first ECU 3, comprises a processing unit 7, which may be implemented as an SoC. The video processing system further comprises an input interface 8 for coupling one or more cameras 9 of the vehicle 1 or of the video processing system 2 to the processing unit 7, wherein the input interface 8 may for example comprise a deserializer 27. Furthermore, the video processing system 2 comprises one or more memory interfaces 12, 14 for coupling one or more respective memory units 13, 15 to the processing unit 7. For example, the memory unit 13 may be implemented as a DDR-RAM unit and the memory unit 15 may be implemented as a flash memory unit. The video processing system 2 also comprises a video output interface 10, which may be implemented as a serializer, coupling the processing unit 7 to a connection element 33. For example, the first ECU 3 may comprise the memory units 13, 15, the memory interfaces 12, 14, the video output interface 10 and/or the input interface 8.

The vehicle 1 , for example the video processing system 2, comprises a receiving unit 11 , which may also be implemented as an SoC. The vehicle 1 or the video processing system 2 further comprises a video input interface 16, which may for example be implemented as a deserializer, for coupling the receiving unit 11 to the connection element 33, as well as a display interface 5’ for coupling the display unit 5 to the receiving unit 11. Furthermore, the vehicle 1 or the video processing system 2 may comprise further memory units 20, 21 coupled to the receiving unit 11 and/or a mass storage interface 19 for coupling an external mass storage device (not shown) to the receiving unit 11. The receiving unit 5, the video input interface 16, the display interface 5’, the further memory units 20, 21 and/or the mass storage interface 19 may for example be comprised by the second ECU 4. The connection element 33 connects the video input interface 16 to the video output interface 10 and may for example be designed as a coaxial cable, in particular a high-frequency coaxial cable.

In addition, the vehicle 1 , for example the first and the second ECUs 3, 4, respectively, may comprise respective network transceivers 17, 18, for example CAN transceivers, for connecting the ECUs 3, 4 to the vehicle network 6.

In a normal operation mode, the processing unit 7 generates a video stream and provides it via the video output interface 10 and the video input interface 16 to the receiving unit 11. For example, the deserializer of the video input interface 16 converts the serialized data into a video interface format that may be interfaced to the receiving unit 11. The vehicle network 6 may for example be used to exchange commands and responses between the ECUs 3, 4 and/or further ECUs of the vehicle 1.

In case a large data file, in particular a binary file, such as a compressed video stored in one of the memory units 13, 15 of the first ECU 3, shall be sent from the first ECU 3 to the second ECU 4, this may in principle be done via the vehicle network 6 as well. However, due to the limited bit rate for data transfer achievable by typical vehicle networks 6, the file transfer would be very slow.

Therefore, the video processing system 2 according to the improved concept may operate in a data transfer mode in order to send the data file via the video link given by the video output interface 10, the video input interface 16 and the connection element 33. In this way, high-speed file transfer may be achieved without increasing hardware costs.

To this end, the processing unit 7 may receive image data as well as file content of a binary file via the memory interfaces 12, 14 from the memory units 13, 15. The processing unit 7 may for example divide the binary file into a plurality of portions including the file content. For generating a frame for a video stream in the file transfer mode, a region of interest 22 is defined within the frame, as indicated schematically in Fig. 3, which shows an example for a screen content of the display unit 5 during the file transfer mode. In order to determine pixel values for a pixel within the region of interest 22 according to a predefined color space, a bit string of the file content is encoded according to pixel values of the color space. In this way, each pixel in the region of interest 22 encodes one or more respective bit strings of the file content. Furthermore, the image data are used by the processing unit 7 to define pixel values for respective pixels outside of the region of interest 22. Then, the processing unit 7 provides the video stream including the frame at the video output interface 10 to the video input interface 16 and the receiving unit 11, respectively.

For example, the processing unit 7 may read a certain range of the memory units 13, 15 periodically, according to the frame size and frame rate. For example, a video ring buffer may be used to allow the processing unit 7 to read the memory units 13, 15, while video processors of the processing unit 7, for example, a graphics processing unit, GPU, or an application processing unit APU, may prepare the video frame in the background. For sending the binary file via the video output interface 10, the file content may be copied to the range of the memory units 13, 15 which is to be read by the video ring buffer. Fig. 3 shows schematically an exemplary screen layout for the display unit 5 and a respective frame. During transfer of the file, a progress bar 23 may for example be displayed outside of the region of interest 22 and updated accordingly, for example, based on the image data.

In some implementations, one or more ranges of pixels denoted as embedded lines 24, 25 may be arranged outside of the region of interest 22, for example, at respective edges of the frame. The embedded lines 24, 25 may be utilized for passing additional information such as metadata from the processing unit 7 or the first ECU to the receiving unit 11 or the second ECU 4, respectively. Depending on design constraints in relation to other video applications, the information may be spread or distributed amongst the embedded lines 24, 25. The metadata may include one or more of the following information without being limited to those types of information: a frame counter, a horizontal display size, a vertical display sized, a flag indicating an operation mode of the video processing system, a file name, a file size and/or a file content transfer counter et cetera. The metadata may also include information concerning the region of interest 22, in particular a size, location and/or shape of the region of interest 22. In case of a rectangular region of interest 22, the metadata may for example include a horizontal and vertical start location of the region of interest 22 and/or a horizontal and vertical size of the region of interest 22.

During the transfer of the binary file, the contents 26 within the region of interest 22 will in general change according to the loaded contents from the binary file. The improved concept allows the display application and file transfer application to run in parallel.

The receiving unit 11 may for example copy the information stored in the region of interest 22 via the mass storage interface 19 to the mass storage device or to one of the further memory units 20, 21. Depending on the processing capabilities of the second ECU 4 or the receiving unit 11, respectively, the screen contents may be modified in order to satisfy end user demands. For example, the region of interest 22 may be copied and the content of the region of interest may be replaced by a bitmap stored in the further memory units 20, 21 to achieve a more convenient visual appearance. In particular, random mosaicking display effects within the region of interest 22 may be reduced in this way. Alternatively, the whole content of the screen may be constructed by the receiving unit 11 to replace the display content received from the processing unit 7. The file transfer flow may be controlled by various methods. For example, the video processing system 2 may implement a frame rate-based unidirectional flow control, a software- based bidirectional flow control or a hardware-based bidirectional flow control. Depending on the actual capabilities of the ECUs 3, 4, one of the flow control methods may be selected accordingly.

According to a frame rate-based unidirectional flow control, the ECUs 3, 4 agree on a frame rate that both ECUs 3, 4 are able to handle. The data transfer rate file transferring for transferring file content is given by a product of the frame rate, the number of pixels within the range of interest 22 used and the number of channels used for distributing the file content. The data transfer frame rate is therefore in general different from the video frame rate. For example, a video frame rate of 30 frames/s and the data transfer frame rate of 10 frames/s may be implemented such that the video interface hardware of the ECU 3 will read the video ring buffer memory at 30 frames/s. The first ECU 3 may modify the contents of the region of interest 22 and the information in the embedded lines 24, 25 accordingly at 10 frames/s in this example. Consequently, the second ECU 4 will receive the video stream at 30 frames/s, while the file contents in the region of interest 22 are updated at 10 frames/s only.

According to software- based bidirectional flow control, either the vehicle network 6, or a backchannel communication available via the use of the serializer of the video output interface 10 and the deserializer of the video input interface 16 may be used. When the first ECU 3 is ready to send the file content, it may send a message to the receiving unit 11. When the receiving unit 11 cannot process the received contents due to resource limitations, it may send a message to the processing unit 7 to suspend the transfer.

According to hardware-based bidirectional flow control, general purpose input-output pins, GPIO pins, available at the video output interface 10 and the video input interface 16 may be used. Similarly, hardware pins that are used for the flow control on the video input interface 16 may be mirrored from the video input interface 16 to the video output interface 10. This allows the hardware pin signal from the first ECU 3 to transmit to the second ECU 4 and vice versa, without utilizing additionally the resources of the vehicle network 6.

The integrity of the file content transferred to the receiving unit 11 may be checked by a built-in error checking functionality of the serializer and deserializer. The built-in error checking mechanism may include but is not limited to a CRC checksum and ECC error detection and correction method. The implementation then does not require a software routine to evaluate the checksum of the file or sections of the received file.

Fig. 5 shows an exemplary software design or sequence flow, respectively, for an exemplary implementation of a method according to the improved concept. Modules 28, 29, 30, 31, 32 are implemented on the processing unit 7. In particular, a Tenderer module 29 may be provided, which is responsible for generating the video frames. It may generate the video frame structure and also copy the file content to the video frame. It may be implemented on a specific hardware module or on a common GPU of the processing unit 7.

A video output server module 30 may be responsible for distributing and collecting the frames from and to the Tenderer module 29, respectively. A video controller output module 31 may be responsible for configuring, controlling and synchronizing with hardware display subsystems on one side and video output service on the other side.

A file handler module 28 may be responsible for providing the file content that is to be stored in each of the video frames. Furthermore, the file handler module 28 may be responsible for managing and generating the metadata or header information associated with each blocks of file content. A hardware peripheral driver layer 32 may provide the drivers to communicate to the hardware peripherals, mainly to the display subsystem and the serializer.

In step A, the video controller output module 31 may register an output video vertical synchronization signal, Vsync. In step B, the Tenderer module 29 registers as a client for the video output server module 30. In steps C and D, the video output server module 30 is notified of the Vsync. In step E, the frame buffer is made available to the Tenderer module 29. In step F, the Tenderer module 29 may generate the video frames that need to be transmitted. In steps K and G, the Tenderer module 29 gets the file content that has to be stored into the current frame along with the respective metadata and fills the region of interest 22. In steps H, I and J, the complete video frame is then propagated across the video output server module 30 and the video output controller 31 according to the agreed frame rate depending on the flow control scheme.

As described above, during the file transfer is in progress, the area within the region of interest 22 is updated according to the file contents. Therefore, rapid changes of color and brightness may visually impact the region of interest 22. In order to reduce this visual impact, a visual camouflage mode may be enabled. In this mode, each pixel within the region of interest 22 is updated with minimum impact on change of color and brightness due to the contents of the file.

As an example implementation, a predefined bitmap may be previously prepared and stored in the first ECU 3, in particular in a memory buffer or one of the memory units 13, 15. As a non-limiting example, an RGB888 color space is considered with 8 bits per color or, in other words, 24 bits per pixel. The bitmap is copied to the region of interest 22 for being displayed. In Fig. 4, the 24 bit organization of a pixel for color channels C1, CT and C2 is shown, wherein C1, CT and C2 may for example correspond to red, green and blue, respectively. In case an 8 bit string of the file content shall be transferred, it is split up into three substrings S1 , ST, S2, one for each color channel C1 , CT, C2. In the example of RGB888, the substrings S1, ST may for example comprise three bits each, while the substring S2 comprises the remaining 2 bits of the bit string of the file content.

The substrings S1 and ST are then stored into the three least significant bits of the channels C1, CT, respectively. The substring S2 may be stored to the two least significant bits of the channel C2. The remaining 5 bits of the channels C1 , CT and the remaining 6 bits of the channel C2 store the respective content of the bitmap image. Alternatively, a parity bit P may be stored together with the substring S2 to the three least significant bits of the channel C2. In this case, each of the channels C1, CT, C2 stores respective 5 bits of the bitmap image.

In this way, the visual impact observable by the end user is reduced with the expenses that each pixel can transfer only one byte of file content instead of three bytes in the exemplary case of the RGB888 color space. However, the screen layout can be designed such that the region of interest 22 is enlarged, theoretically up to the size of the screen resolution, to compensate for the reduced number of bits per pixel for the file transfer.

Due to concerns of privacy and security of the file being transferred, also an encryption process may be comprised by a method according to the improved concept to prevent the file content from being available to unauthorized users. The file content to be transferred may for example be encrypted on the side of the first ECU 3 and decrypted on the side of the second ECU 4. The profiles being taken for encryption and decryption are agreed by the ECUs 3, 4 before the file transfer begins through a protocol over the video link as described in the following. Exemplary flow charts to illustrate the flow of the encryption protocol are shown in Figs. 6 and 7. The second ECU 4 may check the transferred metadata for a file transfer mode flag. If encryption is required, the second ECU 4 may request encryption scheme details from the first ECU 3.

In particular, different approaches for secret key selection may be employed. For example, a pre-shared key, PSK, may be used. In this case, the ECUs 3, 4 agree to utilize a secret key known by both beforehand, for example, made available during production of the vehicle 1 or stored by means of a secure storage during the production of the ECUs 3, 4. In case of a key exchange, at each initialization of a transfer session, each ECU 3, 4 will agree on a common secret key by means of key agreement protocol, for example an elliptic curve Diffie-Hellman, ECDH, protocol.

For the encryption algorithm, the ECUs 3, 4 agree on the encryption algorithm to be employed, for example an advanced encryption standard, AES, or a Galois/Counter mode, GCM. The encryption mode intrinsics, for example, an initiation vector, is generated by the first ECU 3 by means of a random number generator with enough entropy, for example, according to TRNG AIS-31. The encryption/decryption process may then be performed by means of hardware acceleration or software-based.

In the flow chart of Fig. 6, the second ECU 4 is the key agreement protocol initiator. The protocol starts in step S61. In step S62, the second ECU 4 requests a specific file to be transferred by the first ECU 3. In step S63, the first ECU 3 verifies the request and determines, whether the file is encrypted. If the file requested by the second ECU 4 is encrypted, the first ECU 3 sets an encryption mode in step S64 and replies to the request with an encryption flag and an encryption scheme to be utilized.

In step S65, the second ECU 4 receives the encryption flag and the scheme to be utilized. In step S66, the second ECU 4 loads the pre-shared key to be utilized at the cryptographic engines. In step S67, the second ECU 4 requests the initiation of the encrypted file transfer and in step S68, the first ECU 3 starts the encrypted file transfer.

As depicted in the flow chart of Fig. 7, in step S71 the file, which may be pre-stored or generated on the on-the-fly, is provided by the first ECU 3. In step S72, the file may be split into smaller pieces and loaded to a high-speed volatile memory buffer to be prepared for the encryption. In step S73, the encryption engine is utilized to encrypt the data from the buffer and has access to secret keys required for the encryption process. In step S74, the encrypted data is allocated to a first further buffer to be prepared for transmission to the second ECU 4. In step S75, the data from step S74 is transmitted to the second ECU 4. In step S76, the transmitted data is received by the second ECU 4. In step S77, a second further buffer is used to load the encrypted data received to be prepared for decryption. In step S78, a decryption engine, which has access to the secret keys required for the decryption process, is utilized to decrypt the data.

In step S79, the decrypted data are allocated in a third further buffer to be prepared for reconstruction. In step S710, the file split in step S71 is reconstructed at the second ECU 4 from the multiple pieces dynamically allocated in step S79. In step S711, the file is forwarded to the storage system of the second ECU 4 or to another application.

As described, according to the improved concept, a possibility for high-speed data transfer via a video link is provided, which allows for a parallel file transfer and video display. In particular, the improved concept is characterized by a very low computational complexity and low computational costs. Furthermore, additional hardware costs may be avoided.