Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
VIDEO OUTPUT CHECKER
Document Type and Number:
WIPO Patent Application WO/2014/139773
Kind Code:
A1
Abstract:
Video output checker A video output checker (23) is described. The video output checker is configured to receive incoming video data (20) for rendering an image on a display (17). The video data comprises pixel data which comprise, for each pixel (42), a set of colour component values (57, 57 2, 57 3 ) for a given colour model. The video output checker can be configured to compare each colour component for a pixel in a selected area (24) of the image with a corresponding test range of values (5 1L, 5 1U, 5 2L, 5 2U, 5 3L, 5 3U ) and, if a component value falls outside the range, to measure a deviation. The video 10 output checker can be configured to sum deviations for the selected area to provide an error value (58) for the selected area, and to compare the error value against a threshold number (59). Additionally or alternatively, the video output checker can be configured to determine whether each colour component for a pixel in a selected area (24) of the image falls within a corresponding test range of values (5 1L, 5 1U, 5 2L, 5 2U, 1 5 3L, 5 3U ) so as to determine whether the pixel is valid or invalid and to count the number (58) of valid or invalid pixels in the selected area. The video output checker is configured to compare the number of valid or invalid pixels against a threshold number (59). 20 (Figure 6)

Inventors:
FIEDLER PETER (DE)
GRUNDMANN SVEN (DE)
HENNIG TOBYAS (DE)
Application Number:
PCT/EP2014/053369
Publication Date:
September 18, 2014
Filing Date:
February 20, 2014
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
RENESAS ELECTRONICS EUROP LTD (GB)
International Classes:
G09G5/14; G06F11/10; G09G5/02; G09G5/06
Foreign References:
EP2493189A12012-08-29
JPH04328420A1992-11-17
US20120036418A12012-02-09
US20120050612A12012-03-01
Attorney, Agent or Firm:
PIOTROWICZ, Pawel et al. (Byron HouseCambridge Business Park,Cowley Road, Cambridge Cambridgeshire CB4 0WZ, GB)
Download PDF:
Claims:
Claims

1. A video output checker (23) which is configured to receive incoming video data (20) for rendering an image on a display (17), the video data comprising pixel data which comprise, for each pixel (42), a set of colour component values (571, 572, 573) for a given colour model, the video output checker configured to compare each colour component for a pixel in a selected area (24) of the image with a corresponding test range of values (5I1L, 5iiu, 5I2L, 5i2u, 513L, 5i3u) and, if a component value falls outside the range, to measure a deviation, the video output checker further configured to sum deviations for the selected area to provide an error value for the selected area, and to compare the error value against a threshold number (59).

2. A video output checker (23) which is configured to receive incoming video data (20) for rendering an image on a display (17), the video data comprising pixel data which comprise, for each pixel (42), a set of colour component values (571, 572, 573) for a given colour model, the video output checker configured to determine whether each colour component for a pixel in a selected area (24) of the image falls within a corresponding test range of values (5I1L, 5iiu, 5I2L, 5i2u, 513L, 5i3u) so as to determine whether the pixel is valid or invalid and to count the number (58) of valid or invalid pixels in the selected area, and to compare the number of valid or invalid pixels against a threshold number (59).

3. A video output checker (23) according to claim 1 or 2 which is configured to set a range of values each to be full range for preselected pixels such that the preselected pixels are determined to be valid.

4. A video output checker (23) according to any preceding claim which is configured to store test data (46) comprising test pixel data (45) which comprise, for each pixel, an index identifying one of a predetermined number of colours or whether colour is unimportant, and to convert the index into a set of ranges of values (5I1L, 5iiu, 5I2L, 5I2U, 5I3L, 5l3u).

5. A video output checker (23) according to any preceding claim, wherein the set of colour component values (571, 572, 573) comprises a set of three colour component values.

6. A video output checker (23) according to any preceding claim, wherein the given colour model is the RGB colour model and the colour components comprise red, green and blue. 7. A video output checker (23) according to any preceding claim, wherein the test values (5I1L, 5iiu, 5I2L, 5I2U, 5I3L, 5I3U) are programmable.

8. A video output checker (23) according to any preceding claim, wherein the threshold number is programmable.

9. A video output checker (23) according to any preceding claim comprising an interface (41) to receive instructions for setting the test range values (5I1L, 5iiu, 5I2L, 5I2U, 5I3L, 5i3u) and/or the threshold. 10. A video output checker (23) according to any preceding claim, wherein the selected area (24) contains a telltale.

11. A video output checker (23) according to any preceding claim, which is configured, after checking the selected area (24), to generate a signal (65) at the end of a frame and to determine whether the signal is generated within a given time window.

12. A video output checker (23) according to any one of claims 1 to 11, which is configured to check a first selected area (24) in a first frame and to check a second selected area (24) in a second frame.

13. A video output checker (23) according to any one of claims 1 to 11, which is configured to check a first selected area (24) on a first display and to check a second selected area (24) on a second frame. 14. A system comprising:

at least two video output checkers (23) according to any one of claims 1 to 13, each video output checker configured to check a respective selected area (24) of the image.

15. A system comprising: a display controller (16) for generating video data (20) for rendering on a display (17); and

at least one video output checker (23) according to any one of claims 1 to 13 configured to receive and check the video data in the (respective) area(s) (24).

5

16. An integrated circuit comprising at least one video output checker (23) according to any one of claims 1 to 13 or a system according to claim 14 or 15.

17. A method of checking a video output, the method comprising:

10 receiving incoming video data (20) for rendering an image on a display (17), the video data comprising pixel data which comprise, for each pixel (42), a set of colour component values (571, 572, 573) for a given colour model;

determining whether each colour component for a pixel in a selected area (24) of the image falls within a corresponding test range of values (5I1L, 5iiu, 5I2L, 5I2U, 513L,

!5 5 v) so as to determine whether the pixel is valid or invalid and to count the number

(58) of valid or invalid pixels in the selected area; and

comparing the number of valid or invalid pixels against a threshold number

(59) · 0 18. A method of checking a video output, the method comprising:

receiving incoming video data (20) for rendering an image on a display (17), the video data comprising pixel data which comprise, for each pixel (42), a set of colour component values (571, 572, 573) for a given colour model;

comparing each colour component for a pixel in a selected area (24) of the

5 image with a corresponding test range of values (5I1L, 5iiu, 5I2L, 5I2U, 5I3L, 5i3u) and, if a component value falls outside the range, measuring a deviation,

summing deviations for the selected area to provide an error value for the selected area; and

comparing the error value against a threshold number (59).

0

19. A method according to claim 17 or 18 which is implemented in hardware.

Description:
Video output checker Description

The present invention relates to a video output checker for particular, but not exclusive, use with a display for a vehicle instrumentation cluster.

Warning messages in an instrumentation cluster are often displayed as warning lights (usually referred to as "telltales"). Examples of warning messages include an airbag control telltale indicating that the airbag is not functioning or anti-lock breaking system (ABS) telltale indicating that ABS is not functioning.

It is usually necessary to ensure that a warning message, if displayed, can in fact be seen by the driver. Proper display of warning messages in an instrumentation cluster can be checked. For example, if light emitting diodes (LEDs) are used to display a warning message, then an LED current monitor or even a photo detector can be used confirm that a warning message is in fact being displayed.

With the introduction of liquid crystal display thin film transistor (LCD-TFT) screens in instrumentation clusters, warning messages can also be displayed on LCD-TFT screens as so-called "virtual telltales". Proper display of virtual telltales is usually also required. It may not be necessary to monitor proper operation of the whole display, but just of an area where virtual telltales are displayed.

One solution is to employ external cameras or image sensors to monitor the relevant area of the LCD-TFT screen. However, such dedicated monitoring requires additional resources for capturing and data processing.

Another solution is to employ hardware which reads back display content sent to an area of the TFT screen as close as possible to input pins of the screen, calculate a checksum for the content over this area and compare the checksum with a software- generated expected checksum. If the calculated checksum does not match the expected checksum, a message is passed to application software so that the application software can take appropriate action, such as issue an audio warning. Typical checksum algorithms, such as CRC32, generate a different checksum value even if a single bit in a data stream is different. Therefore, such checksum algorithms may not be particularly suited to some forms of virtual telltale.

Examples of situations when checksum algorithms may not be adequate will now be described with reference to Figures 1 to 3.

Referring to Figure 1, an image 1 comprising a single colour 2 and a background colour 3 may be modified by using patterning to generate a modified image 4 comprising additional virtual colour(s) 51, 52, 5 3 .

Dithering or unintentionally-added noise can mean that a checksum algorithm cannot be used without continually generating error messages.

Referring to Figure 2, a telltale 6 (which in this case takes the form of a triangle warning symbol) can be laid over live video 7 (which in this case takes the form of an image from a camera showing a car on a road).

It may difficult to check whether the telltale 6 is properly displayed using a checksum algorithm because the background image 7 is unpredictable.

Referring to Figure 3, a speedometer 8 is shown. Part of a telltale 9 is hidden, for example, by a virtual instrument needle 10.

Again, the checksum is likely to generate an error even though the displayed content is correct.

Examples of video output checkers which employ CRC algorithms are described in US 2012/0036418 Ai and US 2012/0050612 Ai.

The present invention seeks to provide an improved video output checker for use with, for example, a LCD-TFT screen and other types of displays. According to a first aspect of the present invention there is provided a video output checker. The video output checker is configured to receive incoming video data for rendering an image on a display, the video data comprising pixel data which comprise, for each pixel, a set of colour component values for a given colour model (such as RGB).

The video output checker can be configured to compare each colour component for a pixel in a selected area of the image with a corresponding test range of values and, if a component value falls outside the range, to measure a deviation. The video output checker can be configured to sum deviations for the selected area to provide an error value for the selected area and to compare the error value against a threshold number.

The threshold number may be set to a value of at least i% and no more than io% of a maximum possible error for in the selected area (i.e. maximum deviation per component multiplied by the number of components multiplied by the number of pixels in the selected area). For example, the threshold number may be set to 5% of a maximum possible error for in the selected area.

Additionally or alternatively, the video output checker can be configured to determine whether each colour component for a pixel in a selected area of the image falls within a corresponding test range of values so as to determine whether the pixel is valid or invalid and to count the number of valid or invalid pixels in the selected area. The video output checker can be configured to compare the number of valid or invalid pixels against a threshold number.

The threshold number may be at least 50% of the number of pixels in the selected area. For example, the threshold value may be 90 % or more of the number of pixels in the selected area. Thus, in either case, the video output checker can tolerate a greater variety in colour when checking the content of an image in the selected area which may be caused by, for example, dithering or laying content over video, or due to, for example, partial obscuration of the content by other content. The video output checker may be configured to set each range of values to be full range for preselected pixels thereby forcing the preselected pixels to be deemed to fall within range or be valid. Thus, the video output checker can more easily ignore certain parts of the image and can thereby tolerate a greater variety in the shape and/or colour of content of interest.

The video output checker may be configured to store test data comprising test pixel data which comprise, for each pixel, an index identifying one of a predetermined number of colours or whether colour is unimportant (i.e. indicating "don't care"), and to convert the index into a set of ranges of values.

This can be used to help reduce the amount of memory needed to store test data. For example, an index may be used which may take four values. This can be used to indicate one of three colours or that the colour of a pixel is unimportant (i.e. "don't care").

The set of colour component values may comprise a set of three colour component values. The given colour model may be the RGB colour model and so the colour components may comprise red, green and blue. The component values may take a value lying between oxo and oxF. The component values may take a value lying between oxoo and oxFF.

The test values may be programmable and/or the threshold number may be programmable. Thus, the test values and/ or threshold number can change and so the video output checker can adapt according to different conditions.

The video output checker may comprise a first unit (or "data comparator") configured to determine whether each colour component for a pixel in a selected area of the image falls within a corresponding test range of values so as to determine whether the pixel is valid or invalid and to count the number of valid or invalid pixels in the selected area.

The video output checker may comprise a second unit (or "discriminator") configured to compare the number of valid or invalid pixels against a threshold number. The data comparator and discriminator may be separate units or may be integrated into a single unit.

The video output checker may comprise a third unit (or "data expander") configured to receive test data comprising test pixel data which comprise, for each pixel, an index identifying one of a predetermined number of colours or whether colour is

unimportant, and to convert the index into a set of ranges of values

The video output checker may comprise an interface to receive instructions for setting the test range values and/ or the threshold.

The video output checker may be configured, after checking the selected area, to generate a signal at the end of a frame indicating that the check has been completed and to determine whether the signal is generated within a given time window.

This can be used to check whether the incoming video output and/or checking process is/ are operating correctly.

The video output checker may be configured to check a first selected area in a first frame and to check a second selected area in a second frame.

This can be used to help reduce the amount of hardware needed to check multiple telltales. The video output checker may be configured to check a first selected area on a first display and to check a second selected area on a second display.

The video output checker may be configured to receive and operate synchronously with a video output pixel clock.

The video output checker is preferably implemented as hardware logic. The selected area may comprise an image or partial image of a telltale.

The image may be a frame. The frame may be a frame in a sequence of frames. The image may have an RGB444 image format. The image may be a static image. According to a third aspect of the present invention there is provided a system comprising at least two video output checkers, each video output checker configured to check the video data in respective selected areas of the image.

According to a fourth aspect of the present invention there is provided a system comprising a display controller for generating video data for rendering on a display and at least one video output checker configured to receive and check the video data in the (respective) selected area(s) of the image.

According to a fifth aspect of the present invention there is provided an integrated circuit comprising one video output checker, at least one video output checker or at two least video output checkers. According to a sixth aspect of the present invention there is provided an integrated circuit comprising a display controller for generating video data for rendering on a display and at least one video output checker configured to receive and check the video data in the (respective) area(s). According to a seventh aspect of the present invention there is a method of checking a video output. The method comprises receiving incoming video data for rendering an image on a display, the video data comprising pixel data which comprise, for each pixel, a set of colour component values for a given colour model, determining whether each colour component for a pixel in a selected area of the image falls within a

corresponding test range of values so as to determine whether the pixel is valid or invalid and to count the number of valid or invalid pixels in the selected area and comparing the number of valid or invalid pixels against a threshold number.

According to an eighth aspect of the present invention there is a method of checking a video output. The method comprises receiving incoming video data for rendering an image on a display, the video data comprising pixel data which comprise, for each pixel, a set of colour component values for a given colour model, comparing each colour component for a pixel in a selected area of the image with a corresponding test range of values and, if a component value falls outside the range, measuring a deviation, summing deviations for the selected area to provide an error value for the selected area and comparing the error value against a threshold number. The method may be implemented in hardware.

Certain embodiments of the present invention will now be described, by way of example, with reference to the accompanying drawings, in which:

Figure l illustrates dithering to generate virtual colours;

Figure 2 illustrates a telltale overlaying live video;

Figure 3 illustrates a partially-hidden telltale;

Figure 4 illustrates some examples of telltales;

Figure 5 is a schematic block diagram of a system for generating video data for an instrumentation display;

Figure 5a illustrates an example of an expected image;

Figure 6 is a schematic block diagram of a video output checker;

Figure 7 illustrates monitored screen content;

Figure 8 illustrates test data including a set of pixel test data;

Figure 9 illustrates test data comprising pixel data taking the form of indices representing expected pixel colours;

Figure 10 shows a list of indices and corresponding expected pixel colour;

Figure 11 is a schematic diagram of a look-up table for converting an index into a set of ranges of acceptable colour component values;

Figure 12 is a schematic block diagram of a data expander;

Figure 13 schematically shows comparison of each colour component value for a pixel of incoming video data with a corresponding range of colour component values;

Figure 14 is a process flow diagram of a method of comparing each colour component value for a pixel of incoming video data with a corresponding range of colour component values to identify valid pixels and of counting valid pixels;

Figure 15 is a process flow diagram of a method of determining whether an image has been correctly displayed based on a counted number of valid pixels;

Figure 16 illustrates a set of colour component values showing values which fall out of range and corresponding error values;

Figure 17 is a process flow diagram of a method of comparing each colour component value for a pixel of incoming video data with a corresponding range of colour component values to determine an error value for a value falling outside the range and to accumulate error values to obtain an error value for the image;

Figure 18 is a process flow diagram of a method of determining whether an image has been correctly displayed based on an error value;

Figure 19 illustrates an active area of a screen and displayable telltales; Figure 20 illustrates time multiplexing whereby different test areas are monitored in different frames;

Figure 21 is a schematic block diagram of a window watchdog timer; and

Figure 22 is a timing chart of a process for confirming proper operation of a video output checker.

Referring to Figure 4, some examples of telltales Hi, ii 2 ,...,iii 2 which may be displayed in an automotive instrumentation cluster are shown. In Figure 4, line and filled regions 12 may be coloured, for example green, red, yellow or blue. The unfilled and background regions 13 may be transparent or black.

For example, a first telltale Hi which indicates that cruise control has been set may be green. A second telltale n 2 which indicates a battery problem may be red. A third telltale n 3 which indicates a problem with anti-lock brakes may be yellow. An eleventh telltale n 4 which indicates full (or "high") beam may be blue.

Referring to Figure 5, an automotive instrumentation system 14 is shown. The system 14 includes a main controller 15, a display controller 16 and a display 17. The main controller 15 may generate or receive signals relating to automotive conditions and may forward instructions 18 and/or data 19 to the display controller 16. The display controller 16 generates and outputs video data 20 to the display 17 for rendering (i.e. displaying). The video data 20 takes the form of a video data stream comprising a sequence of frames. The video data 20 may include vertical sync, horizontal sync, data and clock signals. The video data 20 may conform to video graphics adapter (VGA) standard, although other formats can be used. In this case, an RGB444 image format is used. However, other image formats can be used, such as RGB888 or Y'UV444.

The display 17 takes the form of a liquid crystal display thin-film transistor (LCD-TFT) screen having an active area 21. However, the display can take other forms, such as an organic light-emitting diode (OLED) screen, or a direct projector. From time to time, one or more telltales 22 may be rendered on the display 17. The display controller 16 generates video data 20 which includes telltale image data. Herein, telltales 22 may also be referred to as "warning symbols" or simply "symbols". A telltale 22 may be the same, similar or dissimilar to the telltales Hi, ii 2 ,...,iii 2 shown in Figure 4. A video output checker 23 (herein also referred to as "image checking device") is used to check whether the telltale(s) 22 is (are) properly displayed in an area 24 being monitored (herein also referred to as a "test area") by comparing the image displayed in the test area 24 with a corresponding expected image 25 (Figure 5a). As will be explained later, a single video output checker 23 can be used to monitor two or more test areas 24 by inspecting different test areas 24 in different frames. Two or more video output checkers 23 can operate in parallel to monitor two or more test areas 24. Two or more test areas 24 may overlap or abut. More than one display 17 may be used and a single video output checker 23 can be used to monitor test area 24 on different displays.

Referring to Figure 6, the video output checker 23 is shown in more detail.

The video output checker 23 receives video data 20 and video synchronisation data 30 from the display controller 16 (Figure 5) and outputs status 31 and messages 32 to the main controller 15 (Figure 5). The video synchronisation data 30 typically includes pixel clock (not shown), horizontal synchronisation (HSync) and vertical

synchronisation (VSync)

The video output checker 23 includes an address generator 35, test data memory 36, data expander 37, a data comparator 39, a decision unit 40 (herein also referred to as "a discriminator") and a control interface 41.

Referring also to Figure 7, the address generator 35 identifies the area 24 of the active display content 21 to be monitored. The active area 24 is rectangular and comprises m by n pixels 42. Typically, the active area 24 comprises 100 by 100 pixels 42.

Referring also to Figure 8, a set of test pixels 43 corresponding to the active area 24 are defined for an expected image 25 (Figure 5a). The address generator unit 35 generates (absolute) addresses 44 for pixel test data 45 forming a set of test data 46 for the test pixels 43 checking the test area 24 (Figure 5) to determine whether a telltale 22 (Figure 5) is displayed correctly. Herein, the set of test data 46 is also referred to as "symbol test data" or "telltale test data".

The address generator 35 may store data 47 specifying the position and size of the test data 24. This data 47 can be programmed through the control interface unit 41.

Pixel test data 45 comprises information about the expected colour of a pixel 42 in the active area 24. The pixel test data 45 can be stored as indexed colours. ΛΓ-different colours can be stored for a set of test data, namely N-i values each indicating a different colour and 1 value indicating that the colour of a pixel is not important (herein referred to as "don't care"). As will be explained in more detail hereinafter, if a pixel test data 45 is set to "don't care", then that test pixel can be ignored for the purposes of making a decision. This can make it easier to check for non-rectangular (or other irregularly-shaped) symbols.

Referring in particular to Figure 9, a table 48 is shown which illustrates a relation between pixel test data in the form of an index and corresponding colour. N = 4 and so two bits of data can be used identify first, second and third colours (such as green, red and yellow) and "don't care". However, N may take other binary values (such as 2 or 8). N may take non-binary values. Using only two bits to identify colour can help to save data memory. However, if telltales 22 (Figure 5) having more than three colours are required (for example, green, red, yellow and blue), then an additional video output checker 23 can be provided so as to handle additionally-coloured telltales 22 (Figure 5). Two video output checkers 23 can be used to check overlapping test areas 24 (Figure 5). One video output checker 23 can be used to check overlapping test areas 24 (Figure 5), for example, by checking one test area 24 in one frame and checking another test area in a different frame.

Referring again to Figure 6, test data 45 is read out from test data memory 36 and is supplied to the data expander 37.

The data expander 37 converts the test data 46 into expanded test data 50. Referring also to Figure 10, the data expander 37 maps index values 45 (Figure 8) into pairs 5I1, 5I2, 5I3 of component values 5I1L, 5iiu, 5I2L, 5I2U, 5I3L, 5I3U for a colour model (or "direct mapped colour") which is the same as the incoming video data 20. The component value pairs 5I1, 5I2, 5i 3 define upper and lower limits for a component values for each component.

In this example, an RGB model is used and so the colour components comprise red, green and blue components. However, other colour models can be used, such as YUV, Y'UV or YCbCr. The index values are converted into pairs of red, blue and green component values 5I1L, 5iiu, 5I2L, 5I2U, 51 3 L, 5i 3 u. In this example, 12 bits are used to encode pixel data. However, fewer or more bits, for example 24 bits, may be used to encode data. Thus, in this case each value 5I1L, 5iiu, 5I2L, 5I2U, 51 3 L, 5i 3 u can take a value between oxo and oxF. Referring to Figure 11, a look-up table 52 is used to map each index value into a set of component values taking values when N = 4. As shown in Figure 11, lower and upper limits for the first, second and third colour components for a given index, i, take the value (Ci I; L, Ci i;H ), (C2 i;L , C2 i;H ) and (C3 i;L , C3 i;H ) respectively. For example, the colour may be green and so the component values (Cii,L, Ciy , (C2i, L , C2i, H ) and (C3i,L, C3I,H) may take values (oxo, 0x4), (0x8, oxF) and (oxo, 0x4) respectively. The colour may be red and so the component values (Cii,L, ΟΙΙ,Η), (C2i, L , C2i, H ) and (C3i,L, C3I,H) may take values (0x8, oxF), (oxo, 0x4) and (oxo, 0x4) respectively. A colour may be yellow and so the component values (Cii,L, ΟΙΙ,Η), (C2i, L , C2i, H ) and (C3i,L, C3I,H) may take values (oxD, oxF), (oxA, oxF) and (oxo, 0x4) respectively.

The component values (Cii,L, ΟΙΙ,Η), (C2i, L , C2i, H ) and (C3i,L, C3I,H) for "don't care" are set to full range, in this case, (oxo, oxF), (oxo, oxF) and (oxo, oxF) respectively. Thus, as will be explained in more detail later, when checking whether a component value of a pixel 42 (Figure 7) falls within the respective test range, the component value will fall within the test range for "don't care" regardless of component values.

A corresponding full range can be set for "don't care" for other colour models. The upper and lower values 5I1L, 5iiu, 5I2L, 5I2U, 51 3 L, 5i 3 u for a given entry index can be stored as a pair of values, one storing the values of the lower limits 5I1L, 5I2L, 51 3 L and another storing the values of the upper limits 5I1U, 5i 2 u, 5i 3 u. For example, for an entry index which corresponds to yellow colour, assuming an RGB444 image format is used, the values stored in the look table are oxDAo and OXFF4. Thus, the lower and upper limits 5I1L, 5iiu for a first component are oxD and oxF respectively, the lower and upper limits 5I2L, 5i 2 u for a second component are oxA and oxF respectively and the lower and upper limits 5I3L, 5I3U for the third component are oxo and oxF respectively.

Referring to Figure 12, the data expander 37 may store the look up table 52 in the form of a programmable register. The register 52 comprises elements 54 storing component values 53 in pairs for different indexes 55 (one of which corresponds to "don't care") and different components 56. The values 53 in the register 52 can be programmed through the control interface unit 41.

Referring again to Figure 6 and 13, the data comparator 39 compares, component-wise, component values 571, 572, 57 3 of pixel data 57 for a pixel 42 (Figure 7) of incoming image data 20 within the test area 24 (Figure 7) with pairs of component values 5I1L, 5iiu, 5I2L, 5I2U, 5I3L, 5I3U of pixel data 50 for a corresponding test pixel 43 (Figure 10) to judge whether the pixel is valid.

Determining correct display of a telltale based on counting invalid pixels

Figure 14 is a flow diagram of a comparing and counting process carried out by the data comparator 39.

Referring to Figures 6, 13 and 14, when a new frame of video data is received, a valid pixel counter is reset (step Si). The data compare unit 39 inspects pixel data 57 for a next (in this case, the first) pixel 42 (Figure 7) in the test area 24 (Figure 7) (step S2).

The data comparator 39 reads the first component value 571 (for example, red component) of the pixel data 57 for the incoming image data 20 (step S3) and compares it against the lower and upper limits 5I1L, 5iiu for the corresponding component (steps S4 & S5). If the first component value 571 falls with range, then the data comparator 39 reads the second component value 572 (for example, green component) of the pixel data 57 for the incoming image data 20 (step S6) and compares it against the lower and upper limits 5I2L, 5I2U for the corresponding component (steps S7 & S8).

If the second component value 572 falls with range, then the data comparator 39 reads the third component value 57 3 (for example, blue component) of the pixel data 57 for the incoming image data 20 (step S9) and compares it against the lower and upper limits 5I3L, 5I3U for the corresponding component (steps S10 & S11).

If each component value 571, 572, 57 3 falls in range, then the pixel is considered to be valid (step S12) and the valid pixel counter in incremented by one (step S13).

If, however, one or more of the component values 571, 572, 57 3 falls outside range, then the pixel is considered to be invalid (step S14). The valid pixel counter is not incremented. Optionally, an invalid counter may be used to count invalid pixels.

The data comparator 39 determines whether all the pixels 43 in the test area 24 (Figure 7) have been checked (step S15). If not, the data comparator 39 proceeds by inspecting the pixel data 57 for the next pixel 43 (Figure 7) in the test area 24 (Figure 7) (step S2) and repeating the process (steps S3 to S11 and either S12 & S13 or S14).

Once all the pixels 43 in the test area 24 (Figure 7) have been checked, the data comparator 39 outputs a value 58 of the valid pixel counter to discriminator 40 (step S16).

Referring to Figure 6, the discriminator 40 decides whether or not the telltale has been correctly displayed.

Figure 15 is a flow diagram of a decision and action process carried out by the discriminator 40.

Referring to Figures 6 and 15, once a frame has been compared, the value 58 of the valid pixel counter is compared to a threshold value 59 (step S17). The threshold value may be at least 50% of tested pixels. For example, the threshold value may be 90% of tested pixels or more. The discriminator 40 may store the threshold value 59 in the form of a programmable register. The values 59 can be programmed through the control interface unit 41. If the measured value 58 equals or exceeds the threshold value 59, the discriminator 40 determines that the telltale has been correctly displayed (step S18). The discriminator 40 may set a flag 31 in a status register (not shown) and/or output a message 32 (step S19 & S20). However, if the measured value 58 falls below the threshold value 59, then the discriminator 40 determines that the telltale has been incorrectly displayed (step S21). The discriminator 40 may set a flag 31 in a status register (not shown) and/or output a message 32 (step S22 & S23). The message 32 (no error/error) can be programmed through the control interface unit 41·

Referring again to Figure 6, the control interface 41 provides an interface to a main controller 15 (Figure 5) or other external controller. Through the interface 41 it is possible to program the test data 46 (Figure 8), the values 53 (Figure 12) in the look-up table (Figure 12), the data 47 (Figure 6) regarding the position and size of the test area 24 (Figure 7), the threshold value 59 for the result judgement and/or the messages 32, as well as other control information. The control interface 41 may also provide an interface to inspect the status 31 stored in the discriminator 40.

In the process hereinbefore described (hereinafter referred to as the "first process"), the data comparator 39 counts the number of valid (or invalid) pixels and the discriminator 40 compares this number against a threshold. Additionally or alternatively, another process (or "second process") can be used in the data comparator 39 determines a so-called "pixel difference" for each pixel, in other words a measure of how much the colour of an actual screen pixel differs from a test pixel, and sums pixel differences over a test area 24 and the discriminator 40 compares the sum against a threshold. The second process allows narrower ranges for test colour components to be used, but provide some latitude if many pixels are invalid (i.e. fall outside range) but only just fall outside range. This can afford greater control when determining whether or not a symbol has been displayed correctly.

Determining correct display of a telltale based on measuring pixel difference

Referring to Figure 16, the data comparator 39 (Figure 6) can be used to measure, for each pixel i, a degree of deviation (or "pixel difference") of an actual screen pixel colour component value outside the respective colour component range. The deviation (or "difference") is expressed as an error value, ERROR(i, CX) where X = 1, 2 or 3. For example, Ci may be red, C2 may be green and C3 may be blue. If the actual screen pixel component value falls within range, then the ERROR(i, CX) is zero. If the actual screen pixel component falls outside range, then the ERROR(i, CX) is equal to the absolute difference between the actual screen pixel component value and the closest threshold limit, which may be the upper or lower limit, i.e. ERROR(i, CX) = abs(CX - CXm) or abs (CX iL - CX).

Error values ERROR(i, CX) for each colour component are added to determine an error value, ERROR(i), for a pixel, i.e. ERROR(i) = ERROR(i, Ci) + ERROR(i, C2) +

ERROR(i, C3).

Error values for all the pixels, i.e. o, i,...,i,...,(mxn)-i are added to find a total error value, ERROR, for the test area. The discriminator 40 (Figure 6) compares this total error value against an error value threshold.

Using this approach, the error value threshold can be set so as to allow, on average, a certain degree of deviation by a certain number of pixels. Thus, a symbol which might be recognisable to the user but which would otherwise be deemed to be incorrectly displayed by the first process can be judged to be correctly displayed.

Figure 17 is a flow diagram of a comparing and counting steps carried out by the data comparator 39 (Figure 6) in the second process. In this example, it is assumed that there are three colour components (Ci, C2, C3) each component encoded with 4 bits, i.e. RGB444. However, other image formats can be used. Referring to Figures 6, 13 and 17, when a new frame of image data is received, an ERROR value register is reset (step S25).

A pixel ERROR value register is reset (step S26). The data compare unit 39 inspects pixel data 57 for a next (in this case, the first) pixel 42 (Figure 7) in the test area 24 (Figure 7) (step S27).

The data comparator 39 reads the first component value 571 (for example, red component) of the pixel data 57 for the incoming image data 20 (step S28) and compares it against the lower and upper limits 5I1L, 5iiu for the corresponding component (steps S29 & S30).

If the first component value 571 falls below the lower limit 5I1L, then the data comparator 39 determines ERROR(i, Ci) by calculating an absolute value of the difference between the first component value 571 and the lower limit 5I1L (step S31).

If the first component value 571 is above an upper limit 5I1U, then the data comparator 39 determines ERROR(i, Ci) by calculating an absolute value of the difference between the first component value 571 and the upper limit 5I1U (step S32).

If the first component value 571 falls between the lower and upper limits 5I1L, 5iiu, then the data comparator 39 sets ERROR(i, Ci) to oxo (step S33).

The same process is repeated for the second component value 572 (for example green component) of the pixel data 57 (steps S34 to S39) and the third component value 573 (for example blue component) of the pixel data 57 (steps S40 to S45).

The difference value, ERROR(i,CX) for each component (Ci, C2, C3) are summed to yield a difference value, ERROR(i) for the pixel (step S46).

For example, the lower limits 5I1L, 5I2L, 51 3 L may be represented by a value oxooo and the upper limits 5I1U, 5i 2 u, 5i 3 u may be represented by a value 0X9FF. Thus, the lower and upper limits 5I1L, 5iiu for the red component are oxo and 0x9 respectively, the lower and upper limits 5I2L, 5i 2 u for the green component are oxo and oxF respectively and the lower and upper limits 5I3L, 5i 3 u for the blue component are oxo and oxF respectively. If a screen image pixel has a value OXB44 (i.e. the red component is oxB, the green component is 0x4 and the blue component value is 0x4), the difference values ERROR(i,CX) for the red, green and blue component values are 0x2, oxo and oxo respectively and the difference value, ERROR(i), for the pixel is 0x2. Steps S29 to S46 can be executed using a suitable function such as, for example:

ERROR ( i ) = max (cropneg (C1-C1U) , cropneg(ClL - CI)

max (cropneg (C2-C2U) , cropneg(C2L - C2)

max (cropneg (C3-C3U) , cropneg(C3L - C3)

where

cropneg (x) is { x for x >= 0

{ 0 for x < 0

and

max(x, y) is { x for x >= y

{ y for x < y

The data comparator 39 determines whether all the pixels 42 in the test area 24 (Figure 7) have been checked (step S48). If not, the data comparator 39 proceeds by resetting the pixel ERROR register (step S26) and inspecting the pixel data 57 for the next pixel 43 (Figure 7) in the test area 24 (Figure 7) (step S27) and repeating the process (steps S28 to47)-

Once all the pixels 42 in the test area 24 (Figure 7) have been checked, the data comparator 39 outputs ERROR value 58 to discriminator 40 (step S49).

The discriminator 40 (Figure 6) decides whether or not the telltale has been correctly displayed.

Figure 18 is a flow diagram of a decision and action process carried out by the discriminator 40 in the second process.

Referring to Figures 6 and 18, once a frame has been compared, the ERROR value 58 of is compared to a threshold ERROR value 59 (step S50). The threshold ERROR value 58 may be set to, for example, a value closest to 5% of a maximum possible error for a frame. The discriminator 40 may store the ERROR threshold value 59 in the form of a programmable register. The values 59 can be programmed through the control interface unit 41. If the measured value 58 falls below the threshold value 59, the discriminator 40 determines that the telltale has been correctly displayed (step S51). The discriminator 40 may set a flag 31 in a status register (not shown) and/or output a message 32 (step S52 & S53).

However, if the accumulated ERROR value 58 is equal to or exceeds the ERROR threshold value 59, then the discriminator 40 determines that the telltale has been incorrectly displayed (step S54). The discriminator 40 may set a flag 31 in a status register (not shown) and/or output a message 32 (step S55 & S56). The discriminator 40 may issue an interrupt and store the index number of the failed monitoring area in the status register (not shown).

The message 32 (no error/error) can be programmed through the control interface unit 4 1 -

Multiple video output checkers 23

Referring again to Figure 5, more than one video output checker 23 may be provided.

For example, up to four video output checkers 23 can be provided sharing a common single port RAM macro. This can help to reduce chip size, although this might limit the ability to monitor overlapping test area.

If more than one checker 23 is used, then they can operate in parallel. Parallel checkers 23 can monitor overlapping area. For example, four modules each running four checkers 23 can monitor up to 16 telltales in parallel.

Time multiplexing

Although multiple video output checkers 23 can be used to check multiple telltales, a single video output checker 23 can also be used to monitor multiple telltales.

Figure 19 shows an active area 21 of a screen 17 (Figure 5) in which multiple illustrative telltales 221, 22 2 , 22 3 , 22 4 can be displayed as-and-when required. The telltales 221, 22 2 , 22 3 , 22 4 are on a black background. Each telltale 22i, 22 2 , 22 3 , 22 4 is different and, when displayed, are displayed in different parts of the active area 21. Each telltale 221, 22 2 , 22 3 , 22 4 has a corresponding test area 24 1 , 24 2 , 24 3 , 244. A first telltale 221 is a yellow check engine symbol , a second telltale 22 2 is dark blue high beam on indicator symbol, a third telltale 22 3 is a green cruise control indicator symbol and a fourth telltale 22 4 is a red break warning indicator symbol.

Referring to also Figure 20, a single video output checker 23 (Figure 5) running one checking process per frame can be used to monitor multiple telltales 221, 22 2 , 22 3 , 22 4 (for example up to 16 telltales) using time multiplexing.

The address generator 35 (Figure 6) handles time multiplexed monitoring of multiple (enabled) test areas 241, 242, 24 3 , 244. One test area 241, 242, 24 3 , 244 is checked each frame 6o n , 6o n+ i, 6o n+ 2, 6o n+3 . Thus, after checking a first monitoring area 241 in a first frame 6o n , the video output checker 23 checks a second monitoring area 242 in the next frame 6o n+ i and so on. After all active monitoring areas 241, 242, 24 3 , 244 have been processed, the video output checker 23 repeats the operation starting again with the first monitoring area 24 1 .

As shown in Figure 20, the video output checker 23 can inspect different test areas 24 1 , 24 2 , 24 3 , 244 containing different telltales 22 1 , 22 2 , 22 3 , 22 4 (or the same test area in which different telltales can be displayed) in different frames 6o n , 6o n+ i, 6o n+ 2, 6o n+3 to see if the corresponding expected image 251, 252, 25 3 , 254 is displayed.

If video data 20 has a frame rate of at least 80 fps (i.e. each frame is displayed for no more than 12.5 ms) and the reaction time of a driver is considered to be, for example, 200 ms, then it is possible monitor up to 16 different telltales sequentially. A combination of two or more video output checkers 23 and time multiplexing can be used. Thus, a combination of parallel checking and sequential checking can be used.

The process can be implemented as a hardware state machine which is reconfigured for each image, i.e. each frame. For example, a test area in a frame can be checked using a set of registers which define the colour look up table, geometry and test pixels. When the next frame is output, the registers can be switched to a new set of registers.

Accordingly, it is possible to test several overlapping objects frame by frame.

The process can be used to check two or more video outputs. Input data can be switched from one video output to another video output.

For example, telltales can be sequentially tested on a first screen, a second screen and then the first screen again and so on. Window watchdog timer

Referring to Figure 5, the video output checker 23 automatically checks the timing of VSync and HSync because, if the timing of VSync and/or HSync is/are incorrect, then the test area 24 will be incorrectly positioned and the telltale checking process will yield an error result.

Referring to Figure 21, the video output checker 23 can include a window watchdog timer 61 to supervise the video output 20 (Figure 5) and the video output checking process to confirm that the video output checking process of a frame is completed within a certain time window and, thus, verify that the video output 20 (Figure 5) and checking process are operating properly.

The window watchdog timer 61 includes a window watchdog controller 62 and a counter 63 having a counter value 64. Referring also to Figure 22, for a given frame 6o q , after a video checking process for a test area 24 r has been completed, the discriminator 40 outputs a check complete signal 65 at the end of the complete frame 6o q . The address generator 35 (Figure 6) detects the end of a complete frame (based on sync data and by counting all the pixels in the frame) and signals the end of the complete frame to the discriminator 40. When the window watchdog controller 62 receives the check complete signal 65 from the discriminator 40, it provides a reset signal 66 to reset counter 63. Video checking process for the test area 24 r +i in the next frame 6o q +i is carried out and so on.

Prior to resetting, when the window watchdog controller 62 receives the check complete signal 65 from the discriminator 40, it compares the counter value 64 against lower and upper values 67L, 67H so as to check whether the check complete signal 65 is received within a preset window of time, ti to t2. The lower and upper values 67L, 67H can be programmed.

If the check complete signal 65 is received before the start of a window (i.e. before ti) or after the end of the window (i.e. after t2), then window watchdog controller 62 generates and transmits a failure interrupt 68 to the main controller 15.

The window can be used to check whether the process is completed within, for example, ±10% of an expected completion time.

To give an example, a video output checking process can be performed every 16.6 ms for a display refreshing with a frame rate of 60 fps. Accordingly, the window watchdog timer 61 can be programmed to have a lower limit 67L of 15 ms and an upper limit 67H of 18 ms.

Therefore, if after 18 ms no frame complete signal 65 is received (for example, as shown in the last frame 6o q+3 in Figure 22), the window watchdog timer 61 determines that video output checking is not properly operating. This might occur, for example, if a video output checking process does not finish checking a test area 24.

If the window watchdog timer 61 receives a frame complete signal 65 within 15 ms of a previously-transmitted frame complete signal 65, then the window watchdog timer 61 determines that the timing of the video output 20 (Figure 5) is not set correctly or that the video output checking process could not check a complete image because, for example, the image has not yet been completely output.

Window watchdog timer can be used when time multiplexing is or is not used.

The video output checker 23 using the first and/or second process can have one or more advantages.

A human observer may not distinguish between small differences in colour and/or amplitude. However, if an overly-sensitive image checking process (such as one employing CRC) is used and there are slight, but imperceptible (to the human observer) differences in colour and/ or amplitude, then an otherwise acceptably-displayed telltale may be dismissed as being improperly displayed. Therefore, by checking ranges of colours and/or by designating certain pixels as being ones whose colour is unimportant, sensitivity of the video output checker 23 (Figure 6) to images which, for example employ dithering or which blend with background data, can be reduced. Moreover, the ranges can be adjusted and/or the number and/or position of "don't care" pixels can be adjusted (even on a frame-by-frame basis) to (dynamically) alter sensitivity.

Moreover, the threshold value 59 can also independently be adjusted (optionally, on a frame-by-frame basis) to alter sensitivity.

Thus, the video output checker 23 can tolerate a greater variety in colour and shape of telltales and/or the degree of tolerance can be varied. It will be appreciated that many modifications may be made to the embodiments herein before described.

For example, although video output checker 23 can be used in other, non-automotive applications, such as industrial control.

Although the video output checker 23 is based on an RGB colour model, other RGB colour models can be used, such as YUV, Y'UV and other similar colour models. Pixel in the video data 20 may be encoded using 12 bits, 24 bits or other amounts of bits. Although the video output checker 23 is based on processing three component values, more (e.g. 4) component values may be used.

The video output checker 23 can run in the video output pixel clock domain. This means that the control interface 41 and result are re-synchronised.