Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
SYSTEMS AND METHODS FOR MONITORING PLAYBACK OF RECORDED CONTENT
Document Type and Number:
WIPO Patent Application WO/2018/152630
Kind Code:
A1
Abstract:
The various embodiments disclosed herein relate to a system and method for monitoring playback of recorded content. The system comprises a playback monitor controller that can be placed behind a cinema screen or in front of the screen. The system can be configured to detect the presence of light and sound or to compare the light and sound detected to a content verification record for audit or verification purposes. The method uses the playback monitor controller to obtain light and sound data, process that data, and compare it to the content verification record.

Inventors:
BLUMENFELD STEVEN (US)
ARCHER ROD (US)
RELACION DOMINGO (CA)
CHIU JOSEPH (US)
Application Number:
PCT/CA2018/050198
Publication Date:
August 30, 2018
Filing Date:
February 21, 2018
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
TIMEPLAY INC (CA)
International Classes:
H04N21/442; H04N21/40; H04N21/4425
Foreign References:
US8957972B22015-02-17
US9794641B22017-10-17
US20110090482A12011-04-21
US20170048609A12017-02-16
Attorney, Agent or Firm:
BERESKIN & PARR LLP / S.E.N.C.R.L. (CA)
Download PDF:
Claims:
We claim:

A playback monitoring system located at a venue, the playback monitoring system being communicably coupled to a content playback device configured to play audio and video associated with a content file at the venue, the video being displayed on a display screen and the audio being provided within the venue, the playback monitoring system comprising:

- a memory unit;

- a processing unit coupled to the memory unit;

- at least one video sensor coupled to the memory unit and the processing unit, the at least one video sensor being configured to detect presence of light corresponding to the video being displayed on the display screen within the venue; and

- at least one audio sensor coupled to the memory unit and the processing unit, the at least one audio sensor being configured to detect presence of sound corresponding to the audio being provided within the venue,

- the processing unit being configured to generate a light presence signal based on the detection of presence of light by the at least one video sensor, and a sound presence signal based on the detection of sound by the at least one sound sensor.

2. The playback monitoring system of claim 1 , wherein the at least one video sensor is further configured to detect light intensity associated with the video being displayed on the display screen, and generate a corresponding light intensity signal comprising a plurality of light intensity values.

3. The playback monitoring system of claim 2, further comprising a content verification record corresponding to the video associated with the content file being played using the content playback device, wherein the processing unit is configured to:

- receive the content verification record from an external server, the content verification record comprising a plurality of brightness values associated with the video being displayed on the display screen; - compare the plurality of brightness values within the content verification record to the plurality of light intensity values within the light intensity signal; and

- generate a correlation index for the video associated with the content file based on the comparison.

The playback monitoring system of claim 1 , wherein the at least one audio sensor is further configured to sample sound associated with the audio being provided within the venue, and generate a corresponding sound signal comprising a plurality of sampled sound values.

The playback monitoring system of claim 4, further comprising a content verification record corresponding to the audio associated with the content file being played using the content playback device, wherein the processing unit is configured to:

- receive the content verification record from an external server, the content verification record comprising a plurality of sound levels associated with the audio being played within the venue;

- compare the plurality of sound levels within the content verification record to the plurality of sampled sound values within the sound signal; and

- generate a correlation index for the sound associated with the content file based on the comparison.

The playback monitoring system of claim 1 , wherein the at least one video sensor is a single cell light intensity sensor.

The playback monitoring system of claim 1 , wherein the at least one video sensor is located behind the display screen such that it is on the opposite side of the display screen than the content playback device.

The playback monitoring system of claim 1 , wherein the at least one video sensor is located on the same side of the display screen as the content playback device.

9. The playback monitoring system of claim 1 , wherein the content file is stored on film.

10. The playback monitoring system of claim 1 , wherein the content file is a digital file.

1 1 . The playback monitoring system of claim 1 , wherein the venue is a movie theater and the content playback device is a projector. 12. A method of monitoring playback of a content file within a venue, the venue including a content playback device configured to play audio and video associated with the content file, the video associated with the content file being displayed on a display screen and the audio associated with the content file being provided within the venue, the method comprising:

- detecting, by at least one video sensor, presence of light corresponding to the video being displayed on the display screen within the venue;

- detecting, by at least one audio sensor, presence of sound corresponding to the audio being provided within the venue; and

- generating, by a processing unit coupled to the at least one video sensor and the at least one audio sensor, a light presence signal based on the detection of presence of light by the at least one video sensor, and a sound presence signal based on the detection of sound by the at least one sound sensor.

13. The method of claim 12, further comprising:

- detecting, by the at least one video sensor, light intensity associated with the video being displayed on the display screen; and

- generating, by the processing unit, a corresponding light intensity signal comprising a plurality of light intensity values.

14. The method of claim 13, further comprising:

- receiving a content verification record from an external server, the content verification record corresponding to the video associated with the content file being played using the content playback device, the content verification record comprising a plurality of brightness values associated with the video being displayed on the display screen;

- comparing the plurality of brightness values within the content verification record to the plurality of light intensity values within the light intensity signal; and

- generating a correlation index for the video associated with the content file based on the comparison.

15. The method of claim 12, further comprising:

- sampling, by the at least one audio sensor, the sound associated with the audio being provided within the venue; and

- generating a corresponding sound signal comprising a plurality of sampled sound values.

16. The method of claim 15, further comprising:

- receiving a content verification record from an external server, the content verification record corresponding to the audio associated with the content file being played using the content playback device, the content verification record comprising a plurality of sound levels associated with the audio being played within the venue;

- comparing the plurality of sound levels within the content verification record to the plurality of sampled sound values within the sound signal; and

- generating a correlation index for the sound associated with the content file based on the comparison.

17. A computer-readable medium storing computer-executable instructions, the instructions for causing a processor to perform a method of monitoring playback of a content file within a venue, the venue including a content playback device configured to play audio and video associated with the content file, the video associated with the content file being displayed on a display screen and the audio associated with the content file being provided within the venue, the method comprising: - detecting, by at least one video sensor, presence of light corresponding to the video being displayed on the display screen within the venue;

- detecting, by at least one audio sensor, presence of sound corresponding to the audio being provided within the venue; and

- generating, by a processing unit coupled to the at least one video sensor and the at least one audio sensor, a light presence signal based on the detection of presence of light by the at least one video sensor, and a sound presence signal based on the detection of sound by the at least one sound sensor.

18. The computer-readable medium of claim 17, wherein the method further comprises:

- detecting, by the at least one video sensor, light intensity associated with the video being displayed on the display screen;

- generating, by the processing unit, a corresponding light intensity signal comprising a plurality of light intensity values;

- receiving a content verification record from an external server, the content verification record corresponding to the video associated with the content file being played using the content playback device, the content verification record comprising a plurality of brightness values associated with the video being displayed on the display screen;

- comparing the plurality of brightness values within the content verification record to the plurality of light intensity values within the light intensity signal; and

- generating a correlation index for the video associated with the content file based on the comparison.

19. The computer-readable medium of claim 17, wherein the method further comprises:

- sampling, by the at least one audio sensor, the sound associated with the audio being provided within the venue;

- generating a corresponding sound signal comprising a plurality of sampled sound values; - receiving a content verification record from an external server, the content verification record corresponding to the audio associated with the content file being played using the content playback device, the content verification record comprising a plurality of sound levels associated with the audio being played within the venue;

- comparing the plurality of sound levels within the content verification record to the plurality of sampled sound values within the sound signal; and

- generating a correlation index for the sound associated with the content file based on the comparison.

20. The computer-readable medium of claim 17, wherein the method is further defined according to any one of claims 12 to 16.

Description:
Title: SYSTEMS AND METHODS FOR MONITORING PLAYBACK OF

RECORDED CONTENT

Cross-reference to Related Application

[1] This application claims the benefit of U.S. Provisional Patent Application No. 62/461 ,753, filed February 21 , 2017. The entire contents of U.S. Provisional Patent Application No. 62/461 ,753 are incorporated by reference herein.

Field

[2] The described embodiments relate to systems and methods for monitoring playback of recorded content, and in particular monitoring presence of light and sound during playback of recorded content.

Background

[3] Conventional systems and methods used to monitor playback of recorded content at a venue consist of intrusive devices, such as an internet camera. During playback, an internet camera may be aimed at a display surface to capture images and sound of the recorded content as is seen and heard by an audience. However, use of such intrusive devices tends to be expensive, complex, and prone to security and piracy concerns, including concerns of malicious use.

Summary

[4] In a first broad aspect, in at least one embodiment described herein, there is provided an In Venue Light Sound Meter (IVLSM) system and method for capturing the presence or absence of light and sound.

[5] In some embodiments, the IVLSM is a device that is placed behind a movie screen and measures the inclusion or absence of sound and light, and transmits that data to a central processor where the data is analyzed and aggregated for monitoring and alerting purposes.

[6] In some other embodiments, the IVLSM device is placed in front of the movie screen, such as in the theater or projection booth. In some cases, the device may also include a filter to distort the image or prevent a portion or portions of the image so that the light data it captures has no real value for any purpose other than monitoring playback. [7] In another aspect, in at least one embodiment described herein, there is provided a playback monitoring system located at a venue, the playback monitoring system being communicably coupled to a content playback device configured to play audio and video associated with a content file at the venue, the video being displayed on a display screen and the audio being provided within the venue, the playback monitoring system comprising: a memory unit; a processing unit coupled to the memory unit; at least one video sensor coupled to the memory unit and the processing unit, the at least one video sensor being configured to detect presence of light corresponding to the video being displayed on the display screen within the venue; and at least one audio sensor coupled to the memory unit and the processing unit, the at least one audio sensor being configured to detect presence of sound corresponding to the audio being provided within the venue, the processing unit being configured to generate a light presence signal based on the detection of presence of light by the at least one video sensor, and a sound presence signal based on the detection of sound by the at least one sound sensor.

[8] In some embodiments, the at least one video sensor is further configured to detect light intensity associated with the video being displayed on the display screen, and generate a corresponding light intensity signal comprising a plurality of light intensity values.

[9] In some embodiments, the system comprises a content verification record corresponding to the video associated with the content file being played using the content playback device, and the processing unit is configured to: receive the content verification record from an external server, the content verification record comprising a plurality of brightness values associated with the video being displayed on the display screen; compare the plurality of brightness values within the content verification record to the plurality of light intensity values within the light intensity signal; and generate a correlation index for the video associated with the content file based on the comparison.

[10] In some embodiments, the at least one audio sensor is further configured to sample sound associated with the audio being provided within the venue, and generate a corresponding sound signal comprising a plurality of sampled sound values.

[11] In some embodiments, the system comprises a content verification record corresponding to the audio associated with the content file being played using the content playback device, wherein the processing unit is configured to: receive the content verification record from an external server, the content verification record comprising a plurality of sound levels associated with the audio being played within the venue; compare the plurality of sound levels within the content verification record to the plurality of sampled sound values within the sound signal; and generate a correlation index for the sound associated with the content file based on the comparison.

[12] In some embodiments, the at least one video sensor is a single cell light intensity sensor.

[13] In some embodiments, the at least one video sensor is located behind the display screen such that it is on the opposite side of the display screen than the content playback device.

[14] In some other embodiments, the at least one video sensor is located on the same side of the display screen as the content playback device.

[15] In some embodiments, the content file is stored on film.

[16] In some other embodiments, the content file is a digital file.

[17] In some embodiments, the venue is a movie theater and the content playback device is a projector.

[18] In another aspect, in at least one embodiment described herein, there is provided a method of monitoring playback of a content file within a venue, the venue including a content playback device configured to play audio and video associated with the content file, the video associated with the content file being displayed on a display screen and the audio associated with the content file being provided within the venue, the method comprising: detecting, by at least one video sensor, presence of light corresponding to the video being displayed on the display screen within the venue; detecting, by at least one audio sensor, presence of sound corresponding to the audio being provided within the venue; and generating, by a processing unit coupled to the at least one video sensor and the at least one audio sensor, a light presence signal based on the detection of presence of light by the at least one video sensor, and a sound presence signal based on the detection of sound by the at least one sound sensor.

[19] In some embodiments, the method comprises detecting, by the at least one video sensor, light intensity associated with the video being displayed on the display screen; and generating, by the processing unit, a corresponding light intensity signal comprising a plurality of light intensity values.

[20] In some embodiments, the method comprises receiving a content verification record from an external server, the content verification record corresponding to the video associated with the content file being played using the content playback device, the content verification record comprising a plurality of brightness values associated with the video being displayed on the display screen; comparing the plurality of brightness values within the content verification record to the plurality of light intensity values within the light intensity signal; and generating a correlation index for the video associated with the content file based on the comparison.

[21] In some embodiments, the method comprises sampling, by the at least one audio sensor, the sound associated with the audio being provided within the venue; and generating a corresponding sound signal comprising a plurality of sampled sound values.

[22] In some embodiments, the method comprises receiving a content verification record from an external server, the content verification record corresponding to the audio associated with the content file being played using the content playback device, the content verification record comprising a plurality of sound levels associated with the audio being played within the venue; comparing the plurality of sound levels within the content verification record to the plurality of sampled sound values within the sound signal; and generating a correlation index for the sound associated with the content file based on the comparison.

[23] In another aspect, in at least one embodiment described herein, there is provided a method of monitoring playback of content, where a content server initiates playback of a particular item of content. The content server transmits a message to a playback monitor controller identifying the particular content and indicating that the content is about to be played back. In response, the playback monitor controller retrieves a content verification record corresponding to the particular content. The content verification record contains a series of expected brightness values that are temporally correlated with the video content. The playback monitor controller receives a light intensity signal from a sensor, and produces a corresponding series of average light intensity values. The playback monitor controller compares the expected brightness values with the average light intensity values to provide a video correlation index. The playback monitor controller provides the video correlation index to the content server, which makes use of it, for example, to verify that an advertisement or other video content has been played back at a venue.

[24] In another aspect, in at least one embodiment described herein, there is provided a computer-readable medium storing computer-executable instructions. The instructions cause a processor to perform a method of monitoring playback of a content file within a venue, the venue including a content playback device configured to play audio and video associated with the content file, the video associated with the content file being displayed on a display screen and the audio associated with the content file being provided within the venue, the method comprising: detecting, by at least one video sensor, presence of light corresponding to the video being displayed on the display screen within the venue; detecting, by at least one audio sensor, presence of sound corresponding to the audio being provided within the venue; and generating, by a processing unit coupled to the at least one video sensor and the at least one audio sensor, a light presence signal based on the detection of presence of light by the at least one video sensor, and a sound presence signal based on the detection of sound by the at least one sound sensor.

[25] In some embodiments, the instructions cause the processor to perform the method comprising: detecting, by the at least one video sensor, light intensity associated with the video being displayed on the display screen; generating, by the processing unit, a corresponding light intensity signal comprising a plurality of light intensity values; receiving a content verification record from an external server, the content verification record corresponding to the video associated with the content file being played using the content playback device, the content verification record comprising a plurality of brightness values associated with the video being displayed on the display screen; comparing the plurality of brightness values within the content verification record to the plurality of light intensity values within the light intensity signal; and generating a correlation index for the video associated with the content file based on the comparison.

[26] In some embodiments, the instructions cause the processor to perform the method comprising sampling, by the at least one audio sensor, the sound associated with the audio being provided within the venue; generating a corresponding sound signal comprising a plurality of sampled sound values; receiving a content verification record from an external server, the content verification record corresponding to the audio associated with the content file being played using the content playback device, the content verification record comprising a plurality of sound levels associated with the audio being played within the venue; comparing the plurality of sound levels within the content verification record to the plurality of sampled sound values within the sound signal; and generating a correlation index for the sound associated with the content file based on the comparison.

[27] In some embodiments, the instructions cause the processor to perform the methods as described above or other methods in accordance with the teachings herein.

[28] Other features and advantages of the present application will become apparent from the following detailed description taken together with the accompanying drawings. It should be understood, however, that the detailed description and the specific examples, while indicating preferred embodiments of the application, are given by way of illustration only, since various changes and modifications within the spirit and scope of the application will become apparent to those skilled in the art from this detailed description.

Brief Description of the Drawings

[29] For a better understanding of the various embodiments described herein, and to show more clearly how these various embodiments may be carried into effect, reference will be made, by way of example, to the accompanying drawings which show at least one example embodiment and the figures will now be briefly described.

[30] FIG. 1 A is an example of a block diagram of a content playback system;

[31] FIG. 1 B is another example of a block diagram of a content playback system;

[32] FIG. 1 C is another example of a block diagram of a content playback system;

[33] FIG. 2A is an example of a block diagram of a content playback system illustrating the location of various components with respect to each other;

[34] FIG. 2B is another example of a block diagram of a content playback system illustrating the location of various components with respect to each other;

[35] FIG. 3 is an example of a graph of a content verification record;

[36] FIG. 4A is an example of a graph of a light intensity signal;

[37] FIG. 4B is another example of a graph of a light intensity signal;

[38] FIG. 5 is an example of a block diagram of a playback monitoring system; [39] FIG. 6 is an example of a method of monitoring playback of a content file that may be used by the playback monitoring system;

[40] FIG. 7 is another example of a method of monitoring playback of a content file that may be used by the playback monitoring system; and

[41] FIG. 8 is another example of a method of monitoring playback of a content file that may be used by the playback monitoring system.

[42] The drawings, described below, are provided for purposes of illustration, and not of limitation, of the aspects and features of various examples of embodiments described herein. For simplicity and clarity of illustration, elements shown in the drawings have not necessarily been drawn to scale. The dimensions of some of the elements may be exaggerated relative to other elements for clarity. It will be appreciated that for simplicity and clarity of illustration, where considered appropriate, reference numerals may be repeated among the drawings to indicate corresponding or analogous elements or steps.

Description of Exemplary Embodiments

[43] Various systems or methods will be described below to provide an example of an embodiment of the claimed subject matter. No embodiment described below limits any claimed subject matter and any claimed subject matter may cover methods or systems that differ from those described below. The claimed subject matter is not limited to systems or methods having all of the features of any one system or method described below or to features common to multiple or all of the apparatuses or methods described below. It is possible that a system or method described below is not an embodiment that is recited in any claimed subject matter. Any subject matter disclosed in a system or method described below that is not claimed in this document may be the subject matter of another protective instrument, for example, a continuing patent application, and the applicants, inventors or owners do not intend to abandon, disclaim or dedicate to the public any such subject matter by its disclosure in this document.

[44] Furthermore, it will be appreciated that for simplicity and clarity of illustration, where considered appropriate, reference numerals may be repeated among the figures to indicate corresponding or analogous elements. In addition, numerous specific details are set forth in order to provide a thorough understanding of the embodiments described herein. However, it will be understood by those of ordinary skill in the art that the embodiments described herein may be practiced without these specific details. In other instances, well-known methods, procedures and components have not been described in detail so as not to obscure the embodiments described herein. Also, the description is not to be considered as limiting the scope of the embodiments described herein.

[45] It should also be noted that the terms "coupled" or "coupling" as used herein can have several different meanings depending in the context in which these terms are used. For example, the terms coupled or coupling may be used to indicate that an element or device can electrically, optically, or wirelessly send data to another element or device as well as receive data from another element or device.

[46] It should be noted that terms of degree such as "substantially", "about" and "approximately" as used herein mean a reasonable amount of deviation of the modified term such that the end result is not significantly changed. These terms of degree may also be construed as including a deviation of the modified term if this deviation would not negate the meaning of the term it modifies.

[47] Furthermore, any recitation of numerical ranges by endpoints herein includes all numbers and fractions subsumed within that range (e.g. 1 to 5 includes 1 , 1 .5, 2, 2.75, 3, 3.90, 4, and 5). It is also to be understood that all numbers and fractions thereof are presumed to be modified by the term "about" which means a variation of up to a certain amount of the number to which reference is being made if the end result is not significantly changed.

[48] The example embodiments of the systems and methods described herein may be implemented as a combination of hardware or software. In some cases, the example embodiments described herein may be implemented, at least in part, by using one or more computer programs, executing on one or more programmable devices comprising at least one processing element, and a data storage element (including volatile memory, non-volatile memory, storage elements, or any combination thereof). These devices may also have at least one input device (e.g. a pushbutton keyboard, mouse, a touchscreen, and the like), and at least one output device (e.g. a display screen, a printer, a wireless radio, and the like) depending on the nature of the device.

[49] It should also be noted that there may be some elements that are used to implement at least part of one of the embodiments described herein that may be implemented via software that is written in a high-level computer programming language such as object oriented programming. Accordingly, the program code may be written in C, C++ or any other suitable programming language and may comprise modules or classes, as is known to those skilled in object oriented programming. Alternatively, or in addition thereto, some of these elements implemented via software may be written in assembly language, machine language or firmware as needed. In either case, the language may be a compiled or interpreted language.

[50] At least some of these software programs may be stored on a storage media (e.g. a computer readable medium such as, but not limited to, ROM, magnetic disk, optical disc) or a device that is readable by a general or special purpose programmable device. The software program code, when read by the programmable device, configures the programmable device to operate in a new, specific and predefined manner in order to perform at least one of the methods described herein.

[51] Furthermore, at least some of the programs associated with the systems and methods of the embodiments described herein may be capable of being distributed in a computer program product comprising a computer readable medium that bears computer usable instructions for one or more processors. The medium may be provided in various forms, including non-transitory forms such as, but not limited to, one or more diskettes, compact disks, tapes, chips, and magnetic and electronic storage.

[52] The various embodiments disclosed herein generally relate to systems and methods for monitoring playback of recorded content, and in particular monitoring presence of light and sound during playback of recorded content.

[53] The term "content" is used herein to describe moving video images or audio. Some content may include both video and audio that are correlated to one another. Content may be stored digitally, on film, or otherwise fixed in any tangible medium of expression.

[54] When content is stored digitally, it is typically encoded using a selected content encoding format and the resulting encoded data is recorded in a data file. For example, a movie may include video content and audio content (which may include one or more audio tracks) that is temporally correlated to the video content. The video and audio content may be encoded using any appropriate video encoding format such as MP4, WMV, FLV, AVI, Quicktime™, 3GP, H.264 or other formats that will be known to a skilled person. The resulting encoded audio data and video data are recorded in one or more data files, according to the selected format, which may be referred to as audio files or video files. Some encoding formats allow audio data to be recorded together with video data in a single data file, which may typically be referred to as a video file. Data files are typically stored in a file system or data store.

[55] Reference is first made to FIG. 1A, which illustrates a block diagram of a content playback system 100A in accordance with an example embodiment. In the illustrated embodiment of FIG. 1 A, the content is digitally stored.

[56] System 100A includes a content server 102, a data storage system 104, a video playback system 106, an audio playback system 108 and a playback monitoring system 1 10. Video playback system 106 may also be referred to as a video reproduction system. Audio playback system 108 may also be referred to as an audio reproduction system.

[57] Data storage system 104 is a data recording device that is accessible to content server 102. Data storage system 104 may be located locally or remotely to content server 102 and may be accessed through any type of cable or communication system, such as a universal serial bus (USB) cable, a local area network, a wide area network or any combination of such networks, including the Internet. In some embodiments, the content server 102 may be able to access more than one data storage systems, each analogous to data storage system 104, using an access cable or a wireless connection appropriate to each data storage system 104.

[58] In some cases, the data storage system 104 is a server system with storage capabilities either stored locally within the venue of system 100A or remote from the venue of system 100A. In some other cases, data storage system 104 is coupled to the playback monitoring system 1 10 via a memory card, such as SD card, micro SD card etc.

[59] Data storage system 104 contains a content data store 120, which is used to store a variety of data files 122, including video files 124 and audio files 126. As noted above, video files 124 include encoded video content and may optionally include encoded audio content that is correlated to the video content, e.g. temporally correlated.

[60] In addition, data storage system 104 includes a content verification database 128. Content verification database 128 includes one or more content verification records 150 corresponding to particular content stored in the data storage system 104. For example, content verification record 150a corresponds to video file 124a stored in the content data store 120.

[61] Content server 102 is coupled to data storage system 104 to retrieve data files 122. Content server 102 transmits the data files to the appropriate content playback system 106 or 108. For example, video files are transmitted to video playback system 106 and audio files are transmitted to audio playback system 108. Video files 124 containing both video and audio data may be transmitted to both video playback system 106 and audio playback system 108.

[62] Video playback system 106 includes a video decoder 130, a projector 132 and a display screen 134. Video decoder 130 receives video files from content server 102, extracts and decodes video content from the video files and provides the video content to projector 132. Projector 132 projects the video data onto display screen 134, producing a visible image that can be observed by viewers (not shown) situated between the projector 132 and the display screen 134. Typically, system 100A is installed in a video viewing facility such as a movie theatre, television room, a shopping mall, restaurant, bar or other location where viewers may view the displayed video content. In various embodiments, video playback system may include display panels such as LCD or LED display panels instead of or in addition to a projector / display screen combination.

[63] In system 100A, display screen 134 is a perforated cinema display screen. Display screens are commonly perforated or made from a woven fabric to permit sound from speakers placed behind the display screen (i.e., on the opposite side of the display screen 134 from projector 132) to pass through the display screen. The perforations in the display screen will typically allow a small amount of light projected by projector 132 to pass through the display screen.

[64] In system 100A, the audio playback system 108 includes an audio decoder 160 and one or more loudspeakers 162. Audio content (which may be part of a video file 124 or a separate audio file 126) is transmitted by the content server 102 to the audio decoder 160. Audio decoder 160 decodes the audio content and reproduces it through loudspeaker 162.

[65] As illustrated in system 100A, the playback monitoring system 1 10 includes a playback monitor controller 140, a video playback sensor 142 and an audio playback sensor 144. The playback monitor controller 140 is coupled to content server 102 and data storage system 104. [66] Video playback sensor 142 is positioned behind display screen 134. In this example embodiment, video playback sensor 142 is a single cell light intensity sensor 148. For example, sensor 148 may be a CdS photocell, a light dependent resistor, a pin photo diode, a photo transistor, or any other device capable of providing a signal corresponding to the intensity of light incident on the sensor. In various embodiments, the sensor may be a linear or two-dimensional sensor array.

[67] Sensor 142 is responsive to the total light energy incident upon it and provides a time varying light intensity signal 146 corresponding to the total light energy. Controller 140 receives the light intensity signal 146.

[68] When content server 102 initiates the playback of a particular item of content from content data store 120 by the video playback system 106 or the audio playback system 108 or both, the content server transmits a message 158 to playback monitor controller 140 identifying the particular content and indicating that the content is about to be played back. In response, playback monitor controller 140 retrieves a target record, which can be a content verification record 150 corresponding to that particular content, from the data storage system 104.

[69] Playback monitor controller 140 compares the light intensity signal 146 received from the sensor 142 to the expected brightness values recorded in the corresponding content verification record 150. Based on the comparison, the playback monitor controller 140 generates a video correlation index 156. The video correlation index corresponds to the extent to which the expected brightness values recorded in the corresponding content verification record 150 and average light intensity values within the light intensity signal 146 received from the sensor 142 correlate for the entire length of the video content being played back.

[70] Reference is next made to FIG. 1 B, which illustrates a block diagram of a content playback system 100B in accordance with another example embodiment. In the embodiment of FIG. 1 B, the content is stored on film instead of being stored digitally. In such embodiments, the video content may be stored in the form of images on film print frames with audio content stored on tracks adjacent to the images on the film print frames.

[71] As illustrated, system 100B includes a content server 102, a data storage system 104', a film playback system 106' and a playback monitoring system 1 10. Content server 102 is analogous to content server 102 of FIG. 1A, and playback monitoring system 1 10 is analogous to playback monitoring system 1 10 of FIG. 1A. [72] In the illustrated embodiment of system 100B, the data storage system 104' is analogous to data storage system 104 of FIG. 1A. Data storage system 104' includes a content verification database 128', which includes one or more content verification records 150' corresponding to one or more films.

[73] In such embodiments, the content server 102 is coupled to the data storage system 104' to coordinate the playback of a film from the data storage system 104'. In particular, the content server 102 coordinates the playback of a film by triggering the appropriate content playback system 106' to begin playing the content on the film. The content server 102 is also configured to transmit a message 158 to playback monitor controller 140 as discussed above.

[74] In some cases, the content playback system 106' includes a projector with a built in speaker, where the projector is configured to play a film. In some other cases, the content playback system 106' includes a projector and an external speaker system, where the projector is configured to play the video images provided on a film, and the external speaker system is configured to play the sound associated with the video images on the film in a coordinated manner. The content playback system 106' also includes a display screen 134 analogous to display screen 134 of FIG. 1 A.

[75] In the embodiment of FIG. 1 B, the playback monitor controller 140 is analogous to the playback monitor system 140 of FIG. 1 A.

[76] Reference is next made to FIG. 1 C, which illustrates a block diagram of a content playback system 100C in accordance with another example embodiment. System 100C is generally analogous to systems 100A of FIG. 1 A and 100B of FIG. 1 B.

[77] System 100C differs from previously disclosed embodiments in that the playback monitoring system 1 10 is a stand-alone system, which is not coupled to a data storage system (such as the data storage system 104 of FIG. 1A and data storage system 104' of FIG. 1 B), and may be optionally coupled to a content server, such as the content server 102 of FIGS. 1 A and 1 B.

[78] In the embodiment of FIG. 1 C, the video sensor 142 is responsive to the total light energy incident upon it and provides a signal 146' that indicates whether or not some light is being generated from the display screen 134. The audio sensor 144 also detects the presence of audio within the venue, and generates another signal 146" that indicates whether or not audio is being generated in the venue. In this embodiment, only the presence of absence of video and/or audio is detected and reported by the playback monitoring system 1 10, without any further step, such as comparison with a content verification record.

[79] Reference is made to FIG. 2A, which illustrates a block diagram of a content playback system 200A in accordance with another example embodiment. In the embodiment of FIG. 2A, sensor 142 is positioned behind display screen 134. The amount of light reaching sensor 142 depends on various factors, such as the light transmission characteristics of display screen 134, the field of view of sensor 142 and the distance of the sensor from the display screen 134. Various display screens may permit greater or lesser amounts of light to pass through them due to number and size of perforations, the opaqueness of the material from which the display screen is made and the extent to which the display screen diffuses the transmitted light. Typically, the amount of incident light energy 202 reaching display screen 134 from projector 132 will be substantially greater than the amount of transmitted light 204 that is transmitted through screen 134. The transmitted light is generally diffused.

[80] Sensor 142 may be placed at any position behind screen 134. Typically, the field of view of a single cell sensor may range from a few degrees to up to about 170-180 degrees, particularly if lenses are provided between a light sensing element in sensor 142 and the display screen 134. Typically, the sensitivity of a sensor falls off at higher angles from the center of the field of view. In this example embodiment, sensor 142 may have a field of view that allows light from circle 206 (as shown on screen 134 of FIGS. 1A - 1 C) to reach sensor 142. Sensor 142 will typically be more sensitive to light energy transmitted near the center of circle 206 (and thus transmitted at a more direct angle) rather than light energy transmitted at the edges of circle 206 (and thus transmitted at a more oblique angle). The radius of circle 206 will depend on the field of view of sensor 142 and the distance between sensor 142 and display screen 134.

[81] In some embodiments, sensor 142 generates a light intensity signal 146 that is comprised of a series of light intensity values corresponding to all of the light energy that is incident upon it at any point in time. In some other embodiments, sensor 142 generates a light presence signal 146 that is an indication of whether or not any light is reaching the sensor during the playback of the content.

[82] Reference is next made to FIG. 2B, which illustrates a block diagram of a content playback system 200B in accordance with another example embodiment. In the embodiment of FIG. 2B, sensor 142 is positioned in front of display screen 134. The amount of light reaching sensor 142 will depend on the light reflection characteristics of display screen 134, the field of view of sensor 142 and the distance of the sensor from the display screen 134. Various display screens may permit greater or lesser amounts of light to reflect off of them due to number and size of perforations, the opaqueness of the material from which the display screen is made and the extent to which the display screen diffuses the transmitted light. Typically, the amount of incident light energy 202 reaching display screen 134 from projector 132 will be substantially greater than the amount of reflected light 204' that is reflected off of screen 134. The reflected light is generally not diffused.

[83] In some embodiments, sensor 142 generates a light intensity signal 146 that is comprised of a series of light intensity values corresponding to all of the light energy that is incident upon it at any point in time. In some other embodiments, sensor 142 generates a light intensity signal 146 that is an indication of whether or not any light is reaching the sensor during the playback of the content.

[84] In some cases, in the embodiments where the sensor 142 is placed in front of the display screen 134, various filtering mechanisms are used on the display screen 134 to obscure the actual images that can be observed by the sensor 142, without affecting the light that is incident on the sensor 142. For example, the display screen 134 or the sensor 142 may be provided with filters, such as an image blurring filter, a shadow masking filter or a combination filter capable of blurring and shadow masking. If the screen 134 is provided with such filters, then the screen 134 is configured to only obscure the images of the content being played back with respect to the sensor 142, and not the audience viewing the screen.

[85] Reference is next made to FIG. 3, which illustrates a graph 300 of a content verification record 150a according to an example embodiment. As discussed before, the content verification record 150a corresponds to a particular item of video content in a video file 124a in data storage system 120.

[86] As seen in graph 300, content verification record 150a contains a series of expected brightness values that are temporally correlated with the video content. For example, if the video content is 150 seconds long, then content verification record 150a contains a series of expected brightness values corresponding to a 150 second time period. The expected brightness values may be expressed in any form such as: a range between any two numbers, such as 0 and 63, where 0 indicates no light and 63 indicates bright light; a range of lux values (e.g. luminance and luminous emittance etc.) or any other values. The expected brightness values may have any temporal resolution. For example, a content verification record with high resolution expected brightness values may have a resolution equal to the frame rate of the video content. A content verification record with relatively low resolution expected brightness values may have a single light intensity value corresponding to series of frames in the video content. For example, the light intensity value may correspond to the average expected brightness of a series of frames extending over a fraction of a second, a second, multiple seconds or any other time period. Typically, the content verification record will include an indication of the time period (which may be expressed in frames or any other suitable unit) over which the expected brightness values are averaged.

[87] FIG. 3 illustrates a content verification record 150a in which expected brightness values 152 are provided for successive one second time periods, beginning at the start of the video content (in the video file 124a) to which the content verification record 150a corresponds.

[88] Reference is next made to FIGS. 3, 4A and 4B. FIG. 4A illustrates a graph 400A of a light intensity signal 146 generated by the sensor 142 according to an example embodiment. As discussed above, the playback monitor controller 140 compares the light intensity signal 146 received from the sensor 142, and generates a series of light intensity values 148 forming the light intensity signal 146.

[89] FIG. 4B illustrates a graph 400B of an average light intensity signal 146B based on light intensity signal 146. As shown, the average light intensity signal 146B includes a series of average light intensity values 154 corresponding to the average light intensity value during each successive second of the video content.

[90] In various embodiments, the playback monitor controller 140 compares the expected brightness values 152 and the average light intensity values 154 and provides a video correlation index 156. The video correlation index corresponds to the extent to which the expected brightness values 152 and average light intensity values correlate for the entire length of the video content.

[91] In this example, playback monitor controller 140 determines the ratio between the expected brightness value 152 and the corresponding average light intensity value 154 for each segment of the video content. In this example, each segment is one second long. One ratio is calculated for each segment. The series of ratios may then be analyzed to determine the degree of correlation between the expected brightness values and the average light intensity values. For example, if the series of ratios exhibits a fairly constant value with a relatively low standard deviation, the correlation of the expected brightness values and the average light intensity values will be considered high. Conversely, if the standard deviation of the series of ratios is relatively high, then the expected brightness values and the average light intensity values will be considered to have a low correlation.

[92] By calculating a series of ratios and then analyzing the series of ratios in this example embodiment, various factors can be fully or partially abstracted out. For example, the light output of the projector may vary over time and between different locations, the light transmission characteristics of a display screen may vary between different locations, the placement of the sensor 142 behind display screen 134 may vary between different locations. Each of these variations may tend to have a relatively consistent proportional effect on the amount of light that is incident on the sensor in a particular installation or embodiment, as illustrated in systems 100A - 100C. Such variations may uniformly affect the ratio of the expected brightness values and the average light intensity values, allowing the series of ratios to be analyzed to obtain a useful video correlation index.

[93] Playback monitor controller 140 provides the video correlation index 156 to the content server 102. The video correlation index 156 corresponds to the degree of correlation between the expected brightness values and the average light intensity values. For example, the video correlation index may be the standard deviation of the series of ratios. In this case, a low video correlation index will correspond to a high degree of correspondence and vice versa.

[94] A high degree of correspondence indicates that the video content played back using the video playback system 106 is in fact the correct video content that corresponds to the content verification record. The video correlation index may be used to verify that a particular item of video content has been successfully played back. For example, the video correlation index may be used for control or audit purposes to verify that an advertisement or other video content has been played back at a venue.

[95] Referring again to FIG. 1A, the audio playback system includes an audio decoder 160 and one or more loudspeakers 162. Audio content (which may be part of a video file or an audio file) is transmitted by the content server 102 to the audio decoder 160. Audio decoder 160 decodes the audio content and reproduces it through loudspeaker 162.

[96] Content server 102 transmits a message 158 to playback monitor controller 140 to indicate that a particular item of audio content is being reproduced. Playback monitor controller 140 retrieves a content verification record 150 corresponding to the particular audio item. The playback monitor controller 140 samples audio using audio sensor 144. Playback monitor controller 140 compares the sampled audio to the content verification record 150 to provide an audio correlation signal.

[97] Playback monitor controller 140 can sample audio by taking sound measurement levels from audio sensor 144. Playback monitor controller 140 can perform sound level computations at each of one or more sampling windows. Playback monitor controller 140 can track sound level changes for audience noise rejection. Playback monitor controller 140 can determine if sound levels indicate audio from loudspeaker 162. Audio sensor 144 can operate at a sound sampling interval of 22 kHz, during which playback monitor controller 140 can get pulse density, compute a filtered waveform, and store resulting data.

[98] In one embodiment, playback monitor controller 140 can detect sound from loudspeaker 162 using a volume-level approach. This approach rejects background noise in order to eliminate false positive reporting or sound playing in a venue that could be triggered by background audience noise in the pre-show environment. Playback monitor controller 140 can look for a slowly ramping noise level to set a noise floor as might happen when an audience enters a venue, creating a steady background chatter. Playback monitor controller 140 identifies when this chatter has a sharp fall-off when the show starts, allowing it to reset the expected noise floor used to reject the background noise. In addition, the lighting condition of the venue can be used to adjust or gate sound measurements.

[99] When a video file containing both video and audio content is reproduced, both the video correlation signal and the audio correlation signal may be used to verify the successful playback or reproduction of the video and audio content.

[100] Reference is next made to FIG. 5, which illustrates a block diagram of a playback monitoring system 1 10 in accordance with an example embodiment. The playback monitoring system 1 10 may be a stand-alone equipment comprising the various components required to monitor the playback of recorded content as discussed in the various embodiments herein. [101 ] The system 1 10 comprises a memory unit 518 and a processing unit 504 coupled to the memory unit 518. The processing unit 504 is analogous to playback monitor controller 140 of FIGS. 1A - 1 C.

[102] The system 1 10 comprises a video sensor 142 and an audio sensor 144, which communicate with processing unit 504. The structure and functionality of each of the video sensor 142 and the audio sensor 144 is analogous to the various examples of video sensors 142 and audio sensors 144 discussed herein.

[103] As shown, an input/output interface 524 can be present for receiving and communicating, for example, a content verification record 150 from a data storage system 104 to memory unit 518.

[104] The system 1 10 can have an output unit 506 that provides the video correlation index 156 and/or an audio correlation signal, as generated by the processing unit 504. A wireless unit 514 can be optionally connected to processing unit 504 in order to send or receive data or signals wirelessly to any other inputs, outputs, units, interfaces, or sensors.

[105] The system 1 10 is provided as an example and there can be other embodiments of the system 1 10 with different components or a different configuration of the components described herein. The system 1 10 further includes several power supplies (not shown) connected to various components of the system 1 10 as is commonly known to those skilled in the art.

[106] Reference is next made to FIG. 6, illustrating an example embodiment of a method 600 of monitoring playback of a content file in accordance with the teachings herein. Method 600 is carried out by the various modules of the playback monitoring system as discussed herein.

[107] In the various embodiments disclosed herein, the playback monitoring system is located at a venue. Also located at the venue is a content playback device, such as a projector, a loudspeaker, a display screen (e.g. LCD, LED, monitor etc.) that is configured to play audio and video associated with a content file being played back. The video of the content file is displayed on a display screen, such as the display screen 134 of FIGS. 1A - 1 C, in the venue such that it can be viewed by audience at the venue. Similarly, audio associated with the content file is provided within the venue so that it is accessible to the audience at the venue.

[108] At 605, presence of light corresponding to the video associated with the content file being played back on the display screen 134 at the venue is detected. One or more video sensors, such as video sensors 142 of FIGS. 1A - 1 C, are configured to detect the presence or absence of light being received from the display screen 134. In some cases, the video sensor or sensors are located behind the display screen 134, as discussed in the embodiment of FIG. 2A. In some other cases, the video sensor or sensors are located in front of the display screen 134, as discussed in the embodiment of FIG. 2B.

[109] At 610, presence of sound corresponding to the audio associated with the content file is detected. The presence or absence of sound is detected by one or more audio sensors located at the venue. In some cases, the audio sensor or sensors may be located at the same location as the video sensor. In some other cases, the audio sensor or sensors are located away from the video sensor. In some further cases, multiple audio sensors are located at the venue at difference locations throughout the venue.

[110] At 615, a light presence signal is generated by a processing unit which is coupled to the one or more video sensors and one or more audio sensors. The light presence signal is based on the detection of presence of light by the one or more video sensors.

[111 ] At 620, a sound presence signal is generated by the processing unit. The sound presence signal is based on the detected of sound by the one or more sound sensors within the venue.

[112] In various embodiments, the one or more video sensors, and the one or more audio sensors transmit the detected presence or absence of light and sound information to a playback monitor controller, such as the playback monitor controller 140 of FIGS. 1A - 1 C. The playback monitor controller 140 is configured to generate the associated light presence signal and the sound presence signal based on the received information. The light presence signal thus generated may be expressed in any form, such as a range between two numbers, such as 0 and 1 , where 0 indicates absence of light and 1 indicates presence of light. In some other cases, the range may be between two numbers, such as 0 and 63, where 0 indicates absence of light and 63 indicates presence of light. The audio presence signal may be similarly represented where a lower value indicates absence of sound and a higher value indicates presence of sound.

[113] The light and sound presence signals thus generated may be stored locally, or transmitted to an external server for further analysis and processing. For example, an operator may be able to view the light and sound presence signals and determine if the entire content playback or parts of the content playback is unsuccessful. The operator can accordingly address the problem.

[114] In some cases, the playback monitor controller 140 is configured to provide alerts, such as in the form of alarms, reports etc. to indicate the absence of light and sound at the venue based on the light and sound presence signals. For example, in some cases, the playback monitor controller 140 is configured to compare the light presence signal to a corresponding predetermined threshold value. If the light presence signal is below the predetermined threshold, the playback monitor controller 140 may be configured to trigger an alert.

[115] Reference is next made to FIG. 7, illustrating an example embodiment of a method 700 of monitoring playback of a content file in accordance with the teachings herein. Method 700 is carried out by the various modules of the playback monitoring system as discussed herein. The embodiment of FIG. 7 relates to a content file that is a video file.

[116] At 705, light intensity associated with the video being displayed on the display screen 134 is detected. One or more video sensors, such as video sensors 142 of FIGS. 1A - 1 C, are configured to detect the light intensity associated with the video being displayed on the display screen 134. In some cases, the video sensor 142 is configured to determine the luminosity or brightness value associated with the video being displayed on the display screen 134.

[117] At 710, a light intensity signal corresponding to the detected light intensity is generated. The light intensity signal is generated by a playback monitor controller, such as controller 140 of FIGS. 1A - 1 C, and comprises a plurality of light intensity values. Each light intensity value at a given time corresponds to the light intensity associated with the image on the display screen 134 at that time.

[118] At 715, the playback monitor controller 140 is configured to receive a content verification record. In some cases the content verification record is stored locally within the playback monitoring system 1 10. In some other cases, the content verification record is received from a data storage server that may be either located locally within the venue or remotely from the venue.

[119] The content verification record corresponds to the content file being played back at the venue. In particular, in this embodiment, the content verification record includes a plurality of target brightness values associated with the video being displayed on the display screen 134. In most cases, the target brightness values are generated for a scale that may be based on the frame rate, time, or some other unit. For example, the target brightness values may be generated on a time scale such that each target brightness value corresponds to the time of display of the corresponding video. In another example, the target brightness values may be generated on a different scale, such that each target brightness value corresponds to each frame of the corresponding video.

[120] In most cases, the playback monitor controller 140 generates the light intensity signal based on the same scale as the content verification record. In some other cases, where the light intensity signal is generated based on a different scale or unit than the content verification record, one or the other is converted to the scale or unit of the other. In some further cases, where the light intensity signal is generated based on a different scale or unit than the content verification record, the scales or units of both are converted to another common unit.

[121 ] In most cases, the content verification record is generated ahead of time of the playback. For example, the content verification record may be generated by a supervised playback of the content item at the venue. The supervised playback may be carried out at a different location and adjusted for characteristics of the venue where the subsequent playback is carried out.

[122] In some other cases, the content verification record is estimated algorithmically, and adjusted for characteristics of different venues. Such an estimated content verification record may be additionally adjusted based on time of day or other occasions to take into account any unusual yet predictable sound or light imbalances within the venue.

[123] At 720, the playback monitor controller 140 is configured to compare the plurality of brightness values within the content verification record to the plurality of light intensity values within the light intensity signal, as discussed herein.

[124] At 725, the playback monitor controller 140 is configured to generate a correlation index for the video associated with the content file based on the comparison. The correlation index is an indicator of how closely the actual brightness values of the video being displayed on the display screen 134 maps to the target brightness values contained in the content verification record.

[125] Reference is next made to FIG. 8, illustrating an example embodiment of a method 800 of monitoring playback of a content file in accordance with the teachings herein. Method 800 is carried out by the various modules of the playback monitoring system as discussed herein. The embodiment of FIG. 8 relates to a content file that is an audio file.

[126] At 805, the sound associated with the audio file being played back within the venue is detected. The sound is detected by one or more audio sensors, such as audio sensor 144 of FIGS. 1A - 1 C. In addition to detecting the sound, the audio sensor or sensors are configured to sample the sound being detected . In some cases, a sampling rate of 22 KHz is used. In some other cases, a sampling rate matching the frame rate of the corresponding video that is correlated to the audio is used. The sampled sound values reflect the volume or sound level of the audio being played within the venue.

[127] At 810, a sound signal corresponding to the audio being played within the venue is generated, where the sound signal consists of a plurality of sampled sound values. The sound signal is generated by a playback monitor controller, such as controller 140 of FIGS. 1 A - 1 C.

[128] At 815, a content verification record corresponding to the audio being played within the venue is received. The content verification record may be stored locally within the playback monitoring system, or may be received from an external server, such as a data storage system 104 of FIGS. 1A - 1 C. The content verification record includes a plurality of sound levels associated with the audio being played within the venue.

[129] As discussed above, in some cases the plurality of sound levels contained within the content verification record may be based on a time scale or a different scale, and may be converted to the same scale as the sound signal generated by the playback monitor controller 140.

[130] At 820, the plurality of sound levels within the content verification record is compared to the plurality of sampled sound values within the sound signal generated by the playback monitor controller 140, as discussed herein.

[131 ] At 825, the playback monitor controller 140 is configured to generate a correlation index for the sound associated with the audio being played within the venue based on the comparison. The correlation index is an indicator of how closely the actual sound of the audio being displayed in the venue maps to the target sound levels contained in the content verification record. [132] The present invention has been described here by way of example only, while numerous specific details are set forth herein in order to provide a thorough understanding of the exemplary embodiments described herein. However, it will be understood by those of ordinary skill in the art that these embodiments may, in some cases, be practiced without these specific details. In other instances, well-known methods, procedures and components have not been described in detail so as not to obscure the description of the embodiments. Various modification and variations may be made to these exemplary embodiments without departing from the spirit and scope of the invention, which is limited only by the appended claims.