Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
SYNCHRONIZED MEDIA CONTENT ON A PLURALITY OF DISPLAY SYSTEMS IN AN IMMERSIVE MEDIA SYSTEM
Document Type and Number:
WIPO Patent Application WO/2017/172514
Kind Code:
A1
Abstract:
An immersive media system is disclosed. The immersive media system includes a master display system and a plurality of slave display systems. The master display system synchronizes the display of a video with the plurality of slave display systems using a synchronization signal. The synchronization signal is sequentially transmitted from the master display system to each of the slave display systems, wherein the display systems are serially connected to one another. Sub-frame video synchronization is achieved using the sequentially chained display systems.

More Like This:
Inventors:
GOCKE ALEXANDER WILLIAM (US)
DUYVEJONCK DIEGO (US)
STREMPLE SCOTT (US)
DELVAUX JEROME (US)
CAPPON EMMANUEL (US)
Application Number:
PCT/US2017/024003
Publication Date:
October 05, 2017
Filing Date:
March 24, 2017
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
BARCO INC (US)
International Classes:
H04N5/04
Foreign References:
US20120159026A12012-06-21
US6122000A2000-09-19
US20160080710A12016-03-17
US20030179782A12003-09-25
US20090036159A12009-02-05
US20140152784A12014-06-05
US20150348558A12015-12-03
Attorney, Agent or Firm:
ALTMAN, Daniel, E. (US)
Download PDF:
Claims:
WHAT IS CLAIMED IS:

1. An immersive media system comprising:

a master display system comprising a master media server and a master screen, the master media server configured to:

generate a synchronization signal comprising a waveform encoding synchronization information;

transmit the synchronization signal over a first synchronization cable electrically coupled to the master display system;

provide a master video; and

send the master video to the screen for display;

a first slave display system comprising a first slave media server and a first slave screen, the first slave media server configured to:

receive the synchronization signal from the master display system over the first synchronization cable electrically coupled to the first slave display system;

transmit the synchronization signal over a second synchronization cable electrically coupled to the first slave display system;

extract the synchronization information;

provide a first slave video; and

send the first slave video to the first slave screen for display;

a second slave display system comprising a second slave media server and a second display screen, the second slave media server configured to:

receive the synchronization signal from the first display system over the second synchronization cable electrically coupled to the second slave display system;

extract the synchronization information;

provide a second slave video; and

send the second slave video to the second slave screen for display; and a networked connection communicably coupling the master display system to the first slave display system and to the second slave display system; wherein the master video, the first slave video, and the second slave video are synchronized based at least in part on the synchronization information.

2. The immersive media system of Claim 1, wherein the synchronization signal is encoded using a biphase mark code.

3. The immersive media system of Claim 1, wherein the synchronization signal comprises a data word of 64 bits.

4. The immersive media system of Claim 1, wherein the first and second synchronization cables are coaxial cables.

5. The immersive media system of Claim 1, wherein a frame rate of each of the master video, the first slave video, and the second slave video is the same.

6. The immersive media system of Claim 5, wherein the frame rate is 30 fps.

7. The immersive media system of Claim 1, wherein the master video is extracted from a digital cinema package.

8. The immersive media system of Claim 7, wherein the first slave video is extracted from the digital cinema package.

9. The immersive media system of Claim 7, wherein the first slave video is extracted from a second digital cinema package.

10. The immersive media system of Claim 7, wherein the second slave video is extracted from the digital cinema package.

11. The immersive media system of Claim 1, further comprising a network connection communicably coupling the master media system to the first slave display system.

12. The immersive media system of Claim 11, wherein the network connection further communicably couples the master media system to the second slave media system.

13. The immersive media system of Claim 1, wherein a data rate of the synchronization signal is about 2 Mbps.

14. The immersive media system of Claim 1, wherein a latency between the master video and the second slave video is less than 100 μβ.

15. The immersive media system of Claim 1, wherein a latency between the master video and the second slave video is less than 0.01% of a frame rate of the master video.

16. A slave display system in an immersive media system, the slave display system comprising:

a media module configured to provide a video comprising a plurality of video frames;

a synchronization in connector configured to receive a synchronization signal from a first synchronization cable electrically coupled to the synchronization in connector; and

a synchronization out connector electrically coupled to the synchronization in connector, the synchronization out connector configured to transmit the synchronization signal to a second synchronization cable electrically coupled to the synchronization out connector;

a synchronization module electrically coupled to the synchronization in connector, the synchronization module configured to

extract synchronization information from the received synchronization signal; and

provide the synchronization information to the media module, wherein the media module provides a video frame of the video based at least in part on the synchronization information,

wherein the provided video frame is synchronized with a video of another display system in the immersive media system.

17. The slave display system of Claim 16, wherein the synchronization in connector is electrically coupled to the synchronization out connector through active electronics.

18. The slave display system of Claim 16, further comprising a screen configured to display the provided video frame.

19. The slave display system of Claim 16, wherein the synchronization signal is formatted according to the AES3 standard.

20. The slave display system of Claim 16, wherein the media module is further configured to extract the video from a digital cinema package.

21. The slave display system of Claim 20, wherein at least a portion of the digital cinema package is stored on the slave display system.

22. An immersive media system, comprising:

a server configured to store cinema content;

a server node connected to the server, wherein the server node is configured to receive the cinema content and distribute the cinema content; and

a plurality of display systems, each display system comprising an Integrated Cinema Media Processor configured to receive the cinema content from the server node, and each display system configured to project a video based on at least the received cinema content.

23. The immersive media system of Claim 22, further comprising a network attached storage coupled to the server.

24. The immersive media system of Claim 22, wherein the plurality of display systems comprises a master display system, a first slave display system, and a second slave display system, wherein the master display system is connected by a first cable to the first slave display system, and the first slave display system is connected by a second cable to the second slave display system.

25. The immersive display system of Claim 24, wherein the master display system configured to receive commands from a user interface.

Description:
SYNCHRONIZED MEDIA CONTENT ON A PLURALITY OF DISPLAY SYSTEMS

IN AN IMMERSIVE MEDIA SYSTEM

BACKGROUND

[0001] The present disclosure generally relates to systems and methods of an immersive media system that provides synchronized video on a plurality of display systems.

[0002] Media content can be delivered to homes and other venues for viewing. For example, in many households, content is delivered via cable, the internet, and/or antenna.

SUMMARY

[0003] Example embodiments described herein have innovative features, no single one of which is indispensable or solely responsible for their desirable attributes. Without limiting the scope of the claims, some of the advantageous features will now be summarized.

[0004] An immersive media system can include a plurality of display systems arranged to provide immersive viewing of video. For example, an immersive media system can comprise a plurality of television screens arranged around a viewer and/or audience. In this way, the viewer/audience can experience a sense of immersion into the environment depicted in the video. Synchronized video provided by the plurality of television screens can create a unified video presentation. Such immersive media systems are capable of generating audiovisual presentations with a relatively high level of realism due at least in part to video being simultaneously presented to the viewer from many directions.

[0005] Typically, media content is provided for a single display system for viewing. Such single-screen display systems are not configured to provide multi-view content (e.g., media streams designed to be shown on a plurality of screens). Combining a plurality of such single-screen display systems to enable presentation of multi-view content to create an immersive media system presents a number of challenges. For example, to provide an immersive audiovisual environment it can be important to reduce or eliminate issues that may destroy the immersive quality of the experience for the viewer. In particular, if video from different screens are not synchronized, a viewer can become disoriented, distracted, or can otherwise lose a sense of immersion in the environment. Combining single-screen display systems can result in video synchronization issues because such display systems may not be configured to synchronize video with other display systems. Thus, attempts to convert single-screen display systems to be part of an immersive media system can result in out-of-sync video on different displays, reducing viewer enjoyment and satisfaction.

[0006] Moreover, in certain venues, such as in a home, media content can be streamed to a single receiving device (e.g., a set-top box) via cable, internet, antenna, etc. As such, another challenge of providing media content to the plurality of display systems is distributing the data to each of them.

[0007] Accordingly, systems and methods are provided herein for providing synchronized media streams from a plurality of display systems. In particular, a master display system can generate a synchronization signal based at least in part on the media stream provided to the master display system and transmit the synchronization signal serially to a plurality of slave display systems (e.g., creating a chain of displays). In succession, each slave display system in the chain receiving the synchronization signal can (1) pass the signal to a subsequent slave display system and (2) process the synchronization signal to determine when to display a video frame so that it is synchronized with the video of the master display system. The displays can thus be connected in serial, or chained together, to synchronize the media streams of each, the synchronization signal being provided by the master display system.

[0008] In some implementations, an immersive media system comprises at least 3 displays that are sequentially chained. Media content can be downloaded to each display. A master display creates and transmits a synchronization signal to enable synchronous projection of video content by all display systems with sub-frame accuracy. The synchronization signal gets passed sequentially among the at least 3 chained displays.

[0009] Advantageously, a sequentially chained display system can utilize a wireless connection (e.g., using a wireless networking protocol such as IEEE 802.11η) to transmit a synchronization signal derived from standard timecodes used in media environments. Using sequentially chained display systems can also simplify the generation and transmission of the synchronization signal. For example, the serial wireless connection design can reduce or eliminate the need for signal amplification relative to an immersive media system employing a parallel connection infrastructure. Similarly, the serial connection design reduces or eliminates a need for a centralized distribution system to distribute the synchronization signal to each display system relative to an immersive media system employing a parallel connection infrastructure. Moreover, the serial connection design enables flexibility in the number of display systems in the immersive media system because the addition of another display simply entails adding another link in the display system chain. This can provide an advantage over a parallel connection design as a maximum number of display systems can be reached in a parallel system when the synchronization signal distribution system runs out of available connection points. The serial connection design also can result in relatively small latency between display systems. The synchronization signal can also enable synchronization of video with different frame rates, aspect ratios, codecs, or the like due at least in part to the synchronization signal being independent of these parameters. For example, a master display system can generate a synchronization signal based at least in part on a signal coming from its framebuffer and a slave display system can synchronize its video based at least in part on regulating the signal output of its framebuffer according to information in the synchronization signal.

[0010] In some implementations, multi-view content can be packaged for digital delivery and ingested by a receiving device at the venue, wherein the package comprises a plurality of channels of video to be displayed by a corresponding plurality of display systems. In some embodiments, each of the video channels can conform to standard digital content packaging formats (e.g., MKV, MP4, XVID, AVI, WMV, MOV, and/or other media formats). A receiving device (e.g., a set-top box) can receive the package, extract the digital files, and distribute the video channels to the master display system and/or other display systems in the immersive media system. In some embodiments, the master display system can selectively distribute video data to the display intended to play the video data. In some embodiments, each display in the immersive media system ingests the entire package and is configured to determine the appropriate video channel to decode and display. In some embodiments, the master display system is configured to automatically determine the appropriate slave display system for each video channel in the package and to transmit the video channel data to the slave display, where transmission can occur prior to presentation of the multi-view content, during playback wherein the slave display system buffers the video channel data, or the video channel data is delivered and displayed in real-time.

[0011] In some embodiments, the master display system includes hardware and/or software components that distinguish it from slave display systems. For example, a slave display system can include a synchronization module or card that allows it to frame-lock the video presentation based at least in part on the synchronization signal originating from the master display. In some embodiments, the master display system and slave display system contain the same hardware and/or software components, but the master display system is configured to act as the master while other display systems are configured to act as slaves. In some embodiments, a display system comprises a media server and display screen integrated into a single physical unit. In some embodiments, a display system comprises a media server and a display screen that are physically separate and communicatively coupled to one another (e.g., through a wired or wireless connection).

BRIEF DESCRIPTION OF THE DRAWINGS

[0012] Various embodiments are depicted in the accompanying drawings for illustrative purposes, and should in no way be interpreted as limiting the scope of the inventions. In addition, various features of different disclosed embodiments can be combined to form additional embodiments, which are part of this disclosure. Any feature or structure can be removed or omitted. Throughout the drawings, reference numbers can be reused to indicate correspondence between reference elements.

[0013] FIG. 1 illustrates an example method of generating media content, delivering the media content to a home or other venue, receiving and unpacking the content, sending the content to a first display system, and then sending the content to display systems in a chained sequence.

[0014] FIG. 2 illustrates an example immersive media system having a plurality of display systems.

[0015] FIG. 3 illustrates a plurality of example display systems ingesting digital content for viewing.

[0016] FIG. 4 illustrates a block diagram of an example media server for a display system. [0017] FIG. 5 illustrates a flow chart of an example method of synchronizing multiple media streams in serially connected media servers in an immersive media system.

[0018] FIG. 6 illustrates a flow chart of an example method of synchronizing a slave video with a master video based at least in part on a synchronization signal from a master display system.

DETAILED DESCRIPTION

[0019] Although certain embodiments and examples are disclosed herein, inventive subject matter extends beyond the specifically disclosed embodiments to other alternative embodiments and/or uses, and to modifications and equivalents thereof. Thus, the scope of the claims appended hereto is not limited by any of the particular embodiments described below. For example, in any method or process disclosed herein, the acts or operations of the method or process can be performed in any suitable sequence and are not necessarily limited to any particular disclosed sequence. Various operations can be described as multiple discrete operations in turn, in a manner that can be helpful in understanding certain embodiments; however, the order of description should not be construed to imply that these operations are order dependent. Additionally, the structures described herein can be embodied as integrated components or as separate components. For purposes of comparing various embodiments, certain aspects and advantages of these embodiments are described. Not necessarily all such aspects or advantages are achieved by any particular embodiment. Thus, for example, various embodiments can be carried out in a manner that achieves or optimizes one advantage or group of advantages as taught herein without necessarily achieving other aspects or advantages as can also be taught or suggested herein.

[0020] FIG. 1 illustrates an example method of generating media content, delivering the media content to a home or other venue, receiving and unpacking the content, sending the content to a first display system, and then sending a synchronization signal to display systems in a chained sequence.

[0021] Block 100 shows generating media content. The media content can be generated by a service provider (e.g., a cable company, movie distributor, internet company, streaming service, etc.). The media content can be in a variety of formats including MKV, MP4, XVID, AVI, WMV, MOV, and/or other media formats. The media content can also be formatted for display in a variety of resolutions including 4K (e.g., 3636x2664, 3996x2160, 3840x2160, 4096x2160, etc.), 2K (e.g., 1828x1332, 1998x1080), HD (e.g., 1920x1080, 1280x720), SD (640x480) or the like. The media content can also be in a variety of frame rates including, for example and without limitation, 24 fps, 30 fps, 60 fps, 120 fps, etc.

[0022] The media content can also undergo additional compression during generation. For example, and without limitation, color information in 4K video can be encoded using Chroma subsampling and other techniques in order to enable transmission of the video using lower band widths.

[0023] In some embodiments, compression of the media content can be achieved by utilizing a lower resolution signal (e.g., lower resolution than 4K resolution). By illustrative example, the lower resolution signal can be 2K, HD, SD, or the like. The lower resolution signal can comprise video frames, where each video frame comprises pixels for displaying the video frame. When viewed in a sequence, the video frames can appear as a movie. Each pixel can have a color. For example, and without limitation, each entry can comprise of three values representing the relative red, green, and blue composition of a pixel. In other cases, each entry can be a scaled number between zero (0) and one (1), reflecting a color and/or color intensity. In other cases, each entry can be a number in any range of numbers reflecting different colors. In other cases, the entries can be letters and/or numbers that correspond to particular colors and/or color intensities. A person having ordinary skill in the art should appreciate that the color of a pixel can be represented in any number of ways. In this disclosure, any way of representing the color, color intensity, and/or any other attribute of a pixel (e.g., hue, contrast, brightness, shade, grayscale, etc.) will be called the "color" of the pixel. The visual combination of the pixel colors can create the image perceived by a viewer.

[0024] In some embodiments, using 2K signals as the lower resolution signal may be desirable because 2K signals require less upscaling to achieve 4K resolution. Because the lower resolution signal can be smaller in size (e.g., smaller in bytes, megabytes, gigabytes, terabytes, and the like to transmit or store the signal) than a 4K signal. An up-conversion signal can be transmitted along with the lower resolution signal in order to upscale the lower resolution signal to 4K resolution. The up-conversion signal can be a separate signal or integrated into the lower resolution signal itself. In some embodiments, the upscale signal can contain instructions for each pixel and/or groups of pixels in each frame to upscale the signal, such as by interpolating additional pixels between pixels of the lower resolution signal. By way of illustration, and without limiting embodiments of this disclosure, a 100 x 200 pixel frame can be upscaled by a factor of two, creating a 200 x 400 pixel frame. In this example, between each adjacent pixel of the original 100 x 200 pixel frame, an additional pixel can be interpolated so that the frame enlarges to 200 x 400 pixels.

[0025] In some embodiments, the lower resolution signal can be upscaled by a receiving device. The receiving device processes the up-conversion signal, which instructs the receiving device to determine the color of an interpolated pixel based on pixels in other frames and/or other locations within the same frame of the interpolated pixel.

[0026] Different methods of interpolation can be used as desired. Some examples of techniques are nearest-neighbor interpolation, bilinear interpolation, bi-cubic interpolation, and/or directional upscaling. In some embodiments, more holistic methods can be implemented, such as, without limitation, comparing blocks (e.g., sections of adjacent pixels in a frame) or pixels along visual lines (e.g., edges) in a frame to find the color of an interpolated pixel.

[0027] In some implementations, the lower resolution signal can have frames of higher resolution mixed in. For example, and without limitation, the lower resolution signal can comprise 100 frames. The majority of the frames can have 2K resolution, however, some subset of the frames can have 4K resolution. For example, and without limitation, every fifth frame can have 4K resolution. A frame with 2K resolution can interpolate pixels based, at least in part, on pixel values of the frames that have 4K resolution. For example, and without limitation, the interpolated pixel value can be calculated relative to the corresponding pixel in the sequentially closest preceding 4K resolution frame and/or sequentially closest proceeding 4K resolution frame. The up-conversion signal can comprise instructions including how to adjust the pixel color relative to those corresponding pixels. The instructions can be pixel specific (e.g., instructions in how to change the color of the particular interpolated pixel) or for groups of pixels. For example, and without limitation, instructions can give instructions on how to change blocks of pixels. [0028] Such instructions can comprise mathematical transformations as desired. For example, and without limitation, the instructions can comprise mathematical functions comprising addition, multiplication, division, subtraction, etc.

[0029] The media content can be made available in any number of mediums including DVD, Blu-Ray, Redray, hard drive/media drive, internet, broadcast signal, etc. The content can be formatted to be played by a receiving device, such as a set-top box.

[0030] The video content can further encode content for a plurality of display systems, including 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, or more display systems.

[0031] Block 101 shows delivering media content to home or other venue. Delivery can be accomplished by physical delivery of a disk (e.g., DVD, Blu-Ray, Redray), hard drive/media drive, or other medium. It can also be accomplished by sending information over telecommunication lines, cable lines, wirelessly, and/or through a broadcast signal.

[0032] In block 102, the delivered media content is received and unpacked by a receiving device. For example, a set-top box can be configured to decode, decompress, and/or ingest the media content for viewing.

[0033] In block 103, the media content is sent to the plurality of display systems. In some embodiments, the media content can be sent to just a master display system. The master display system can then connect to one or more slave display systems to distribute all or a portion of the media content to each of those connected display systems. The media content can be distributed over cables that have signal lines and ground lines, such as coaxial cables, Ethernet cables, HDMI cables, and the like. It is to be understood that other cabling options are within the scope of this disclosure including, for example and without limitation, serial cables, twisted pair cables, USB cables, and the like. Moreover, data can be transferred using a 1000BASE-T GB transceiver and/or any cable and/or component conforming to IEEE's Gigabit Ethernet standard. Additionally, cables can be replaced by wireless transmission (e.g., using a wireless networking protocol such as IEEE 802.11η). In some embodiments, the receiving device can distribute all or a portion of the media content to each of the display systems.

[0034] Each display system can optionally include a media server configured to receive and transmit media content. The media server can also be configured to receive and transmit synchronization signals. [0035] In block 104, a synchronization signal is sent from the first display system to the second display system. The signal can be sent by wireless transmission using a wireless networking protocol such as IEEE 802.1 In. It can also be sent over cables that have signal lines and ground lines, such as coaxial cables, Ethernet cables, HDMI cables, and the like.

[0036] FIG. 2 illustrates an example immersive media system having a plurality of display systems. The immersive media system 200 comprises a plurality of display systems 204a-c, configured to display media content for providing an immersive media experience. For example and without limitation, display systems 204a-c can comprise multiple direct-view displays, multiple rear-projection displays, multiple front-projection displays, liquid-crystal displays (LCDs), light-emitting diode (LED) displays, LED LCD displays, in-plane switching panels (IPSs), cathode ray tubes, plasma displays, ultra high definition (HD) panels, 4K displays, retina displays, organic LED displays, surfaces, canvases, and/or any displays, televisions, monitors, panels, and/or devices known in the art for visual presentation. There can be gaps between adjacent displays. For example, displays 204a-c can have gaps between them as depicted in FIG. 2. In some embodiments, the gaps can be relatively small, close to zero, or zero. The immersive media system 200 can include a plurality of flat or curved displays or screens or it can include a single curved display or screen. The displays can be rotated relative to one another. Display systems 204a-c can also have respective inclinations relative to one another. The screens 204a-c of the immersive media system 100 can include flat screens, curved screens, or a combination of both.

[0037] The viewer can be positioned so as to view displays 204a-c. For example, viewers can be located at seating 201. The example immersive media system 200 includes display systems 204a-c configured to show video. Sound systems can be included in display systems 204a-c and/or otherwise positioned to provide an immersive experience.

[0038] In some embodiments, a media server is physically separate from the display and is communicably coupled (e.g., through wired or wireless connections) to the display. In some embodiments, the display comprises an integrated media server. The media server can include hardware and software components configured to receive, store, and decode media content. The media server can include hardware and software configured to ingest and decode digital content files, to produce a media stream (e.g., video and audio), to send image data to the display. The media server can include modules for ingesting digital content, decoding ingested content, generating video from the decoded content, generating audio from the decoded content, providing security credentials to access secure content, and to generate or interpret synchronization signals to provide a synchronized presentation, and the like.

[0039] As illustrated in FIG. 2, the display system 204a can be configured to be the master display system. As used herein, the master display system or the master media server provides the synchronization signal to which the slave display systems synchronize their output. The master display system 204a ingests, decodes, and/or provides the main audiovisual presentation in the immersive media system 200. Display systems 204b and 204c are slave display systems. As used herein, a slave display system or slave media server provides images synchronized to the master system wherein synchronization is based at least in part on the synchronization signal provided by the master display system. A slave display system can provide video that is projected peripheral, adjacent, near, and/or otherwise complementary to the video provided by the master display system.

[0040] The master display system 204a transmits a synchronization signal over a cable and/or wirelessly. The synchronization signal is the same or substantially the same for all display systems to enable globally synchronized video in the immersive media system. Accordingly, due at least in part to display systems 204a-c displaying video based on the synchronization signal, a synchronized video presentation is provided. As used herein, synchronized video includes video shown from different display systems having corresponding frames that are displayed within a sufficiently small time window from one another so as to be displayed substantially simultaneously. In some embodiments, synchronized video includes video wherein corresponding frames are displayed such that a time between the display of the synchronized frames is less than or equal to about 1 ms, less than or equal to about 500 μβ, less than or equal to about 350 μβ, less than or equal to about 250 μβ, or less than or equal to about 200 μβ. Such synchronization can be referred to as having sub-frame accuracy in its synchronization. For example, for a video that has a frame rate of 30 fps (or 60 fps), each frame of video is displayed for about 33.3 ms (or 16.7 ms). Videos that are synchronized to within a fraction of the time a video frame is displayed can be said to have sub-frame accuracy. For example, sub-frame accuracy can include synchronization that has a latency between corresponding frames that is less than about 10% of the frame rate, less than about 5% of the frame rate, less than about 1% of the frame rate, or less than about 0.1% of the frame rate.

[0041] In some embodiments, the master display system 204a can control display of a video in units of frames and synchronize the video frames from display systems 204b and 204c using a time code for each frame, the time code being carried by the synchronization signal, as described in greater detail herein with reference to FIG. 3. Accordingly, the display systems 204a-c can accurately synchronize the video based at least in part on the time code for each frame in the synchronization signal.

[0042] FIG. 2 illustrates 3 display systems 204a-c. However, the immersive media system 200 can include a different number of display systems. For example, immersive media system 200 can include 2, 3, 4, 5, 6, 7, 8, 9, 10, or more than 10 display systems. The immersive media system 200 can be configured such that more than one display system provides video on a single screen, such that the images substantially overlap. The immersive media system 200 can be configured such that display systems provide video on a single display screen wherein the videos from display systems minimally overlap, are adjacent to one another, or are near one another to provide a substantially unitary video presentation.

Example Display Systems

[0043] FIG. 2 illustrates a plurality of example display systems 302a-c ingesting digital content 301 for display in an immersive media system 300. The digital content 301 can be any collection of digital files that include content data and metadata that make up a composition to be displayed by the immersive media system 300. The digital content 301 can be received by receiving device 306. Receiving device 306 can process and ingest digital content 301. Receiving device 306 can also pass digital content 301 through network connection 350 in non-ingested form to the plurality of display systems 302a-c, where the digital content can be ingested at one or more of the plurality of display systems 302a-c. The media servers 310a-c can be configured to extract the appropriate data files from the ingested digital content 301 and to decode the appropriate video content to send to the corresponding display systems 302a-c. The master display system 302a can generate a first synchronization signal 304a to send over router 303. Router 303 can relay the synchronization signal to the first slave display system 302b. The first slave display system 302b can then send the synchronization signal, as synchronization signal 305a, to a second slave display system 302c as synchronization signal 305b. This method and system of using router 303 to relay synchronization signals can repeat for each display system in immersive media system 300. In this way, immersive media system 300 can display synchronized video from a plurality of display systems.

[0044] In some embodiments, the master and slave display systems 302a-c can be configured to ingest only portions of the digital content 301 intended for that particular display system. For example, a display system 302a-c can download portions of a digital package that contain the data sufficient to provide the content intended for that particular display system. In some embodiments, the master display system 302a ingests the entire digital content 301 and distributes a copy of that digital content 301 to the slave display systems 302b-c. In some implementations, after ingesting the digital content 301, the master display system 302a distributes a portion of the digital content 301 to each slave display system 302a-c wherein the portion transmitted to the slave display system contains the data sufficient for that particular slave display system to provide its audiovisual presentation. Transmission of digital content 301 can occur over the network connection 350 which can be a wired connection, a wireless connection, or a combination of both wired and wireless connections.

[0045] The master display system 302a can transmit copies of the digital content 301 or copies of portions of the digital content 301 to the slave display systems 302b-c over the network connection 350. In such circumstances, the master display system 200a can transmit the digital content 301 to the slave display systems 302b-c prior to presentation of the composition contained in the digital content 301, during presentation of the composition by buffering the data in each slave display system 302b-c, and/or during presentation of the composition in real time. In some implementations, the master display system 302a can transmit information to the slave display systems 302b-d indicating which portion of the digital content 301 that each slave display system should ingest. Based at least in part on this information, each slave display system 302b, 302c can ingest a portion of the digital content 301. [0046] The digital content 301 can include data that is encoded and/or compressed. The digital content 301 can conform to a variety of formats and/or specifications. For example, and without limitation, digital content 301 can comprise files encoded using WMV3, WMA, VC-1 Advanced Profile, high-efficiency video coding (HEVC) (e.g., H.265), H.264, MPEG-4, VP8, VP9, Daala, Theora, and the like. The files can include Digital Rights Management ("DRM"). In some cases, the files can include video and sound for the plurality of display systems (e.g., 1, 2, 3, 4, 5, 6, 7, 8, or more display systems) in immersive media system 300. The files can be display-system specific (e.g., content for master display system 302a, slave display system 302b, or slave display system 302c) or can be run on any display system.

[0047] In some embodiments, digital content 301 can include metadata or other files that can be used to identify and/or designate files for playing on a particular display system in display system 300. In some embodiments, a playlist or directory can be used to designate files for playing on the particular display system.

[0048] In some embodiments, based on the metadata, other files, playlist, and/or directory can be used at least in part to implement a smart ingest function that limits ingestion of digital content 301 to the relevant portions of the digital content 301 for its intended display system. This smart ingestion can occur at the receiving device 306, wherein the appropriate content is sent to the correct display system, or can occur at each or any of the media servers of the display systems (e.g., media servers 310a-c). In certain implementations, the immersive media system 300 displays content from the display systems 302a-c blended together to accommodate a curved display screen.

[0049] In some embodiments, the immersive media system 300 can display other content synchronized to digital content 301. This can allow for dynamic content (e.g., feeds from social media, advertisements, news feeds, etc.) to be displayed along with a main video presentation (e.g., a feature film). In some embodiments, one or more of the display systems 302a-c provides the dynamic, synchronized content overlaid on the display screens of the display systems.

[0050] The systems and methods described herein can advantageously allow the synchronization of video from a plurality of display systems when the digital content 301 conforms to a single specification, multiple specifications, or a combination of a single specification and no specification. This advantage is realized due at least in part to the master display system 302a generating the synchronization signal after the media content has been decoded and/or ingested. The master display system 302a can generate an appropriate timeline and metadata independent of the format of the digital content 301 and encode that information into the synchronization signal. In some embodiments, the synchronization can be done between the video frames (e.g., line-based synchronization). For example, the master display system 302a can generate the synchronization signal after the frame buffer output in the display, prior to the showing of the frame on the display screen. Each slave display 302b-c can receive the synchronization signal and control its display of video based on the timeline and metadata in the signal. For example, slave display systems 302b-c can frame-lock to the synchronization signal at a similar hardware level to the master display system 302a (e.g., after the frame buffer and prior to the modulation chip). Thus, in some embodiments, the display systems 302a-c can be synchronized on a frame -basis, frame- locking content wherein timing is linked on a frame-by-frame basis. Accordingly, the immersive media system 300 can synchronize display systems 302a-c with each other for content playback with sub-frame accuracy, wherein each media server of each display system has files in a different format and/or following a different specification.

[0051] The immersive media system 300 can also synchronize video having different aspect ratios, different content formats, and/or different frame rates. For example, side screens can have a frame rate that is higher than a frame rate of the main screen or vice versa. In some embodiments, synchronization of different frame rates can occur where the differing frame rates are multiples of one another (e.g., 30 fps and 60 fps), multiples of a common frame rate (e.g., 24 fps and 60 fps are both multiples of 12), or where the data rate of the synchronization signal allows for synchronization at differing frame rates (e.g., where the base frequency of the synchronization signal is a multiple of possible frame rates). The immersive media system 300 can also synchronize video that is stereoscopic, not stereoscopic, or a combination of both.

[0052] The master display system 302a and the slave display systems 302b-c can be substantially identical devices. In some implementations, the user can configure the devices to assume the roles of master and slave. In certain implementations, the content ingested by the devices determines, at least in part, the role of the display system (e.g., master or slave). The immersive media system 300 can thus be configured to not include any specific main server that controls the slave display systems.

[0053] The master display system 302a can include hardware and/or software components that differentiate it from the slave display systems 302b-c. For example, the master display system 302a can include hardware and/or software specific to generating the synchronization signal. Similarly, the slave display systems 302b-c can include hardware and/or software specific to synchronizing video output based on the synchronization signal.

Example Media Server

[0054] FIG. 4 illustrates a block diagram of an example media server system 410. The media server system 410 can be a master media server or a slaver media server. The media server system 410 can be configured to generate a synchronization signal (e.g., when it is a master media server system), transmit the synchronization signal (e.g., over a synchronization link such as a coaxial cable), receive a synchronization signal (e.g., when it is a slave media server system), synchronize presentation of a video based at least in part on the synchronization signal, send and receive communications over a network connection, process digital files to generate a video, provide security credentials to extract video, and the like. The media server system 410 can include hardware and software sufficient to accomplish the functionality described herein.

[0055] The media server system 410 includes a controller 401, such as a computer processor, and a data store 402, such as a non-transitory computer storage. Controller 401 can be configured to provide computational power and to direct and coordinate the execution of functions sufficient to provide the targeted and desired functionality of the media server system 410. The data store 402 can be used to store digital files, e.g., software, executable instructions, configuration settings, calibration information, and the like. In some embodiments, the media server 410 provides a user interface or a control program accessible over a network connection that allows a user or other system to provide commands to the media server system 410, to monitor a status of the media server system, and/or to request information from the media server system 410. In some embodiments, a user or other system can communicate with the master media server in an immersive media system to control all of the media servers in the immersive media system. [0056] The media server system 410 includes a communication module 403 configured to process, send, receive, construct, and/or interpret information over a network connection, such as the network connection 350 described herein with reference to FIG. 3. For example, the communication module 403 can be configured to ingest digital content for display by an associated display. As described herein, the communication module 403 can be configured to perform a smart ingest function wherein data necessary for displaying content on the associated display is ingested and other data is not ingested. The communication module 403 can be configured to send commands to be performed by connected media servers. For example, a master media server can command one or more slave media servers to control its associated display system by dowsing the shutter or other similar functionality. The communication module 403 in a slave display system can be configured to receive and interpret commands received from a master display system.

[0057] The media server system 410 includes a media module 404 configured to process digital data to generate a video presentation. The media module 404 can be configured to extract packaged files from a standard format, such as a DCP package, and to provide an appropriate signal to a display screen so that the display screen displays intended video. For example, to display a feature film, the media module 404 can decompress digital files, identify an appropriate playlist file, decode associated image essence files, decode associated audio essence files, and produce a video signal that is sent to a display screen for display.

[0058] The media server system 410 includes a security module 405 configured to provide appropriate security functionality to access secure and/or encrypted digital files. The security module 405 can provide appropriate security credentials and decrypt the digital files so that the media module 404 can access the files. The security module can also provide security functionality when the video signal generated by the media module 404 is to be sent over a cable to the display screen, such as when the display screen is physically separate from the media server system 410.

[0059] The media server system 410 includes a synchronization module 406 configured to generate a synchronization signal (e.g., when the media server 410 is part of a master display system), transmit the synchronization signal (e.g., wirelessly and/or over a cable), and/or process the synchronization signal (e.g., when the media server 410 is part of a slave display system). The synchronization module 406 can be configured to generate the synchronization signal. The synchronization signal can be generated independent of synchronization information provided in the digital files related to the composition (e.g., video presentation). For example, the synchronization signal can be generated based at least in part on the video signal generated by the media module. The synchronization signal can be generated based at least in part on the output of a frame buffer in the display screen, prior to (or in parallel with) the video signal being input the display screen.

[0060] The synchronization signal can be a waveform having information that is encoded therein. The waveform can utilize a biphase mark code to encode data (e.g., as used in AES3 and S/PDIF signals). The synchronization signal encoded with biphase mark code can be polarity insensitive which can be advantageous in an immersive media system with a plurality of display systems. The waveform can be divided into words, or groups of bits, with information encoded at particular places within the words. The waveform can have one or more encoded words. The synchronization waveform can be a modified linear time code or a modified AES3 signal. The waveform can encode SMPTE timecode data to enable synchronization of slave display systems to the master display system. The waveform can also encode commands or other information (e.g., metadata) addressed to or intended for one or more display systems.

[0061] As an illustrative and non-limiting example, the synchronization signal can include two 64-bit words. The first word can include a 24-bit frame number, a valid status bit, a channel status bit, a user data bit, a parity bit (e.g., to validate a received word). In certain implementations, an active edge in the user data bit can be used to indicate that the master display system will start the next frame. The second word can contain a command structure used by a master display system to provide commands to connected slave display systems. Additional data, such as metadata, can be included in the first or second word. For example, the second word can include a 24-bit instruction from the master display system to connected slave display systems. The metadata can be used to provide information to the slave display systems to modify their functionality. For example, metadata can be used to indicate that the master display system is paused. The slave display systems can then pause their playback until another signal is received indicating that playback on the master display system has resumed. The synchronization signal can be a modification of standard signals, such as the linear time code or AES3 signal. This can allow existing display systems, hardware, and/or software to incorporate elements sufficient to implement the synchronization signal in a relatively straight-forward and easy fashion.

[0062] The synchronization module 406 can include look-up tables, data structures, data tables, data bases, or the like to interpret the signals encoded into the synchronization signal. For example, the synchronization module 406 can include a command table that correlates commands with numbers encoded into the synchronization signal.

[0063] As an illustrative and non-limiting example, the synchronization signal can have a data rate of about 2 Mbps. When the synchronization signal is encoded using 64- bit words, the data rate is about 32 μβΛ οΓά. Where the synchronization signal includes two words, the time to transmit a packet (e.g., the two words) is about 64 μβ.

[0064] The synchronization module 406 can be configured to adjust display of a video frame based at least in part on the synchronization signal. For example, the synchronization module 406 can wait for a matching frame id received in the synchronization signal. When the matching frame id is received, the synchronization module 406 can indicate to the display system to display the appropriate video frame.

[0065] In some embodiments, the synchronization module 406 generates the synchronization signal based at least in part on audio provided by the media module 204. For example, sound can be generated by the master display system and the timing of the audio can drive the video synchronization chain. The audio can be processed by the media module 404 in real time and the video frames can be specified in terms of the number of clock cycles relative to the audio clock domain. This can enable automatic alignment of audio and video during playback. In some embodiments, continuous or substantially continuous adjustments to video playback can be performed during the video blanking time slot (e.g., using a back-pressure algorithm). Accordingly, the master display system can play audio in real time and display the video synchronized to the audio using the media module 404. The master display system also provides a synchronization signal via the synchronization module 406 to connected slave display systems. The slave display systems can then synchronize their video to this synchronization signal provided by the master display system, making it so that their video is synchronized with the master video and not necessarily to their audio.

[0066] In some embodiments, media server 401 can be configured to allow a display system to be upgraded from a single-display system to a display system that can part of an immersive media system, such as the immersive media system described herein with reference to FIGS. 1 and 2. It can be attachable to a display system

Media Stream Synchronization Method

[0067] FIG. 5 illustrates a flow chart of an example method 500 of synchronizing multiple media streams in serially connected media servers in an immersive media system. The method 500 can be performed by a plurality of display systems and/or media servers in an immersive media system. One or more media servers, such as the media servers described herein with reference to FIGS, 3 or 4, can perform one or more of the steps of the method 500. In addition, one or more modules of the media server, such as those described herein with reference to FIG. 4, can perform one or more of the steps of the method 500. Furthermore, a single step of the method 500 can be performed by more than one module and/or display system.

[0068] In block 505, a master display system and/or a receiving device extracts a composition for presentation. As described herein, the composition can include video and/or audio to be presented to an audience. The composition can include video to be displayed by the master display system. In some implementations, the composition can include video to be displayed by two or more slave display systems. In such scenarios, the master display system can transmit the data sufficient to display the video to the respective slave display systems. In some embodiments, two or more slave display systems each extract a composition for presentation by the respective slave display system.

[0069] In block 510, the master display system generates a synchronization signal based at least in part on the extracted composition. The synchronization signal can encode data words into a synchronization waveform. The encoded data words can include synchronization information in the form of a timecode. In some embodiments, the master display system generates the synchronization signal based at least in part on audio in the composition for presentation by the master display system. [0070] In block 515, the master display system transmits the synchronization signal to a first slave display system. The master display system can transmit the synchronization signal over a coaxial cable or other cable with a signal line and a ground. In block 520, the first slave display system receives the synchronization signal and retransmits the synchronization signal to a second slave display system. The first slave display system can receive the synchronization signal at an input synchronization connector and transmit the synchronization signal at an output synchronization connector.

[0071] In block 525, the master display system displays a video frame from the extracted composition. In block 530, the first slave display system and the second slave display system display video frames synchronized with the video frame displayed by the master display system wherein the displayed video frames are synchronized based at least in part on the synchronization signal generated by the master display system. Each of the first and second slave display systems can process the received synchronization signal to extract synchronization information. In addition, each of the first and second slave display systems can control playback of its video (e.g., control timing of when to display a video frame) based at least in part on the extracted synchronization information.

[0072] FIG. 6 illustrates a flow chart of an example method 600 of synchronizing a slave video with a master video based at least in part on a synchronization signal from a master display system. The method can be performed by a slave display system in an immersive media system, such as the slave display systems described herein with reference to FIGS. 1-5. The slave display system can include hardware and software configured to perform the steps in the method 600, and each step in the method can be performed by one or more components and/or one or more modules of the slave display system. Similarly, one or more steps in the method 600 can be performed by any combination of hardware and software of the slave display system. The method 600 can allow a slave display system to synchronize a slave video with a master video. In some embodiments, the slave display system can comprise a modified single display system. For example, a display system can be retrofit with a module, such as the module 410 described herein with reference to FIG. 4, that is configured to receive a synchronization signal and to synchronize its video based at least in part on the received synchronization signal. In some implementations, the synchronization signal can be generated by a master display system that has not been specifically designed to be part of an immersive media system. For example, a display system configured for use in a single-display system can generate a synchronization signal based on standards such as LTC or AES3. The slave display system can receive the synchronization signal and synchronize its video based on that generated synchronization signal. In this way, an immersive media system can be created using existing hardware and retrofitting one or more display systems (e.g., with the module 410) to act as slave display systems.

[0073] In block 605, the slave display system receives a synchronization signal. The synchronization signal can be generated by a master display system or another system configured to generate the synchronization signal. The synchronization signal can be based on standard synchronization signals (e.g., LTC, AES3, etc.) or it can conform to a format that the slave display system can process and from which it can extract synchronization information. The synchronization signal can be received wireless and/or over a cable that has a signal line and a ground line, such as a coaxial cable. It is to be understood that other cabling options are within the scope of this disclosure including, for example and without limitation, serial cables, twisted pair cables, USB cables, and the like.

[0074] In block 610, the slave display system transmits the received synchronization signal to another slave display system over wireless transmission or over another cable (e.g., a cable different from the cable used to receive the synchronization signal). The slave display system can include active electronics configured to receive the synchronization signal and pass that signal to the next slave display system in the chain. In some embodiments, the slave display system includes amplifiers, filters, and/or other electronics configured to reduce degradation of the synchronization signal as it is passed from one slave display system to the next.

[0075] In block 615, the slave display system extracts synchronization information from the received synchronization signal. This can occur in parallel with the transmission of the synchronization signal in block 610. This can be done to reduce or minimize latency in the immersive media system. The synchronization information can include information sufficient for the slave display system to provide a video frame synchronized with a video provided by another display system (e.g., a master display system and/or other slave display systems). The synchronization information can include, for example and without limitation, frame numbers, timestamps, timecodes, metadata, command(s) for the slave display system, and the like, as described in greater detail herein.

[0076] In block 620, the slave display system provides a video frame synchronized with a video provided by another display system. The slave display system can synchronize the video frame at the framebuffer. The slave display system can synchronize the video frame at a point in the processing chain prior to the framebuffer, such as at the video decoding stage. The synchronized video frame can be displayed on a screen along with video from other display systems to provide an immersive viewing experience for a viewer.

Conclusion

[0077] In some embodiments, a computing system that has components including a central processing unit (CPU), input/output (I/O) components, storage, and memory can be used to execute the display system, or specific components of the display system. The executable code modules of the display system can be stored in the memory of the computing system and/or on other types of non-transitory computer-readable storage media. In some embodiments, the display system can be configured differently than described above.

[0078] Each of the processes, methods, and algorithms described in the preceding sections can be embodied in, and fully or partially automated by, code modules executed by one or more computers, computer processors, or machines configured to execute computer instructions. The code modules can be stored on any type of non-transitory computer- readable medium or tangible computer storage device, such as hard drives, solid state memory, optical disc, and/or the like. The systems and modules can also be transmitted as generated data signals (e.g., as part of a carrier wave or other analog or digital propagated signal) on a variety of computer-readable transmission mediums, including wireless-based and wired/cable-based mediums, and can take a variety of forms (e.g., as part of a single or multiplexed analog signal, or as multiple discrete digital packets or frames). The processes and algorithms can be implemented partially or wholly in application-specific circuitry. The results of the disclosed processes and process steps can be stored, persistently or otherwise, in any type of non-transitory computer storage such as, e.g., volatile or non-volatile storage. [0079] The various features and processes described above can be used independently of one another, or can be combined in various ways. All possible combinations and sub-combinations are intended to fall within the scope of this disclosure. In addition, certain method or process blocks can be omitted in some implementations. The methods and processes described herein are also not limited to any particular sequence, and the blocks or states relating thereto can be performed in other sequences that are appropriate. For example, described tasks or events can be performed in an order other than that specifically disclosed, or multiple can be combined in a single block or state. The example tasks or events can be performed in serial, in parallel, or in some other manner. Tasks or events can be added to or removed from the disclosed example embodiments. The example systems and components described herein can be configured differently than described. For example, elements can be added to, removed from, or rearranged compared to the disclosed example embodiments.

[0080] Conditional language used herein, such as, among others, "can," "could," "might," "can," "e.g.," and the like, is not generally intended to imply that features, elements and/or steps are required for one or more embodiments or that one or more embodiments necessarily include logic for deciding, with or without author input or prompting, whether these features, elements and/or steps are included or are to be performed in any particular embodiment. The terms "comprising," "including," "having," and the like are synonymous and are used inclusively, in an open-ended fashion, and do not exclude additional elements, features, acts, operations, and so forth. Also, the term "or" is used in its inclusive sense (and not in its exclusive sense) so that when used, for example, to connect a list of elements, the term "or" means one, some, or all of the elements in the list. Conjunctive language such as the phrase "at least one of X, Y and Z," unless specifically stated otherwise, is otherwise understood with the context as used in general to convey that an item, term, etc. can be either X, Y or Z. Thus, such conjunctive language is not generally intended to imply that certain embodiments require at least one of X, at least one of Y and at least one of Z to each be present. The terms "about" or "approximate" and the like are synonymous and are used to indicate that the value modified by the term has an understood range associated with it, where the range can be ±20%, ±15%, ±10%, ±5%, or ±1%. The term "substantially" is used to indicate that a result (e.g., measurement value) is close to a targeted value, where close can mean, for example, the result is within 80% of the value, within 90% of the value, within 95% of the value, or within 99% of the value.

[0081] While certain example embodiments have been described, these embodiments have been presented by way of example only, and are not intended to limit the scope of the inventions disclosed herein. Thus, nothing in the foregoing description is intended to imply that any particular feature, characteristic, step, module, or block is necessary or indispensable. Indeed, the novel methods and systems described herein can be embodied in a variety of other forms; furthermore, various omissions, substitutions and changes in the form of the methods and systems described herein can be made without departing from the spirit of the inventions disclosed herein.