Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
SATELLITE TELECOMMUNICATIONS USING A COMPUTER SYSTEM ADAPTED FOR HARSH ENVIRONMENTS
Document Type and Number:
WIPO Patent Application WO/2012/142042
Kind Code:
A1
Abstract:
A satellite communication system is disclosed that allows for increased bandwidth. In one embodiment, at least two satellite antennas are used. The satellite antennas are typically portable and can be carried by a user to remote areas. A portable computer system can be used and coupled to the satellite antennas via wired or wireless connection. The portable computer system can be adapted for news coverage and can, therefore, receive a live audio and video feed. A multiplexer can be used to divide the audio and video data streams into multiple data streams for transmission over the satellite antennas. The audio and video feeds are transmitted with increased bandwidth due to the use of multiple- satellite antennas transmitting in parallel.

Inventors:
LEWIS MICHAEL NORMAN (US)
HARVILLE JEFFREY KEITH (US)
Application Number:
PCT/US2012/032928
Publication Date:
October 18, 2012
Filing Date:
April 10, 2012
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
ENVELOPING PROSPECTS INC (US)
LEWIS MICHAEL NORMAN (US)
HARVILLE JEFFREY KEITH (US)
International Classes:
H04B7/185
Foreign References:
US5915020A1999-06-22
US5542104A1996-07-30
US20110067082A12011-03-17
US20090034656A12009-02-05
US20100265129A12010-10-21
Attorney, Agent or Firm:
SCOTTI, Robert, F. (LLPOne World Trade Center, Suite 1600,121 SW Salmon Stree, Portland OR, US)
Download PDF:
Claims:
We claim:

1. A system for communicating via satellite, comprising:

at least two portable satellite antennas; and

a portable computer system coupled to the antennas;

the portable computer system including at least the following:

an input port for receiving an audio data stream and a video data stream; a multiplexer for dividing the audio data stream and video data stream into multiple data streams for transmission over the portable satellite antennas;

an output port for transmitting the multiple data streams over the portable satellite antennas.

2. The system of claim 1, wherein the portable computer system further includes a demultiplexer for receiving at least an audio signal from the portable satellite antennas to allow for two-way communication over the portable satellite antennas.

3. The system of claim 1, wherein the portable computer system is housed in a waterproof case.

4. The system of claim 1, wherein the portable computer system monitors bandwidth of transmission over the portable satellite antennas, and, if bandwidth allows, transmits duplicate packets associated with at least the audio data stream.

5. The system of claim 1, wherein the portable satellite antennas and portable computer system are sized for carrying by a single user.

6. The system of claim 1, further including a server computer for receiving the multiple data streams from the Internet and demultiplexing the data streams into the audio data stream and the video data stream. 7. The system of claim 1, further including a touch screen display associated with the portable computer system for allowing configuration of the portable satellite antennas.

8. The system of claim 1, further including storing synchronization keys in a packet header for synchronizing the audio and video data streams.

9. A method of transmission via satellite, comprising:

receiving an audio data stream and a video data stream from a live user transmission;

splitting the data streams into at least first and second multiplexed data streams; and

transmitting the multiplexed data streams to at least two, portable satellite antennas for parallel transmission of the at least first and second multiplexed data streams to a satellite.

10. The method of claim 9, further including monitoring bandwidth of the transmission to the satellite and, if the bandwidth is adequate, duplicating the audio data stream so that the first and second multiplexed data streams include duplicated audio packets.

11. The method of claim 9, further including storing audio/video synchronization data in an Internet Protocol header for synchronizing the audio and video.

12. The method of claim 9, wherein the splitting of the data streams is performed by a computer in a water-tight box.

13. The method of claim 9, further including receiving at least an audio stream from the portable satellite antennas while transmitting the multiplexed data streams.

14. The method of claim 9, further including configuring the portable satellite antennas from a user interface.

15. The method of claim 9, further including sending server-side configuration data over the parallel transmission and configuring one or more of the following transmission parameters: bandwidth, frame size, and/or frame rate.

16. The method of claim 9, further including receiving the multiplexed data streams over the Internet and demultiplexing the data streams.

17. The method of claim 9, further including using synchronization information to reconstruct the audio and video data streams.

18. A system for communicating via satellite, comprising:

at least two portable satellite antennas;

a portable, waterproof computer system coupled to the antennas;

a video camera and a microphone for capturing video and audio streams and for sending the video and audio streams to the waterproof computer system;

the computer system for generating at least two streams of combined video and audio and transmitting the streams in parallel over the portable satellite antennas;

a server computer for receiving the streams from an Internet connection and reconstructing the video and audio streams.

19. The system of claim 18, wherein the server computer is configurable from the portable, waterproof computer system or via a local HTTP interface. 20. The system of claim 18, wherein the server computer can communicate bandwidth information back to the portable, waterproof computer system.

Description:
SATELLITE TELECOMMUNICATIONS USING A COMPUTER SYSTEM ADAPTED FOR HARSH ENVIRONMENTS

Cross Reference to Related Application

This application claims priority to U.S. Provisional Patent Application No.

61/473,784, filed on April 10, 2011, which is incorporated by reference herein in its entirety.

Field The present disclosure generally relates to telecommunications, and more particularly to a computer system adapted for satellite telecommunications.

Background

In terms of news media, more attention is being shifted to cable, broadcast, Internet, blogs, mobile devices and other vehicles for delivery of rich video content.

Irrespective of medium, the distribution of content related to news and current events is a multi-billion dollar industry. By all measures the news media is becoming considerably more robust as consumers are growing accustomed to, and expect more from, broadband connections, mobile device access, high resolution images, faster connects, richer data, greater search-ability, and other factors. The news media is one of the most rapidly evolving e-commerce industries as a result of advancing mobile technologies— supporting broadcast quality image capture over high speed wireless. Today, media organizations are required to dispatch expensive electronic news gathering crews to breaking news events using expensive camera decks, satellite trucks and costly labor in order to capture time-critical content.

Such crews are well adapted to city streets in local U.S. cities. However, there are reporters that travel to other countries or in remote areas where cell phone communication is on a different network or is not available at all. In such a case, reporters can communicate via satellite communications, but the bandwidth is limited. Additionally, current satellite communication devices are not well adapted to rugged locations. Summary

The present application describes a satellite communication system that allows for increased bandwidth.

In one embodiment, at least two portable satellite antennas are used. The satellite antennas are typically portable and can be carried by a user to remote areas. A portable computer system can be used and coupled to the satellite antennas via wired or wireless connection. The portable computer system can be adapted for news coverage and can, therefore, receive live audio and video feeds. A multiplexer can be used to divide the audio and video data streams into multiple data streams for transmission over the satellite antennas. In another embodiment, the portable computer system can include a demultiplexer for receiving an audio signal with or without an associated video signal from the two or more satellite antennas. Thus, two-way communication can be implemented to allow news reporters to broadcast and receive questions from remote areas.

In another embodiment, the portable computer system can be designed for rugged conditions. For example, the portable computer system can be housed in a waterproof case. Additionally, a flash drive can be used instead of a hard drive to reduce or eliminate mechanical motion in the computer system.

In another embodiment, duplicate audio packets can be transmitted if it is determined that there is adequate bandwidth. In news reporting, the audio feed is more important than the video feed. With redundant packets sent, if one of the packets is corrupted, the other packet can be used. The foregoing and other objects, features, and advantages of the invention will become more apparent from the following detailed description, which proceeds with reference to the accompanying figures.

Brief Description of the Drawings FIG. 1 is system diagram showing multiple satellite antennas used to transmit an audio/video signal from a client to a server computer.

FIG. 2 is an example client computer of FIG. 1.

FIG. 3 is an example server computer of FIG. 1.

FIG. 4 is a flowchart of an embodiment for transmitting data streams over multiple satellite antennas in parallel.

FIG. 5 is a flowchart of a method for configuring antennas and the server computer from the client computer.

FIG. 6 is a flowchart of a method for transmitting and receiving dual data streams over multiple satellite antennas. FIG. 7 is a flowchart of a method for reconstructing the audio and video streams.

FIG. 8 is a flowchart of a method for transmitting redundant audio streams.

FIG. 9 shows an exemplary IP header.

FIG. 10 shows an exemplary RTP header.

FIG. 11 shows an exemplary client-side software model. FIG. 12 shows an exemplary server-side software model.

Detailed Description of Exemplary Embodiments

FIG. 1 is a system diagram of a client computer 110 communicating with a server computer 112 via a satellite 114. Two satellite antennas 116, 118 can be used for parallel communication to the satellite 114. Although two satellite antennas are shown, additional antennas can be used. Parallel communication from the client computer over the antennas 116, 118 increases bandwidth and provides a more stable and reliable communication path, as further described below. The client computer can be housed in a water-proof case to allow transportation to remote locations where harsh environmental conditions can exist. The client computer 110 can be adapted for receiving audio/video signals from a portable news camera 120 that can capture a live news broadcast from a news person 122. The news person 122 can also receive audio 126 in return so as to establish two-way communicate via satellite with another news person 130 in a newsroom. As further described below, the audio/video signals can be split into two data streams for parallel

communications over the antennas 116, 118. The antennas 116, 118 can be any desired antennas for communicating with a satellite. Example antennas are broadband global area network (BGAN) antennas. Such antennas are normally used to connect a portable computer (e.g., laptop) to broadband Internet in remote locations, although as long as line of sight to the satellite exists, the terminal can be used anywhere. The BGAN terminal is about the size of a laptop (i.e., it is sized to be handheld) and can be easily carried into remote areas, unlike other satellite Internet services, which require bulky and heavy satellite dishes. The satellite 114 can receive the parallel communications from the antennas 116, 118 and transmit the same to an Internet server 140, which then transmits the parallel communications over the Internet 142 to the destination server 112. The Internet server 140 is typically a server controlled by a company that owns the satellite 114. It is understood that the Internet server 140 can establish two-way communication with satellite 140 via a fixed antenna system (not shown). As further described below, the server 112 receives the parallel signals and reconstructs the transmitted audio and video streams and synchronizes them for display on television monitor 150. A microphone 152 can receive audio signals from the news person 130 and transmit the audio data stream to the server 112. The server can split the data stream into parallel streams for transmission over the Internet 142, through the satellite, 114, for receipt by the portable antennas 116, 118. The parallel audio signals can then be reconstructed in the client computer 110, and transmitted 126 to the news person 122. The system provides independent multi-path, multi-medium data flows by multiplexing and de-multiplexing streams implementing a virtual bonded data path. Back-channel support is provided to allow for feedback loops such as those utilized by live news feeds. The field apparatus is packaged in a small (e.g., 12.5" x 10.1" x 6") and lightweight (e.g., < 10 lbs.) form factor for ease of portability and remote use. The field apparatus is ruggedized to provide IP67 ingress protection, as well as high shock and vibration protection, to allow operation in the most extreme weather and environmental conditions. The field apparatus is designed to operate at high temperatures (e.g. ,158 degrees Fahrenheit), and runs for longer than 6.5 hours on a single battery charge. The field apparatus uses global satellite communications as the primary transmission medium, thereby, generally providing broader coverage than cellular, Wi-Fi, and other local area or land-based technologies. Alternatively, the field apparatus may employ other land-based or wireless communication known in the art, including, cellular and Wi-Fi.

FIG. 2 is an exemplary hardware diagram of the client computer 110. The water- tight case 202 is shown generically and is generally constructed to withstand an impact without damaging. The client computer 110 can include a battery 206, a processor 208, a flash memory 210, a touch screen 212, a touch screen controller 214, multiplexer 216, demultiplexer 218, and I/O ports 220, 222. Other components can be used. The components are generally coupled together, although not all connections are shown for purposes of clarity. The battery 206 allows the computer 110 to operate autonomously in remote locations and is used to power the other components. An example battery 206 can include of two Energizer Energi To Go XP 18000 (18000 mAh @ 5 V power capacity) batteries connected in series. When fully charged, the battery pack can provide over 6.5 hours of operation time. Other batteries can be used. The processor 208 can be any type of desired controller, as is well understood in the art. Generally, the processor 208 receives instructions from flash memory 210 and executes the instructions to perform the transmission and reception of data streams. The flash memory 210 can be used instead of a hard drive to increase durability. Processing of data streams can be performed by the processor to decide how to split the data streams for transmission over the antennas 116, 118 through control of the multiplexer 216. Furthermore, processing of parallel received data streams can be used to reconstruct the data streams for transmission 126 through control of demultiplexer 218. The touch screen 212 allows the user to input commands to configure the antennas 116, 118 and configure the server 112, remotely. When configuring the server 112, configuration data can be sent as a parallel transmission over the antennas 116, 118 and can be used to configure one or more of the following transmission parameters: bandwidth, frame size, and/or frame rate. The touch screen controller 214 interprets user input to the touch screen 212 and sends the input signals to the processor 208 for further action or to the USB interface. The multiplexer 216 is used to combine the audio/video data streams and create two data streams A/Vl and A/V2 for transmission over I/O port 222. The client computer 110 can be directly coupled to the antennas 116, 118 or wirelessly connected, as well understood in the art. The demultiplexer 218 can be used to reconstruct the audio and/or video signals received from the antennas 116, 118.

Transmission of the reconstructed data streams can be transmitted through I/O port 220 for receipt by the user 122.

FIG. 3 shows further details of an exemplary server 112. The server 112 can include an I/O port 302, a processor 304, a multiplexer 306, a demultiplexer 308, an I/O port 310, and memory 312 for storing configuration data. The server 112 can include other components, as is well understood in the art. The processor 304 can control the demultiplexer 308 and receive the parallel data streams from the I/O 302 for further processing. The server-side processing can use configuration data previously received from the client computer 110 and stored in memory 312. Configuration data can include various transmission parameters, such as bandwidth, frame size, frame rate, etc. Audio and/or video signals can be received through I/O port 310, processed and transmitted via multiplexer 306 as parallel data streams to the Internet for subsequent satellite transmission to the client computer. FIG. 4 is a flowchart of a method for communicating via satellite. In process block 410, audio and video data streams are received. Typically, such data streams are received from a live user transmission, such as a news broadcaster performing a live story. In process block 420, the data streams are split into multiplexed data streams. Splitting the data streams can take a variety of forms and can change based on the desired implementation. In any event, the multiplexer 216 accepts the audio and video data streams and splits the streams into independent streams to be sent over the multiple antennas. The multiplexer can accept configuration requests from the user interface, such as setting the burst rate parameters for transmitting stream parts. The streams can split to be an audio only and video only streams. Alternatively, the audio and video can be combined into each stream to better even out the packet size transmitted over each antenna in parallel. In process block 430, the multiplexed data streams are transmitted to the portable satellite antennas for parallel transmission. In the event that there are more than two antennas, the multiplexer can be designed to generate additional data streams to match the number of antennas. FIG. 5 is a flowchart of a method for configuring the portable antennas and the server computer. In process block 510, user input data is received from the user interface. The user interface has touch screen commands that guide a user through commands for configuration. In process block 520, the portable satellite antennas are configured based on the received input. The configuration depends on the antennas used. In process block 530, the server is configured remotely by sending the user interface commands over the antennas 116, 118, in parallel and to the server via satellite. The server can reconstruct the commands by receiving the parallel data streams, demultiplexing the streams and placing the packets in proper order. FIG. 6 is a flowchart of a method for transmitting and receiving data streams via satellite over the multiple antennas. In process block 610, an audio/video data stream is received from a camera and microphone. In process block 620, the data streams are split and transmitted in parallel. Each data stream can include packets of audio and video data. In process block 630, while transmitting the audio/video data streams, multiple data streams are received in parallel. The multiple streams can include just audio or audio and video. In process block 640, the audio/video streams are re-joined to construct individual audio and/or video streams so that the user 122 can listen and/or watch the news person 130 positioned at the server location. Thus, the system allows for two-way communication over the portable satellite antennas.

FIG. 7 is a flowchart of a method for reordering and synchronizing packet data. In process block 710, on the server side, the multiplexed signals are received from the Internet. In process block 720, the signals are demultiplexed into the audio and video streams. Once the signals are demultiplexed, sequence numbers can be obtained from the packet headers. The sequence numbers can be used to order the packets of data in the correct order, which corresponds to the originating

audio/video signals received from the client computer. In process block 740, synchronization information is obtained from the IP header in order to synchronize the audio and video. FIG. 8 is a flowchart of a method for generating duplicate audio streams. In process block 810, the bandwidth is monitored. Typically, such monitoring is done on the server side based on the packet receipt rate. Alternatively, monitoring can be done on the client side based on received packets. The bandwidth information can then be passed back to the client 110. In process block 820, if the available bandwidth exceeds a predetermined threshold, the client can submit duplicate audio streams to the multiplexer 216. Redundant audio streams provides a higher probability that one of the audio data streams is received at the server without corruption. In process block 830, the video and duplicate audio streams are transmitted in parallel. FIG. 9 shows an example IP header that can be used for sequencing the audio and video streams and for synchronizing the streams. The IP header includes a 32-bit sequence number. The sequence number can be used for ordering the audio stream and the video stream at the server. However, each stream can have its own independent sequence, so to synchronize the two, the options field shown at 910 can be used. The synchronization information stored can include a packet number of the audio stream and a packet number of the video stream so that the two corresponding packets can be aligned. Once synchronized and ordered, the audio and video streams are sent over I/O port 310 to be broadcast on the monitor 150. FIG. 10 shows an example RTP header. RTP provides end-to-end network transport functions suitable for applications transmitting real-time data, such as audio or video data.

FIGs. 11 and 12 show exemplary top-level software and data flow diagrams for the client and server. First, for FIG. 11, the shared data region provides a common system resource that is shared between all tasks. This data region is used to store global system operating parameters, system status information, system run-time data and statistics, and system configuration parameters. The shared data region is also used to provide a mechanism for inter-task communication and signaling. Access to the shared data region is managed through and Application Programming Interface (API) library that provides controlled access to the regions functions and data. The A/V encoder encodes data in a well-known manner. The encoded audio and video streams are transmitted to the multiplexer task via the local loopback address at a predetermined ports for the audio stream and for the video stream. The multiplexer process accepts the audio and video streams from the A/V encoder, splits the audio and video streams, and re-transmits the stream parts to the server de-multiplexer via the satellite terminals. In single terminal mode, streams are transmitted to a single I/O interface. In dual terminal mode, stream packets are tagged and multiplexed between the two I/O interfaces to take advantage of non-coupled available bandwidth. The multiplexer process accepts configuration requests from the user interface manager process. Configuration requests can be used to set transmission and stream processing parameters. Multiplexer status is maintained in the shared data segment. The system monitor and control process is used to monitor system temperature, and to control the A/V encoder. System temperature is accessed via the bus of the computer system. A/V encoder control is accomplished via an interface provided by the A/V encoder application. The system monitor and control process accepts control requests from the user interface manager process for controlling the A/V encoder process. System monitor and control status is maintained in the shared data segment. The configuration manager process is used to manage system configuration settings. System configuration data includes IP address to receiver mappings, and transmit operation modes. System configuration data is stored to, and retrieved from, the solid state compact flash drive of the computer system. The configuration manager process accepts configuration requests from the user interface manager process. Requests include editing of configuration parameters and storage/retrieval of configuration data. Operational configuration parameters are maintained in the shared data segment. The terminal interface manager process is used to manage the configuration and operation of the satellite terminals. The terminal interface manager uses the AT command spec for the satellite terminals to perform initial configuration and setup of the terminals, perform control and monitoring of terminal initialization, perform control and monitoring of terminal broadcast operation, perform monitoring of terminal status, and to restore terminal configuration. Initial terminal configuration includes gathering restore point of terminal configuration, enabling wireless option of terminal, and modifying IP access parameters of terminal. Control and monitoring of terminal includes monitoring terminal pointing status, terminal ready status, terminal GPS status, terminal battery status, terminal temperature status, terminal connection status, and terminal transmit status. The terminal interface manager accepts terminal requests from the user interface manager process. Requests include connection commands, and setup/restore commands. Terminal operation and status data is maintained in the shared data segment. The user interface manager process provides operational screens and menus for display on the touch screen, manages user input requests, and displays system operational parameters and status on the touch screen. The user interface manager process issues requests to the multiplexer process, the system monitor and control process, the configuration manager process, and the terminal interface manager process. The user interface manager accesses system operational parameters and status in the shared data segment. A variety of satellite antennas can be used, such as the Hughes 9201 BGAN terminal. The de-multiplexer process accepts the single or dual IFB audio stream(s) from the server. The demultiplexer provides a single IFB audio stream to the audio decoder process. The audio decoder accepts a single audio stream from the De-Multiplexer task and decodes the stream for analog output. The decoded audio stream is provided to the audio line-out of the computer system. This is provided for IFB audio support. The server interface manager provides a control and status interface between the client and server systems. This interface is used to generate and process server interface commands (such as bandwidth test request/results, broadcast start/stop, progress status reports, etc.), and to maintain status and progress of the server-side system processes and functions. The server interface manager, along with the caster interface manager on the server-side system, provide a tightly-coupled closed-loop control and monitoring capability for the client and server systems. The bandwidth test function executes the bandwidth test with the corresponding function on the server-side system. The control and status of the test is managed via the Server Interface Manager. The watchdog task is used to monitor the health and execution the client software system. A watchdog timer can be maintained for each system task, and health/status information is monitored via the shared data region. Expired watchdog timers indicated task- level failures. The watchdog task provides a system level of fail-safe and error recovery capabilities required to support reliable operation. The updater provides the capability for field updates of the client system via the external USB interface. The updater task executes prior to any other task in the system and scans for software system updates on removable media. Updates for the updater task are processed first, followed by any other software system updates. The updater task also provides a controlled startup environment for the client system. Software system tasks are prioritized and sequenced for startup and execution.

FIG. 12 shows the server-side top-level software and data flow. The shared data region provides a common system resource that is shared between all tasks. This data region is used to store global system operating parameters, system status information, system run-time data and statistics, and system configuration parameters. The shared data region is also used to provide a mechanism for intertask communication and signaling. Access to the shared data region is managed through and Application Programming Interface (API) library that provides controlled access to the regions functions and data. The audio encoder is similar to the client side encoder already described. The encoded IFB audio stream is transmitted to the multiplexer task via the local loopback address at a predetermined port. The multiplexer process accepts the audio stream from the audio encoder, duplicates the stream if configured, and re-transmits the stream parts to client-side de-multiplexer task. The multiplexer process accepts configuration requests from the user interface manager process. Configuration requests can be used to set redundancy parameters used by the multiplexer process for transmitting the audio stream. Multiplexer status is maintained in the shared data segment. The system monitor and control process is used to control the A/V encoder. A/V encoder control is accomplished via the COM/OLE interface provided by the encoder application. The system monitor and control process accepts control requests from the client interface manager process for controlling the encoder process. System monitor and control status is maintained in the shared data segment. The configuration manager process is used to manage system configuration settings. System configuration data includes the system IP address setting. System configuration data is stored to, and retrieved from, the hard drive of the computer system. The configuration manager process accepts configuration requests from the user interface manager process. Requests include editing of configuration parameters and storage/retrieval of configuration data. Operational configuration parameters are maintained in the shared data segment. The de-multiplexer process accepts the A/V streams from the client-side multiplexer. The streams are "re-bonded" to form complete single streams. Error correction is applied if necessary. The de-multiplexer provides complete A/V streams to the A/V decoder process. The A/V decoder process accepts the audio and video streams from the de-multiplexer process and generates a composite audio/video signal for output. The interface manager provides a control and status interface between the server and client systems. This interface is used to process and respond to interface commands (such as bandwidth test request/results, broadcast start/stop, progress status reports, etc.), and to maintain status and progress of the server-side system processes and functions. The interface manager, together with the server interface manager on the client- side system, provide a tightly-coupled closed-loop control and monitoring capability for the server and client systems. The bandwidth test function executes the bandwidth test with the corresponding function on the client- side system. The control and status of the test is managed via the client-side Interface Manager. The updater provides the capability for field updates of the server system via the external USB interface. The updater task executes prior to any other task in the system and scans for software system updates on removable media. Updates for the updater task are processed first, followed by any other software system updates. The updater task also provides a controlled startup environment for the server system. Software system tasks are prioritized and sequenced for startup and execution.

In a particular implementation, the multiplexer can split the audio and video streams received from the A/V encoder process. Splitting is accomplished by alternating one or more TCP/IP data packets between transmit queues connected to each of the BGAN terminals. In one embodiment, the multiplexing alternates each packet as it is received. The user interface manager process provides requests to alter this scheme by selecting various burst rates. In burst rate mode, the multiplexer queues the specified number of packets in sequence to a transmit queue before alternating to the opposite transmit queue - thus creating a "burst" of packets through each queue. In either mode, the audio stream and video stream are initialized to begin transmission on opposite transmission queues. Audio/video synchronization keys are stored in the options field of the IP header utilizing unused and experimental option values. The split streams are transmitted using the Real- Time Transport Protocol (RTP).

Table 1.1

The de-multiplexer reconstructs the split audio and video streams transmitted by the multiplexer. Reconstruction is accomplished by using the sequence number field in the RTP header to insure that the order of the packets is restored for each data stream. Additionally, synchronization between the audio and video streams is achieved by using keying information stored by the multiplexer in the IP header options field. The reconstruction process uses a ring-buffer and a user specified buffer delay time to store and order the incoming stream data packets. Packets for a reconstruction sequence not received within the buffer delay window are considered lost. A polarity algorithm is used to manage sequences that incur rollover of the RTP header sequence number. When rollover occurs, packets are assigned polarity by comparing the sequence number of the packet to the largest allowed sequence number (Oxffff). The following code demonstrates the calculation of polarity:

IPid_WindowStartElement =

ntohs(p_WindowStartElement- >ipPacket.rtpHeader.PayloadCounter);

dif_l = Oxffff - IPid_WindowStartElement;

dif_2 = IPid_WindowStartElement;

y = (dif_l < dif_2 ? dif_l : dif_2); // y = distance from IP header id wrap polarity_2 = (dif_l < dif_2 ? LEFT : RIGHT); // polarity indicates which side of the wrap this element is on

Reconstructed streams are transmitted to the A/V decoder process.

In view of the many possible embodiments to which the principles of the disclosed invention may be applied, it should be recognized that the illustrated embodiments are only preferred examples of the invention and should not be taken as limiting the scope of the invention. Rather, the scope of the invention is defined by the following claims. We therefore claim as our invention all that comes within the scope of these claims.