Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
PORTABLE DATA TRANSMISSION SYSTEM FOR GLOBAL AND LOCAL COMPUTER NETWORKS
Document Type and Number:
WIPO Patent Application WO/1999/056457
Kind Code:
A2
Abstract:
The present invention provides such a data transmission system (100), which encodes and transmits video, audio and text information across a global computer network without the use of a personal computer, workstation or video capture card. The data transmission system (100) can also receive transmissions from other like systems and decode and display or play the information on the video, audio and text outputs included in each system. The hardware, protocol and driving software components of the disclosed system provide a method for transmitting live, real-time and stored video and audio signals across the global computer network. The method includes the steps of: receiving analog video signals from a video input source (10) at a video preprocessor (110), which converts the received analog video signals (114) into alternative video signals that are useful for further processing; storing the preprocessed video signals in video memory (142) for further processing, transmission and display; storing system software, including an operating system in program memory (144); processing the converted video signals using a system processor (130) running the system software for processing the converted video signals for direct streaming over a global computer network; transmitting the processed video signals directly over the global computer network using a digital system processor-implemented communications means (172); and displaying the video signals being transmitted on a display means (180).

Inventors:
SMITH JOSEPH J
GOECKLE HANSPETER
Application Number:
PCT/US1999/009261
Publication Date:
November 04, 1999
Filing Date:
April 29, 1999
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
ZULU BROADCASTING LLC (US)
International Classes:
G06F5/00; H04H20/00; H04N7/10; H04N7/14; H04N; (IPC1-7): H04N/
Foreign References:
US5612732A1997-03-18
US5793413A1998-08-11
US5748786A1998-05-05
US5657028A1997-08-12
Attorney, Agent or Firm:
Bourque, Daniel J. (P.A. Suite 303 835 Hanover Street Manchester, NH, US)
Download PDF:
Claims:
CLAIMS
1. An information transmission system comprising: (a) a video preprocessor, responsive to received analog video signals for converting raw video signals into alternative forms that are useful for further processing; (b) a memory unit, including data storage memory storing the converted video signals and flash memory for storing system software; (c) a system processor running said system software for processing the converted video signals for direct streaming over a global computer network; (d) a communications means implemented in a digital system processor (DSP) for connecting said system to the global computer network; and (e) a display means for displaying the converted video signals being streamed over the global computer network.
2. The information transmission system of claim 1, wherein the video preprocessor means comprises a video input processor, which receives and processes said raw analog video input signals and provides a digital video data stream, and a video data encoder/compressor, which is responsive to said digital video data stream and compresses said digital video data stream.
3. The information transmission system of claim 2, wherein said video data encoder/compressor is a wavelet compressor, which executes a wavelet function on said digital video data stream.
4. The information transmission system of claim 1, further comprising an audio preprocessor, responsive to received analog audio signals for converting raw audio signals into alternative forms that are useful for further processing.
5. The information transmission system of claim 4, wherein the audio preprocessor comprises a stereo coder/decoder, responsive to received analog audio signals, for converting the received analog audio signals into a digital audio data stream.
6. The information transmission system of claim 5, wherein the audio preprocessor further comprises an audio encoder/compressor for performing realtime signal synthesis.
7. The information transmission system of claim 6, wherein said audio encoder/compressor is implemented in a digital signal processor (DSP).
8. The information transmission system of claim 1, wherein said communications means comprises a DSP implemented modem.
9. The information transmission system of claim 1, wherein said communications means comprises a PC/MCIA card implemented communications device.
10. The information transmission system of claim 9, wherein said PC/MCIA cardimplemented communications device is a wireless communications device.
11. A method of transmitting information directly over a global computer network comprising the steps of: (a) receiving analog video signals from a video input source at a video preprocessor, said video preprocessor converting said received analog video signals into alternative video signals that are useful for further processing; (b) storing said preprocessed video signals in video memory for further processing, transmission and display; (c) storing system software, including an operating system in program memory; (c) processing said converted video signals using a system processor running said system software for processing the converted video signals for direct streaming over a global computer network; (d) transmitting said processed video signals directly over said global computer network using a digital system processorimplemented communications means; and (e) displaying the video signals being transmitted on a display means.
Description:
TITLE PORTABLE DATA TRANSMISSION SYSTEM FOR GLOBAL AND LOCAL COMPUTER NETWORKS RELATED APPLICATIONS Provisional Application No. 60/083,516 filed April 29, 1998.

FIELD OF THE INVENTION The disclosed invention provides a portable video, audio and text transmission system for encoding and decoding video, audio and text signals for either real-time or delayed transmissions across a global or local computer network.

BACKGROUND OF THE INVENTION Video and audio compression, encoding, transmission and reconstruction devices have been historically designed to work inside workstation and personal computer (PC) architectures. These methods require the use of a computer to compress, transmit and decompress video and audio signals on both ends of the transmission. Typically, these methods require the use of a video and audio capture card to be installed inside of a PC or workstation thereby relying on the architecture and functionality of a typical computer. These methods of video/audio transmission are constrained and fixed to the physical location, network connection, and power supply that typical computers require.

These constraints and the reliance on computer architectures has many disadvantages. First of all, a person

who wishes to use such a product must have a PC or workstation, must know how to or have a technician install and configure all of the required hardware into the computer to perform the video/audio compressing and transmitting.

Secondly, computers are not portable in nature. In the recommended operating environment as described by computer and workstation manufactures that computers should not be moved during their operation. Portable computers (i. e. laptops) by their nature can be moved more easily but it is still not recommended to move them during operation.

Portable computer are also not designed to accommodate video/audio compression cards.

The global computer network, commonly known as the Internet, provides a new communication medium which can carry the transmission of many forms of information to computers and other viewing devices around the world. This global computer network is bases on an open, non-proprietary communications protocol for transmitting and receiving data.

The data can represent many types of information such as text, still images as well as audio and video signals.

SUMMARY OF THE INVENTION A principle object of the invention is, therefore, to provide a single-purpose multi-media device, which is unencumbered by business software products and multi-purpose generic computing devices.

The present invention provides such a device, namely, a data transmission system, which encodes and transmits video, audio and text information across the global computer network without the use of a PC, workstation or video capture cards. The data transmission system can also receive transmissions from other like systems and decode and display or play the information on the video, audio and text outputs included in each system. The hardware, protocol and driving

software components of the disclosed system provide a method for transmitting live, real-time and stored video and audio signals across the global computer network. The invention also provides a method of receiving video, audio and text information from the global communications network and displaying the received information on integrated output devices.

DESCRIPTION OF THE DRAWINGS These and other features and advantages of the present invention will be better understood by reading the following detailed description, taken together with the drawings wherein: Figure 1 shows the major components required to use the disclosed system and how the system interrelates with additional components and systems to transmit information over the global computer network; Figure 2 is a functional block diagram of the disclosed information transmission system transmission system; Figure 3 is a side view of one embodiment of the disclosed information transmission system showing the input and output connections included thereon; Figure 4 is a front view of one embodiment of the disclosed information transmission system showing the display screen and input buttons; Figure 5 is a circuit board diagram showing a first level of a circuit board used in the disclosed system and the inputs and outputs included thereon, which are used for attaching peripheral components to the system and for operating the system; Figure 6 is a circuit board diagram showing a second level of a circuit board used in the disclosed system and additional inputs and outputs, which are used for attaching additional peripheral components to the system.

Figure 7 is a functional block diagram showing global positioning system components, which are included in one embodiment of the disclosed system; and Figure 8 is a functional block diagram showing a plurality of the disclosed information transmission systems linked using wireless connections to relay information to the global computer network.

Detailed Description of the Invention Turning now to the figures and, in particular, figure 1, an overview of the major components required to use the disclosed system and how the system interrelates with additional components and systems to transmit information over the global computer network is shown. The disclosed information transmission system 100, receives video and audio information from a video/audio source 10, via either a hardwired input cable 12 or via a wireless transmission means (not shown).

Once the system 100 receives the information, it processes and encodes the information as will be more fully described below and establishes a data connection 14 to the global computer network 20. The data connection 14 may be any one of a number of available means of transmitting data from the system 100 to the global computer network 20, including standard dial-up or leased line telephone communications 14a, ethernet communications links 14b, cellular telephone links 14c, satellite communications links 14c, radio frequency (RF) communications links 14e, fiber- optic link or other similar means of hardwired and wireless communications links, such as infrared and wireless ethernet links.

The information is then transmitted over the global computer network 20 to at least one broadcasting host computer 30 and, in the preferred embodiment, a farm of host

broadcasting computers 30a-30d, which receive the information from system 100 and retransmit the same to viewers. The farm of host broadcasting computers 30a-30d transmits the received information via one or more router 32, again over the global computer network 20 to one or more viewers, who view the information on one or more viewing apparatus 40, which may include a multimedia PC 40a, a laptop computer 40b, a MACINTOSHO computer 40c or any other like viewing apparatus. Another information transmission system 100 may also serve as a viewing apparatus 40.

Turning now to figure 2, information transmission system 100 is shown. System 100 includes a video preprocessor 110, which receives video information in the form of analog video signals 114 from video input source 10 via video input 112. In the preferred embodiment, video preprocessor 110 comprises an integrated video encoding/compression chip set. In one embodiment of the invention, the chip set is the Analog Devices ADV601 (model ADV601LC_TQFP), HM514265DLTT-6_44 and SAA7111A wavelet compression chip set. The chip set accepts NTSC/SECAM and PAL video signals and encodes the video signal into digital data in a form that is represented by a wavelet compression algorithm. The video preprocessor 110 takes the raw data and converts it into alternative forms that are useful for directly streaming over the global computer network, or for use in video editing and production software programs which run on workstations and computers.

In one embodiment, the system 100 also includes an audio preprocessor 120, which receives analog audio signals 124 from an audio input source 10 via audio input 122. The audio preprocessor 120 converts the raw, analog audio signals into a format which can be directly streamed over the global computer network or for use in audio editing

software programs that run on workstations and computers. In the preferred embodiment, the specific chips used for audio preprocessor comprises the ADSP2181 and AD1847 chips manufactured by Analog Devices. However, additional embodiments may not include task-specific audio chips or preprocessors.

The controlling functionality of the system is provided by system processor 130, which, in the preferred embodiment, is based on the National Semiconductor NS486SXF chip. Of course, other semiconductor chips and chip sets, such as the Cyrix Media GX 266 or 300, are considered equivalents.

The system processor 130 provides for the basic component integration. The above-mentioned National Semiconductor NS486SFX chip, is a low-power, low-cost processor with built in support for a UART, parallel port, LCD, PCMCIA, general purpose 10 (input/output), and many other features such as timers, real-time clock and DMA, which makes it especially suitable for use as the system processor 130 for information transmission system 100. The processor 130 is compatible with the Intel 486 processor, and can run standard Intel platform software. It also has some useful features for embedded system development such as user definable 10 lines and LCD controller. The Cyrix chip set mentioned above provides additional enhancements, including color touch screen interface capabilities as well as a universal serial bus (USB).

As in any PC, there is an operating system (OS) running on the processor. The operating system is software which provides all the low level basic functions which any embedded applications will need to use. In the preferred embodiment, the system processor 130 runs the PharLap embedded operating system. The PharLap operating system is a true real-time, protected mode operating system, which

runs C++ code developed and tested in a PC environment.

PharLap software includes a full networking package, providing full TCP/IP support for multiple sockets, and Ethernet support. It is very flexible, so that it is easy to add device dependent functionality or extensions to it's internal functions. PharLap software is also low-cost and easy to use. Furthermore, the PharLap OS is a true multi- threaded 32 bit operating system, just like Windows 95 or NT. A thread is software which runs independently from other possible software threads. The operating system manages these threads, and determines when they can run and what they can do.

The system 100 also includes system memory 140 including data storage memory 142, which provides data storage for video and audio information. Preferably, data storage memory comprises DRAM SIMM (model SIMM4X36) memory in lots of 1,4,16 or 32 megabyte capacities.

Additionally, system memory 140 preferably includes flash memory 144 for on-unit program storage. The programs stored in flash memory 144 include both proprietary programs used to drive the specific functionality of the various chips for video and audio encoding and transmission as well as non-proprietary embedded operating system software, such as the operating system software manufactured by PharLap Software, Inc. In the preferred embodiment, flash memory comprises 74ACTQ244 and 74ACTQ16244 chips.

Transmission of the information is achieved by using a Digital Signal Processor (DSP)-implemented communications means, which comprises one or more re-programmable chip (s) that can store programs. One implementation of the DSP- implemented communications means is a DSP, which runs software designed to configure the DSP to serve as a built- in modem 172. The DSP processes the stored information and

forwards the same to data output 170. Alternatively, the output DSP may be implemented on a plug-in PC/MCIA communications card 174, which may be included with or added to the system at a later date. In the disclosed embodiment, the information transmission system 100 uses the Analog Devices ADSP2181 and AD1843 chips to store a modem communication program. The basic communications protocol is provided by the PharLap embedded operating system.

The information transmission system 100 also includes a display 180, which, in the preferred embodiment is a LCD screen. Display 180 displays information about the operation of the system. The display screen is a 320x240 pixel LCD display. It is controlled by the built-in LCD controller in the system processor 130. A section of the system memory is reserved as a display buffer. The display buffer contains the values for the pixels on the screen. The LCD controller then uses a system DMA (Direct Memory Access) controller to transfer the data from the memory buffer to the LCD screen.

This hardware function works in the background of the system software. Thus, the system software simply places the proper pixel pattern in the buffer, and the display will be automatically updated.

The system 100 is operated with four push buttons 190a- 190d, which are provided for user input. These buttons are connected to input lines to the system processor. The system software must poll the status of the buttons to see when they are pushed. These buttons implement virtual functions. This means that the action of a button depends on the software that is currently being executed. In order to let the user know the current functionality of buttons 190a-190d, the buttons are placed along the bottom edge of the LCD display 180 and the system software writes the button function text to the LCD adjacent to the location of

the physical button 192a-192d (Fig. 4). The user then knows which button to push for any current function. It is the responsibility of the system software to keep track of the current button functions and to properly write out the function to the LCD screen.

Alternatively, if the system utilizes a processor that provides touch screen interface capabilities, such as the Cyrix chip set mentioned above, then the push buttons 190a- 190d may be replaces with virtual push buttons that appear directly on the LCD display 180 as a touch screen interface.

As described above, the processor has a built in LCD controller which automatically moves data from a memory buffer to the LCD display. It is the responsibility of software to make sure the contents of that buffer are correct. The PharLap operating system (OS) provides software hooks to put in a custom display driver. The system software hooks into the display driver to provide support for standard software display commands.

The system utilizes standard bitmap fonts. Since, in a basic embodiment of the invention, there is no keyboard, a simple method of alphanumeric input is provided. The user can cycle through available display characters by pushing buttons 190a-190d (or the touch screen-provided virtual buttons) until the desired character appears.

Alternatively, in an additional embodiment, characters may be entered directly using an optional keyboard input (not shown). Pushing another one of the buttons then selects the displayed character. In addition, special commands useful for displaying menus and prompts are provided. These commands make it easy to position text on the screen. There are also a set of function look-up tables which make it easy to invoke actions based on button pushes.

Analog video and audio signals 114 and 124, respectively are received by the system 100 from video/audio input source 10. Video/audio input source 10 may be any one of a number of common sources of analog video and audio signals, including camcorders, professional video equipment, video cassette recorders (VCR), stereo equipment, microphones, audio visual equipment and the like. The video and audio signals are then processed by a video preprocessor 110 and an audio preprocessor 120, respectively.

There are two primary components to the video preprocessor 110. The first is the video input processor 115, which, in one embodiment of the invention comprises a Philips SAA7111a processor chip. This chip receives a raw analog video signal 114 from a camera, VCR, etc. as its input. The video input processor 115 processes the analog video signal 114 and provides a digital video data stream 116 as its output. The Philips SAA7111a processor chip can process any of the standard video formats, such as NTSC, PAL, and SECAM. This chip is controlled over a special serial bus called I2C. The system processor 120 has a built in I2C controller, allowing system software to easily write to the video input processor chip.

The second component of the video preprocessor 110 is the video data encoder/compressor 117, which, in one embodiment of the invention also comprises a DSP, such as the Analog Devices ADV601LC wavelet compressor. The encoder/compressor 117 receives the digital video data signal 116 from the video input processor 115 and executes a wavelet function on the data. The wavelet function provides a method of compressing the video data. It is a very scaleable, error tolerant compression method. This means that the output can be used in many different situations,

high quality broadcast, local networks, and Internet delivery.

The ADV601LC wavelet compressor works in with the ADSP- 2181 digital signal processing chip to perform a bi- orthogonal wavelet transform of each video image in a stream and then quantizes and compresses the images sufficiently so that they can be transmitted over the Internet at a low enough bandwidth to allow them to be received and displayed on a viewer's computer screen at an acceptable rate.

The encoder/compressor 117 sits on the processor bus as a standard 10 device. The system software can write parameters, such as compression factors, directly to the chip. Hardware DMA can be used to transfer compressed data to the computer memory, and the chip has an interrupt line to the processor, which is used to signal time critical events.

The encoder/compressor 117 is designed to run continuously while the video input is active. Thus all pixels of every video frame are compressed and sent to the output. A continuous output is acceptable if the system is connected to a fast network, which can handle the data rate.

However, this will not be the case most of the time. Even with fast connections, Internet transmissions and software playback codecs (encoder/decoders) cannot support the high data rate of full video playback. Therefore, only a portion of the video data will be utilized by the system. (However, it should be understood that in the future, as connectivity and processing speeds increase due to hardware and software advances, the disclosed system will support full frame television quality playback at rates of at least 30 fps at a resolution of 720x586 pixels.) The encoder/compressor 117 uses direct memory access (DMA) to transfer data to the data storage memory 144. DMA

is a hardware technique that moves data as it becomes available regardless of what the system software is currently doing. The encoder/compressor 117 is programmed to issue an interrupt whenever the video signal 116 reaches the end of a video field. The interrupt causes the system to access a video software handler, regardless of what process the software is currently executing. Top priority at that point is to keep the DMA data transfer running, so that nothing is lost.

The system software must decide if the last video data received will be utilized or ignored. The software then sets up the DMA controller appropriately for the data from the next field. If the previous data is not to be utilized, the DMA is set up to overwrite it. Otherwise, if the previous data is to be utilized, then the DMA will be set up to ensure that the new data is saved in a different memory location. Since a full flow of video data can quickly fill up memory, this process of determining what data is required and where to save that data is important to efficient operation.

The final step in handling the video interrupt is to set software flags and pointers. These values are used by software threads running independently of the interrupt handler. For example, the data transmission thread monitors a software flag to identify when more video data is available, and the pointer will tell the transmission thread where the data is located.

There is also some setup involved with the wavelet compressor 117. While wavelet theory is complex, it is well know to those skilled in the art and will not be discussed in detail here. Suffice it to say that there are 42 compression control registers corresponding to 42 frequency components of the video image. The compression values are

called bin widths. Software must set these bin widths to control the amount of compression, which relates to video quality. Access to higher bandwidth for compressed video transmission means that the bin widths can be changed to allow higher quality video.

There is also another factor involved in bin width control. Video on the Internet is not usually played back in a full screen. Rather, it is usually played back in a window of the screen. Therefore, the playback image must be squeezed down from the original source. This is accounted for with the frequency-based nature of bin widths.

Squeezing video down tends to eliminate the high frequency components of a video image. Therefore, the high frequency components of the compressed video can be zeroed out, since they will be lost in playback anyway. This provides greater compression ratios, which is important when dealing with the Internet. Bin widths need only be set once per transmission, unless there is some need to adjust to bandwidth changes.

The concept of matching compression tc bandwidth via bin width changes, and dropping of video frames, is very important to the system. This makes it highly scaleable.

With high bandwidths available, the system can encode video at broadcast quality. As bandwidths decrease, the system can gracefully increase compression and reduce video frame rates as needed.

The NS486SXF chip allows post-processing to be performed on the video images that are produced by the ADV601 wavelet compressor. Post-processing allows an even reduced bandwidth requirement for acceptable image transmission over the Internet. Specifically, two types of post-processing strategies are utilized. The first is known

as"delta frames"and the second is known as"motion vectors." Instead of simply sending a video stream as a series of compressed complete video frames, delta frames allows the disclosed system and method to send an initial compressed complet frame and, subsequently only the compressed difference between the current frame and the previous frame.

This technique alone affords a significant reduction in bandwidth and, consequently, higher frame-rate speeds and improved perception of motion by a viewer at a desktop computer.

In one implementation of delta frames, small differences in image frame to image frame are totally ignored. This technique, which is known as"thresholding", improves frame-rate speeds. Also useful in providing higher frame-rate speeds is a technique known as"pixel averaging." With pixel averaging, small groups of digital elements of an image (i. e. pixels) are averaged, with virtually no loss in video quality.

Motion vectors performs motion prediction and compensation functions. This feature divides a video image frame into a grid of small squares or frames called "macroblocks". For each macroblock in the current image, the system locates that macroblock in the previous image. A motion vector is then determined, which describes the difference in location of a macroblock. The system uses motion vectors in conjunction with an initial compressed full frame image to compute a motion vector predicted frame.

Once an initial compressed full frame is transmitted, the system only needs to transmit the motion vectors and the compressed difference between a current frame and a motion vector predicted frame over the Internet. In one preferred embodiment, the disclosed system computes macroblocks

beginning in the center of an image and spirals outward.

With this ordering, motion vectors compute faster and more accurately than with line by line ordering methods.

Like the video preprocessor 110, the audio preprocessor 120 also consists of two primary components. The first is a stereo coder/decoder (codec) 125, which, in one embodiment of the invention comprises an Analog Devices AD1847 stereo codec. The codec 125 receives an incoming analog audio signal 124 and converts it to a digital audio data stream 126.

This digital audio data stream is then provided to the second component of the audio preprocessor 120, audio encoder/compressor 127. The audio encoder/compressor is preferably a digital audio signal processor (DSP) and in one embodiment of the invention comprises an Analog Devices ADSP-2181 DSP.

A DSP is a computer optimized to process real-time analog signals or perform real-time signal synthesis. The audio DSP 127 can be reprogrammed to do a number of different audio encoding techniques. The audio DSP 127 also looks like an 10 port to the system processor 130, and has an interrupt line to the processor to signal time sensitive events.

There are two aspects of the system software 132, which are specifically related to audio processing. The first is the programming of the audio DSP 127. The other is the software that controls the retrieval of audio data from the audio DSP. Audio must be treated differently from video.

With video, it is possible to drop out some frames and still convey the basic video image. However, this is not the case with audio since the ear is very sensitive to loss of audio data, or any other glitches in the audio. Therefore, the system must give high priority to capturing and transmitting

all audio data. Audio compression techniques are also not as scaleable as video, so the system cannot simply change audio compression factors.

Before the audio DSP 127 begins audio encoding, it is necessary to first pick an audio compression algorithm based on the final use. For live transmissions with low bandwidth, an algorithm such as G. 732.1, which is a common videoconferencing standard, must be selected. Trying to use a higher quality method with low bandwidth transmission guarantees that the system will lose data. Once the user informs the system what type of audio algorithm will work, the audio DSP 127 can be programmed with that algorithm.

Programs for a number of common audio compression algorithms are available from the DSP vendor, and commercially. These DSP programs are stored in flash memory 144, and loaded into the audio DSP 127 as needed. This reloading of DSP programs provides enhanced system flexibility. In the future, the system can even implement audio algorithms not currently available by downloading the DSP code into the system over the Internet, along with the control software to use it.

Different audio algorithms may require slight differences in the control software, but the basic concept remains the same. Audio data bandwidth is much lower than video, so more flexibility is available.

Overall, the concept is always the same. The audio DSP 127 prepares a buffer of digital audio data, signals the processor, and then the system software retrieves the audio data. Different audio DSP programs may have different signaling and communication protocols, so the system's audio driver must respond accordingly. Usually an interrupt will signal the processor that audio data is available. However, sometimes it may be necessary for software to look for a

status flag instead of a hardware interrupt. In that case, the software audio driver will poll the DSP at regular intervals with a timer, instead of waiting for the DSP.

Once audio data is available, the system software will use DMA or software controlled data transfers from DSP to data storage memory 142. The software handler then sets the correct flags and pointers for the regular process threads to take appropriate action. The audio software and the transmission software must always make sure that no data is lost.

The stored video and audio signals are then further processed using system software 132. The extraction of data from the video and audio compression hardware is a time critical process. While the system hardware, including system processor 130 and system memory 140 provides some buffering of the data, the system software 132 must retrieve data from the hardware in a timely process. If data retrieval is not timely enough, data could be irretrievably lost. Therefore, the highest priority is given to the video and audio capture process.

The system hardware components signal the system processor when data is ready through the use of interrupts.

These are dedicated signal lines which force the processor to switch execution to the software responsible for handling that event. The system software 132 then takes the steps necessary to manipulate the data available at that time.

Like data retrieval, system communications through modems or network connections are also time critical and interrupt driven. However, unlike data retrieval, system communications are not as sensitive and short delays will not result in data loss. Rather, short delays will merely delay data transmission.

Besides these and other hardware events, the system software is also responsible for running multiple processes.

The main process always looks for and responds to user input. Another process transmits the encoded data. Another process handles command communication with the server, and another handles text transmissions. These processes are all controlled by the operating system which switches between them as needed.

The system software is multithreaded, just like Windows 95 or NT. This allows multiple software processes to be run simultaneously. The operating system takes care of allocating resources to each thread as needed. The system can be instructed how to allocate resources to threads by defining what activates threads, and what priorities the threads receive. For example, an event can be defined, such as a button being pushed or data being received, to activate a software thread which handles that event.

Time critical events, such as the interrupts from the system's video encoder 117, can occur during any thread.

These interrupts take over program execution and generate an event recognizable by the other threads if necessary when done.

The main thread of the system software is the one that handles menus and monitors the buttons or touch screen. It handles the navigation to other menus, inputting of user data, setup of hardware, etc. Whenever networking is needed, the main thread creates the threads which handle networking.

More than one thread is used when encoding and transmitting data. There is of course the thread which transmits data to the server. This thread establishes a link to the server, and then transmits data to the server as necessary. In the case of a live transmission, this thread

lets the video handler know when it is ready to accept more data, therefore controlling the capture rate of the video based on the transmission rate. For delayed or buffered transmissions, the video handler just keeps storing data for transmission, and the transmission thread transmits it at the network bandwidth.

Besides the transmission thread, there is also a thread which controls commands between the embedded software and the server. These commands are used to identify a unit and transmission event, and define actions, such as start, pause, and stop. This is a different process from data transmission, and making it a separate thread makes the software structure much more straight forward. This control thread also signals the transmission thread when to start and stop.

An additional thread is used to implement a chat feature. Chat is when Internet users send messages back and forth. Although the majority of chat messages are text-based messages, chat messages also include audio as well audio/video messages, using Voice of Internet and other standard protocols. Thus, a chat message can be sent to a broadcaster and can be played back on the system.

Thus, the chat feature can be used to allow someone accessing the Internet to send a message to a transmitter of a live event. That message would then appear on the LCD screen of the transmitter's system. Since this process is totally independent of the rest of the network functions, it is implemented as a separate thread. If an incoming message is detected, the thread is activated, and a message is displayed.

The system software 132 runs on system processor 130 and generates output data 162 in the form of video frames and audio packets, which are forwarded to at least one

built-in modem 172 or PC/MCIA communications card 174 for transmission over the global communications network in the form of final video/audio/text signal 166. Modem 172 and PC/MCIA communications card 174 are selected to provide a variety of communications methods to access the global communications network using a variety of communications means, including but not limited to standard telephone lines, cellular communications links, ISDN lines, ADSL, XDSL or HDSL lines, cable modems, ethernet cards, wireless ethernet, satellite telephone communications, RF communications and other wireless connections.

As mentioned earlier, the internal modem 172 comprises a DSP. In one embodiment, the modem DSP comprises an Analog Devices ADSP-2187 digital signal processor. The ADSP-2187 is essentially a faster version of the processor chip used as the system's audio DSP 127. Using a DSP instead of a dedicated modem chip increases system flexibility, since the DSP can be reprogrammed. The internal modem 172 can be programmed to operate either as a standard V. 34/90 modem, or for connection to an ISDN line. However, since the interface components between the DSP and the phone line are different for V. 34/90 modem and ISDN applications, a replaceable module is required to allow connection to either a standard phone line or an ISDN line.

Networking is controlled by the PharLap OS. The PharLap OS comes with built-in support for standard network protocols, and standard modems and Ethernet adapters.

Efficient, interrupt driven hardware drivers are also provided. The operating system takes care of all the details of establishing and maintaining TCP/IP (networking standard) links. A standard Windows Socket library allows system networking applications to be written exactly as under Windows, with disregard for the underlying hardware.

For non-standard hardware, source code and provisions are available to integrate that hardware into the standard system. For example, the built-in modem 172 used by the system requires integration. While the internal modem 172 uses the same commands as any other modem, it does not quite look like a serial port to the system. The functionality remains the same, but it uses different ports and registers.

To integrate the modem 172, a standard serial port driver must be modified so that it uses the required registers.

The modem still looks like a serial port to the system software. However, the lowest level of code has been modified. Since it is another DSP, the modem must also be loaded with a program before it can be used. It is also reprogrammable to act in different ways.

Finally, video output data 164 is provided to on-board display 180 to provide a video preview capability on the system itself.

The disclosed system can be used in many configurations of sources, connectivity and operational modes. The system includes a plurality of input and output connectors, which allow for the various configurations. The input and output jacks and plugs as well as the controls of the system are shown in figures 3 and 4. Controls for the system include an On/Off button 200 and the four control buttons 190a-190d described above. Also included is a DC In connection 202, which allows the system to be connected to a DC power source (not shown). The DC power source provides sufficient voltage to power the system's on-board battery, which, in turn powers to components of the system.

The inputs include Video Input 112 and left and right Audio Inputs 122L and 112R. Also included is a Microphone Input 204.

The outputs include Line 1 and Line 2 outputs 302 and 304, respectively from the built-in modem 172 (figure 2).

Also included is Serial Output Port 306, PC/MCIA slot 308 and Headphone output 310.

The serial port 306 is a standard 9 pin serial port or USB connector. The system processor 130 (figure 2) includes a standard PC compatible port on the processor chip. While serial port 306 is not needed for normal use, it does provide expansion options. For example, the serial port 306 could be used to attach a keyboard to the system. Or, an external modem could also be attached, which would allow the system to transmit data faster by using both the internal modem 172 and an external modem (not shown) simultaneously.

The serial port can also be used for direct connection to another computer, allowing video and audio data to be directly transferred.

PC/MCIA slot 308 is a standard, single or dual PC type of connector commonly found in portable computers. It provides a common mechanical and electrical specification for external boards to be plugged into the system. A large number of commercially available PC/MCIA compatible boards, which provide many different functions, are available. All PC/MCIA boards also contain information which identify the board and it's functions and provide that information to the host processor.

PC/MCIA card slot 308 provides a great deal of expansion options. One option is an additional, modem, which would provide even faster transmission capabilities.

One of the most important options that will be supported by the system is an Ethernet card. Ethernet is an industry standard for networking computers that provides extremely fast data transfer rates. With an ethernet card plugged in, video and audio data can be transferred at high rates, and

with very high quality, directly into existing computer networks. The PC/MCIA slot 308 could also be used for connection to wireless networks as the technology becomes more readily available and affordable. Additionally, RF cards, flash memory storage cards and the like may be interfaced to the system using the PC/MCIA slot 308.

Additional system components, which are not visible from the outside of the unit are shown in Figures 5 and 6.

Power for the system is provided from either DC power input 202, from an optional on-board battery 320 or from an external battery pack (not shown), which powers the system from battery DC output 324 to circuit board DC In 322 via power cable 323. The battery 320 can be recharged via its DC input 326 and power cable 334 from DC Out 336 of on-board DC power supply 330. The DC power supply 330 can be plugged into any standard 110 or 220 volt AC power source via it AC input 332 and AC power cord 338.

The processor circuit board also included LCD ribbon cable output connector 350, which provides outputs to LCD display 180 (not shown) via ribbon cable 352. The input buttons 190a-190d are connected to button cable input connector 196 by button cable 194. Finally, a parallel port 340, which is not utilized by the present embodiments of the invention is provided. This parallel provides expansion capabilities to the system.

One typical use of the system will be to connect a video camera, camcorder, VCR, or professional broadcast quality video equipment to the Video In jack 112 on the system 100. Alternatively, the system may include an integrated miniature camera (not shown) which would be fixed to the unit such that a separate video source would not be required.

The audio source will typically be from a video camera, camcorder, VCR, professional broadcast equipment. However, the audio source may also be from a stereo, microphone, or any audio device that has an output jack such as a radio or portable stereo unit (such as a Sony Walkman). The audio source may be connected to either the Mic In jack 204 or one or both of the Audio In Left and Audio In Right jacks 122L and 122R, respectively.

The system transmits and receives its data over the global computer network using either the TCP/IP communications protocol or the MSBD communication protocol.

The MSBD protocol is a communication language which has been developed by the Microsoft Corporation. The apparatus provides several methods of connecting to the global network for transmission.

Connection to the global network can be established with the modem (s) 172 (figure 2), which is/are integrated in the system or which interface the system using the PC/MCIA slot 308. The modem 172 can provide up to two simultaneous connections, meaning that a single modem can accommodate and act like two modems. This provides the operator of the unit the ability to connect to one or two standard analog telephone lines simultaneously. The physical connections are made with RJ11 telephone (or other industry standard) jacks for Line 1 and Line 2,302 and 304, respectively. Conversion jacks are available from other vendors for international phone connections. The apparatus is also designed to support ISDN telephone connections. However, an additional component, which is independent of the disclosed invention would be required for the connection.

Connection to the global network can also be established with the system's support for Ethernet connections. The system provides this ability by using the

integrated PC/MCIA slot 308 on the system. An Ethernet card (not shown) can be placed in the slot and connected to the operator's local area network. The local area network can, in turn, be connected to the global computer network.

Ethernet cards are available from a large vendor pool.

Certain Ethernet technologies have emerged that permit for a cordless connection to an Ethernet network. The technology is designed to fit into a PC/MCIA card which is placed into the apparatus'PC/MCIA card slot. This method allows for the remote use of the apparatus.

The system 100 may also be connected to the global computer network by means of a cellular or satellite telephone communications link. Many PC/MCIA modems support the connection to standard analog telephones. Digital cellular phone will also connect the system to the global computer network. The cellular telephones connect to a PC/MCIA modem by way of a chord and special adapter that fits into the RJ11 jack of the modem. This method allows for the remote use of the system.

Cordless computer connectors are essentially cordless phones that are designed to connect a modem to a telephone line with the need for a push-button dialing device. These units operate within the 900 megahertz frequencies and transmit the data to and from the apparatus across the air frequencies. This method allows for the remote use of the apparatus.

Through the use of a radio frequency PC/MCIA card, the apparatus may be connected to the global computer network. A radio frequency device transmits the data to and from the apparatus to a receiving radio frequency base station. The radio frequency base station is typically connected to a analog telephone line but many other means of connecting the

base station to the telephone, cellular, or satellite phone systems are available.

Additional features of another embodiment of the invention is shown in figure 7. In this embodiment, the system 100 incorporates the ability to report global positions through its use of integrated global positioning system (GPS) chips and software.

In this embodiment, GPS signals 400 are received by the system using GPS antenna 402 and GPS receiver 404. The GPS signals are then processed by GPS location processor 406, which calculates GPS data 408 in the form of an absolute GPS position of the system 100. The GPS data is then stored in system memory 140 and is further processed by system software 132, which converts the data into an information packet 162a, which can be transmitted over the global computer network as GPS location data 166a using either built-in modem 172 of any one of the various PC/MCIA cards 174 mentioned above. Like the video information displayed on display 180, the system software may also be configured to send GPS location data 164a to display 180, where it too would be displayed.

Thus, this embodiment of the invention detects its global position and reports the location to applications that require such information. The GPS feature of the system makes the invention serve as a GPS camera, which would be useful for military, security, emergency rescue and other application.

Turning now to figure 8, an information transmission system 100 may be used as a relay station to further transmit originating signals to a final destination. A plurality of systems 100a-100d may be configured in a chain like layout using any of the supported connection and communication methods. As shown in figure 8, one particular

usage of the system is to place a series of units linked together using wireless connections 500a-500c (e. g. wireless PC/MCIA cards) along a series of points from the originating source 100a of the signal to a final connection point 100d which may be connected to the global computer or local network. The final connection point 100d may access the global computer network using any of the communications methodologies mentioned above.

A further embodiment of the invention includes motion detection capabilities. The detection algorithms are based on changes in the video scene being received by the system 100. The invention detects the instantaneous change in video frames which indicates movement in the scene. This embodiment of the invention is highly suitable for security and surveillance applications or low bandwidth constrained broadcasting.

The invention may also be configured to provide a means of automatically notifying operators and/or remote third parties when operator selected events have been triggered.

Such events include motion within the environment, sound within the environment, temperature changes, provided thermal input capabilities are included, the unit's basic status such as power supply, and movement of the unit (possibly due to tampering).

The methods of notification are widespread and based on the message receiving application software. Such applications receive the notification from the system and forward the notification in formats such as e-mail, pagers, automatic telephone calling, and faxing. Depending on the triggered event the information which is sent to the interested party may include a picture of the environment, an audio clip of the sounds, status information about the

operational mode of the unit, units location, temperature data, and time, date of the triggered event.

The system provides a method for receiving video, audio, and text information from other units. The system receives the data and displays the video and text information and plays back the audio portions of the data.

Due to the countless applications of the system, the system is designed to be manufactured in several enclosures and configurations. One configuration includes a portable, battery operated system, for use as a portable system.

Another configuration includes a stationary table top system. The table top system is targeted to the video- conference/video kiosk industry whereby the unit can be placed on a table or in a kiosk and connected to the global computer network for receiving the data. Viewers can watch and listen to the broadcasts coming from either portable or stationary units.

The system also provides a means of connection to robotic arms and maneuvering apparatuses such that the system can instruct the robot control system to reposition the monitoring video, audio and temperature controls. This allows the robotics systems to alter the viewing direction, angle of view, magnification levels, audio sensory levels and directions.

In addition to the numerous configurations available for the variety of uses of the system, the system may be operated in several modes. The system can switch between any of the modes at any time alternately performing the functions that each mode provides.

In a first mode of operation, the system provides a means for encoding, compressing and transmitting video and audio data in real-time across the global computer network.

The operator can connect the video and audio sources and

establish the transmission connection. The apparatus is turned on, the transmission connection is made and the video/audio data is sent to the host computer for re- broadcasting on the global network.

In a second mode of operation, the system provides a means of storing video and audio content for later transmission. The operator may connect the video and audio source and begin recording the information. The video/audio data is stored in the data storage memory 142 (figure 2), which is included in the system 100. When the operator is ready to transmit the data, a transmission connection can be made and the information is transferred to the host computer for re-broadcasting on the global computer network.

In another mode of operation, the system is placed in a standby monitoring mode for detecting changes in the state of the environment in which the apparatus is placed. This mode allows for the detection of motion, audio, temperature, movement of the apparatus and any of the basic operating states of the apparatus. The monitoring mode triggers the notification system when any of the monitored events occur.

In another mode of operation, the system is placed in a reception mode whereby the unit can receive video, audio, and text information from other apparatuses and can render and display the information on the apparatus'LCD and sound speaker.

The system may also be placed in relay mode. In relay mode, the system receives transmitted data from one system and transmits the information to another system. The originating signals may come from one system which is connected via any of the supported transmission methods across a chain of other systems, as shown in figure 7, (connected to the chain by any supported transmission

method) to a final destination apparatus or broadcasting server.

Finally, the system can be placed in a remote operation mode whereby the system can be controlled through the signals sent across the global or local computer network.

All of the operations of the system can be remotely established; including the changing of any operational mode, basic unit configuration, positional changes via the robotics control system.

Modifications and substitutions by one of ordinary skill in the art are considered to be within the scope of the present invention which is not to be limited except by the claims which follow.

What is claimed is: