Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
A PATTERN DRIVEN MESSAGE ORIENTED COMPRESSION APPARATUS AND METHOD
Document Type and Number:
WIPO Patent Application WO/2004/100420
Kind Code:
A2
Abstract:
A compression and decompression method and apparatus comprising an at least one data source (100) providing a stream of data to an at least one data destination (150), employing an at least one pattern classifier (110) processing the stream of data of the at least one data source into a single stream of messages and generating an at least one pattern event, a messages encoder (120) and a messages decoder (130) changing an internal state in response to the at least one pattern event (160, 170).

More Like This:
Inventors:
HELFMAN NADAV BINYAMIN (IL)
KEREN GUY (IL)
DROBINSKY ALEX (IL)
Application Number:
PCT/IL2004/000377
Publication Date:
November 18, 2004
Filing Date:
May 06, 2004
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
VIRTUAL LOCALITY LTD (IL)
HELFMAN NADAV BINYAMIN (IL)
KEREN GUY (IL)
DROBINSKY ALEX (IL)
International Classes:
H03M5/00; H03M7/00; H03M7/30; H04L; (IPC1-7): H04L/
Foreign References:
US5606317A1997-02-25
US6480123B22002-11-12
US6445313B22002-09-03
Other References:
See references of EP 1620951A4
Attorney, Agent or Firm:
Agmon, Jonathan (Advocates & Patent Attorneys Nolton Hous, 14 Shenkar Street Herzliya Pituach, IL)
Download PDF:
Claims:
CLAIMS I/We claim:
1. Within a computerized environment having an at least one central processing unit, a compression or decompression method comprising an at least one data source providing a stream of data to an at least one data destination, employing an at least one pattern classifier processing the stream of data of the at least one data source into a single stream of messages and generating an at least one pattern event, a messages encoder and a messages decoder changing an internal state in response to the at least one pattern event.
2. The method of claim 1 wherein the stream of messages comprises continuous content segments in time, application layer or proximity.
3. The method of claim 1 further comprising the step of matching messages from a store for most recent messages within the encoder internal data structure with strings stored in a strings dictionary.
4. The method of claim 1 further comprising the step of the pattern classifier detecting a pattern event in the data stream.
5. The method of claim 4 wherein the pattern event comprises a silence in the session event or an end session event or an end session event.
6. The method of claim 1 further comprises the step of processing a badly compressed message segments store within the encoder or decoder into new dictionary strings in response to a silent in session event.
7. The method of claim 3 wherein the step of matching comprises the matching of a hash value of a fixed size prefix within the matched context.
8. Within a computerized environment having an at least one central processing unit, a compression or decompression apparatus comprising an at least one data source for providing a stream of data to an at least one data destination; an at least one pattern classifier for processing the stream of data of the at least one data source into a single stream of messages and for generating an at least one pattern event, a messages encoder and a messages decoder for changing an internal state in response to the at least one pattern event.
9. The apparatus of claim 8 wherein the stream of messages comprises continuous content segments in time, application layer or proximity.
10. The apparatus of claim 8 wherein the encoder internal data structure comprises an at least one strings dictionary, and a store for most recent messages comprising an at least one most recent message.
11. The apparatus of claim 10 wherein the at least one most recent message is matched with an at least one string within the at least one strings dictionary.
12. The apparatus of claim 8 further comprising a pattern classifier for detecting a pattern event in the data stream.
13. The apparatus of claim 12 wherein a pattern event is a silence in the session event or an end session event or an end session event.
14. The apparatus of claim 1 wherein the encoder or decoder further comprises a badly compressed message segments store for processing an at least one badly compressed message into an at least one new dictionary string in response to a silent in session event.
Description:
A PATTERN DRIVEN MESSAGE ORIENTED COMPRESSION APPARATUS AND METHOD BACKGROUND OF THE INVENTION FIELD OF THE INVENTION The present invention relates generally to lossless data compression.

More particularly, the present invention relates to repeating compression tasks of data generated by similar sources and possible enactments of universal data compression to utilize the attributes of such sources.

DISCUSSION OF THE RELATED ART The performance of data compression depends on what can be determined about the characteristics of the source. When given an incoming data stream, its characteristics can be used to devise a model for better prediction of forecoming strings. If such characteristics are determined prior to compression, a priori knowledge of source characteristics can be obtained, providing a significant advantage and allowing for a more efficiency in the compression algorithm. However, in most cases a priori knowledge of the source characteristics cannot be determined. This often occurs in real-world applications where properties of a source are dynamic. In particular, the symbol probability distribution of a source usually changes along the time axis. Some substitutional compression algorithms can be used to compress such data, since they do not require a priori knowledge of the source properties. Such algorithms can adaptively learn the source characteristics on the fly during the coding phase. Moreover, the decoder can regenerate the source characteristics during decoding, so that characteristics are not required to be transmitted from encoder to decoder. These compression algorithms can be applied to universal data content and are sometimes called universal data compression algorithms.

The LZ compression algorithm is a universal compression algorithm that is based on substitutional compression. The main reason for LZ compression

algorithm to work universally is the adaptability of the dictionary to the incoming stream. In general, the LZ compression algorithm processes input data stream and then adaptively constructs two identical buffers of a dictionary at both the encoder and the decoder. Without explicit transmission of the dictionary, this building process is performed during the coding and decoding of the stream, and the dictionary is being updated to adapt to the input stream.

Matching procedures using this adapted dictionary are expected to give the desirable compression result, since the dictionary reflects incoming statistic quite accurately. Many applications, which may benefit from data compression, have repeating usage patterns. Examples for such applications are: a client/server application working session which repeats frequently, or a periodic remote backup process. There is therefore a need for a priori knowledge about the source data.

SUMMARY OF THE PRESENT INVENTION The present invention regards a compression apparatus that includes a usage pattern classifier, an encoder, a decoder and a signaling mechanism of classified usage patterns between the encoder and the decoder. The input stream is delivered to the encoder as messages, which are detected by the classifier. The encoder matches each message with (a) a dictionary of previously detected streams and (b) a buffer of most N recent messages. This matching results in: (a) detection of new repeating strings (b) a collection of "badly compressed message segments"for future"off-line"analysis and (c) encoder messages in which content is replaced with a token which includes (a) references to existing strings in the dictionary with the length used from the beginning of the, stream (b) location in the most N recent messages buffer. The location in the most N recent messages buffer is also considered as the declaration of a new string in the dictionary. Offline learning is triggered by a break in the transmitted data detected by the classifier. A pause in the current session results"internal session redundancy analysis-matching all"badly

compressed message"segments from the current session resulting in (a) new strings in the dictionary (b) a reminder of message segments saved for future "cross session"redundancy analysis. During the process, the dictionary is "aged"-strings are removed to make room for new items using some"aging policy"algorithm. The end of the current session results in a cross session redundancy analysis which resolves the reminder segment left from the internal session redundancy analysis process. Several versions of the data structure may co-exist to enable analysis in the background. In this case an identifier of the data structure version used is added to the format of the encoded message. An actual realization of the mechanism may also include state structures signatures exchange between the encoder and the decoder. And data structure disk persistency for initialization and recovery.

In accordance with one aspect of the present invention there is provided a compression or decompression apparatus comprising an at least one data source for providing a stream of data to an at least one data destination; an at least one pattern classifier for processing the stream of data of the at least one data source into a single stream of messages and for generating an at least one pattern event, a messages encoder and a messages decoder for changing an internal state in response to the at least one pattern event. The stream of messages comprises continuous content segments in time, application layer or proximity. The encoder internal data structure comprises an at least one strings dictionary, and a store for most recent messages comprising an at least one most recent message. The at least one most recent message is matched with an at least one string within the at least one strings dictionary. The apparatus can further comprise a pattern classifier for detecting a pattern event in the data stream. The pattern event is a silence in the session event or an end session event or an end session event. The encoder or decoder further comprises a badly compressed message segments store for processing an at least one badly compressed message into an at least one new dictionary string in response to a silent in session event

In accordance with another aspect of the present invention there is provided a compression or decompression method comprising an at least one data source providing a stream of data to an at least one data destination, employing an at least one pattern classifier processing the stream of data of the at least one data source into a single stream of messages and generating an at least one pattern event, a messages encoder and a messages decoder changing an internal state in response to the at least one pattern event. The method can further comprise the step of matching messages from a store for most recent messages within the encoder internal data structure with strings stored in a strings dictionary. The method can further comprise the step of the pattern classifier detecting a pattern event in the data stream. The method can further comprise the step of processing a badly compressed message segments store within the encoder or decoder into new dictionary strings in response to a silent in session event. The step of matching can comprise the matching of a hash value of a fixed size prefix within the matched context.

BRIEF DESCRIPTION OF THE DRAWINGS Fig. 1 is a block diagram that illustrates a data compression system with a pattern classifier, an encoder, a decoder and a classified patterns events signaling mechanism, in accordance with a preferred embodiment of the present invention; Fig. 2 is a block diagram that illustrates the internal structure of the encoder and operation scenarios, in accordance with a preferred embodiment of the present invention.

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT DEFINITIONS Message: a continuous content segment with time, application layer or other proximity as detected by the classifier. Message is the basic unit of processing and encoded and decoded as an atomic operation.

Session: a stream of Messages with time, application layer or other proximity as detected by the classifier, all generated by the same collection data source. A session is associated with begin session and end session events.

Session silent: a time, application layer or other pause in the stream of messages in a session.

The present invention provides a method and apparatus which adds a prior assumption of the existence of usage patterns is to the universal data compression methods. Two identical data structures are maintained in the encoder and the decoder, based on the content of the stream with the addition of signals of detected patterns sent from the encoder to the decoder. The history covered by the data structures of the current invention is from the initial usage of the application. The compression ratio achieved may be in an order of about 1-3 magnitudes larger than in common universal data compression while the present mechanism is highly efficient, and suitable for real-time communication.

Referring to Fig. 1, a collection of similar or logically related data sources 100 is producing data streams, which are processed by a pattern classifier 110 into a stream of messages. Each message is processed by the Encoder 120 as an atomic unit, providing a stream of encoded messages. The Decoder 130 decodes each message proving the original stream of messages, which is processed by the messages to streams unit 140 into streams flowing to the associated collection of destinations 150. The pattern classifier 110 also detects the Session Silent and End Session events, which are signaled both to the Encoder 130 and the Decoder 150, triggering a modification of the encoder

/decoder mutual data structure, named context in this text. The constituent components of the present invention as described in this and the figure which follows operates within a computerized system having one or more central processing unit. Persons skilled in the art will appreciate that the present invention can be operated and applied in many computerized systems, including such systems associated with personal and business computers, network environments, and the like.

Referring now to Fig. 2 The internal structure of the Encoder is described. The Encoder operates as follows: (a) upon activation of the system 0 the most recent context is loaded into a memory structure 220 from a contexts store 260. The context includes a Dictionary of strings 230 and a store of message segments 250. The dictionary 230 is indexed by two methods: (a) - "fingerprint"-a hash value calculated on a fixed size prefix of the string by any (efficient) hashing method such as UHASH. (b) An identifier. The identifier can be a sequential numerator, a randomly generated identifier or any other like identifier suitable for indexing a dictionary. In actual realizations of the invention an ID (and context signature) may be associated with each Context data structure to match with the remote instance for synchronization validation.

When a message is handed to the encoder 1.0 it is stored in the current message data structure. For every location in the message, a fingerprint value is calculated in a manner, which is identical to the dictionary string fingerprints described above. The fingerprint values are used to query 1.1 Dictionary 230, and (optionally for this massage) 1.2 the N most recent message store 215 using a one to many cross redundancy analyzer 210. Any match with dictionary 230 is used as an ESCAPE code (string id, length) token in the encoded message. Any match with the message with a defined minimal length is used as an ESCAPE code (relative message id in the store, location in message, length).

In addition, the matched segment is added to the dictionary 1.2. 1 on both sides.

Badly compressed segments larger than a given threshold 1.4 are added to a message segments store data structure. A session silent event 2.0 signaled by the pattern classifier activates a many to many cross redundancy analyzer 240.

This analyzer is handed the current active context 2.1 and replaces said active context after the analysis with a new active context 2.2 while saving the new context also to the context in the contexts store 2.4. An End session event 3.0 signaled by the pattern classifier activates the many to many cross redundancy analyzer unit to read previous 3.1 and current 3.2 context and to resolve items in the message segments store into a new context 3.3 with disk persistency 3.4.

Both one to many 210 and many to many 240 analyzers operate by mapping each fingerprint value into a list of its instances.

The process in the Decoder is similar in the opposite direction. The Decoder has the same context data structures, which are used to resolve back tokens into context segments in methods, which are known to persons skilled in the related art.

One embodiment of the present invention is provided as follows: A data Source of a web-based application, running on a computer system with one CPU, is generating replies (in respond to request from a web client application). The stream of communication has the following pattern. Packets are transmitted contiguously with a delay of less then 50mSeconds between each packet, until the content the web based application"whishes"to transmit to the Destination (The web client) is entirely transmitted. A pattern classifier module running on the network gateway computer captures the stream form the Source to the Destination via a method such as redirecting the traffic to a local listening TCP port using DNAT (Destination network address translation), which is a well known networking method. After a period of more then 50mSec from the previous packet, the classifier receives the content of a new packet of the stream and start buffing the content until flow of packet stops for more then 50 mSeconds. Then the content is packed with meta-information regarding the original stream into a message data structure and delivered to the Encoder. The Encoder matches the message with a string of previously detected strings using a method such as comparing signature of fixed length segments in the message. Then the encoder matches the message with a buffer of N previously transmitted messages for repeating strings. Any segment whose

size is more then 10% of the message and which is not covered by the dictionary or previous messages is added into a segments store. Every matched string is replaced with an escape-char and an index value. The encoded message is transmitted into the other side and handled by the Decoder, which is running on the gateway computer to the Destination's network. The Decoder replaces every escape-char and index with the original string and transmits the content into the destination using a local TCP connection, which matches the meta-information-in the Message. In addition it adds segment, which are larger then 10% of the message into a segments store. This process repeats for every reply message from the web browser to the web client. When the user"takes a break"more then 120 seconds and stop generating new requests the web browser eventually also stop generating new replies. A software timer in the classifier, which is reseted a retriggered to generate an event within 120 seconds, after generating every message, is eventually triggering a"stream- silence"event. The event is delivered to both the Encoder and the Decoder. In reaction to the event both the Encoder and the Decoder analyses the content of the segments store. Each string, which is larger the 32 bytes and repeats at least twice is added to the strings dictionary. Having a new version of the strings dictionary, the internal state of both the encoder and the decoder is changed in reaction to the stream-silence event.