Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
INTERACTIVE TELEVISION SYSTEM AND METHOD
Document Type and Number:
WIPO Patent Application WO/2001/013206
Kind Code:
A1
Abstract:
A system and method are provided for generating, transmitting and presenting navigable video frames in an interactive broadcast television environment. Authoring tools (110) are utilized to define the content and visual appearance of a multiplicity of linked scenes, and to create a program structure. The scenes thus generated are encoded as video frames (140) and stored in a broadcast file system for cyclical transmission by a transmission apparatus (165). A user terminal (200), coupled to a television monitor (210), is configured to receive the cyclically transmitted frames via a broadcast channel and is additionally configured to receive and process user input, which may be advantageously provided through rudimentary controls located on a conventional remote control unit (220).

Inventors:
SEVERIN LARS
TRAMMEL TROY
FISHER JEFF
Application Number:
PCT/US2000/022006
Publication Date:
February 22, 2001
Filing Date:
August 11, 2000
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
AT HOME CORP (US)
International Classes:
H04N7/16; H04N5/445; (IPC1-7): G06F3/00; G06F13/00
Foreign References:
US6047317A2000-04-04
US5682511A1997-10-28
Attorney, Agent or Firm:
Henkhaus, John (CA, US)
Download PDF:
Claims:
CLAIMS What is claimed is:
1. An interactive television system, comprising: authoring tools for generating a multiplicity of linked scenes and describing a structure of the multiplicity of linked scenes; an interactive channel engine for converting the scenes and the structure into a collection of video frames and a navigation map, the video frames and the navigation map being stored in a memory for subsequent cyclical transmission over a transmission medium; and a user terminal for receiving the cyclically transmitted frames, the user terminal being configured to select a frame for display based on a user input and the navigation map.
2. The interactive television system of. claim 1, wherein the authoring tools comprise a scene editor for defining layout and content of individual scenes, and a program editor for assembling the individual scenes into a program.
3. The interactive television system of claim 2 wherein at least a portion of the scene content is retrieved from a remote source specified in the scene editor.
4. The interactive television system of claim 3 wherein the remote source is accessible via the Internet.
5. The interactive television system of claim 1, wherein the transmission medium comprises a hybrid fibercoax (HFC) network.
6. The interactive television system of claim 1, wherein the video frames generated by the channel engine are encoded in MPEG format.
7. The interactive television system of claim 1, wherein the memory comprises a broadcast file system.
8. A user terminal for an interactive television system, comprising: a receiver for capturing video frames cyclically transmitted by a transmission apparatus over a broadcast channel; a user input interface for receiving a user input; an interactive television application, coupled to the receiver and to the user input interface, for selecting a video frame for display from the cyclically transmitted video frames based on the user input and a navigation map; and a display interface for outputting, in a predefined format, the selected video frame to a display device.
9. The user terminal of claim 8, wherein the selected video frame includes a set of navigable hotspots, each hotspot corresponding to a link to another video frame.
10. The user terminal of claim 8, wherein the user input is supplied by selectively engaging keys on a remote control device coupled to the user input interface through a wireless communication channel.
11. The user terminal of claim 8, wherein the interactive television application is configured to default to a passive viewing mode when no user input is received for a predetermined period of time.
12. The user terminal of claim 8, wherein the captured video frames comprise an MPEGencoded data stream, and wherein the selected video frame is directed as output to the display device in NTSC format.
13. A system for providing interactive video content to a remote user terminal, comprising: authoring tools for generating a multiplicity of linked scenes and describing a structure of the multiplicity of linked scenes; and a channel engine for converting the scenes and the structure into a set of video frames and a navigation map, the video frames being stored in a frame memory for cyclical transmission over a transmission medium; wherein the remote user terminal is configured to select a video frame for display based on received user input and the navigation map.
14. The system of claim 13, wherein the authoring tools include a scene editor for defining content and layout of the multiplicity of scenes.
15. The system of claim 14, wherein at least a portion of the content is retrieved from a remote source.
16. The system of claim 13, wherein the frame memory is located at a transmission facility remote from the channel engine.
17. The system of claim 16, wherein the frame memory comprises a data carousel.
18. A method for interactively providing information over a broadcast network, comprising the steps of: generating a multiplicity of linked scenes and describing a structure of the multiplicity of linked scenes; converting the scenes and the structure into a sequenced set of video frames and a navigation map; cyclically transmitting the sequenced set of video frames and the navigation map to a user terminal; receiving a user input at the user terminal comprising a navigation command; and displaying a selected one of the set of video frames based on the navigation map and the user input.
19. The method of claim 18, wherein the step of cyclically transmitting comprises transmitting the video frames and the navigation map to a plurality of user terminals over a broadcast network.
20. The method of claim 18, wherein the user input is transmitted by a remote control device responsive to a key being engaged.
21. The method of claim 18, wherein the step of generating comprises defining content for each of the linked scenes.
22. The method of claim 21, wherein the step of generating comprises designating a remote source of the content.
Description:
INTERACTIVE TELEVISION SYSTEM AND METHOD CROSS-REFERENCE TO RELATED APPLICATIONS This application claims the benefit of U. S. provisional Patent Application No. 60/148,476 entitled"Interactive Television System and Method", filed on August 12,1999 and U. S. Patent Application No. 09/461,513 entitled"Interactive Television System and Method", filed on December 14,1999.

BACKGROUND 1. Field of the Invention The present invention relates generally to broadcast television systems, and more particularly to an interactive television system and method for generating, transmitting, and presenting navigable video frames in a broadcast television environment.

2. Background of the Prior Art Conventionally, television viewing has been a substantially passive experience, wherein the viewer has virtually no control (beyond the ability to select specific channels) over the information being viewed. In recent years, there has been great commercial interest in developing interactive television systems which give the viewer greater degrees of control over the information presented. Such systems advantageously enable tailoring of the information presented according to the preferences and needs of the viewer.

A specific focus of activity in this area has been the development of interactive television systems which allow the viewer to access the immense amount of information and services available on the World Wide Web (the"Web"). One well-known vendor of such systems is WebTV Networks, Inc. of

Mountain View, California. These systems typically include an embedded web browser enabling the viewer to view selected web pages. In this manner, the viewer is presented with information of the viewer's choosing and may utilize various services, e. g. home shopping, in connection with the web pages.

However, the use and commercial success of interactive television systems has been historically limited by a number of factors. First, most prior art systems require the viewer to familiarize himself or herself with a new and non- intuitive user interface. Television viewers of limited technical sophistication may easily be dissuaded from purchasing interactive systems if they are forced to learn specialized and/or complex commands to successfully operate the system.

A related problem of prior art interactive television devices is latency, or the delay associated with delivery of the selected information to the user. Interactive television users may be discouraged from future use if they are forced to wait for extended periods of time while the selected information is transmitted from the information provider, e. g., a web page server, to the user terminal. This problem may be exacerbated when the information is being transmitted over a highly trafficked and/or relatively low bandwidth network. In order to appeal to most users, an interactive television system should provide a"television-like" experience, wherein the selected information is presented to the user within a short (<2 seconds) time after the user makes a selection.

Another factor limiting the acceptance of prior art interactive television systems is that most systems of this description require a return data path to send data from the viewer to the information provider. The return data path is

typically implemented as a modem/telephone line connection or as an upstream channel in a cable television distribution network. Either of these methods necessitate adding complex and expensive circuitry to the user terminal (e. g., set-top box), and make setup and configuration processes more difficult. Further, such systems are rendered inoperative if the return data path is unavailable. For example, if a single residential telephone line is employed for both voice communication and as a return data path, the associated interactive television system will be inoperative while the telephone line is being used to place and receive telephone calls.

Finally, many interactive television systems require the information provider, e. g., a cable operator, to purchase and install special equipment. Cable operators may be understandably reluctant to invest substantial sums of money in the special equipment necessary to provide interactive television services, particularly where anticipated demand for such services is unknown. Additionally, prior art systems have tended to not be highly scaleable, i. e., system sizes and configurations are not widely ajustable in order to accommodate a lesser or greater number of receiving devices.

Further, if the system is managed locally, personnel must be trained in system operation (or experienced personnel must be hired), thereby significantly raising the cost to the cable operator.

In view of the foregoing discussion, there is a need in the art for an interactive television system which utilizes a simple and familiar interface, does not require a return data path, and can be implemented by a cable operator or similar information provider without having to install additional equipment or incur other costs.

SUMMARY OF THE INVENTION The present invention comprises an interactive television system wherein collections of navigable video frames are cyclically transmitted to a user terminal over a broadcast network. The user terminal is additionally configured to receive and process user input, which is preferably provided through a conventional remote control device. The user terminal selects a video frame for display based on the user input and a navigation map transmitted with the video frames.

The content, appearance and navigation structures of the video frames are defined using a set of software tools. The software tools preferably include a scene-authoring tool for creating the layout and content of individual scenes. A scene may contain one or more embedded links, which point to selected others of the scenes. The scene-authoring tool also enables the author to specify content sources for the scene and set the frequency at which the content is updated. For example, a scene that presents weather information may be configured to retrieve the information at hourly intervals from a National Weather Service web site. In addition to the embedded links, the author may also embed various applets in the scene in order to present the information in a more visually appealing manner. A scene inventory, comprising a multiplicity of previously authored scenes, may be stored in a database for later retrieval.

The software tools additionally include a program design tool for assembling individual scenes into a program comprising a multiplicity of linked scenes. The program design tool is employed to select scenes from the scene inventory, to set a duration for each scene (for use when the system switches into a passive slideshow mode), and to specify a structural framework for the program (e. g., a menu

hierarchy). Program information generated by the program design tool may then be stored in a database.

An interactive channel engine is configured to read the scene and program information stored in the database and generate therefrom a sequenced collection of digitally encoded video frames and a navigation map describing a structural relationship between and among the various video frames. In one mode of the invention, the video frames are encoded in MPEG format. It is to be appreciated that the interactive channel engine periodically regenerates the video frames to reflect updated content and/or program scheduling.

The collection of video frames are then stored in a data carousel for cyclical transmission to the user terminal over a conventional broadcast network (which may comprise, for example, a hybrid optical fiber-coax network of the type commonly used by cable television systems to deliver conventional non-interactive content). A user terminal, such as a set top box, receives and processes the cyclically transmitted video frames. The user terminal has associated therewith a user input device through which the user navigates menu-type links displayed in the video frames. The receiving device identifies a frame for display from the cyclically transmitted sequence of frames based on the user input and the navigation map. For example, the receiving device may initially identify for display a frame comprising a first menu having links to different topics (e. g., news, weather, traffic, and entertainment). Each of the links points to a frame having content related to the corresponding topic.

The user navigates through the menu links by engaging the appropriate keys or buttons located on the remote control device, such as the up/down and left/right directional keys.

Upon selection of a menu link, the receiving device is

configured to identify which frame is pointed to by the selected menu link, and to cause the pointed-to frame to be displayed to the viewer upon the subsequent transmission of the frame. The number and size of the frames stored in the carousel are optimized in view of the available transmission bandwidth in order to minimize latency. In other words, the cycle period (the time it takes for transmission of a complete sequence of the collected frames) is sufficiently short so that the delay between selection of a menu link and display of the associated video frame is substantially imperceptible. If the user terminal does not receive any user input for a predetermined time period, the user terminal may switch to a passive or slideshow mode, wherein designated frames are cyclically presented to the viewer.

The interactive television system may be practiced in connection with a variety of system architectures. According to one example, a regional data center serves several transmission facilities (e. g., cable headends) with video frame feeds (each headend having a different interactive channel engine, located at the regional data center, associated therewith). This configuration advantageously enables central management of plural headends and relieves individual cable operators of the need to manage their own systems.

The present invention thus provides'an interactive television experience without requiring the implementation of a data return path from the viewer to the information provider. The invention also enables the viewer to navigate video frames by using a simple and familiar user interface.

Other advantages of the invention will occur to those of ordinary skill in the art upon review of the following detailed description of preferred embodiments and the accompanying figures.

BRIEF DESCRIPTIION OF THE FIGURES In the accompanying drawings: FIG. 1 is a block diagram depicting an exemplary architecture of an interactive television system; FIG. 2 is a block diagram depicting components of a set of authoring tools; FIG. 3 illustrates an exemplary menu scene created by a scene editor of FIG. 2; FIG. 4 illustrates an exemplary detail scene created by the FIG. 2 scene editor; FIG. 5 is a block diagram depicting the operation of an interactive channel engine of FIG. 1; FIG. 6 is a flowchart depicting steps of a method for generating a set of frames and a navigation map; FIG. 7 is a block diagram of a set top box of FIG. 1; FIG. 8 illustrates a set of keys of a remote control device used for navigating through the frames; FIG. 9 is a flowchart generally illustrating the operation of an interactive television application resident in the set top box, wherein the interactive application executes an event loop; FIG. 10 is a flowchart illustrating a procedure for handling a timeout event; FIG. 11 is a flowchart illustrating a procedure for handling a new navigation map event; and FIG. 12 is a flowchart illustrating a procedure for handling a key press event.

DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS FIG. 1 depicts an exemplary architecture of one embodiment of an interactive television system 100.

Interactive television system 100 generally includes a set of authoring tools 110 employed to create an interactive television program comprising a multiplicity of scenes, each scene having predefined content. At least a portion of the scenes are provided with one or more embedded links which point to others of the scenes. Scene and program information generated by authoring tools 110 is subsequently stored in a relational database, or depot 120. As will be discussed in further detail below in connection with FIG. 2, depot 120 is advantageously coupled to a network (which may comprise a public or private network) to enable periodic retrieval of program content from remote content sources, such as Internet web sites.

Interactive channel engine (ICE) 130 is configured to read scene and program information from depot 120 and generate therefrom a corresponding collection of video frames 140 and a navigation map 150. The operation of ICE 130 is discussed in further detail below in connection with FIG. 3.

Video frames 140 and associated navigation map 150 (and optionally a set of animated advertisements for display in connection with video frames) are subsequently conveyed to a broadcast facility or headend 160, which is operative to cyclically broadcast video frames 140 and navigation map 150 to a plurality of user terminals over a broadcast medium, such as an HFC network 170. Headend 160 is typically provided with a multiplexer/encoder 165 to combine, for transmission, interactive television program 180 (comprising video frames 140 and navigation map 150) with additional audiovisual content representative of conventional (non-interactive) channel content 190 (e. g., news, business, sports,

entertainment and the like). The conventional channel content is typically conveyed to headend 160 through a satellite link or equivalent distribution network.

The multiplexed broadcast signal is received by the user terminal (an exemplary one of a large number of user terminals coupled in communication to the broadcast facility). The user terminal typically comprises a set top box (STB) 200, operative to decode and process the broadcast signal, coupled to a television monitor 210 for presenting images and audio information to the user. STB 200 is further provided with a remote control device 220, coupled to STB 200 via an IR or radio-frequency communication channel. Remote control device 220 is operable, inter alia, to enter user input in connection with interactive television program 180.

The user input is received by STB 200, which selects a frame for display from the cyclically transmitted frames 140 based on the user input and navigation map 150. According to a typical implementation of the invention (which is discussed in further detail below), the user is initially presented with a first menu screen having several content categories, each category being linked to one or more video frames which displays information relating to the selected category.

It is to be appreciated that the various components of interactive television system 100, including authoring tools 110, depot 120, ICE 130, and headend 160, may be physically remote from each other and may be operated and managed independently. This feature advantageously allows interactive television system 100 to be implemented in a variety of system architectures and physical configurations in accordance with the needs and preferences of the system's operators and users. It may be desirable, for example, to provide a central management facility where interactive television programs, optionally tailored to specific

headends, are generated and then distributed to plural headends over a computer network linking the central management facility to the headends. In this manner, authoring tools 110, depot 120 and 130, may be located at the central management facility and operated and maintained by persons located at the facility, thereby obviating the need for owner/operators of headends to install additional equipment and/or train or hire personnel with knowledge of the use and operation of interactive television system 100.

FIG. 2 depicts exemplary components of authoring tools 110. Authoring tools 110 include a scene editor 230 and a program editor 240, and may additionally include various administrative and management tools such as billing tool 250 and alarm monitor 260 (for monitoring the operation of the interactive television system and alerting system administrators in the event of a fault or problem). The various authoring tools may be integrated into one or more software packages, or through a web browser. Depot 120, which preferably comprises a relational database management system, is operative to store and index information generated by the several authoring tools for subsequent retrieval or review.

Depot 120 is additionally programmed to use stored procedures and scheduled tasks to invoke content agents 270, and to generate event logs and alarms.

Scene editor 230 enables an operator to specify the layout and content of individual scenes, as well as designating links or inter-relationships between two or more scenes. Typically, each scene will comprise several elements, each element containing text, graphics or other indicia. One or more of these elements may be designated as a hotspot or link. Each link has associated therewith a pointer to another scene identified by the operator. The link, when selected by the user, causes the pointed-to scene to be displayed, in a

manner similar to selection of a hyperlink in a web browser environment.

Scene editor 230 may be additionally employed to designate content sources. In some implementations of interactive television system 100, it may be desirable to identify remote sources for dynamic information such as news, weather, sports scores, etc., to enable the information presented by the associated scene to be periodically updated without the intervention of the operator. For example, a scene including an element that describes or depicts weather information may retrieve, on a pre-specified periodic basis, weather information from an appropriate website, such as the National Weather Service. The operator enters identification of the remote content source (e. g., a URL of a web page) through the user interface of the scene-authoring tool, and may further enter instructions regarding'when and how often the content is to be retrieved from the remote source (e. g., hourly, daily, weekly, etc.). Content agents 270 are created by depot 120 in accordance with the instructions entered by the operator to effect the periodic retrieval of the updated information from the remote content source.

Scene editor 230 may also allow an operator to embed various applets (which are subsequently interpreted and executed by STB 200) to enhance visual interest associated with the scene's elements. For example, the author may embed a"shuffling"applet, whereby different pieces of information (e. g., news headlines) are sequentially displayed to the user.

FIGS. 3 and 4 depict examples of scenes created using scene editor 230. FIG. 3 depicts a menu scene 300, which includes several elements containing links to other scenes.

These elements include a set of category links 310 (labeled as"Finance","Weather", etc.), and a set of sub-category

links 320 (labeled as"World","National","Local"and "Sports"relating to the selected category ("News") of menu scene 300. Menu scene 300 may also include an element inserted by the operator (not shown in the figure) which displays a banner advertisement containing a link to an animation.

FIG. 4 depicts an exemplary detail scene 400 presenting information relating to the"World"sub-category 320 of FIG.

3. It is noted that each sub-category 320 may be associated with plurality of detail scenes; selection of the sub- category will typically link to a first of the plurality of detail scenes. Exemplary detail scene 400 may also be provided with a banner advertisement element (not shown), as was described above in connection with FIG. 3.

Program editor 240 is utilized by the operator to select previously created scenes, such as menu scene 300 and associated detail scene 400, from a scene inventory stored in depot 120 and assemble the scenes into an interactive television program. The interactive television program may be tailored, for example, to a particular time of day, or to a geographical region in which the receiving user resides. The operator constructs the interactive television program by indicating, typically through a graphical user interface, the identities and locations of the scenes to be included in the program. The operator may also designate using program editor 240 certain scenes, such as menu scene 300, to be default or top-level scene displayed upon initially selecting the interactive program channel on STB 200. Further, program editor 240 allows the operator to include selected ones of the scenes in a"slideshow", which is cyclically displayed by television monitor 210 when no user input has been received for a predetermined period.

FIG. 5 illustrates the operation of ICE 130. ICE 130 is generally configured to read scene and program information (generating using scene editor 230 and program editor 240) stored in depot 120, and to process this information to generate as output a collection of video frames 140, and a navigation map 150 enabling the user to navigate through video frames 140 to effect display of the desired information. Video frames 140 and navigation map 150 may be stored as files on a conventional broadcast file system 540 located at headend 160 for subsequent cyclical transmission over HFC network 170. ICE 130 may additionally copy animated advertisements 530 (typically GIF animations) to broadcast file system 540. ICE 130 may be configured to periodically re-generate interactive television program 180 at pre- specified intervals, or upon the occurrence of particular events (such as a scheduled program change) FIG. 6 is a flowchart illustrating the process of constructing video frames 510 and navigation map 520. In step 602, ICE 130 initializes all variables and creates empty lists for scenes and advertisements contained in the program.

ICE 130 then executes a loop represented by steps 604,606, 608 and 610 where, for each menu in the program, ICE 130 generates a scene vector having as its elements all scenes associated with the menu.

Next, ICE 130 executes a loop represented by steps 612- 628. In this loop, ICE 130 renders, for each menu in the program, a MPEG frame corresponding to the menu screen, and a set of MPEG frames corresponding to each scene in the menu's scene vector. Rendering of the menu and associated scenes as MPEG frames (preferably MPEG I-frames) may be performed using techniques known in the art.

Next, ICE 130 executes a loop represented by steps 630- 636, where an ad list is created consisting of all banner ads

contained in the program. ICE 130 then copies in steps 638- 648 animated advertisements (typically GIF animation files) corresponding to each banner ad to broadcast file system 540, and creates a link to the associated menu. Finally, ICE 130 constructs a navigation map 150 based on the scene and program information, step 650. Navigation map 150 will include a list of files (each MPEG frame 140 or animation 530 comprising a separate file), together with a set of pointers representative of the inter-relationship of the scenes.

FIG. 6 is a block diagram showing components of STB 200.

STB 200 includes a demultiplexer/tuner 710, an interactive television (ITV) application 720, a central processing unit (CPU) 730, an operating system (OS) 740,. a user input interface 750, graphics chip 760, and MPEG decoder 770.

Demultiplexer/tuner 710 receives a broadcast signal over HFC network 170 from headend 160 comprising a plurality of channels having programming content. At least one of the channels is an interactive television channel comprising cyclically transmitted video frames 140, navigation map 150, and (optionally) animations 530, which collectively represent interactive television program 180. Demultiplexer/tuner 710 extracts a data stream representative of a selected channel from the received signal. Assuming the selected channel is the interactive television channel, the data stream is directed to ITV application 720, which is generally operative to select a video frame 140 and/or an animation 530 based on user input received through user input interface 750 and navigation map 150. The processes embodied in ITV application 720 are described in more detail below in connection with FIGS. 9-11. ITV application 270 may additionally interpret and execute applets or scripts embedded in video frames 140.

In some embodiments of interactive television system 100, ITV application 270 may cache selected video frames 140 (for

example, the most recently viewed frame) in order to avoid or minimize latency associated with waiting for the selected frame to be received.

STB 200 further includes CPU 730 and operating system 740 for, respectively, interpreting and executing program instructions (such as those contained in ITV application 270) and controlling the allocation and usage of hardware resources. CPU 730 receives user input from remote control device 220 through user input interface 750, which interprets and processes characteristic codes transmitted by remote control device 220. The use of remote control device 220 to navigate through video frames 140 is discussed in further detail below. Graphics chip 760 and MPEG decoder 770 receive output generated by ITV application 720, representative of the selected video frame, and process the output for display as images on television monitor 210.

FIG. 8 depicts keys located on remote control device 220 used for navigating through video frames 140 displayed on television monitor 210. The keys include a set of four directional keys: an up key 802, down key 804, left key 806, and right key 808, and; a select key 810. As is known in the remote control art, depressing a key causes remote control device 220 to generate an infrared or RF signal characteristic of the depressed key. The signal transmitted by remote control device 220 is then received and interpreted by user input interface 750. The user is able to select information he or she wishes to view by engaging the appropriate keys on remote control device 220, which in turn causes ITV application 720 to read navigation map 150 and output a video frame, from the cyclically transmitted video frames, which presents the information that the user wishes to view.

FIG. 9 depicts in flowchart form the operation of ITV application 720, in accordance with an exemplary implementation of interactive television system 100. In step 902, ITV application 720 is initialized. Initialization of ITV application will include loading navigation map 150 and generating internal objects. Next, in step 904, ITV application 720 selects a frame corresponding to a first menu scene (e. g., menu scene 300) for display on television monitor 210. As discussed above in connection with FIG. 3, first menu scene 300 includes a listing of a set of categories 310, and a set of sub-categories 320 associated with a first category selected by default.

In steps 906 and 908, ITV application 720 executes an event loop wherein it awaits the occurrence of an event. The event may consist of a timeout condition, where no user input has been received for a predetermined period of time, a map file change event where a new navigation map 150 has been received from headend 160, or a key press event indicative of user input entered via remote control device 220. Upon detection of an event in step 908, ITV application 720 determines if the event is a timeout event, step 910. If the event is a timeout event, ITV application 720 proceeds to the timeout event handling process depicted in FIG. 10 and described in further detail hereinbelow.

If the event is not a timeout event, then ITV application 720 determines if the event is a new (i. e., changed) map file event, step 912. A new navigation map 150 will be generated by ICE 130 when, for example, the interactive program has been changed in accordance with scheduling information entered by the operator. If the event is determined to be a new map file event, then ITV application 720 jumps to the new map file routine depicted in FIG. 11 and described below. If it is determined in steps 910

and 912, that the detected event is neither a timeout event nor a new map file event (i. e., that the event is a key press event), then ITV application 720 proceeds to the key press event handling procedure depicted in FIG. 12 and described below.

FIG. 10 is a flowchart depicting steps of a method for handling a timeout event. In step 1002, ITV application 720 determines if it is currently set in browser (also known as slideshow) mode. As discussed above, ITV application 720 may be programmed to default to browse mode in situations where no user input has been received for a predetermined period.

In browse mode, ITV application 720 is operative to cycle through selected ones of received video frames 140 in accordance with sequencing and display duration information set by an operator using program editor 240. If ITV application 720 determines that it is currently in browse mode, it proceeds to step 1006; otherwise, it sets a flag indicating that it is now operating in browse mode, step 1004.

Next, in step 1006, ITV application 720 selects the next slide to show from the slide list. If it is determined in step 1008 that the previous slide shown was the last entry in the slide list, ITV application 720 selects the first slide in the slide list, step 1010. The selected slide is then displayed on television monitor 210, step 1012.

In step 1014, ITV application 720 schedules the next timeout event (i. e., sets the time at which the next timeout event will occur if no user input is received). ITV application 720 then returns to the event loop represented by steps 906 and 908 of FIG. 9.

FIG. 11 is a flowchart depicting steps of a method for handling a new map file event. As discussed above, new navigation maps 150 are generated when the program is

changed, typically in accordance with scheduling information supplied by the system operator. In step 1102, new navigation map 150 (reflecting the addition, deletion and/or rearrangement of scenes) is loaded into STB 200. ITV application 720 is then re-initialized, step 1104. Re- initialization of ITV application 720 will include generating new internal objects in accordance with new navigation map 150. Finally, ITV application 720 selects first menu 300 (which may be changed from the first menu of the earlier program) for display on television monitor 210, step 1106.

ITV application 720 then proceeds to the event loop of steps 906 and 908 shown in FIG. 9.

FIG. 12 is a flowchart depicting steps of a method for handling a key press event. In step 1202., ITV application 720 determines whether the currently displayed frame represents a menu scene (such as menu scene 300) or a detail screen (such as detail scene 400). If the currently displayed scene is a menu screen, ITV application 720 then determines, step 1206, whether the user input represents a select key 810 press, or a navigation key press. As discussed above, navigation keys may include directional keys 802-808 shown in FIG. 8.

If the user input represents a select key 810 press, ITV application 720 examines the type of link which is currently highlighted, step 1206. Link types may include menu or category links 310, detail links 320, and ad links (not shown). If the highlighted link is a menu link, then ITV application 720 causes the frame representative of the associated menu scene to be displayed, step 1208. If the highlighted link is a scene or detail link, then ITV application 720 causes the frame representative of the first detailed scene associated with the link to be displayed, step 1210. Finally, if the highlighted link is an ad link, ITV application 720 causes the associated animation to be

displayed, step 712. Following display of the menu or detail scene or ad animation per steps 1208,1210 or 1212, ITV application returns to the event loop represented by steps 906 and 908 of FIG. 9.

If it is determined in step 1204 that the user input constitutes a navigation key press, then ITV application reads (step 1214) navigation map 150 to determine a scene pointed to by the particular navigation key pressed, and causes the pointed-to scene to be displayed, step 1216. If the navigation key pressed does not correspond to any scene, then ITV application 720 causes an error sound to be played, step 1218. ITV application 720 then jumps to the event loop of steps 906 and 908.

Upon a determination in step 1202 that the currently displayed frame is representative of a detail scene, ITV application 720 next identifies the key press type, step 1220. If the user pressed select key 810, ITV application causes an ad animation associated with the currently displayed detail scene to be displayed, step 1222. If an up key 802 or a down key 804 was pressed, then ITV application 720 causes a menu scene pointed to by the detail scene to be displayed, step 1224. If left key 805 or right key 808 was pressed, then ITV application 720 causes a previous or subsequent detail scene to be displayed, step 1226. ITV application 720 then advances to the event loop of steps 906 and 908.

It should be understood that the specific procedure for navigating between and among scenes described above in connection with FIGS. 10-12 is provided by way of example only, and should not be construed to limit the invention to any particular hierarchical arrangement of scenes.

It will be further recognized by those skilled in the art that, while the invention has been described above in

terms of preferred embodiments, it is not limited thereto.

Various features and aspects of the above described invention may be used individually or jointly. Further, although the invention has been described in the context of its implementation in a particular environment and for particular applications, those skilled in the art will recognize that its usefulness is not limited thereto and that the present invention can be beneficially utilized in any number of environments and implementations. Accordingly, the claims set forth below should be construed in view of the full breadth and spirit of the invention as disclosed herein.