Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
VIDEO GAMING DEVICE WITH REMOTE RENDERING CAPABILITY
Document Type and Number:
WIPO Patent Application WO/2015/104849
Kind Code:
A1
Abstract:
A gaming device for playing a video game. The gaming device executes game software to output rendering commands representing a virtual world of the video game and has a local rendering module for processing the rendering commands to generate an image data stream of the virtual world. The gaming device has a remote rendering controller for generating second rendering commands for processing by a remote rendering module to generate a rendered graphics output also depicting the virtual world.

Inventors:
PERRIN CYRIL (FR)
Application Number:
PCT/JP2014/050727
Publication Date:
July 16, 2015
Filing Date:
January 09, 2014
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
SQUARE ENIX HOLDINGS CO LTD (JP)
International Classes:
G06T15/00; A63F13/86
Domestic Patent References:
WO2013128709A12013-09-06
Foreign References:
JP2007079664A2007-03-29
JP2006101329A2006-04-13
US20090080523A12009-03-26
Other References:
See also references of EP 3092621A4
Attorney, Agent or Firm:
OHTSUKA, Yasunori et al. (KIOICHO PARK BLDG. 3-6, KIOICHO, CHIYODA-K, Tokyo 94, JP)
Download PDF:
Claims:
CLAIMS

A gaming device for playing a video game, the gaming device comprising:

(a) game software to output first rendering commands representing a virtual world of the video game;

(b) a local rendering module for processing the rendering commands to generate an image data stream of the virtual world;

(c) a remote rendering controller for generating second rendering commands for processing by a remote rendering module to generate a rendered graphics output depicting the virtual world;

(d) an output for:

(i) releasing the image data stream;

(ii) transmitting the second rendering commands to a data communication network for transmission to the remote rendering module.

2. A gaming device as defined in claim 1 , wherein the remote rendering controller derives the second rendering commands from the first rendering commands.

3. A gaming device as defined in claim 2, wherein the first rendering commands includes rendering instructions, the local rendering module processing geometry data according to the rendering instructions to generate the image data stream.

4. A gaming device as defined in claim 3, wherein the geometry data include vertex data.

5. A gaming device as defined in claim 2, wherein the second rendering commands includes rendering instructions, the remote rendering module processing geometry data according to the rendering instructions to generate the rendered graphics output.

6. A gaming device as defined in claim 2, wherein the remote rendering controller encodes the second rendering commands for transmission to the remote rendering module.

7. A gaming device as defined in claim 1 , wherein the game software is a locally executed game software, the output transmits game metrics derived from the locally executed game software to the data communication network for transmission to a remote game hosting server aggregating game metrics from multiple remote gaming devices.

8. A gaming device as defined in claim 1 , wherein the second rendering commands include data packetized for transmission to the remote rendering module.

9. A gaming device as defined in claim 1, wherein the remote rendering controller generates identification information to identify one or more spectators to which an output of the remote rendered module depicting the virtual world is to be sent via the data network.

10. A gaming device as defined in claim 1 , configured to generate a Graphical User Interface (GUI) on a display displaying the image data stream, the GUI providing a player playing the video game at the gaming device with a plurality of spectator selection options.

11. A gaming device as defined in claim 10, wherein the gaming device is responsive to selection of an option by the player via the GUI to transmit to the remote rendering module identification information associated with the option selected by the player.

12. A server arrangement for performing remote rendering of video game graphics, the server arrangement including:

(a) an input for receiving:

(i) rendering commands transmitted to the input over a data network from a remote location, the rendering commands representing a virtual world of a video game, the video game being played by at least one player controlling a virtual character in the virtual world;

(ii) Identification information to identify one or more spectators;

(iii) a rendering module for processing the rendering commands to generate an image data stream depicting the virtual world;

(b) an output for transmitting the image data stream to the one or more spectators over the data network.

13. A server arrangement as defined in claim 12, wherein the rendering commands include rendering instructions, the rendering module processing geometry data according to the rendering instructions to generate the image data stream.

14. A server arrangement as defined in claim 13, including a machine-readable storage for holding a list of identifiers of spectators that are to receive the image data stream.

15. A server arrangement as defined in claim 14, wherein in response to reception of the identification information at the input, the server arrangement loading the identification information in the list.

16. A method for allowing a spectator to spectate a video game, the method including:

(a) executing by a CPU game software to generate rendering commands depicting a virtual world of a video game; (b) processing with a rendering module the rendering commands to generate an image data stream;

(c) outputting the image data stream to a display device;

(d) wherein the rendering commands are first rendering commands, including deriving second rendering commands from the first rendering commands;

(e) transmitting the second rendering commands via a data network to a remote server for rendering and delivery of a rendered output to at least one spectator. 17. A method for performing distributed rendering of a video game, the method including:

(a) processing at a first network node in a data network rendering commands depicting a virtual world of a video game with a first rendering module to generate an image data stream for display to a player that controls a virtual character in the video game;

(b) processing at a second node in the data network that is remote from the first node second rendering commands derived from the first rendering commands with a second rendering module to produce a rendered output depicting the virtual world and the virtual character of the player;

(c) transmitting the rendered output to at least one spectator for viewing.

18. A machine readable storage medium encoded with non transitory program code for execution by a CPU for generating rendering commands to perform remote rendering of video game scenes of a video game, the program code including:

(a) remote rendering program instructions for processing first rendering commands output by game software for driving a local rendering module, to derive second rendering commands for processing by a remote rendering module to generate an image data stream of the video game scenes;

(b) program instructions for transmitting the second rendering commands to a remote rendering module over a data network.

19. A machine readable storage medium as defined in claim 18, wherein the program code includes program instructions to generate a Graphical User Interface (GUI) on a display displaying an image data stream produced by the local rendering module on the basis of the first rendering commands, the GUI providing a player playing the video game with a plurality of spectator selection options.

20. A machine readable storage medium as defined in claim 19, wherein the program code includes program instructions responsive to selection of an option by the player via the GUI to transmit to the remote rendering module identification information associated with the option selected by the player.

21. A machine readable storage medium as defined in claim 20, wherein the identification information identifies at least one spectator to which the image data stream produced by the remote rendering module is to be transmitted.

Description:
DESCRIPTION

VIDEO GAMING DEVICE WITH REMOTE RENDERING CAPABILITY

FIELD OF THE INVENTION The invention relates to a video gaming device that interfaces with a remote rendering server to perform image rendering. The invention also relates to a server arrangement that performs image rendering functions under control of the remote video gaming device. The invention also relates to various methods and devices for performing remote rendering operations in the context of video gaming.

BACKGROUND OF THE INVENTION

Many video games today allow spectating, which is the ability of a participant to view the game action without directly being involved in it.

Spectating an online video game that is being run over a private network requires the spectator to invest in gaming equipment that can interface with the other gaming devices connected to the private network. This means that a spectator can watch the game only through the intermediary of a gaming device running the same game software that would be required to directly participate in the game. This is obviously a disadvantage in that it limits access, from a spectating point of view, to online games.

Therefore, there is a need to develop improved gaming devices and associated systems and techniques for facilitating access to spectators to video game action.

SUMMARY OF THE INVENTION

As embodied and broadly described herein, the invention provides a gaming device for playing a video game. The gaming device executes game software to produce first rendering commands representing a virtual world of the video game and a local rendering module for processing the rendering commands to generate an image data stream of the virtual world. The gaming device also has a remote rendering controller for generating second rendering commands for processing by a remote rendering module to generate a rendered graphics output also depicting the virtual world. An output releases the image data stream and transmits the second rendering commands to a data communication network for transmission to the remote rendering module.

As embodied and broadly described herein, the invention also provides a server arrangement for performing remote rendering of video game graphics. The server arrangement receives rendering commands transmitted to the server input over a data network from a remote location, the rendering commands representing a virtual world of a video game, the video game being played by players controlling respective virtual characters in the virtual world. The server arrangement also receives identification information to identify one or more spectators. The server arrangement has a rendering module for processing the rendering commands to generate an image data stream depicting the virtual world and an output for transmitting the image data stream to the one or more spectators over a data network.

As embodied and broadly described herein, the invention further provides a method for allowing a spectator to spectate an online video game played over a network. The method including executing by a Central Processing Unit (CPU) game software to generate rendering commands representing a virtual world of a video game, processing the rendering commands with a rendering module to generate an image data stream and outputting the image data stream to a display device. The method also includes processing with the CPU the first rendering commands to derive second rendering commands and transmitting the second rendering commands to a remote server for rendering and delivery of a rendered output to at least one spectator. As embodied and broadly described herein, the invention also provides a method for performing distributed rendering of a video game. The method including processing with a first rendering module at a first network node in a data network rendering commands representing a virtual world of a video game to generate an image data stream for display to a player that controls a virtual character in the video game. The method also includes processing at a second node in the data network that is remote from the first node second rendering commands derived from the first rendering commands with a second rendering module to produce a rendered output depicting the virtual world and the virtual character of the player and transmitting the rendered output to at least one spectator for viewing.

As embodied and broadly described herein, the invention further includes a machine readable storage medium encoded with non transitory program code for execution by a CPU for generating rendering commands to perform remote rendering of video game scenes, the program code including remote rendering program instructions for processing first rendering commands output by game software for driving a local rendering module, to derive second rendering commands for processing by a remote rendering module to generate an image data stream of the video game scenes. The program code also includes program instructions for transmitting the second rendering commands to a remote rendering module over a data network.

BRIEF DESCRIPTION OF THE DRAWINGS

Figure 1 is a block diagram illustrating a data network and the various network nodes collectively implementing an online video game infrastructure, according to a non-limiting example of implementation of the invention;

Figure 2 is a more detailed block diagram of the gaming device shown in Figure 1 , illustrating the various functional modules of the gaming device and their relationship to some external components; P T/JP2014/050727

Figure 3 is a block diagram of a computing platform on which the gaming device shown in Figures 1 and 2 is implemented;

Figure 4 is a block diagram illustrating in greater detail the structure of the remote rendering server shown in Figures 1 and 2;

Figure 5 is a flowchart illustrating a process that is performed during execution of the game software by the gaming device shown in Figures 1 and 2;

Figure 6 is a flowchart of a process performed by the remote rendering controller of the gaming device shown in Figures 1 and 2;

Figure 7 is a flowchart of a process performed by the rendering module of the remote rendering server;

Figure 8 illustrates the local display that is associated with the gaming device and shows a graphical user interface allowing the game player to identify a spectator to which the game action can be streamed; and

Figure 9 is a flowchart illustrating the process for selecting the spectator to receive the game action stream by using the graphical user interface shown in Figure 8.

DESCRIPTION OF AN.EXAMPLE IMPLEMENTATION OF THE INVENTIO

In a specific and non-limiting example of implementation, the invention provides a gaming device that outputs rendering commands that are rendered remotely. The rendered output is available to spectators that wish to observe the game action.

Figure 1 illustrates the data communication infrastructure allowing playing the video game and also providing spectators with access to the video game action. The data communication infrastructure includes a data network 10 such as the Internet that can be used to exchange data between nodes of that network. A gaming device 12, which constitutes an individual node of the data network 10, allows a local player to play a video game. The video game is an online video game, in other words, other players at remote gaming devices also participate in the game. Such multiplayer gaming action is generally considered to provide a superior game experience than a single gamer action. Alternatively, the video game is a single player game.

The gaming device 12 outputs image data to a local display 14, such as a television set. The local display 14 shows to the local player images of the virtual world in which the virtual game character of the player evolves. While not shown in the drawings, it is to be understood that the local player interacts with the video game through controls such as game pads or any other suitable device allowing the requisite degree of player input. Such game pads allow the player to control the movement of the virtual game character, change the settings of the gaming device and optionally specify the address or identity of a spectator that is to receive the gaming action stream.

When the online video game is being played, the gaming device sends game metrics to a game-hosting server 16. The game-hosting server 16 receives the game metrics from all the gaming devices involved in the game, aggregates them and broadcasts them back to the individual gaming devices 12. In this fashion, the gaming device 12 can generate the interactive, multiplayer virtual environment.

As the game is being played, the gaming device 12 generates rendering commands that are transmitted to a remote rendering server 18. The rendering commands constitute a representation of the virtual world of the game and they are used as input to a rendering module at the remote rendering server 18. The rendering module processes the rendering commands and generates an image data stream that is transmitted through the data network 0 to spectators 20. The above-described architecture is useful and beneficial in the sense that it does not require the individual spectators to be involved in any way in the exchange of game metrics required when a participant is actively involved in the game and controls a virtual game character. Spectators only need to connect to the remote rendering server 18 to access the image data stream conveying the video game action.

The gaming device 12 is based on a computer platform that is generically illustrated in Figure 3. The computer platform has a Central Processing Unit (CPU) 22 (in practice, multiple CPUs can be used to increase the processing power), a machine- readable storage 24 which is more commonly referred to as "memory", a Graphics Processing Unit (GPU) 26 and an input/output interface 28.

These components are interconnected by a data bus 30 over which data and other signals are exchanged.

The memory 24 globally refers to both volatile and non-volatile storage used for storing program instructions and data on which the program instructions operate. In practice, the memory 24 could be of distributed nature and could include multiple components that may be located at a single node or can be at several different nodes of the network.

The GPU 26 is a hardware accelerator that increases image rendering performance. As it is well known to those skilled in the art, the GPU is a specialized processor that can perform image processing functions more efficiently and rapidly than a general purpose CPU.

The input/output interface 28 refers globally to the various hardware and software components allowing exchange of data with external devices. For example, signals that convey player commands generated by the game pads are directed to the CPU through the input/output interface 28. Similarly, signals and data generated by the internal processing are output and directed to external devices through the input/output interface 28.

Figure 2 illustrates in greater detail the internal structure of the gaming device 12. Note that the internal representation provided is a blend of hardware and software intended to illustrate functional modules rather than specific hardware components.

The gaming device 2 has a game software 32 that implements the game logic as a result of execution of game software 32 by the CPU 22. The game logic runs according to player inputs, in particular those of the local player at the game pads and also inputs from players at remote gaming devices. It is beyond the scope of this specification to discuss the particulars of the game software 32 as it is highly dependent on the game scenario, game type etc. Suffice it to say that the execution of the game software 32 outputs data that is supplied to a rendering module to generate the images of the virtual world that are displayed to the local player on the display 14. In addition, the execution of the game software 32 generates game metrics, which describe the actions of the virtual character controlled by the local player. Examples of game metrics include the location of the virtual player in the game map, its direction and speed of movement, actions by the player such as firing a weapon, using a particular tool to achieve a certain task, body postures, specific character settings such as weapons inventory and degree of health remaining, among others.

The game metrics are output by the gaming device 12 and sent to the game hosting server 16 which aggregates them and broadcasts them back to the individual gaming devices 12. In this fashion, each gaming device has access to the game metrics of other players in order to create a complete representation of the virtual world and its action including not only the game character controlled by the local player but also the game characters controlled by the remote players. The gaming device 12 further includes a local rendering module 34, which receives from the game software 32 the data describing the virtual world of the game and the action. This data, also referred to as "rendering commands" includes instructions telling the local rendering module 34 how to process geometry information, which provides a mathematical representation of the objects making up the virtual world scenes and game characters, in order to generate a visible image that can be displayed to the local player. In a specific form of implementation, the output of the local rendering module 34 is an image data stream that can be fed to a television set or to a video monitor for display. The image data stream can be a succession of still image frames. Alternatively the image data stream can be organized to provide information on movements, such as for example a video stream encoded according to the h264 standard for video compression.

A remote rendering controller 36 is provided to generate the rendering commands for driving the remote rendering server 18. The remote rendering controller 36, which can be implemented in software, intercepts the rendering commands output as a result of execution of the game software 32 and converts them in a format suitable for transmission and processing by the remote rendering server 18. Note that the processing of the intercepted rendering commands may be minimal or extensive.

For example, the remote rendering controller 36 may simply packetize the intercepted rendering commands for transmission to the remote rendering server 18 through the data network 10. In this case, the rendering commands that are delivered at the remote rendering server 18 (after re-assembly) are essentially identical to the rendering commands that are input in the local rendering module 34.

In another form of implementation, the remote rendering controller 36 will, in addition to packetizing the data, encode it for more efficient transmission. Different types of encoding can be contemplated according to the intended application. In such instance, the remote rendering controller 36 is provided with a suitable encoder that can be hardware- or software-implemented. As a corollary to the encoder, a complementary decoding function will also be required at the remote rendering server 18.

In a specific and non-limiting example of implementation, the remote rendering controller 36 collects all the rendering commands produced by the game software 32 in one frame (typically 33ms) and sends those commands to the remote rendering server 18 for execution. In this fashion, the rendering commands are being effectively sent to the remote rendering server 18 in batches, where each batch contains image data for a single still image.

In addition to the generation of the rendering commands for processing by the remote rendering server 18, the remote rendering controller 36 can also output spectator identifiers, to identify one or more spectators that are to receive the game action stream, as it will be discussed in greater detail later.

The gaming device 12 also has an output 38 that releases the various data streams internally generated, to external devices. For instance, the output of the local rendering module 34, -which is the image data stream, is released from the output to the local display 14. In a specific form of implementation, the image stream can be encoded into the well-known HD I format that includes both video and audio subcomponents. Evidently, other formats can also be used without departing from the spirit of this invention.

The rendering commands generated by the remote rendering controller 36 are also released through the output 38. In this instance, the output connects tb the Internet such that the rendering commands can be transmitted to the remote rendering server 18. In this example, the output 38 has the necessary data transmission functions such that it can interface with the data network 10. As it will be apparent to the reader skilled in the art, the output 38 is not necessarily a dedicated software or hardware function, rather it represents the overall functionality allowing the various signals that are generated internally to be released to external devices. Accordingly, the structure of the output 38 may be of distributed nature and components thereof may be integrated in other functions or modules of the gaming device 12. For example, the output function as it relates to the image stream released from the local rendering module 34 may be integrated into the local rendering module 34 that could, in addition to performing the image rendering operation, encode the rendered output in the HDMI format.

Similarly, the rendering commands transmission function can be integrated into the remote rendering controller 36 that could in addition to performing the data processing on the rendering commands, encode them and then packetize the encoded output in a format suitable for transmission to the data network 10.

Although not shown in the drawings, the gaming device 12 would also include an input function through which external signals are channeled for internal processing. An example of such external signals would be the control signals generated by the game pads on which the player inputs commands to control the movements and actions of the virtual character.

Figure 4 is a more detailed block diagram of the remote rendering server 18 shown in Figures 1 and 2. The remote rendering server 18 includes a rendering module 42 for performing image rendering on the rendering commands received from the gaming device 12. The remote rendering module 42 is based on a GPU (not shown) to more efficiently perform the image rendering operation. In a specific example, the remote rendering module 42 can be the same as the local rendering module 34 in the gaming device 12. However, this is not necessary in all cases as implementations are possible where a different rendering module may be used. For instance, the rendering module 42 may be designed such as to output the image stream in a different format that is better suited for video streaming over the Internet. 0727

The MPEG-4 is an example of such format. Furthermore, the rendering module 42 can be specifically designed according to the resolution or simply the type of displays on which the spectators will be watching the game action and that can be significantly different from the display connected to directly integrated into the gaming device 12. For example, the rendering module 42 can be optimized to generate an image data stream adapted for mobile devices, instead of larger displays of the type that the gaming device 12 would normally connect to.

The rendered image data stream generated by the rendering module 42 is transmitted to various spectators through an output 44. The output 44, which broadly designates an output function of the remote rendering server 18, performs a number of tasks such as managing the list of spectators, which are to receive the image data stream, among others. The spectator management function would require maintaining a list of the IP addresses of the spectators that are to receive the image data stream. Accordingly, the image data stream received from the rendering module 42 is broadcast to the IP addresses in the list of spectators.

The remote rendering server 18 also has an input 46 connecting to the Internet that receives the rendering commands generated by the gaming device 12. Accordingly, the rendering commands transmitted from the gaming device 12 are received at the input 46, pre-processed and supplied to the rendering module 42. In its simplest form, the pre-processing function would include re-assembling the data extracted from the packets to produce a continuous and coherent data stream that the rendering module 42 can process. If the gaming device 12 encodes the rendering commands, the input 46 would then apply a suitable decoding function before supplying the rendering commands to the rendering module 42.

Under the optional scenario where the gaming device 12, in addition to generating the rendering commands, also generates one or more identifiers of the spectators that are to receive the rendered output, those identifiers would also be received by the remote rendering server 18 through the input 46. The identifiers of the spectators can be the IP addresses of the terminals at which the spectators would be watching the game action. Those IP addresses can be passed to the IP address list maintained by the output 44. This communication function is shown by the arrow 48.

Figure 5 is a flowchart illustrating the various steps of the process during execution of the game software 32. The process starts substantively at step 50, where the game software 32 receives the player inputs. As discussed earlier, the player inputs are generated by the player operating the game pads, in particular buttons, joysticks and other control devices that are used to indicate desired motions or actions of the virtual game character. Signals from the game pads are input in the gaming device 12 and directed to the game software 32. While not shown explicitly in the flowchart, additional player inputs will also exist when the gaming device is used for playing multip!ayer games. In such instance, in addition to the player inputs that are generated locally, the game software 32 will also be receiving game metrics, which inherently convey player inputs, from participating gaming devices.

At step 52, the game software 32 is executed to implement the game logic. The game logic is highly dependent on the particular type of game and scenario and it will not be described in detail. Suffice it to say that the game logic is the set of rules that determine an outcome based on player inputs.

At step 54 the game software 32 outputs rendering commands and game metrics. The rendering commands describe the virtual world of the game which is to be displayed to the local player. The rendering commands is the information that is supplied to the local rendering module 34 and processed to generate individual pixel values expressed in terms of color and intensity that compose the image the local player is to see. Generally, the rendering commands include instructions that tell the local rendering module 34 how to handle or manipulate geometry data. The geometry data constitutes a mathematical description of the various components of the virtual world scenes. An example of geometry data is vertex coordinates that define the location of the vertices making up the image mesh.

A simple example of a rendering command could be instructions to the GPU 26 to create a certain visual effect or a transformation to move a certain object in the image.

Note that the rendering commands may, in addition to the instructions convey the geometry data as well

The rendering commands are passed to the rendering module 34 via its Application Programming Interface (API) 80 (shown in Figure 2). DirectX or OpenGL are examples of APIs that can be used to by the game software 32 to interact with the GPU 26 of the rendering module 34, via a device driver 82.

As to the game metrics, as discussed earlier they represent in the context of the particular game which is being played, the actions of the virtual character associated with the game player such as motion of the character, actions being performed etc.

The rendering commands and the game metrics are two separate streams of information, which are directed to different components of the gaming device 12. The rendering commands are directed to the local rendering module 34 while the game metrics are output and are transmitted through the Internet to the game hosting server 16.

Figure 6 is a flowchart of the process performed by the remote rendering controller 36. At step 56, the remote rendering controller 36 intercepts and processes the rendering commands generated by the game software 32. The processing function can vary depending on the intended application and it can be as simple as dispatching the rendering commands to the remote rendering server 18. In most applications, however, a more substantive processing will be performed in order to more efficiently transmit the rendering commands. An example of such processing is to encode the rendering commands to reduce the amount of data that needs to be transmitted. Any suitable encoding process can be used without departing from the spirit of the invention. Such encoding is mostly effective with the rendering commands.

Another type of processing that can also be performed by the remote rendering controller 36 is to packetize the rendering commands such that they can be transmitted to the remote rendering server 18 over the Internet.

At step 58, the remote rendering controller 36 outputs the processed rendering commands to the output function 44 such that the rendering commands stream can be transmitted to the remote rendering server 18.

Step 60 is an optional step. The remote rendering controller 36 outputs the identifier of the recipient of the remotely rendered image stream, which typically would be a spectator. The identifier can be any type of identification data which can uniquely distinguish the spectator from other spectators. An example of such an identifier is the IP address of the terminal at which the spectator is located.

Figure 7 is a flowchart of a process implemented by the rendering module 42 of the remote rendering server 18. The process starts at step 62 where the rendering commands from the gaming device 12 are input into the rendering module 42. This step assumes that initially the necessary processing has been performed on the data transmitted over the Internet in order to make it suitable for processing by the rendering module 42. Such processing may include decoding the rendering commands in the event they have been encoded prior to transmission.

At step 64, the image based on the rendering commands is rendered. The rendered output can be in any suitable format, the MPEG-4 format being one example. Step 66 is an optional step that describes the scenario where the gaming device 12 also sends the identifier of one or more spectators to which the game action stream is to be delivered. As indicated previously, the identifier can be the IP address of the terminal of the spectator. At step 68, the image data stream is sent to the spectators. The transmission would typically involve placing the individual video frames in suitable packets prior to transmission.

Figure 8 illustrates a graphical user interface that can be used by the local player at the gaming device 12 to select one or more spectators that are to receive the game action stream. This graphical user interface is generated and controlled by the game software 32 or the operating system running the console 12 and includes a control mechanism that presents the game player with a series of options among which the game player can select one or more spectators for receiving the game action stream. In a specific example of implementation, the game software 32 generates on the local display 14 the graphical user interface that includes a list 70 of spectators for selection by the local player. The list of spectators can be a static list in the sense that it would be pre-loaded by the player in the gaming device 12. This static list would include a series of entries where each entry is associated to an individual spectator. The spectators could be friends, relatives or acquaintances of the game player that may be interested to receive the game action feed when the player is playing the game. Each entry in the list includes a name for the respective spectator, which can be the real name of the person or pseudonym and also includes the identifier, such as the IP address of the terminal that is associated with the spectator.

Alternatively, other types of identifiers can be considered that can eventually be resolved into an IP address. For example, an email address can be specified.

Another possibility is to create a dynamic list that does not require the local player to manually input all the list entries. Such a dynamic list requires some sort of communication mechanism between the gaming console 12 and the spectator's terminal 20. For example, the gaming device 12 can send a request to a spectator terminal 20 to accept an invitation for a game action stream and if the spectator accepts the invitation, a response is sent to a gaming device 12 that is used to create an entry in the list that contains all the relevant information necessary to properly identify the spectator. The communication between the gaming device 12 and the spectator terminal 20 can be made by using email, text messaging etc.

To perform the spectator selection, the local player uses a pointing device 72 to designate one or more spectators in the list. The output of the selection is sent to the game software 32 and the associated spectator identifiers are sent to the remote rendering server 18, as previously described.

Figure 9 is a flowchart of the process for selecting one or more spectators that are to receive the video game action stream. At step 74, the selection of the spectator is received which is performed as described previously by using the graphical user interface. At step 76, the identifiers of the spectators so selected are sent to the remote rendering server 18.

In a possible variant, the remote rendering controller 36 is responsible for generating the GUI for performing the spectator selection. The remote rendering controller "injects" in the flow of data directed to the local rendering module 34 the data necessary to generate the GUI on the display 14. Player interactions with the GUI on the display 14, such as selections of the spectators etc., are intercepted and directed to the remote rendering controller 36 for processing. In this fashion, neither the game software 32 nor the operating system of the gaming device 12 are involved with the spectator designation or selection process. The advantage of this form of implementation is that the game software 32 and the operating system of the gaming device 12 do not require modification to implement the spectator selection functionality. The remote rendering controller 36, which can be an add-on software module, performs all the necessary functions to provide access to spectators to the game action. Thus, a gaming device that originally has not been designed to provide spectator access to the gaming action can be upgraded with this feature by installing the software module implementing the remote rendering controller 36.

The description of the invention above was done in the context of a multi-player gaming device. Note that the invention applies equally well to a single player gaming device in which the game software is executed and generates rendering commands that are directed to the local rendering module in order to generate an image data stream for the game player to see. The single player gaming device would also include a remote rendering controller, identical to the remote rendering controller 36 which intercepts the rendering commands and transmits them to the remote rendering server 18. The only distinction with the multi-player gaming device is that no game metrics need to be output since the game logic is constrained to the local gaming device.

Also note that the remote rendering server 18 can work equally well with rendering instructions generated from a multi-player gaming device than rendering instructions generated from a single player gaming device.

Persons skilled in the art should appreciate that the above-discussed embodiments are to be considered illustrative and not restrictive. Also it should be appreciated that additional elements that may be needed for operation of certain embodiments of the present invention have not been described or illustrated as they are assumed to be within the purview of the person of ordinary skill in the art. Moreover, certain embodiments of the present invention may be free of, may lack and/or may function without any element that is not specifically disclosed herein.

Those skilled in the art will also appreciate that additional adaptations and modifications of the described embodiments can be made. The scope of the invention, therefore, is not to be limited by the above description of specific embodiments but rather is defined by the claims attached hereto.