Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
DIRECTING COMPUTER GAME STREAMING
Document Type and Number:
WIPO Patent Application WO/2023/150471
Kind Code:
A1
Abstract:
A game director server receives game-player-generated data of a computer game from game player devices operated by game players respectively. The game director server is separate from a game play server that generates and maintains a first game world for the game player devices to play the computer game. The game-player-generated data is used to generate and maintain, by the game director server, a second game world separate from the first game world. The second game world is not for the game player devices to play the computer game. Game scenes in the second game world are monitored to identify a subset of specific game scenes from among candidate game scenes. Specific video game streams that capture images of the specific game scenes are generated for streaming to game spectator device(s).

Inventors:
NINAN AJIT (US)
Application Number:
PCT/US2023/061403
Publication Date:
August 10, 2023
Filing Date:
January 26, 2023
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
DOLBY LABORATORIES LICENSING CORP (US)
International Classes:
A63F13/86; A63F13/352; A63F13/5252
Foreign References:
US20090253506A12009-10-08
US20190262723A12019-08-29
US20170157512A12017-06-08
USPP63305636P
Other References:
JENS MÜLLER, SERGEI GORLATCH, TOBIAS SCHRÖTER,STEFAN FISCHER: "Scaling Multiplayer Online Games using Proxy-Server Replication: A Case Study of Quake 2", ACM, 2 PENN PLAZA, SUITE 701 - NEW YORK USA, 25 June 2007 (2007-06-25) - 29 June 2007 (2007-06-29), pages 219 - 220, XP040063332, ISBN: 978-1-59593-673-8
Attorney, Agent or Firm:
ZHANG, Yiming et al. (US)
Download PDF:
Claims:
CLAIMS: A method comprising: receiving, by a game director server, a mirror copy of game-player-generated data of a computer game from one or more game player devices operated by one or more game players respectively, wherein the game director server is separate from a game play server that generates and maintains a first game world for the one or more game player devices to play the computer game; using the game-player-generated data to generate and maintain, by the game director server, a second game world separate from the first game world, wherein the second game world is not for the one or more game player devices to play the computer game; monitoring game scenes in the second game world to identify a subset of one or more specific game scenes from among a plurality of candidate game scenes; generating one or more specific video game streams that capture images of the one or more specific game scenes for streaming to one or more game spectator devices; wherein the game director server autonomously and dynamically updates the second game world without receiving any game play data generated by the game play server after the computer game starts unfolding. The method of Claim 1, wherein the second game world represents a mirror copy of the first game world. The method of any of claims 1-2, wherein the game director server receives the game- player-generated data from an aggregation and redirection server that receive the game-player-generated data from the one or more game player devices. The method of any of claims 1-3, wherein the game director server receives the game- player-generated data forwarded by the game play server. The method of any of claims 1-4, wherein the game play server generates game play data based at least in part on the game-player-generated data; wherein the game play server sends at least a part of the game play data to the one or more game player devices to update one or more game views rendered on one or more image displays operating in conjunction with the one or more game player devices.

6. The method of any of claims 1-5, wherein the one or more specific video game streams are streamed to one or more game spectator devices by way of a video game streaming server.

7. The method of any of claims 1-6, wherein the game director server operates with an automated director to generate automated game directing data that selects one or more specific game scenes to be depicted by the one or more specific game streams at one or more specific time points.

8. The method of any of claims 1-7, wherein the game director server communicates with a designated director to generate second game directing data that selects one or more second specific game scenes to be depicted by the one or more specific game streams at one or more second specific time points.

9. The method of any of claims 1-9, wherein the game director server operates with a specific game spectator device of the one or more game spectator devices to generate third game directing data that selects one or more third specific game scenes to be depicted by a specific game stream of the one or more specific game streams at one or more third specific time points; wherein the specific game stream is to be streamed to the specific game spectator device.

10. The method of any of claims 1-9, wherein the one or more specific game streams depict one or more specific game scenes captured by one or more specific cameras in a specific time interval in the computer game; wherein the game director server selects one or more specific camera positions in relation to the second game world for placing the one or more specific cameras in the specific time interval.

11. The method of claim 10, wherein the one or more specific camera positions include one or more of: discrete positions within the second game world, positions on one or more spatial trajectories within the second game world, positions outside of the second game world. An apparatus performing any of the methods as recited in claims 1-11. A non- transitory computer readable storage medium, storing software instructions, which when executed by one or more processors cause performance of the method recited in any of claims 1-11. A computing device comprising one or more processors and one or more storage media, storing a set of instructions, which when executed by one or more processors cause performance of the method recited in any of claims 1-11.

Description:
DIRECTING COMPUTER GAME STREAMING

CROSS-REFERENCE TO RELATED APPLICATIONS

[0001] This application claims priority of the following priority application: US provisional application 63/305,635, filed 01 February 2022; EP application 22155828.1, filed 09 February 2022, both of which are incorporated by reference in their entirety.

TECHNICAL FIELD

[0002] The present invention relates generally to computer games, and in particular, to directing computer game streaming.

BACKGROUND

[0003] Computer games have become spectator sports. Numerous game enthusiasts or spectators follow popular computer game events and watch live (and even recorded) video streams of tournaments of computer games. These game video streams may provide birds-eye views or first-person views of a played computer game.

[0004] Under some approaches, a game player may wear a camera to capture a game video stream of screen images rendered on an image display in front of the game player. The captured game video stream can be uploaded to a cloud-based game video streaming server, and broadcast or streamed to game spectators by way of the game video streaming server. [0005] While the game video stream as captured by the game player might provide a more engaging or immersive view than a birds-eye view, such game video stream would probably miss at least half of the actions in the computer game. Moreover, as the game player could make head or body movements from time to time, images in the captured game video stream would become jerky and hard to follow visually. As a result, user experience with game video streaming could deteriorate greatly, as compared with other types of video streaming such as authored/directed video streaming controlled by skilled video artists or professionals.

[0006] The approaches described in this section are approaches that could be pursued, but not necessarily approaches that have been previously conceived or pursued. Therefore, unless otherwise indicated, it should not be assumed that any of the approaches described in this section qualify as prior art merely by virtue of their inclusion in this section. Similarly, issues identified with respect to one or more approaches should not assume to have been recognized in any prior art on the basis of this section, unless otherwise indicated.

SUMMARY

[0007] Embodiments are disclosed for directing computer game streaming. [0008] In some embodiments, there is provided a method comprising: receiving, by a game director server, a mirror copy of game-player-generated data of a computer game from one or more game player devices operated by one or more game players respectively, wherein the game director server is separate from a game play server that generates and maintains a first game world for the one or more game player devices to play the computer game; using the game-player-generated data to generate and maintain, by the game director server, a second game world separate from the first game world, wherein the second game world is not for the one or more game player devices to play the computer game; monitoring game scenes in the second game world to identify a subset of one or more specific game scenes from among a plurality of candidate game scenes; generating one or more specific video game streams that capture images of the one or more specific game scenes for streaming to one or more game spectator devices, wherein the game director server autonomously and dynamically updates the second game world without receiving any game play data generated by the game play server after the computer game starts unfolding.

[0009] In some embodiments, the second game world represents a mirror copy of the first game world.

[00010] In some embodiments, the game director server receives the game-player- generated data from an aggregation and redirection server that receive the game-player- generated data from the one or more game player devices.

[00011] In some embodiments, the game director server receives the game-player- generated data forwarded by the game play server.

[00012] In some embodiments, the game play server generates game play data based at least in part on the game-player-generated data; wherein the game play server sends at least a part of the game play data to the one or more game player devices to update one or more game views rendered on one or more image displays operating in conjunction with the one or more game player devices.

[00013] In some embodiments, the one or more specific video game streams are streamed to one or more game spectator devices by way of a video game streaming server.

[00014] In some embodiments, the game director server operates with an automated director to generate automated game directing data that selects one or more specific game scenes to be depicted by the one or more specific game streams at one or more specific time points.

[00015] In some embodiments, the game director server communicates with a designated director to generate second game directing data that selects one or more second specific game scenes to be depicted by the one or more specific game streams at one or more second specific time points.

[00016] In some embodiments, the game director server operates with a specific game spectator device of the one or more game spectator devices to generate third game directing data that selects one or more third specific game scenes to be depicted by a specific game stream of the one or more specific game streams at one or more third specific time points; wherein the specific game stream is to be streamed to the specific game spectator device. [00017] In some embodiments, the one or more specific game streams depict one or more specific game scenes captured by one or more specific cameras in a specific time interval in the computer game; wherein the game director server selects one or more specific camera positions in relation to the second game world for placing the one or more specific cameras in the specific time interval.

[00018] In some embodiments, the one or more specific camera positions include one or more of: discrete positions within the second game world, positions on one or more spatial trajectories within the second game world, positions outside of the second game world. [00019] In some embodiments, there is provided an apparatus performing any of the above methods.

[00020] In some embodiments, there is provided a non-transitory computer readable storage medium, storing software instructions, which when executed by one or more processors cause performance of any of the above methods.

[00021] In some embodiments, there is provided a computing device comprising one or more processors and one or more storage media, storing a set of instructions, which when executed by one or more processors cause performance of any of the above methods.

BRIEF DESCRIPTION OF DRAWINGS

[00022] The present invention is illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings and in which like reference numerals refer to similar elements and in which:

[00023] FIG. 1A through FIG. 1C illustrate example systems for directing computer game streaming;

[00024] FIG. 2 illustrates an example video game streaming configuration with a game video streaming server;

[00025] FIG. 3A illustrates example automated directed video game streaming with a game director server; FIG. 3B illustrates example directed video game streaming with a game director server operating with an external non-game spectator device; FIG. 3C illustrates example directed video game streaming with a game director server operating with a game spectator device;

[0010] FIG. 4 illustrate an example process flow; and

[0011] FIG. 5 illustrates an example hardware platform on which a computer or a computing device as described herein may be implemented.

DESCRIPTION OF EXAMPLE EMBODIMENTS

[0012] Example embodiments, which relate to directing computer game streaming, are described herein. In the following description, for the purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the present invention. It will be apparent, however, that the present invention may be practiced without these specific details. In other instances, well-known structures and devices are not described in exhaustive detail, in order to avoid unnecessarily occluding, obscuring, or obfuscating the present invention.

[0013] Example embodiments are described herein according to the following outline:

1. GENERAL OVERVIEW

2. GAME PLAYING

3. GAME MIRRORING

4. DIRECTED GAME STREAMING

5. EXAMPLE PROCES S FLOWS

6. IMPLEMENTATION MECHANISMS - HARDWARE OVERVIEW

7. EQUIVALENTS, EXTENSIONS, ALTERNATIVES AND MISCELLANEOUS

1. GENERAL OVERVIEW

[0014] This overview presents a basic description of some aspects of an example embodiment of the present invention. It should be noted that this overview is not an extensive or exhaustive summary of aspects of the example embodiment. Moreover, it should be noted that this overview is not intended to be understood as identifying any particularly significant aspects or elements of the example embodiment, nor as delineating any scope of the example embodiment in particular, nor the invention in general. This overview merely presents some concepts that relate to the example embodiment in a condensed and simplified format, and should be understood as merely a conceptual prelude to a more detailed description of example embodiments that follows below. Note that, although separate embodiments are discussed herein, any combination of embodiments and/or partial embodiments discussed herein may be combined to form further embodiments.

[0015] Under some approaches that do not implement techniques as described herein, while playing a computer game, a game player can wear a camera as a part of a head- mounted device to capture images displayed on an image display with which the game player is playing the computer game. Another camera such as a front camera of a computer placed in front of the game player can be used to capture images of the game player while the computer game is being played. The images captured by these two cameras can be uploaded to a streaming server such as Twitch for streaming to game spectators. However, as the computer game may be a multi-player game, the images captured by these two cameras situated with the game player may miss crucial information of what is going on in the computer game. In addition, images captured by a head-mounted camera can look very jerky and jumpy as the game player makes head or body movements from time to time.

[0016] In contrast, under techniques as described herein, a game director server - which may be cloud based - can receive a mirror copy of contemporaneous game-player-server- generated data and generated a dedicated game world for streaming purposes, which represents a mirror copy of a (game-play-server-generated) game world for playing purposes maintained or generated with a game play server. The game world maintained and generated with the game director server evolves or develops the exactly identical sequences of game scenes and game states in the game-player-server-generated game world for the computer game over a game session. In addition to or in place of cameras disposed with or near a game player as in other approaches, under techniques as described herein, the game director server can concurrently and/or consecutively deploy, position and/or invoke real or virtual cameras in various vantage locations or perspectives in relation to a (game-director-server-generated) game world in which game plays are unfolding and interesting players (as represented by their avatars) are playing. As a result, all interesting game plays or scenes in the game world can be (e.g., completely, from multiple locations or perspectives (e.g., orientations, directions, viewpoints, etc.) to a game play or scene, etc.) captured. Notably, as the cameras can be deployed in the mirror copy of the game world at fixed locations or along smooth spatial trajectories or paths, images captured by some or all of the cameras under techniques as described herein can be relatively stable as compared with head-mounted cameras under other approaches that do not implement these techniques. As the cameras can be deployed any spatial locations in the mirror copy of the game world, images captured by some or all of the cameras under techniques as described herein can capture and convey relatively complete game playing information as compared with head-mounted cameras under other approaches that do not implement these techniques.

[0017] Images captured by various cameras can be used to generate different video game streams from different angles, different views, different locations, different spatial trajectories (with different positions and/or different motion characteristics relating to camera movement in terms of velocity, accelerations, rotations, etc.), etc. In some operational scenarios, the game director server can also automatically or incorporate user input to select one or more specific video game streams at any given time to specific or all game spectators.

[0018] Example embodiments described herein relate to video game streaming. A game director server receives game-player-generated data of a computer game from one or more game player devices operated by one or more game players respectively. The game director server is separate from a game play server that generates and maintains a first game world for the one or more game player devices to play the computer game. The game-player-generated data is used to generate and maintain, by the game director server, a second game world separate from the first game world. The second game world is not for the one or more game player devices to play the computer game. Game scenes in the second game world are monitored to identify a subset of one or more specific game scenes from among a plurality of candidate game scenes. One or more specific video game streams that capture images of the one or more specific game scenes are generated for streaming to one or more game spectator devices.

[0019] In some example embodiments, mechanisms as described herein form a part of a media processing system, including but not limited to any of: cloud-based server, mobile device, virtual reality system, augmented reality system, head up display device, helmet mounted display device, CAVE-type system, wall-sized display, video game device, display device, media player, media server, media production system, camera systems, home-based systems, communication devices, video processing system, video codec system, studio system, streaming server, cloud-based content service system, a handheld device, game machine, television, cinema display, laptop computer, netbook computer, tablet computer, cellular radiotelephone, electronic book reader, point of sale terminal, desktop computer, computer workstation, computer server, computer kiosk, or various other kinds of terminals and media processing units.

[0020] Various modifications to the preferred embodiments and the generic principles and features described herein will be readily apparent to those skilled in the art. Thus, the disclosure is not intended to be limited to the embodiments shown, but is to be accorded the widest scope consistent with the principles and features described herein.

2. GAME PLAYING

[0021] FIG. 1A illustrates an example system 100 for directing computer game streaming. Some or all devices and components/units thereof in the system (100) may be implemented in software, hardware, or a combination of software and hardware, with one or more of: computing processors such as central processing units or CPUs, audio codecs, video codecs, digital signal processors, graphic processing units or GPUs, etc.

[0022] As illustrated in FIG. 1A, the system (100) includes a game play server 110 that hosts a (e.g., live, video, online, real time, near real time, competitive, tournament, single player, multi-player, team-based, etc.) computer game played by one or more (e.g., human, etc.) game players respectively operating one or more game player devices 102-1, 102-2, . . . 102-N, where N is an integer no less than one (1). The computer game may last for a game session or a time interval such as 30 minutes, one hour, one and half hours, etc. The computer game may be, but is not necessarily limited to only, one of: League of Legends, Counter- Strike, Call of Duty, Dota2, Fortnite, Overwatch, PlayerUnknown’s Battlegrounds, StarCraft, etc.

[0023] The one or more game players may be respectively (virtually) represented by one or more avatars located in a game world 114. The game world (114) can be (e.g., autonomously, independently, without depending on a game director/streaming server, etc.) generated, defined, dynamically represented and/or maintained in the game play server (100) or a player-interacting game engine 112 therein. For reference purposes, the game world (114) may be referred to as the play server game world (114).

[0024] The game-player-interacting game engine (112) interacts with the one or more game player devices (102-1, 102-2, . . . 102-N) in the game session for the game players to play the computer game. The game-player-interacting game engine (112) can communicate during the game session with the one or more game player devices (102-1, 102-2, . . . 102-N) over one or more bidirectional data flows 118-1, 118-2, ... 118-N, respectively.

[0025] For example, the game-player-interacting game engine (112) receives (e.g., live, real time, near real time, etc.) game-player-generated data from the game player devices (102-1, 102-2, ... 102-N) over the bidirectional data flows (118-1, 118-2, ... 118-N). The game-player-generated data may be generated by the game player devices (102-1, 102-2, . . . 102-N) from user interactions with the game players through one or more user interfaces. In response to receiving the game-player-generated data, the game-player-interacting game engine (112) determines, identifies and/or generates corresponding game player generated/triggered events represented or encapsulated in the game-player-generated data. The game-player-interacting game engine (112) uses the game-player-generated data as well as game-server-maintained data such as states of the play server game world (114), histories of states of the play server game world (114) to drive (e.g., live, real time, near real time, etc.) changes in the states of the play server game world (114) over a single or multiple concurrent and/or consecutive game plays during the game session.

[0026] Additionally, optionally or alternatively, the game -player-interacting game engine (112) may (e.g., concurrently, contemporaneously, subsequently, within a relatively strict time budget measured in a few milliseconds or seconds, etc.) generate or trigger (game server generated/triggered) events in response to one or more of: present or past received game player generated/triggered events, concurrent and/or past states of the play server game world (114), changes in the states of the play server game world (114), and so on. The game play server (110), or the game-player-interacting game engine (112) therein, may represent or encapsulate the game-play-sever-generated/triggered events into game-play-sever-generated data, and sends/transmits (e.g., within a strict time budget, in real time, in near real time, etc.) some or all of the game-server-generated data to the game player devices (102-1, 102-2, . . . 102-N) over the bidirectional data flows (118-1, 118-2, ... 118-N). The game-server- generated data may be used by the game player devices (102-1, 102-2, . . . 102-N) to render or update their respective visual (including any accompanying audio) of the computer game with image displays and/or audio speakers.

[0027] A game world as described herein such as 114 or 108 of FIG. 1A includes avatars representing the game players and attendant game characters/objects (e.g., rooms, castles, rocks, creatures, etc.) in a (virtual, multi-dimensional) game space. The game space may be composed of all spatial locations or regions of the game world. The avatars and attendant game characters/objects may be stationary or moving in the game space at a given time in the game session as determined by their respective individual states at the given time. These individual states of the avatars and attendant game characters/objects at the given time collectively or aggregately give rise to, or define, the overall state of the game world at the given time. A visual depiction of the game world may be captured at the given time by a virtual or real camera positioned in relation to (e.g., at a spatial location/point hovering above, at a spatial location/point within, etc.) the game world. The visual depiction of the game world as captured by the camera provides a visual representation of the overall state of the game world or a portion thereof at the given time.

[0028] The computer game played by the game players may comprise sequential or concurrent game plays unfolding in the play server game world (114) over the game session. A (computer) game play is a time varying evolution - or a sequence of (consecutive and/or concurrent) changes in states of - the play server game world (114) or a portion thereof - over a time interval in the game session - that are caused by a set of player-generated and/or play-server-generated events.

[0029] A visual depiction of a game play - including, but not necessarily limited to only, from a perspective of an avatar representing a game player operating a game player device - may be captured (as camera-generated images) over the time duration of the game play (possibly with leading or trailing time margins of a few seconds, a few minutes, etc.) by a virtual or real camera positioned in relation to a game world (e.g., 114, 108, etc.). The visual depiction may be streamed or transmitted to image rendering device(s) such as game player device(s) and/or game spectator device(s) and rendered by these devices on wearable or nonwearable image displays as video images. The visual depiction of the game play may be live or recorded (or replayed/highlighted). As used herein, “a live game play” or “a live visual depiction of the game play” may mean streaming video images of the game play from a game streaming server (e.g., a game director server, etc.) in real time, near real time, or within a relatively strict time limit such as one (1) millisecond, five (5) milliseconds, ten (10) milliseconds, etc., while the game play is being made in the play server game world (114). In contrast, “a recorded game play,” “a recorded visual depiction of the game play,” “a highlighted game play,” “a highlighted visual depiction of the game play,” “a replayed game play,” “a replayed visual depiction of the game play,” etc., may mean streaming video images of the game play on a time-delay basis (e.g., five (5) seconds later, minutes later, etc.) after some or all of the game play has occurred or mirrored in a game world (114 or 108). Example video game streaming can be found in U.S. Patent Application No. 63/305,636 (Attorney Docket No. 60175-0497; D21070USP1), titled “ENHANCING AND TRACKING VIDEO GAME STREAMING,” by Ajit Ninan, filed on equal day, the contents of which are incorporated herein by reference in its entirety.

[0030] Each game player device of the game player devices (102-1, 102-2, . . . 102-N) can be configured or installed with a (e.g., same, with relatively minor differences, web-based, non-web-based, mobile, etc.) game client application for the computer game. When executed, a game client application running on a game player device as described herein enables or causes the game player device to connect with the game play server (110) to play the computer game in the game session, to capture game-player-generated events from interactions with a game player operating the game player device, to communicate the game- player-generated events to the game play server (110), to receive states of some or all of the avatars including an avatar representing the game player and states of some or all of the attendant game characters/objects, to receive visual depictions and/or non- visual representations of the play server game world (114) or game plays occurred therein, to generate and/or render visual depictions of some or all of the play server game world (114) or game plays occurred therein on an image display operating in conjunction with the game player device, etc. In some operational scenarios, two-dimensional (2D) or three-dimensional (3D) visual game maps (e.g., rendered in an image display or a portion of, etc.) may be used to present at least a part of these visual depictions of some or all of the play server game world (114) or game plays occurred therein by the game player device to the game player. [0031] The game play server (110) that hosts the computer game played by the game players may communicate with the game player devices (102-1, 102-2, . . . 102-N) over wired and/or wireless data communication networks or links. The game play server (110) may be implemented with a single computer device or multiple computer devices local or remote to some or all of the game player devices (102-1, 102-2, . . . 102-N) in a peer-to-peer model, in a master-slave model, in a client-server model, in a distributed computing model, etc. In some operational scenarios, some or all of the game play server (110) may be cloud-based (e.g., a cloud-based server, a cluster of cloud-based virtual computers, etc.), premise-based, a combination of cloud-based and premise-based, etc.

[0032] The game play server (110) represents a (final) judge, arbitrator or authority for all the states of (hence all the game plays involving) the avatars and the attendant game characters/objects in the computer game, as played by the game players via the game player devices (102-1, 102-2, . . . 102-N). Visual and non-visual representations of the states of any of the avatars and the attendant game characters/objects in the computer game as received, generated and/or rendered by any of the game player devices (102-1, 102-2, . . . 102-N) are ensured - by game server functionality implemented by the game play server (110) and the game client applications running on the game player devices (102-1, 102-2, . . . 102-N) - to be consistent to the states of the avatars and the attendant game characters/objects in the computer game as maintained in or by the game play server (110).

3. GAME MIRRORING

[0033] As illustrated in FIG. 1A, the system (100) also includes a game director server 104 that supports directed (e.g., live, real time, near real time, time-delayed, etc.) video streaming of the computer game to one or more game spectator devices 116-1, 116-2, ... 116- M, where M is an integer no less than one (1) operated respectively by one or more (e.g., human, etc.) game spectators. The directed video streaming of the computer game may last for the entire game session or some time segments/intervals therein.

[0034] While the computer game is being played by the one or more game players by way of the game play server (110) and the game player devices (102-1, 102-2, . . . 102-N), the game director server (104), or a game-player-non-interacting game engine (106) therein, (e.g., separately, autonomously, independently, without depending on a game director/streaming server, etc.) generates, defines, dynamically represents and/or maintains a game world 108. The one or more game players may be respectively and identically (virtually) represented by one or more avatars located in the game world (108). For reference purposes, the game world (108) may be referred to as the director server game world (108). [0035] The game-player-non-interacting game engine (106) does not interact with the one or more game player devices (102-1, 102-2, . . . 102-N) operated by the game players to play the computer game in the game session in that the game-player-non-interacting game engine (106) may only communicate during the game session with the one or more game player devices (102-1, 102-2, ... 102-N) over one or more unidirectional data flows 120-1, 120-2, ... 120-N, respectively.

[0036] For example, the game-play er- non- interacting game engine (106) receives (e.g., live, real time, near real time, etc.) game -player-generated data the same as received by the game-player-interacting game engine (114) from the game player devices (102-1, 102-2, . . . 102-N) over the unidirectional data flows (120-1, 120-2, . . . 120-N). In response to receiving the game-player-generated data, the game-play er- non-interacting game engine (106) determines, identifies and/or generates corresponding game player generated/triggered events represented or encapsulated in the game-player-generated data. The game-player-non- interacting game engine (106) uses the game -player-generated data as well as game-server- maintained data such as states of the director server game world (108), histories of states of the director server game world (108) to drive (e.g., live, real time, near real time, etc.) changes in the states of the director server game world (108) over a single or multiple concurrent and/or consecutive game plays during the game session.

[0037] Additionally, optionally or alternatively, the game -player-non-interacting game engine (106) may (e.g., concurrently, contemporaneously, subsequently, within a relatively strict time budget measured in a few milliseconds or seconds, etc.) generate or trigger (director server generated/triggered) events in response to one or more of: present or past received game player generated/triggered events, concurrent and/or past states of the director server game world (108), changes in the states of the director server game world (108), and so on. Unlike the game play server (110) or the game-player-interacting game engine (112) therein, the game director server (104) or the game-player-non-interacting game engine (106) therein does not represent or encapsulate the game director server generated/triggered events into any sever-generated data to be sent/transmitted to the game player devices (102-1, 102-2, . . . 102-N). Hence, the game player devices (102-1, 102-2, . . . 102-N) do not receive or use any sever-generated data from the game director server (104) to render or update their respective visual (including any accompanying audio) of the computer game with image displays and/or audio speakers.

[0038] Like the game-play- server generated game world (114), the game-director- servergenerated game world (108) includes the same avatars representing the same game players and the same attendant game characters/objects (e.g., rooms, castles, rocks, creatures, etc.) in an identical or mirrored (virtual, multi-dimensional) game space. The avatars and attendant game characters/objects may be stationary or moving in the identical or mirrored game space of the game-director-server-generated game world (108) for or at a given time in the game session as determined by their respective individual states at the given time, mirroring the corresponding avatars and attendant game characters/objects represented in the game-play- server generated game world (114). A visual depiction of the game-director-server-generated game world (108) may be captured at the given time by a virtual or real camera positioned in relation to (e.g., at a spatial location/point hovering above, at a spatial location/point within, etc.) the game-director-server-generated game world (108). The visual depiction of the game- director-server-generated game world (108) as captured by the camera provides a visual representation of the overall state of the game-director-server-generated game world (108) or a portion thereof at the given time.

[0039] As in the computer game hosted in the game play server (110), sequential or concurrent game plays may be mirrored as unfolding in the game-director-server-generated game world (108) over the game session. A visual depiction of a game play - e.g., from a perspective of an avatar representing a game player operating a game player device - as mirrored in the game-director-server-generated game world (108) may be captured (as camera- generated images) by the game director server (104) over the time duration of the game play (possibly with leading or trailing time margins) by a virtual or real camera positioned in relation to the game-director-server-generated game world (108). The visual depiction may be streamed or transmitted to image rendering device(s) such as game player device(s) and/or game spectator device(s) and rendered by these devices on wearable or nonwearable image displays as video images. The visual depiction of the game play may be live or recorded or highlighted.

[0040] The game director server (104) or the player-non-interacting game engine (106) therein may communicate with the game player devices (102-1, 102-2, . . . 102-N) over the data connections (118 and 120) in a variety of ways. [0041] In a first example, the game client application for the computer game as configured or installed with each game player device of some or all of the game player devices (102-1, 102-2, . . . 102-N) may be provisioned or equipped with a (e.g., dedicated, separate, etc.) data connection to each of the game director server (104) and the game play server (110), and can send an identical copy of some or all player-generated data of the game play device to each of the game director server (104) and the game play server (110) on the data connection (e.g., a unidirectional data connection, a bidirectional data connection, etc.). [0042] In a second example, in a system 100-1 as illustrated in FIG. IB, a separate message aggregation and redirection server 126 may be provisioned or deployed between some or all of the game player devices (102-1, 102-2, . . . 102-N) on one side and the game director server (104) and the game play server (110) on the other side. Some or all of the data connections (118 and 120) may be implemented or supported by the message aggregation and redirection server (126). For instance, instead of directly sending player-generated data (or corresponding messages encapsulating the player-generated data) from a game player device to one or both of the game director server (104) and the game play server (110), the game player device can send the player-generated data to the message aggregation and redirection server (126). The player-generated data can then be forwarded or routed to one or both of the game director server (104) and the game play server (110). Conversely, server-generated data (e.g., only the game -player-server generated data, without game-director-server generated data, etc. may be sent from the game play server (110) to a game player device by way of the message aggregation and redirection server (126).

[0043] In a third example, in a system 100-2 as illustrated in FIG. 1C, the game play server (110) may include a message redirection server 128. Some or all of the unidirectional data connections (120) may be implemented or supported at least in part by the message redirection server (128) in the game play server (110). For instance, instead of directly sending player-generated data (or corresponding messages encapsulating the player-generated data) from a game player device to each of the game director server (104) and the game play server (110), the game player device can send the player-generated data to the game play server (110). The message redirection server (128) in the game play server (110) can forward (e.g., an identical copy of, etc.) the player-generated data to the game director server (104) and the game play server (110).

[0044] The game director server (104) may be implemented with a single computer device or multiple computer devices local or remote to some or all of the game player devices (102-1, 102-2, ... 102-N) and/or the game spectator devices (116-1, 116-2, ... 116-M) in a peer-to-peer model, in a master-slave model, in a client-server model, in a distributed computing model, etc. In some operational scenarios, some or all of the game director server (104) may be cloud-based (e.g., a cloud-based server, a cluster of cloud-based virtual computers, etc.), premise-based, a combination of cloud-based and premise-based, etc. [0045] The game director server (104) does not represent a (final) judge, arbitrator or authority for all the states of (hence all the game plays involving) the avatars and the attendant game characters/objects in the computer game as played by the game players via the game player devices (102-1, 102-2, . . . 102-N). The game director server (104) does not affect the computer game as played by the game players via the game player devices (102-1, 102-2, ... 102-N).

[0046] The game director server (104) does not host but mirrors the computer game played by the game players. The game director server (104) may communicate with the game spectator devices (116-1, 116-2, ... 116-M) over wired and/or wireless data communication networks or links such as the data connections (122-1, 122-2, . . . 122-M) in a variety of ways. [0047] In a first example, the game spectator devices (116-1, 116-2, ... 116-M) may be configured or installed with a (e.g., same, with relatively minor differences, web-based, non- web-based, mobile, etc.) game spectator application for receiving video (or image) streams of the computer game directly from the game director server (110). When executed, a game spectator application running on a game spectator device as described herein enables or causes the game spectator device to connect with the game director server (110) to receive a live or recorded video stream of the computer game as captured by one or more cameras from the director-server-generated game world (108). Some or all of the video stream received by the game spectator device may be displayed or rendered on an image display operating in conjunction with the game spectator device.

[0048] In a second example, in a video game streaming configuration 200 as illustrated in FIG. 2, the game spectator devices (116-1, 116-2, ... 116-M) may be configured or installed with a (e.g., same, with relatively minor differences, web-based, non-web-based, mobile, etc.) game spectator application for receiving video (or image) streams of the computer game indirectly from the game director server (110) by way of a game video streaming server 202 operating in conjunction with the game director server (110). The game video streaming server (202) receives one or more source video (or image) streams generated by the game director server (110) from one or more cameras capturing visual depiction(s) of the director- server-generated game world (108). When executed, a game spectator application running on a game spectator device as described herein enables or causes the game spectator device to connect with the game video streaming server (202) to receive a live or recorded video stream of the computer game as captured by the cameras from the director- server-generated game world (108). Some or all of the video stream received by the game spectator device may be displayed or rendered on an image display operating in conjunction with the game spectator device.

4. DIRECTED GAME STREAMING

[0049] Directed game streaming can be supported with a (e.g., cloud based, non-cloud- based, etc.) mirror copy of a computer game running (e.g., in a game director server, autonomously, independently, without depending on game server generated data or events, etc. etc.) in parallel to a source version of the computer game running in a game play server. As used herein, and contexts of a computer game to select or direct specific game views of the computer game in video directed game streaming refers to using dynamic or real time game contents game streaming of the computer game to game spectator(s).

[0050] Example dynamic or real time game contents and contexts in the computer game may include, but are not necessarily limited to only: any, some or all of: a presence or absence of a specific game player; anticipated, impending, or actual play actions of one or more game players or their avatars; anticipated, impending, or actual encountering of two specific game players or their avatars; anticipated, impending, or actual game server generated events such as scoring, death, etc., of an avatar representing a game player; anticipated, impending, or actual game player generated events in connection with an avatar representing a game player; change(s) of state(s) of a game world or a part therein; etc.

[0051] Example game views that can be dynamically selected - based on dynamic or real time game contents or contexts - under directed game streaming as described herein may include, but are not necessarily limited to only, any, some or all of: birds-eye game views; first-person game views; game views other than birds-eye and first-person views; etc. In some operational scenarios, game views in video game streaming can be used or directed to focus on favorite game player(s) and/or popular game play(s) and/or favorite location(s) of a depicted game world in the computer game.

[0052] Directed game streaming can be enabled, implemented or performed in different ways. In a first example, game view selection data may be generated internally by a game director server such as 104 of FIG. 1A. In a second example, game view selection data may be generated externally, and provided in game director data over a game directing data connection 124, to a game director server such as 104 of FIG. 1A.

[0053] In some operational scenarios, a human director (e.g., a designated director by a game streaming operator, a self-designated director, a game spectator performing selfdirected game streaming while watching the self-directed game streaming, etc.) can monitor the computer game, dynamically select game views from a plurality of game views based on game contexts/contents of the computer game at various time points in a game session of the computer game and causes one or more video game streams to be created based on the game views selected by the human director at various time points in the game session of the computer game. The human director can be a professional game streaming director, a professional game player who is not playing the computer game in the game session, a game spectator, etc.

[0054] In some operational scenarios, a non-human director (e.g., an artificial intelligence or Al based director, a machine learning or ML based director, a streaming server module as a part of or operating in conjunction with a game director server and/or a game play server, etc.) can be implemented or used to monitor the computer game, dynamically select game views from a plurality of game views based on game contexts/contents of the computer game at various time points in a game session of the computer game and causes one or more video game streams to be created based on the game views selected by the non-human director at various time points in the game session of the computer game. The non-human director can be implemented in a game director server, a separate server/device/system external to but operating in conjunction with the game director server, etc.

[0055] FIG. 3A illustrates example automated directed video game streaming with a game director server (e.g., 104, etc.). Some or all devices or modules depicted in FIG. 3A can be implemented as a part of, or separately but operating in conjunction with, the game director server (104).

[0056] By way of example but not limitation, the game director server (104) includes or operates with an automated game director 302, a camera controller 304, a game video stream generator 306, one or more real or virtual cameras 308-1, 308-2, 308-3, etc., in addition to a game-player-non-interacting game engine (e.g., 106, etc.) and a game-director-server game world (e.g., 108, etc.) generated and/or driven by the game-player-non-interacting game engine (106).

[0057] The automated game director (302) can be implemented with software, hardware or a combination of software and hardware to monitor (e.g., live, real time, near real time, etc.) what are occurring in the game-director-server game world (108), and dynamically or adaptively generate automated game directing data that identifies and/or selects specific game scenes depicting specific avatars (e.g., representing specific game players, etc.), specific game moments/e vents, specific game plays, etc., in the game-director-server game world (108) at specific time points or time intervals in real time or near real time. These game scenes are deemed to be popular (e.g., to a population of game spectators, a specific group of game spectators, a specific game spectator, etc.) for video game streaming.

[0058] In some operational scenarios, the automated game director (302) can be implemented with an Al or ML based game scene prediction model for identifying and/or selecting specific scenes related to the computer game. The game scene prediction model can be implemented using artificial neural networks or ANNs (e.g., convolutional neural networks or CNNs, etc.) or non-neural-network Al or ML techniques.

[0059] The game scene prediction model may be trained offline or online in a training phase using past game sessions of the computer game. Viewing statistics collected from the past game sessions can serve as ground truths for identifying whether a specific type of game scene is popular among general or specific spectators. Game scenes along with their respective ground truths can be used to train the game scene prediction model to derive optimized operational parameters for the game scene prediction model that reduce or minimize prediction errors.

[0060] In an application or deployment phase, the game scene prediction model can be included as a part of the game director server (104) or the automated game director (302) to generate the automated game directing data that predict, identify and/or select specific game scenes in the game session of the computer game.

[0061] Additionally, optionally or alternatively, in some operational scenarios, the automated game director (302) can be implemented with non- Al non-ML techniques for identifying and/or selecting specific scenes related to the computer game. For example, rules or programming logics may be implemented to rank degrees of spectator interests or assign different weighting factors to different types of game plays, different game players, etc., and to generate automated game directing data that identify or select specific scenes based on these ranked interests or assigned weighting factors from among a plurality of candidate scenes related to the computer game.

[0062] In some operational scenarios, the automated game director (302) is internal to the game director server (104). In some operational scenarios, the automated game director (302) may be external to, but operate in conjunction with, the game director server (104). The automated game director (302) may receive game director server maintained data and provide the automated game directing data over a game director data connection (e.g., 124 of FIG. 1 A through FIG. 1C, etc.). [0063] The camera controller (304) can (e.g., continuously, from time to time, live, in real time, in near real time, etc.) receive the automated game directing data, use the received data to identify or select specific camera(s) at specific location(s) in relation to the game- director-server game world (108) at specific time point(s) or in specific time interval(s), direct the selected cameras to capture images of the selected game scenes of the game- director-server game world (108).

[0064] A real camera as described herein may capture images via light sensors or light sensitive materials operating with optical stack. A virtual camera as described herein may be implemented to capture images through computer-implemented image rendering techniques using a 3D model defining or specifying visual or non-visual characteristics such as depths, locations, positions, contours, surfaces, volumes, textures, patterns, positions, reflection properties, transparency properties, etc., of avatars, attendant game characters/objects, (e.g., virtual, etc.), ambient background, light sources, etc.

[0065] The total number of active cameras (e.g., 308-1, 308-2, 308-3, etc.) at any given time of a game session of a computer game may or may not be fixed in various operational scenarios. A camera as described herein can be fixed or movable to (e.g., different, etc.) location(s) within the game-director-server game world (108) and/or location(s) outside the game-director-server game world (108) such as a birds-eye view location (e.g., where the camera (308-3) is placed outside or above the game world (108), etc.).

[0066] In a first example, at a first specific time point or interval, a first camera (e.g., one of the cameras (e.g., 308-1, 308-2, etc.), etc.) may be selected to be placed at a first spatial location 312-1 within the game world (108) to capture images of a specific game scene in the game world (108).

[0067] In a second example, at a second specific time point or interval (which may or may not be same as the first specific time point or interval), a second camera (which may or may not be same as the first camera) may be selected to be placed at a second different spatial location 312-2 within the game world (108) to capture images of a second specific game scene in the game world (108).

[0068] In a third example, at a third specific time interval (which may or may not include the first or second specific time point or interval), a third camera (which may or may not be same as the first or second camera) may be selected to be placed along a continuous or contiguous spatial path or trajectory 310 within the game world (108) to capture images of a third specific game scene in the game world (108). Positions, velocities, accelerations, etc., associated with the spatial path or trajectory (310) may be specifically identified or selected based on the automated game directing data for the purpose of capturing the third specific game scene.

[0069] Additionally, optionally or alternatively, visual information including but not limited to video images from other cameras different from the cameras deployed at spatial locations in relation to the game-director-server game world (108) may be collected and streamed as a part of video game streaming of the game session of the computer games to game spectators (e.g., live, in real time, in near real time, etc.) by the game director server (104) or a game streaming server operating with the game director server (104).

[0070] The game video stream generator (306) can be implemented to use some or all of these cameras (e.g., 308-1, 308-2, 308-3, etc.) to generate or capture consecutive and/or concurrent sequences or streams of images at any given time points or intervals during the game session of the computer game. One or more specific streams of images from one or more specific cameras and/or specific attendant visual game information may be selected or switched to by the game video stream generator (306) - e.g., as directed by the automated game director (312) - to transmit, stream and/or otherwise provide to one or more game spectator devices, to a video game streaming server, to a video game broadcaster, to a video game recorder, etc.

[0071] FIG. 3B illustrates example directed video game streaming with a game director server (e.g., 104, etc.) under control of a game director device/client 316. Some or all devices or modules depicted in FIG. 3B can be implemented as a part of, or separately but operating in conjunction with, the game director server (104).

[0072] By way of example but not limitation, the game director server (104) includes or operates with a camera controller 304, a game video stream generator 306, one or more real or virtual cameras 308-1, 308-2, 308-3, etc., in addition to a game-player-non-interacting game engine (e.g., 106, etc.) and a game-director-server game world (e.g., 108, etc.) generated and/or driven by the game-player-non-interacting game engine (106).

[0073] The game director device/client (316) - implemented with software, hardware or a combination of software and hardware and external to the game director server (104) - may be operated by a human game director (e.g., a professional or designated video game director affiliated with a video game streaming service, etc.) who reviews candidate video game streams, imagery or textual game information, etc., and provide director input to the game director server (104) to identify, select or switch to specific video game streams from among a plurality of candidate video game streams and/or specific attendant visual game information for video game streaming. [0074] The human game director can operate the game director device/client (316) through user interfaced implemented by the game director device/client (316) to monitor (e.g., live, real time, near real time, etc.) what are occurring in the game-director-server game world (108), and dynamically or adaptively generate game directing data that identifies and/or selects specific game scenes depicting specific avatars (e.g., representing specific game players, etc.), specific game moments/events, specific game plays, etc., in the game- director-server game world (108) at specific time points or time intervals in real time or near real time. These game scenes are deemed to be popular (e.g., to a population of game spectators, a specific group of game spectators, a specific game spectator, etc.) for video game streaming.

[0075] The human game director can operate the game director device/client (316) through user interfaced implemented by the game director device/client (316) to generate the game directing data that predict, identify and/or select specific game scenes in the game session of the computer game.

[0076] The camera controller (304) can (e.g., continuously, from time to time, live, in real time, in near real time, etc.) receive the game directing data, use the received data to identify or select specific camera(s) at specific location(s) in relation to the game-director- server game world (108) at specific time point(s) or in specific time interval(s), direct the selected cameras to capture images of the selected game scenes of game-director-server game world (108).

[0077] The total number of active cameras (e.g., 308-1, 308-2, 308-3, etc.) at any given time of a game session of a computer game may or may not be fixed in various operational scenarios. A camera as described herein can be fixed or movable to (e.g., different, etc.) location(s) within the game-director-server game world (108) and/or location(s) outside the game-director-server game world (108) such as a birds-eye view location (e.g., where the camera (308-3) is placed outside or above the game world (108), etc.).

[0078] Additionally, optionally or alternatively, visual information including but not limited to video images from other cameras different from the cameras deployed at spatial locations in relation to the game-director-server game world (108) may be collected and streamed as a part of video game streaming of the game session of the computer games to game spectators (e.g., live, in real time, in near real time, etc.) by the game director server (104) or a game streaming server operating with the game director server (104).

[0079] The game video stream generator (306) can be implemented to use some or all of these cameras (e.g., 308-1, 308-2, 308-3, etc.) to generate or capture consecutive and/or concurrent sequences or streams of images at any given time points or intervals during the game session of the computer game. One or more specific streams of images from one or more specific cameras may be selected by the game video stream generator (306) to transmit, stream and/or otherwise provide to one or more game spectator devices, to a video game streaming server, to a video game broadcaster, to a video game recorder, etc.

[0080] FIG. 3C illustrates example directed video game streaming with a game director server (e.g., 104, etc.) under control of a game spectator device 116-i. Some or all devices or modules depicted in FIG. 3C can be implemented as a part of, or separately but operating in conjunction with, the game director server (104).

[0081] By way of example but not limitation, the game director server (104) includes or operates with a camera controller 304, a game video stream generator 306, one or more real or virtual cameras 308-1, 308-2, 308-3, etc., in addition to a game-player-non-interacting game engine (e.g., 106, etc.) and a game-director-server game world (e.g., 108, etc.) generated and/or driven by the game-player-non-interacting game engine (106).

[0082] The game spectator device (116-i) - which may be implemented by one or more computing devices and one or more image displays - may be operated by a game spectator who performs self- video-game-directing by reviewing some or all candidate video game streams, imagery or textual game information, etc., on the same or different image display(s) and provide director input to the game director server (104) to identify, select or switch to specific video game streams from among a plurality of candidate video game streams and/or specific attendant visual game information for video game streaming to be rendered on the same or different image display(s) operating with the game spectator device (116-i) or to be recorded on a non-transitory computer-readable data store or medium.

[0083] The game spectator can operate the game spectator device (116-i) through user interfaced implemented by the game spectator device (116-i) to monitor (e.g., live, real time, near real time, etc.) what are occurring in the game-director-server game world (108), and dynamically or adaptively generate game directing data that identifies and/or selects specific game scenes depicting specific avatars (e.g., representing specific game players, etc.), specific game moments/events, specific game plays, etc., in the game-director-server game world (108) at specific time points or time intervals in real time or near real time. These game scenes are deemed to be popular (e.g., to a population of game spectators, a specific group of game spectators, a specific game spectator, etc.) for video game streaming.

[0084] The game spectator can operate the game spectator device (116-i) through user interfaced implemented by the game spectator device (116-i) to generate the game directing data that predict, identify and/or select specific game scenes in the game session of the computer game.

[0085] The camera controller (304) can (e.g., continuously, from time to time, live, in real time, in near real time, etc.) receive the game directing data, use the received data to identify or select specific camera(s) at specific location(s) in relation to the game-director- server game world (108) at specific time point(s) or in specific time interval(s), direct the selected cameras to capture images of the selected game scenes of game-director-server game world (108).

[0086] The total number of active cameras (e.g., 308-1, 308-2, 308-3, etc.) at any given time of a game session of a computer game may or may not be fixed in various operational scenarios. A camera as described herein can be fixed or movable to (e.g., different, etc.) location(s) within the game-director-server game world (108) and/or location(s) outside the game-director-server game world (108) such as a birds-eye view location (e.g., where the camera (308-3) is placed outside or above the game world (108), etc.).

[0087] Additionally, optionally or alternatively, visual information including but not limited to video images from other cameras different from the cameras deployed at spatial locations in relation to the game-director-server game world (108) may be collected and streamed as a part of video game streaming of the game session of the computer games to the game spectator and/or other game spectators (e.g., live, in real time, in near real time, etc.) by the game director server (104) or a game streaming server operating with the game director server (104).

[0088] The game video stream generator (306) can be implemented to use some or all of these cameras (e.g., 308-1, 308-2, 308-3, etc.) to generate or capture consecutive and/or concurrent sequences or streams of images at any given time points or intervals during the game session of the computer game. One or more specific streams of images from one or more specific cameras may be selected by the game video stream generator (306) to transmit, stream and/or otherwise provide to one or more game spectator devices including the game spectator device (116-i), to a video game streaming server, to a video game broadcaster, to a video game recorder, etc.

5. EXAMPLE PROCESS FLOWS

[0089] FIG. 4 illustrates an example process flow according to an example embodiment of the present invention. In some example embodiments, one or more computing devices or components may perform this process flow. In block 402, a game director server receives a mirror copy of game-player-generated data of a computer game from one or more game player devices operated by one or more game players respectively. The game director server is separate from a game play server that generates and maintains a first game world for the one or more game player devices to play the computer game.

[0090] In block 404, the game director server uses the game-player-generated data to generate and maintain a second game world separate from the first game world. The second game world is not for the one or more game player devices to play the computer game.

[0091] In block 406, the game director server monitors game scenes in the second game world to identify a subset of one or more specific game scenes from among a plurality of candidate game scenes.

[0092] In block 408, the game director server generates one or more specific video game streams that capture images of the one or more specific game scenes for streaming to one or more game spectator devices.

[0093] In an embodiment, the second game world represents a mirror copy of the first game world.

[0094] In an embodiment, the game director server autonomously and dynamically updates the second game world without receiving any game play data generated by the game play server after the computer game starts unfolding.

[0095] In an embodiment, the game director server receives the game-player-generated data from an aggregation and redirection server that receive the game-player-generated data from the one or more game player devices.

[0096] In an embodiment, the game director server receives the game-player-generated data forwarded by the game play server.

[0097] In an embodiment, the game play server generates game play data based at least in part on the game-player-generated data; the game play server sends at least a part of the game play data to the one or more game player devices to update one or more game views rendered on one or more image displays operating in conjunction with the one or more game player devices.

[0098] In an embodiment, the one or more specific video game streams are streamed to one or more game spectator devices by way of a video game streaming server.

[0099] In an embodiment, the game director server operates with an automated director to generate automated game directing data that selects one or more specific game scenes to be depicted by the one or more specific game streams at one or more specific time points.

[0100] In an embodiment, the game director server communicates with a designated director to generate second game directing data that selects one or more second specific game scenes to be depicted by the one or more specific game streams at one or more second specific time points.

[0101] In an embodiment, the game director server operates with a specific game spectator device of the one or more game spectator devices to generate third game directing data that selects one or more third specific game scenes to be depicted by a specific game stream of the one or more specific game streams at one or more third specific time points; the specific game stream is to be streamed to the specific game spectator device.

[0102] In an embodiment, the one or more specific game streams depict one or more specific game scenes captured by one or more specific cameras in a specific time interval in the computer game; the game director server selects one or more specific camera positions in relation to the second game world for placing the one or more specific cameras in the specific time interval.

[0103] In an embodiment, the one or more specific camera positions include one or more of: discrete positions within the second game world, positions on one or more spatial trajectories within the second game world, positions outside of the second game world, etc. [0104] In various example embodiments, an apparatus, a system, an apparatus, or one or more other computing devices performs any or a part of the foregoing methods as described. In an embodiment, a non-transitory computer readable storage medium stores software instructions, which when executed by one or more processors cause performance of a method as described herein.

[0105] Note that, although separate embodiments are discussed herein, any combination of embodiments and/or partial embodiments discussed herein may be combined to form further embodiments.

6. IMPLEMENTATION MECHANISMS - HARDWARE OVERVIEW

[0106] According to one embodiment, the techniques described herein are implemented by one or more special-purpose computing devices. The special-purpose computing devices may be hard-wired to perform the techniques, or may include digital electronic devices such as one or more application-specific integrated circuits (ASICs) or field programmable gate arrays (FPGAs) that are persistently programmed to perform the techniques, or may include one or more general purpose hardware processors programmed to perform the techniques pursuant to program instructions in firmware, memory, other storage, or a combination. Such special-purpose computing devices may also combine custom hard-wired logic, ASICs, or FPGAs with custom programming to accomplish the techniques. The special-purpose computing devices may be desktop computer systems, portable computer systems, handheld devices, networking devices or any other device that incorporates hard-wired and/or program logic to implement the techniques.

[0107] For example, FIG. 5 is a block diagram that illustrates a computer system 500 upon which an example embodiment of the invention may be implemented. Computer system 500 includes a bus 502 or other communication mechanism for communicating information, and a hardware processor 504 coupled with bus 502 for processing information. Hardware processor 504 may be, for example, a general purpose microprocessor.

[0108] Computer system 500 also includes a main memory 506, such as a random access memory (RAM) or other dynamic storage device, coupled to bus 502 for storing information and instructions to be executed by processor 504. Main memory 506 also may be used for storing temporary variables or other intermediate information during execution of instructions to be executed by processor 504. Such instructions, when stored in non- transitory storage media accessible to processor 504, render computer system 500 into a special-purpose machine that is customized to perform the operations specified in the instructions.

[0109] Computer system 500 further includes a read only memory (ROM) 508 or other static storage device coupled to bus 502 for storing static information and instructions for processor 504.

[0110] A storage device 510, such as a magnetic disk or optical disk, solid state RAM, is provided and coupled to bus 502 for storing information and instructions.

[0111] Computer system 500 may be coupled via bus 502 to a display 512, such as a liquid crystal display, for displaying information to a computer user. An input device 514, including alphanumeric and other keys, is coupled to bus 502 for communicating information and command selections to processor 504. Another type of user input device is cursor control 516, such as a mouse, a trackball, or cursor direction keys for communicating direction information and command selections to processor 504 and for controlling cursor movement on display 512. This input device typically has two degrees of freedom in two axes, a first axis (e.g., x) and a second axis (e.g., y), that allows the device to specify positions in a plane. [0112] Computer system 500 may implement the techniques described herein using customized hard-wired logic, one or more ASICs or FPGAs, firmware and/or program logic which in combination with the computer system causes or programs computer system 500 to be a special-purpose machine. According to one embodiment, the techniques herein are performed by computer system 500 in response to processor 504 executing one or more sequences of one or more instructions contained in main memory 506. Such instructions may be read into main memory 506 from another storage medium, such as storage device 510. Execution of the sequences of instructions contained in main memory 506 causes processor 504 to perform the process steps described herein. In alternative embodiments, hard-wired circuitry may be used in place of or in combination with software instructions.

[0113] The term “storage media” as used herein refers to any non-transitory media that store data and/or instructions that cause a machine to operation in a specific fashion. Such storage media may comprise non-volatile media and/or volatile media. Non-volatile media includes, for example, optical or magnetic disks, such as storage device 510. Volatile media includes dynamic memory, such as main memory 506. Common forms of storage media include, for example, a floppy disk, a flexible disk, hard disk, solid state drive, magnetic tape, or any other magnetic data storage medium, a CD-ROM, any other optical data storage medium, any physical medium with patterns of holes, a RAM, a PROM, and EPROM, a FLASH-EPROM, NVRAM, any other memory chip or cartridge.

[0114] Storage media is distinct from but may be used in conjunction with transmission media. Transmission media participates in transferring information between storage media. For example, transmission media includes coaxial cables, copper wire and fiber optics, including the wires that comprise bus 502. Transmission media can also take the form of acoustic or light waves, such as those generated during radio-wave and infra-red data communications .

[0115] Various forms of media may be involved in carrying one or more sequences of one or more instructions to processor 504 for execution. For example, the instructions may initially be carried on a magnetic disk or solid state drive of a remote computer. The remote computer can load the instructions into its dynamic memory and send the instructions over a telephone line using a modem. A modem local to computer system 500 can receive the data on the telephone line and use an infra-red transmitter to convert the data to an infra-red signal. An infra-red detector can receive the data carried in the infra-red signal and appropriate circuitry can place the data on bus 502. Bus 502 carries the data to main memory 506, from which processor 504 retrieves and executes the instructions. The instructions received by main memory 506 may optionally be stored on storage device 510 either before or after execution by processor 504.

[0116] Computer system 500 also includes a communication interface 518 coupled to bus 502. Communication interface 518 provides a two-way data communication coupling to a network link 520 that is connected to a local network 522. For example, communication interface 518 may be an integrated services digital network (ISDN) card, cable modem, satellite modem, or a modem to provide a data communication connection to a corresponding type of telephone line. As another example, communication interface 518 may be a local area network (LAN) card to provide a data communication connection to a compatible LAN. Wireless links may also be implemented. In any such implementation, communication interface 518 sends and receives electrical, electromagnetic or optical signals that carry digital data streams representing various types of information.

[0117] Network link 520 typically provides data communication through one or more networks to other data devices. For example, network link 520 may provide a connection through local network 522 to a host computer 524 or to data equipment operated by an Internet Service Provider (ISP) 526. ISP 526 in turn provides data communication services through the world wide packet data communication network now commonly referred to as the “Internet” 528. Local network 522 and Internet 528 both use electrical, electromagnetic or optical signals that carry digital data streams. The signals through the various networks and the signals on network link 520 and through communication interface 518, which carry the digital data to and from computer system 500, are example forms of transmission media.

[0118] Computer system 500 can send messages and receive data, including program code, through the network(s), network link 520 and communication interface 518. In the Internet example, a server 530 might transmit a requested code for an application program through Internet 528, ISP 526, local network 522 and communication interface 518.

[0119] The received code may be executed by processor 504 as it is received, and/or stored in storage device 510, or other non-volatile storage for later execution.

7. EQUIVALENTS , EXTENSIONS , ALTERNATIVES AND MISCELLANEOUS

[0120] In the foregoing specification, example embodiments of the invention have been described with reference to numerous specific details that may vary from implementation to implementation. Thus, the sole and exclusive indicator of what is the invention, and is intended by the applicants to be the invention, is the set of claims that issue from this application, in the specific form in which such claims issue, including any subsequent correction. Any definitions expressly set forth herein for terms contained in such claims shall govern the meaning of such terms as used in the claims. Hence, no limitation, element, property, feature, advantage or attribute that is not expressly recited in a claim should limit the scope of such claim in any way. The specification and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense.

[0121] Various aspects of the present invention may be appreciated from the following Enumerated Example Embodiments (EEEs):

[00026] EEE1: a method comprising: receiving, by a game director server, a mirror copy of game-player-generated data of a computer game from one or more game player devices operated by one or more game players respectively, wherein the game director server is separate from a game play server that generates and maintains a first game world for the one or more game player devices to play the computer game; using the game-player-generated data to generate and maintain, by the game director server, a second game world separate from the first game world, wherein the second game world is not for the one or more game player devices to play the computer game; monitoring game scenes in the second game world to identify a subset of one or more specific game scenes from among a plurality of candidate game scenes; generating one or more specific video game streams that capture images of the one or more specific game scenes for streaming to one or more game spectator devices.

[00027] EEE2: the method according to EEE1, wherein the second game world represents a mirror copy of the first game world.

[00028] EEE3: the method of EEE1 or EEE2, wherein the game director server autonomously and dynamically updates the second game world without receiving any game play data generated by the game play server after the computer game starts unfolding [00029] EEE4: the method of any of EEE1 to EEE3, wherein the game director server receives the game-player-generated data from an aggregation and redirection server that receive the game -play er- generated data from the one or more game player devices.

[00030] EEE5: the method of any of EEE1 to EEE4, wherein the game director server receives the game-player-generated data forwarded by the game play server.

[00031] EEE6: the method of any of EEE1 to EEE5, wherein the game play server generates game play data based at least in part on the game-player-generated data; wherein the game play server sends at least a part of the game play data to the one or more game player devices to update one or more game views rendered on one or more image displays operating in conjunction with the one or more game player devices.

[00032] EEE7: the method of any of EEE1 to EEE6, the one or more specific video game streams are streamed to one or more game spectator devices by way of a video game streaming server.

[00033] EEE8: the method of any of EEE1 to EEE7, the game director server operates with an automated director to generate automated game directing data that selects one or more specific game scenes to be depicted by the one or more specific game streams at one or more specific time points. [00034] EEE9: the method of any of EEE1 to EEE8, the game director server communicates with a designated director to generate second game directing data that selects one or more second specific game scenes to be depicted by the one or more specific game streams at one or more second specific time points.

[00035] EEE10: the method of any of EEE1 to EEE9, the game director server operates with a specific game spectator device of the one or more game spectator devices to generate third game directing data that selects one or more third specific game scenes to be depicted by a specific game stream of the one or more specific game streams at one or more third specific time points; wherein the specific game stream is to be streamed to the specific game spectator device.

[00036] EEE11: the method of any of EEE1 to EEE10, the one or more specific game streams depict one or more specific game scenes captured by one or more specific cameras in a specific time interval in the computer game; wherein the game director server selects one or more specific camera positions in relation to the second game world for placing the one or more specific cameras in the specific time interval.

[00037] EEE 12: the method of any of EEE 1 to EEE11, the one or more specific camera positions include one or more of: discrete positions within the second game world, positions on one or more spatial trajectories within the second game world, positions outside of the second game world.

[00038] EEE13: the method of any of EEE1 to EEE12, there is provided an apparatus performing any of the above methods.

[00039] EEE14: a non-transitory computer readable storage medium, storing software instructions, which when executed by one or more processors cause performance of the method recited in any of EEE1 to EEE 12.

[0122] EEE15: a computing device comprising one or more processors and one or more storage media, storing a set of instructions, which when executed by one or more processors cause performance of the method recited in any of EEE1 to EEE12.