Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
ENHANCING AND TRACKING VIDEO GAME STREAMING
Document Type and Number:
WIPO Patent Application WO/2023/150159
Kind Code:
A1
Abstract:
A spatial portion of a game world of a computer game is designated for incorporating non-game visual content. A game view image that includes a visual depiction of at least a part of the game world at a first time is received. It is determined whether the game view image includes the spatial portion of the game world. In response to determining that the game view image includes the spatial portion of the game world, a non-game visual content portion is inserted into the game view image. The game view image, as inserted with the non-game visual content portion, is encoded into a video game stream to cause a recipient device of the video game stream to generate a display image from the game view image and render the display image on an image display operating in conjunction with the recipient device.

Inventors:
NINAN AJIT (US)
Application Number:
PCT/US2023/012106
Publication Date:
August 10, 2023
Filing Date:
February 01, 2023
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
DOLBY LABORATORIES LICENSING CORP (US)
International Classes:
A63F13/61; A63F13/355; A63F13/79
Foreign References:
US20200346114A12020-11-05
US20090109213A12009-04-30
USPP63305635P
Attorney, Agent or Firm:
ZHANG, Yiming et al. (US)
Download PDF:
Claims:
CLAIMS A method comprising: designating a spatial portion of a game world of a computer game for incorporating nongame visual content; receiving a game view image that includes a visual depiction of at least a part of the game world at a first time; determining whether the game view image includes the spatial portion of the game world; in response to determining that the game view image includes the spatial portion of the game world, inserting a non-game visual content portion into the game view image at a spatial location corresponding to the spatial portion; encoding the game view image, as inserted with the non-game visual content portion, into a video game stream to cause a recipient device of the video game stream to generate a display image from the game view image and render the display image on an image display operating in conjunction with the recipient device, wherein a game spectator operates the recipient device to receive and view video game streaming of the computer game. The method of Claim 1, wherein the spatial portion is visually perceptibly demarcated in the game view image. The method of Claim 1, wherein the spatial portion is not visually perceptibly demarcated in the game view image and is indicated in image metadata generated for the game view image. The method of any of Claims 1-3, wherein the game view image depicts a specific visual scene, represented in the game world, selected based on spectator preferences for game content. The method of any of Claims 1-4, wherein the non-game content item is selected based on spectator preferences for non-game content. The method of any of Claims 1-5, wherein different video game streams generated by multiple cameras from the game world are streamed to different recipient devices; wherein game viewing statistics are collected, tracked and analyzed to cause changes to future video game streaming of the computer game. The method of any of Claims 1-6, wherein different game spectators operate the different recipient devices, respectively; wherein a specific game spectator, among the different game spectators, is provided with a specific non-game content item incorporated into game view images rendered to the specific game spectator; wherein the specific non- game content item is selected based at least in part on individual game viewing statistics collected for the specific game spectator. The method of any of Claims 1-7, wherein a specific game player of the computer game is provided with additional capabilities based at least in part on individual game viewing statistics collected for the specific game player. The method of any of Claims 1-8, wherein a camera is placed at a specific position and a specific direction relative to a coordinate system of the game world to capture one or more game view images from the computer world; wherein the specific portion and the specific direction are selected based at least in part on individual game viewing statistics collected for one of a game spectator, a game player, a game play, or a source of nongame content. The method of any of Claims 1-9, wherein the game world represents a mirror copy of player interacting game world. The method of any of Claims 1-9, wherein the video game stream is encoded with a sequence of consecutive game view images captured from the game world under direction of one of an automated director, a designated director or a game spectator. An apparatus performing any of the methods as recited in Claims 1-11. A non-transitory computer readable storage medium, storing software instructions, which when executed by one or more processors cause performance of the method recited in any of Claims 1-11. A computing device comprising one or more processors and one or more storage media, storing a set of instructions, which when executed by one or more processors cause performance of the method recited in any of Claims 1-11.
Description:
ENHANCING AND TRACKING VIDEO GAME STREAMING

CROSS-REFERENCE TO RELATED APPLICATIONS

[0001] This application claims priority of the following priority applications: US provisional application 63/305,636, filed 01 February 2022, and European Patent Application No.

22156125.1, filed 10 February 2022, the contents of each of which are hereby incorporated by reference in its entirety.

TECHNOLOGY

[0002] The present invention relates generally to computer games, and in particular, to enhancing and tracking video game streaming.

BACKGROUND

[0003] Computer games have become spectator sports. Numerous game enthusiasts or spectators follow popular computer game events and watch live (and even recorded) video streams of tournaments of computer games. These game video streams may provide birds-eye views or first-person views of a played computer game.

[0004] Under some approaches, a game player may wear a camera to capture a game video stream of screen images rendered on an image display in front of the game player. The captured game video stream can be uploaded to a cloud-based game video streaming server, and broadcast or streamed to game spectators by way of the game video streaming server.

[0005] While the game video stream as captured by the game player might provide a more engaging or immersive view than a birds-eye view, such game video stream would probably miss at least half of the actions in the computer game. Moreover, as the game player could make head or body movements from time to time, images in the captured game video stream would become jerky and hard to follow visually. As a result, only a relatively limited amount of game information with relatively poor video quality could be streamed to game spectators.

The approaches described in this section are approaches that could be pursued, but not necessarily approaches that have been previously conceived or pursued. Therefore, unless otherwise indicated, it should not be assumed that any of the approaches described in this section qualify as prior art merely by virtue of their inclusion in this section. Similarly, issues identified with respect to one or more approaches should not assume to have been recognized in any prior art on the basis of this section, unless otherwise indicated.

BRIEF DESCRIPTION OF DRAWINGS

[0006] The present invention is illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings and in which like reference numerals refer to similar elements and in which:

[0007] FIG. 1A through FIG. 1C illustrate example systems for enhancing and tracking computer game streaming;

[0008] FIG. 2 illustrates an example game world for streaming with designated spatial portions for incorporating non-game content;

[0009] FIG. 3A and FIG. 3B illustrate example game view images which, after captured from a game world, incorporate non-game content before streaming to game spectators;

[0010] FIG. 4 illustrate an example process flow; and

[0011] FIG. 5 illustrates an example hardware platform on which a computer or a computing device as described herein may be implemented.

DESCRIPTION OF EXAMPLE EMBODIMENTS

[0012] Example embodiments, which relate to enhancing and tracking video game streaming, are described herein. In the following description, for the purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the present invention. It will be apparent, however, that the present invention may be practiced without these specific details. In other instances, well-known structures and devices are not described in exhaustive detail, in order to avoid unnecessarily occluding, obscuring, or obfuscating the present invention. [0013] Example embodiments are described herein according to the following outline:

1. GENERAL OVERVIEW

2. ENHANCHING GAME STREAMING

3. COMBINING NON-GAME CONTENT WITH GAME CONTENT

4. TRACKING GAME STREAMING

5. EXAMPLE PROCESS FLOWS

6. IMPLEMENTATION MECHANISMS - HARDWARE OVERVIEW

7. EQUIVALENTS , EXTENSIONS , ALTERNATIVES AND

MISCELLANEOUS

1. GENERAL OVERVIEW

[0014] This overview presents a basic description of some aspects of an example embodiment of the present invention. It should be noted that this overview is not an extensive or exhaustive summary of aspects of the example embodiment. Moreover, it should be noted that this overview is not intended to be understood as identifying any particularly significant aspects or elements of the example embodiment, nor as delineating any scope of the example embodiment in particular, nor the invention in general. This overview merely presents some concepts that relate to the example embodiment in a condensed and simplified format, and should be understood as merely a conceptual prelude to a more detailed description of example embodiments that follows below. Note that, although separate embodiments are discussed herein, any combination of embodiments and/or partial embodiments discussed herein may be combined to form further embodiments.

[0015] Under techniques as described herein, non-game content - which is not captured from a computer game but rather from internal or external data sources - can be combined with game content that is captured from the computer game into video game streams to be delivered to game spectators. A game world visually and conceptually represents the computer game. The game world can include designated spatial locations for incorporating the non-game content in game view images captured from the game world. These designated spatial locations can be marked, determined or identified with green screens, fiducial markers, and/or non- visual image metadata that specifies pre-configured or dynamically determined spatial locations, spatial trajectories, perspectives, etc., for example as a part of a game world map or model describing the geography of the game world.

[0016] Some or all of the non-game content can be specifically selected to meet or satisfy specific preferences of specific game spectators. Additionally, optionally or alternatively, some or all of the non-game content can be general or non- specialized to a large group of game spectators including any specific game spectators.

[0017] The spatial locations for incorporating the non-game content can be specifically chosen to increase spectator views or lengthen viewing time. Some or all of the non-game content can be incorporated into the game view images by replacing existing image portions in the game view images. Additionally, optionally or alternatively, some or all of the non-game content can be incorporated into the game view images by superimposing (using a transparent background) with existing image portions in the game view images.

[0018] As used herein, the term “spectator views” refer to the number of views of game or non-game visual content made by game spectators as measured for example by the number of downloaded video game streams. The term “game views” refer to views of the game world as would be visually perceived by observers such as cameras physically or virtually located within (e.g., in situ, etc.) the game world or forming spatial relationships (e.g., bird views, first person views, game player views, in-game avatar views etc.) to the game world.

[0019] Separate game viewing may be made to different game spectators using different video game streams including game view images captured concurrently or simultaneously by multiple cameras placed within or in relation to the game world.

[0020] Game viewing statistics by specific game spectators and/or a population of game spectators can be collected, tracked and/or analyzed by a game viewing statistics tracking server. Example game viewing statistics tracking server may be, but is not necessarily limited to only, one or more of: media streaming servers, game servers, game director servers, or other servers operating with some or all of the foregoing. For example, individual game viewing statistics with respect to an individual game spectator, a group of game spectators, an individual game player, a group of game players (e.g., two specific game player opponents or teammates, a team, two opposing teams, etc.), an individual game play, a group of game plays (e.g., consecutive, causally related, non-causally related, etc.), a source or provider of non-game content, a type of non-game content, etc., may be individually or aggregately collected, tracked and/or analyzed. [0021] Based at least in part on the game viewing statistics, specific relationships or correlations between or among game spectators, game players, sources of non-game content, etc., can be ascertained, tested, measured, verified, applied in future incorporation of non-game content with game content in present or future playing of the computer game or other computer games. For example, a game spectator may be provided with specific non-game content determined to be interesting to the game spectator. A game player may be selected for incorporating specific non-game content in present or future game playing. A specific source of non-game content may be selected or recommended for specific game spectators and/or specific game players. Additionally, optionally or alternatively, a game player may be provided with game playing capabilities based at least in part on game viewing statistics relating to the game player or peers.

[0022] Example embodiments described herein relate to video game streaming. A spatial portion of a game world of a computer game is designated for incorporating non-game visual content. A game view image that includes a visual depiction of at least a part of the game world at a first time is received. It is determined whether the game view image includes the spatial portion of the game world. In response to determining that the game view image includes the spatial portion of the game world, a non-game visual content portion is inserted into the game view image at a spatial location corresponding to the spatial portion. The game view image, as inserted with the non-game visual content portion, is encoded into a video game stream to cause a recipient device of the video game stream to generate a display image from the game view image and render the display image on an image display operating in conjunction with the recipient device.

[0023] In some example embodiments, mechanisms as described herein form a part of a media processing system, including but not limited to any of: cloud-based server, mobile device, virtual reality system, augmented reality system, head up display device, helmet mounted display device, CAVE-type system, wall-sized display, video game device, display device, media player, media server, media production system, camera systems, home-based systems, communication devices, video processing system, video codec system, studio system, streaming server, cloud-based content service system, a handheld device, game machine, television, cinema display, laptop computer, netbook computer, tablet computer, cellular radiotelephone, electronic book reader, point of sale terminal, desktop computer, computer workstation, computer server, computer kiosk, or various other kinds of terminals and media processing units.

[0024] Various modifications to the preferred embodiments and the generic principles and features described herein will be readily apparent to those skilled in the art. Thus, the disclosure is not intended to be limited to the embodiments shown, but is to be accorded the widest scope consistent with the principles and features described herein.

2. ENHANCHING GAME STREAMING

[0025] FIG. 1A illustrates an example system 100 for enhancing video game streaming. Some or all devices and components/units thereof in the system (100) may be implemented in software, hardware, or a combination of software and hardware, with one or more of: computing processors such as central processing units or CPUs, audio codecs, video codecs, digital signal processors, graphic processing units or GPUs, etc.

[0026] As illustrated in FIG. 1A, the system (100) includes a game server 104 that hosts a

(e.g., live, video, online, real time, near real time, competitive, tournament, single player, multi- player, team-based, etc.) computer game played by one or more (e.g., human, etc.) game players respectively operating one or more game player devices 102-1, 102-2, ... 102-N, where N is an integer no less than one (1). The computer game may last for a game session or a time interval such as 30 minutes, one hour, one and half hours, etc. The computer game may be, but is not necessarily limited to only, one of: League of Legends, Counter-Strike, Call of Duty, Dota2, Fortnite, Overwatch, PlayerUnknown’s Battlegrounds, StarCraft, etc.

[0027] The one or more game players may be respectively (virtually) represented by one or more avatars located in a game world 108. The game world (108) can be (e.g., autonomously, independently, without depending on a game director/streaming server, etc.) generated, defined, dynamically represented and/or maintained in the game server (100) or a game engine 106 therein.

[0028] The game engine (106) interacts with the one or more game player devices (102-1, 102-2, ... 102-N) in the game session for the game players to play the computer game. The game engine (106) can communicate during the game session with the one or more game player devices (102-1, 102-2, ... 102-N) over one or more data network connections 118-1, 118-2, ... 118-N, respectively.

[0029] For example, the game engine (106) receives (e.g., live, real time, near real time, etc.) game-player-generated data from the game player devices (102-1, 102-2, ... 102-N) over the data network connections (118-1, 118-2, ... 118-N). The game-player-generated data may be generated by the game player devices (102-1, 102-2, ... 102-N) from user interactions with the game players through one or more user interfaces. In response to receiving the game-player- generated data, the game engine (106) determines, identifies and/or generates corresponding game player generated/triggered events represented or encapsulated in the game-player- generated data. The game engine (106) uses the game-player-generated data as well as game- server-maintained data such as states of the game world (108), histories of states of the game world (108) to drive (e.g., live, real time, near real time, etc.) changes in the states of the game world (108) over a single or multiple concurrent and/or consecutive game plays during the game session.

[0030] Additionally, optionally or alternatively, the game engine (106) may (e.g., concurrently, contemporaneously, subsequently, within a relatively strict time budget measured in a few milliseconds or seconds, etc.) generate or trigger (game server generated/triggered) events in response to one or more of: present or past received game player generated/triggered events, concurrent and/or past states of the game world (108), changes in the states of the game world (108), and so on. The game server (104), or the game engine (106) therein, may represent or encapsulate the game-sever-generated/triggered events into game- sever-generated data, and sends/transmits (e.g., within a strict time budget, in real time, in near real time, etc.) some or all of the game- server-generated data to the game player devices (102-1, 102-2, ... 102-N) over the data network connections (118-1, 118-2, ... 118-N). The game- server-generated data may be used by the game player devices (102-1, 102-2, ... 102-N) to render or update their respective visual (including any accompanying audio) of the computer game with image displays and/or audio speakers.

[0031] A game world as described herein includes avatars representing the game players and attendant game characters/objects (e.g., rooms, castles, rocks, creatures, etc.) in a (virtual, multidimensional) game space. The game space may be composed of all spatial locations or regions of the game world. The avatars and attendant game characters/objects may be stationary or moving in the game space at a given time in the game session as determined by their respective individual states at the given time. These individual states of the avatars and attendant game characters/objects at the given time collectively or aggregately give rise to, or define, the overall state of the game world at the given time. A visual depiction of the game world may be captured at the given time by a virtual or real camera positioned in relation to (e.g., at a spatial location/point hovering above, at a spatial location/point within, etc.) the game world. The visual depiction of the game world as captured by the camera provides a visual representation of the overall state of the game world or a portion thereof at the given time.

[0032] The computer game played by the game players may comprise sequential or concurrent game plays unfolding in the game world (108) over the game session. A (computer) game play is a time varying evolution - or a sequence of (consecutive and/or concurrent) changes in states of - the play server game world (108) or a portion thereof - over a time interval in the game session - that are caused by a set of player-generated and/or play-server- generated events.

[0033] A visual depiction of a game play - e.g., from a perspective of an avatar representing a game player operating a game player device - may be captured (as camera-generated images) over the time duration of the game play (possibly with leading or trailing time margins of a few seconds, a few minutes, etc.) by a virtual or real camera positioned in relation to the game world (108). The visual depiction may be streamed or transmitted to image rendering device(s) such as game player device(s) and/or game spectator device(s) and rendered by these devices on wearable or non-wearable image displays as video images. The visual depiction of the game play may be live or recorded (or replay ed/highlighted). As used herein, “a live game play” or “a live visual depiction of the game play” may mean streaming video images of the game play from a game streaming server (e.g., a game server, etc.) in real time, near real time, or within a relatively strict time limit such as one (1) millisecond, five (5) milliseconds, ten (10) milliseconds, etc., while the game play is being made in the game world (108). In contrast, “a recorded game play,” “a recorded visual depiction of the game play,” “a highlighted game play,” “a highlighted visual depiction of the game play,” “a replayed game play,” “a replayed visual depiction of the game play,” etc., may mean streaming video images of the game play on a timedelay basis (e.g., five (5) seconds later, minutes later, etc.) after some or all of the game play has occurred or mirrored in the game world (108).

[0034] Each game player device of the game player devices (102-1, 102-2, ... 102-N) can be configured or installed with a (e.g., same, with relatively minor differences, web-based, non- web-based, mobile, etc.) game client application for the computer game. When executed, a game client application running on a game player device as described herein enables or causes the game player device to connect with the game server (104) to play the computer game in the game session, to capture game-player-generated events from interactions with a game player operating the game player device, to communicate the game-player-generated events to the game server (104), to receive states of some or all of the avatars including an avatar representing the game player and states of some or all of the attendant game characters/objects, to receive visual depictions and/or non-visual representations of the game world (108) or game plays occurred therein, to generate and/or render visual depictions of some or all of the game world (108) or game plays occurred therein on an image display operating in conjunction with the game player device, etc. In some operational scenarios, two-dimensional (2D) or three- dimensional (3D) visual game maps (e.g., rendered in an image display or a portion of, etc.) may be used to present at least a part of these visual depictions of some or all of the game world (108) or game plays occurred therein by the game player device to the game player.

[0035] The game server (104) that hosts the computer game played by the game players may communicate with the game player devices (102-1, 102-2, ... 102-N) over wired and/or wireless data communication networks or links. The game server (104) may be implemented with a single computer device or multiple computer devices local or remote to some or all of the game player devices (102-1, 102-2, ... 102-N) in a peer-to-peer model, in a master-slave model, in a client-server model, in a distributed computing model, etc. In some operational scenarios, some or all of the game server (104) may be cloud-based (e.g., a cloud-based server, a cluster of cloud-based virtual computers, etc.), premise-based, a combination of cloud-based and premisebased, etc.

[0036] The game server (104) represents a (final) judge, arbitrator or authority for all the states of (hence all the game plays involving) the avatars and the attendant game characters/objects in the computer game, as played by the game players via the game player devices (102-1, 102-2, ... 102-N). Visual and non-visual representations of the states of any of the avatars and the attendant game characters/objects in the computer game as received, generated and/or rendered by any of the game player devices (102-1, 102-2, ... 102-N) are ensured - by functionality implemented by the game server (104) and the game client applications running on the game player devices (102-1, 102-2, ... 102-N) - to be consistent to the states of the avatars and the attendant game characters/objects in the computer game as maintained in or by the game server (104).

[0037] Video game streaming can be supported using a (e.g., cloud based, non-cloud-based, etc.) game world representing a running computer game. As used herein, video game streaming refers to (e.g., live, real time, near real time, dynamic, time delayed, etc.) video streaming of visual depictions of the game world or portions therein of, or visual information relating to, the computer game to game spectator(s).

[0038] In some operational scenarios, for example as illustrated in FIG. 1A, the game server (104) supports (e.g., live, real time, near real time, time-delayed, enhanced, tracked, etc.) video streaming of the computer game to one or more game spectator devices 116-1, 116-2, ... 116-M, where M is an integer no less than one (1) operated respectively by one or more (e.g., human, etc.) game spectators. The video streaming of the computer game may last for the entire game session or some time segments or intervals therein, for example while the computer game is being played by the one or more game players by way of the game server (104) and the game player devices (102-1, 102-2, ... 102-N).

[0039] The game spectator devices (116-1, 116-2, ... 116-M) may be configured or installed with a (e.g., same, with relatively minor differences, web-based, non-web-based, mobile, etc.) game spectator application for receiving video (or image) streams of the computer game directly from the game server (104). When executed, a game spectator application running on a game spectator device as described herein enables or causes the game spectator device to connect with the game server (104) to receive a live or recorded video stream of the computer game as captured by one or more cameras from the game world (108). Additionally, optionally or alternatively, the game spectator devices (116-1, 116-2, ... 116-M) may be configured or installed with a (e.g., same, with relatively minor differences, web-based, non-web-based, mobile, etc.) game spectator application for receiving video (or image) streams of the computer game indirectly from the game server (104) by way of a game video streaming server (not shown) operating in conjunction with the game server (104). Some or all of the video stream received by the game spectator device may be displayed or rendered on an image display operating in conjunction with the game spectator device.

[0040] The game server (104) may maintain a data store or data cache to store some or all (e.g., up-to-date, etc.) spectator profiles that have been established or determined in past or current streaming operations. For any new spectator to which no existing spectator profiles have been specifically selected or defined, a default spectator profile may be initially selected for the new spectator for the purpose of generating enhanced video streaming for the new spectator. The initially selected spectator profile may be refined, updated and/or specialized with spectator specific information for the new spectator as such information becomes available.

[0041] Hence, one or more specific spectator profiles for some or all of the one or more game spectators may be respectively (virtually) defined, specified, dynamically represented and/or maintained in the game server (104) or the spectator stream manager (114) therein for the purpose of generating enhanced video streaming for some or all of the one or more game spectators.

[0042] Any, some or all of these spectator profiles may be updated, for example by the game server (104) or a spectator management device/module operating in conjunction with the game server (104), in real time, from time to time, etc., using new, changed or deleted information available, collected or finalized for any, some or all spectators described or defined in the spectator profiles.

[0043] In response to receiving a request for video streaming of the computer game from a game spectator device operated by a game spectator, the game server (104) may invoke a spectator stream manager 114 to establish or determine a specific spectator profile for the game spectator based at least in part on information (e.g., spectator name, spectator identifier, spectator geographic location, spectator information, etc.) carried in the request. The specific spectator profile may be an individual spectator profile specifically established or determined for the game spectator. Additionally, optionally or alternatively, the specific spectator profile may be specifically established or determined for a subset of game spectators - including the game spectator - in an overall population of game spectators.

[0044] A spectator profile for a game spectator can be used by the game server (104) or the spectator stream manager (114) therein to determine preferences for game content as well as preferences for non-game content and carry out spectator enhanced game streaming to the game spectator device operated by the game spectator based at least in part on these preferences of the game spectator.

[0045] Game content refers to visual depictions or imagery of the game world captured or generated by real or virtual cameras from the game world (or a corresponding time varying 3D model representing the game world), including but not limited to avatars and attendant game objects/characters in the game world. Example preferences for game content may include, but are not necessarily limited to only, any of: favoring specific game player(s), disfavoring specific game player(s), favoring specific avatar(s), disfavoring specific avatar(s), favoring specific game plays, disfavoring specific game plays, favoring specific game scenes, disfavoring specific game scenes, favoring specific game world locations, disfavoring specific game world locations, etc. [0046] Non-game content refers to visual information or imagery - not captured by real or virtual cameras from the game world in which the computer game is unfolding but rather - from data source(s) external or extrinsic to the game world such as avatars and attendant game objects/characters in the game world as a part of playing the computer game. Example preferences for non-game content may include, but are not necessarily limited to only, any of: favoring specific music genres, disfavoring specific music genres, favoring specific movies, disfavoring specific movies, favoring specific actors, disfavoring specific actors, favoring specific artists, disfavoring specific artists, favoring specific sports activities, disfavoring specific sports activities, favoring specific content topics, disfavoring specific content topics, favoring specific real world locations, disfavoring specific real world locations, etc. Example data source(s) to provide non-game content may include, but are not necessarily limited to only, any of: data source(s) local or remote to the game server (104), cloud-based data source(s), etc. [0047] In response to determining the game spectator’ s preferences for game content with the spectator profile for the game spectator, the game server (104) or the spectator stream manager (114) therein can proceed to capture or generate dynamic or real time game contents for the game spectator. In some operational scenarios, the game server (104) or the spectator stream manager (114) therein monitors dynamic or real time contents and/or contexts of the computer game and uses the monitored contents and/or contexts of the computer game in combination with the spectator’s preferences for game content to identify or select spectator specific dynamic or real time game views of the computer game in video game streaming of the computer game to game spectator(s).

[0048] Example dynamic or real time contents and/or contexts in the computer game may include, but are not necessarily limited to only: any, some or all of: a presence or absence of a specific game player; anticipated, impending, or actual play actions of one or more game players or their avatars; anticipated, impending, or actual encountering of two specific game players or their avatars; anticipated, impending, or actual game server generated events such as scoring, death, etc., of an avatar representing a game player; anticipated, impending, or actual game player generated events in connection with an avatar representing a game player; change(s) of state(s) of a game world or a part therein; etc.

[0049] Example game views that can be dynamically selected - using dynamic or real time game contents and/or contexts in combination with the game spectator’s preferences - under enhanced game streaming as described herein may include, but are not necessarily limited to only, any, some or all of: birds-eye game views; first-person game views; game views other than birds-eye and first-person views; etc. In some operational scenarios, game views in video game streaming can be used or selected to focus on favorite game player(s) and/or popular game play(s) and/or favorite location(s) of a depicted game world in the computer game.

[0050] The game server (104) or the spectator stream manager (114) therein may invoke a camera controller 110 to (e.g., continuously, from time to time, live, in real time, in near real time, etc.) to place or direct specific camera(s) at specific location(s) in relation to the game world (108) at specific time point(s) or in specific time interval(s) corresponding to the determined spectator specific game views, and direct the selected cameras to capture or generate spectator specific - spectator group specific in some operational scenarios, where game content is selected specifically for a spectator group or subset in an overall population of game spectators - game content images of selected game scenes of the game world (108) corresponding to the determined spectator specific game views.

[0051] A real camera as described herein may capture images via light sensors or light sensitive materials operating with optical stack. A virtual camera as described herein may be implemented to capture images through computer-implemented image rendering techniques using a 3D model defining or specifying visual or non-visual characteristics such as depths, locations, positions, contours, surfaces, volumes, textures, patterns, positions, reflection properties, transparency properties, etc., of avatars, attendant game characters/objects, (e.g., virtual, etc.), ambient background, light sources, etc.

[0052] The total number of active cameras at any given time of a game session of a computer game may or may not be fixed in various operational scenarios. A camera as described herein can be fixed or movable to (e.g., different, etc.) location(s) within the game world (108) and/or location(s) outside the game world (108) such as a birds-eye view location (e.g., where the camera is placed outside or above the game world (108), etc.). [0053] In response to determining the game spectator’ s preferences for non-game content with the spectator profile for the game spectator, the game server (104) or the spectator stream manager (114) therein can proceed to retrieve or identify, from one or more non-game content (e.g., internal, external, cloud-based, cooperative, etc.) data sources or data repositories, non- game contents for the game spectator. Additionally, optionally or alternatively, in some operational scenarios, the game server (104) or the spectator stream manager (114) therein uses the monitored contents and/or contexts of the computer game in combination with the spectator’s preferences for non-game content to retrieve or identify the non-game contents for the game spectator from the data sources or repositories.

[0054] Example non-game contents may include, but are not necessarily limited to only: any, some or all of: videos not originated from the game world (108); images not originated from the game world (108); graphics not originated from the game world (108); messages not originated from the game world (108); visual or textual content related or specific to an avatar, a game player, an attendant game character/object but not originated or captured from the game world (108) using real or virtual camera(s); any combination of the foregoing; etc.

[0055] The game server (104) or the spectator stream manager (114) therein may invoke a spectator content combiner 116 to (e.g., continuously, from time to time, live, in real time, in near real time, etc.) combine the spectator specific non-game content into the spectator specific images generated from the spectator specific game views of the game world (108) by replacing or superimposing the spectator specific non-game content at specific image portions of the spectator specific images. These specific image portions of the spectator specific images may correspond spatial locations - in the game world (108) - set up or designated by the game server (104) as replaceable or superimposable with non-game visual content.

[0056] Texture, color (e.g., a green color image portion, etc.), luminance, image metadata, or any combination of the foregoing, can be used by the spectator content combiner (116) to identify a replaceable or superimposable image portion in a spectator specific game content image. The replaceable image portion may be replaced by or superimposed with one or more non-game content image portions retrieved or generated from the external or internal data store(s).

[0057] Additionally, optionally or alternatively, in some operational scenarios, a camera location of a spectator specific image captured from the game world (108) by a real or virtual camera may be used to determine whether and where to incorporate non-game content (image portions) into the spectator game content image.

[0058] Subsequent to combining the spectator specific non-game content into the spectator specific images of the game world (108) to generate spectator specific combined images, a spectator stream generator 112 may be invoked by the game server (104) or the spectator stream manager (114) to encode the spectator specific combined images (e.g., in one or more time consecutive sequences, etc.) into one or more spectator specific video streams of the computer game. These spectator specific video streams including the spectator specific combined images and related image metadata may be transmitted, streamed or delivered directly or indirectly to the game spectator device. The game spectator device may access, use or select some or all of the received spectator specific video streams of the computer game, decode the spectator specific combined images including the game and non-game visual content from some or all of the received spectator specific video streams of the computer game, generate corresponding display images from the decoded spectator specific combined images, and render some or all of the display images on one or more image displays operating in conjunction with the game spectator device.

3. COMBINING NON-GAME CONTENT WITH GAME CONTENT

[0059] Non-game (visual) content can be incorporated into game view images, for example using green screens or fiducial markers in the game view images or image portions (in the game view images) corresponding to pre-configured or dynamically determined locations/trajectories/perspectives within or in reference to a game world. [0060] FIG. 2 illustrates an example game world (e.g., 108, etc.) in which specific portions of the game world (108) or spatial locations therein can be demarcated, designated or assigned - e.g., by the game server (104 of FIG. 1A), a (e.g., game streaming, etc.) server other than the game server (104), etc. - for incorporating non-game content into game view images acquired or captured by a real or virtual camera from the game world (108).

[0061] As shown in FIG. 2, in a first example, a specific spatial area 202-1 (e.g., a stadium, an open area, a terrace, etc.) in the game world (108) may be demarcated, designated or assigned as a specific portion of the game world (108) to indicate, identify or signal that non-game content may be incorporated, merged, combined, or superimposed into game view images depicting a portion or entirety of the specific portion of the game world (108).

[0062] In a second example, a specific spatial straight line 202-2 (e.g., a wall, etc.) in the game world (108) may be demarcated, designated or assigned as a specific portion of the game world (108) to indicate, identify or signal that non-game content may be incorporated, merged, combined, or superimposed into game view images depicting a portion or entirety of the specific portion of the game world (108).

[0063] In a third example, a specific point 202-3 (e.g., on top of an ornate bridge, etc.) in the game world (108) may be demarcated, designated or assigned as a specific portion of the game world (108) to indicate, identify or signal that non-game content may be incorporated, merged, combined, or superimposed into game view images depicting a portion or entirety of the specific portion of the game world (108).

[0064] In a fourth example, a specific spatial curve 202-4 (e.g., a trail, a path, etc.) in the game world (108) may be demarcated, designated or assigned as a specific portion of the game world (108) to indicate, identify or signal that non-game content may be incorporated, merged, combined, or superimposed into game view images depicting a portion or entirety of the specific portion of the game world (108).

[0065] In a fifth example, a specific spatial cube 202-5 (e.g., a building, a room, a corridor, etc.) in the game world (108) may be demarcated, designated or assigned as a specific portion of the game world (108) to indicate, identify or signal that non-game content may be incorporated, merged, combined, or superimposed into game view images depicting a portion or entirety of the specific portion of the game world (108).

[0066] In various operational scenarios, these and other (e.g., one-dimensional or ID, two- dimensional or 2D, three-dimensional or 3D, etc.) geometric constructs may be used to demarcate, define or specify spatial portions of the game world (108) for non-game content placement, replacement and/or superimposition.

[0067] Some or all of the specific portions of the game world (108) for incorporating non- game content may be demarcated, delineated, specified or set up using geofences in a planar game world map representing the game world (108), geocoordinates represented in the game world (108), specific colors (e.g., green, etc.), specific textures (e.g., identifying a displayable wall or a poster board, etc.), fiducial markers, specific shapes (which may be permitted to vary, for example conformally, in game view images according to the camera location and direction used to capture the game view images containing these specific shapes), etc. Indication, identification or signaling information about some or all of the specific portions of the game world (108) can be visually perceptibly represented and embedded (e.g., with fiducial markers, specific colors, specific shapes, specific textures, etc.) within the game view images.

Additionally, optionally or alternatively, indication, identification or signaling information about some or all of the specific portions of the game world (108) can be non-visually represented and carried in image metadata relating to the game view images.

[0068] A non-game content combiner (e.g., 116 of FIG. 1A, etc.) can receive, extract or determine the information about the specific portions of the game world (108) from the game view images (e.g., using computer vision techniques, applying object segmentation filtering, etc.) and/or the image metadata, and use the information to combine, incorporate or superimpose non-game content at specific image portions in the game view images. [0069] FIG. 3A illustrates an example game view image 310 depicting a portion of a game world (e.g., 108 of FIG. 1 A or FIG. 2, etc.) in which an avatar 302 representing a game player may be present or depicted along with attendant game objects 304 and 306. The game view image (310) may be captured from the game world (108) by a real or virtual camera (not shown) placed in a specific portion (e.g., any of 202-1 through 202-5 of FIG. 2, etc.) of the game world (108) or at a spatial location therein that has been demarcated, designated or assigned - e.g., by the game server (104 of FIG. 1A), a (e.g., game streaming, etc.) server other than the game server (104), etc. - for incorporating non-game content into game view images acquired or captured by the camera.

[0070] The attendant game object 306 as depicted in the game view image (310) may be visually perceptibly indicated, identified or signaled as a spatial location at which non-game content may be incorporated, combined, inserted or superimposed. For example, a green screen, a fiducial marker or a specific texture may be used in an image portion of the captured game view image (310) to indicate where the non-game content may be incorporated, combined, inserted or superimposed.

[0071] In response to detecting or determining (e.g., through computer visioning, object segmentation, color/texture analysis, image metadata, etc.) the attendant game object (306) represents a spatial location for inserting/overlaying (e.g., replacing, superimposing, etc.) non- game content, a non-game content combiner (e.g., 116 of FIG. 1A, etc.) as described herein may generate a non-game content image portion 308 in which the non-game content is visually included or depicted. The non-game content combiner (116) can proceed to insert/overlay (e.g., replace, superimpose, etc.) over some or all of the attendant game object (306) with the non- game content image portion (308).

[0072] In some operational scenarios, before being inserted/overlaid into the game view image (310), the non-game content image portion (308) may be applied with one or more spatial transformations to fit into the designated image portion of the game view image (310) in a perspective corrective manner in reference to the location of the camera and/or camera settings

(e.g., as indicated image metadata, as inferred from visual content of the game view image (310), etc.). Example spatial transformations may include, but are not necessarily limited to only, any of: translations, rotations, scalings, camera-dependent reorientations or resizing, etc.

[0073] FIG. 3B illustrates an example game view image 310-1 depicting a portion of a game world (e.g., 108 of FIG. 1 A or FIG. 2, etc.) in which the avatar (302) may be present or depicted along with the attendant game objects (304). The game view image (310-1) may be captured from the game world (108) by a real or virtual camera 312 placed in a specific portion (e.g., any of 202-1 through 202-5 of FIG. 2, etc.) of the game world (108) or at a spatial location therein that has been demarcated, designated or assigned - e.g., by the game server (104 of FIG. 1A), a (e.g., game streaming, etc.) server other than the game server (104), etc. - for incorporating nongame content into game view images acquired or captured by the camera.

[0074] An image portion 306-1 of the captured game view image (310-1) at which nongame content may be incorporated, combined, inserted or superimposed may not be visually perceptibly indicated in the game view image (310-1). Image metadata may be generated by the camera or a (e.g., pre-, post-, etc.) processing device operating in conjunction with the camera (312) to generate the image metadata that logically but not visually delineate or demarcate the image portion (306-1) of the game view image (310-1).

[0075] The logical delineation or demarcation of the image portion (306-1) may be generated or determined using any combination of: locational information (e.g., geo-coordinates in a game world map or stationary coordinate system representing the game world (108), orientations, etc.) and/or settings (e.g., zoom factor, etc.) of the camera (312); locational information (e.g., geo-coordinates, orientations in relation to the camera (312) or a stationary coordinate system of the game world (108), etc.) of attendant game objects/characters such as 304 of FIG. 3B; locational information (e.g., geo-coordinates, orientations in relation to the camera (312) or a stationary coordinate system of the game world (108), etc.) of avatars representing game players such as 302 of FIG. 3B; locational or geofence information (e.g., geocoordinates, orientations in relation to the camera (312) or a stationary coordinate system of the game world (108), etc.) of spatial portions/locations of the game world (108) that permit nongame content insertion such as any of 202-1 through 202-5 of FIG. 2; etc.

[0076] By way of example but not limitation, automatic depth extraction may be performed to determine a depth (e.g., representing five meters away from the camera (312), etc.) of an avatar depicted in game view images and non-game content can be inserted or superimposed at an image portion at a second depth (e.g., representing ten meters away from the camera (312), representing five meters away from the avatar, etc.). Additionally, optionally or alternatively, an orientation of the avatar in relation to the camera (312) can be determined or extracted and used to determine a second orientation of an image portion with which the non-game content is inserted or superimposed.

[0077] In response to detecting or determining (e.g., through image metadata, etc.) the image portion (306-1) represents a spatial location for inserting/overlaying (e.g., replacing, superimposing, etc.) non-game content, a non-game content combiner (e.g., 116 of FIG. 1A, etc.) as described herein may generate a non-game content image portion 308-1 in which the non-game content is visually included or depicted. The non-game content combiner (116) can proceed to insert/overlay (e.g., replace, superimpose, etc.) over some or all of image portion (306-1) with the non-game content image portion (308-1).

[0078] In some operational scenarios, before being inserted/overlaid into the game view image (310-1), the non-game content image portion (308-1) may be applied with one or more spatial transformations to fit into the designated image portion of the game view image (310-1) in a perspective corrective manner in reference to the location of the camera and/or camera settings (e.g., as indicated image metadata, as inferred from visual content of the game view image (310-1), etc.). Example spatial transformations may include, but are not necessarily limited to only, any of: translations, rotations, scalings, camera-dependent reorientations or resizing, etc.

4. TRACKING GAME STREAMING

[0079] FIG. IB illustrates an example system 100-1 for enhancing and tracking video game streaming. Some or all devices and components/units thereof in the system (100-1) may be implemented in software, hardware, or a combination of software and hardware, with one or more of: computing processors such as central processing units or CPUs, audio codecs, video codecs, digital signal processors, graphic processing units or GPUs, etc.

[0080] As illustrated in FIG. IB, the system (100-1) includes a game server 104 that hosts a (e.g., live, video, online, real time, near real time, competitive, tournament, single player, multiplayer, team-based, etc.) computer game played by one or more (e.g., human, etc.) game players respectively operating one or more game player devices 102-1, 102-2, ... 102-N, where N is an integer no less than one (1).

[0081] The one or more game players may be respectively (virtually) represented by one or more avatars located in a game world 108. The game world (108) can be (e.g., autonomously, independently, without depending on a game director/streaming server, etc.) generated, defined, dynamically represented and/or maintained in the game server (100) or a game engine 106 therein.

[0082] Each game player device of the game player devices (102-1, 102-2, ... 102-N) can be configured or installed with a (e.g., same, with relatively minor differences, web-based, non- web-based, mobile, etc.) game client application for the computer game.

[0083] Spectator specific and/or non- spectator- specific video game streaming can be supported from the game server (104) to one or more game spectator devices 116-1, 116-2, ... 116-M, where M is an integer no less than one (1) operated respectively by one or more (e.g., human, etc.) game spectators.

[0084] The game spectator devices (116-1, 116-2, ... 116-M) may be configured or installed with a (e.g., same, with relatively minor differences, web-based, non-web-based, mobile, etc.) game spectator application for receiving video (or image) streams of the computer game directly or indirectly from the game server (104).

[0085] The game server (104) may maintain a data store or data cache to store some or all (e.g., up-to-date, etc.) spectator profiles that have been established or determined in past or current streaming operations. For any new spectator to which no existing spectator profiles have been specifically selected or defined, a default spectator profile may be initially selected for the new spectator for the purpose of generating enhanced video streaming for the new spectator. The initially selected spectator profile may be refined, updated and/or specialized with spectator specific information for the new spectator as such information becomes available.

[0086] In response to receiving a request for video streaming of the computer game from a game spectator device operated by a game spectator, the game server (104) may invoke a spectator stream manager 114 to establish or determine a specific spectator profile for the game spectator based at least in part on information (e.g., spectator name, spectator identifier, spectator geographic location, spectator information, etc.) carried in the request.

[0087] A spectator profile for a game spectator can be used by the game server (104) or the spectator stream manager (114) therein to determine preferences for game content as well as preferences for non-game content and carry out spectator enhanced game streaming to the game spectator device operated by the game spectator based at least in part on these preferences of the game spectator.

[0088] In some operational scenarios, the game server (104) may invoke some or all of a spectator stream manager (e.g., 114, etc.), a camera controller (e.g., 110, etc.), a spectator content combiner (e.g., 116, etc.), a spectator stream generator (e.g., 112, etc.) to generate (e.g., non- spectator- specific, spectator- specific, a combination of non- spectator- specific and spectatorspecific, etc.) game content and (e.g., non- spectator- specific, spectator-specific, a combination of non- spectator- specific and spectator- specific, etc.) non-game content and encode the game content and non-game content into one or more video game streams available for streaming to one or more game spectator devices. As used herein, “non- spectator- specific” content may refer to content that is not selected based on spectator specific preferences. “Spectator-specific” content may refer to content that is specifically selected based at least in part or in whole on spectator specific preferences.

[0089] In a first example, the game video streams may include at least one game video stream encoded with non- spectator- specific game content and non- spectator- specific non-game content. In a second example, the game video streams may include at least one game video stream encoded with non- spectator- specific game content and spectator- specific non-game content. In a third example, the game video streams may include at least one game video stream encoded with spectator- specific game content and non- spectator- specific non-game content. In a fourth example, the game video streams may include at least one game video stream encoded with spectator- specific game content and spectator- specific non-game content. These and other approaches of generating and encoding game content and/or non-game content may be used in various operational scenarios.

[0090] Techniques as described herein may implement or enable operations with various types of game content directing, for example to focus on favorite game players or their corresponding avatars, using one or more of: auto game content directors, designated directors with human input, self-directors by individual game spectators, etc. For example, an automated director can automatically identify or select the most interesting game views and corresponding video game streams from a space or set of all possible game views and video game streams that can possibly be captured at all available angles or positions within or in reference to a game world. One of these most interesting game views or video game streams may be captured with a camera that focuses or includes an image portion with which non-game content can be inserted or superimposed. The camera may be directed to capture a game play with the inclusion of the image portion for a relatively extended time duration or period to lengthen spectator viewing time of the non-game content inserted or superimposed with the image portion. Additionally, optionally or alternatively, spectator viewing time of the non-game content can be extended using replays, highlights, slow motions, etc., that include the image portion with which the nongame content is inserted or superimposed. A director including but not limited to a game spectator performing self-direction can select or move a camera through the game world, instead of seeing from a head-mounted camera from a first person or game player. Each of some or all game spectators can see their unique game view with a distinct perspective. For example, different game spectators can direct their respective cameras to focus on their favorite game players or corresponding avatars, respectively.

[0091] In an example, the game video streams available for streaming to game spectator devices may include at least one game video stream encoded with game content generated under direction of an (e.g., artificial intelligence or Al based, machine learning or ML based, artificial neural network or ANN based, rule based, etc.) automated director with no or little user input. In another example, the game video streams may include at least one game video stream encoded with game content generated under direction of a designated director. In a seventh example, the game video streams may include at least one game video stream encoded with game content generated under direction of a game spectator. Example generation of game content under direction or automated director, designated director or game spectator can be found in U.S. Patent Application No. 63/305,635 (Attorney Docket No. 60175-0492; D20111USP1), titled “DIRECTING COMPUTER GAME STREAMING,” by Ajit Ninan, filed on 1 February 2022, the contents of which are incorporated herein by reference in its entirety.

[0092] Game streaming - including but not limited to directed game streaming under an automated director or human director/spectator - may be supported with the game world (108) that hosts the computer game or a mirror copy (e.g., 128 of FIG. 1C, etc.) of the game world (108) maintained or mirrored in another server device/system such as a media streaming server

(e.g., 126 of FIG. 1C, etc.) operating in conjunction with the game server (104). [0093] FIG. 1C illustrates an example game streaming server 126 - which may be a game director server that directs game content creation - operating in conjunction with a game server (e.g., 104, etc.). As shown, the game server includes a game engine (e.g., 106, etc.) that interacts with game player devices (e.g., 102-1, 102-2, ... 102-N, etc.) to host the computer game. The game player devices can still send player device generated game state and control information to the game server (104) or the game engine (106) therein and receives game server generated game state and control information from the game server (104), for the purpose of playing the computer game. In some operational scenarios, the game streaming server (126) receives the player device generated game state and control information as well as the game server generated game state and control information, directly or indirectly from the game player devices and/or the game server (104). In some operational scenarios, the game streaming server (126) receives the player device generated game state and control information directly or indirectly from the game player devices but do not receive the game server generated game state and control information, directly or indirectly from the game server (104). Instead, the game streaming server (126) may include or implement a second game engine separate from the game engine (106) of the game server (104) to independently generate or recreate a mirror copy of the game server generated state and control information. In any event, the game streaming server (126) accesses or uses the player device generated state and control information as well as the game server generated state and control information (or the independently generated mirror copy thereof) to maintain a mirror game world 128. The game streaming server (126) can invoke or request some or all of a spectator stream manager (e.g., 114, etc.), a camera controller (e.g., 110, etc.), a spectator content combiner (e.g., 116, etc.), a spectator stream generator (e.g., 112, etc.) to generate game view content - e.g., multiple concurrent or simultaneous game views or visual depictions from different cameras differently positioned and/or directly oriented/angled in relation to the mirror game world (128) - from the mirror game world (128), instead of the game world (108) maintained by the game server (104) of FIG. 1C. [0094] Separate game viewing to different game spectators - which may or may not vary based at least in part of respective spectator profiles/preferences of the game spectators - may be supported under techniques as described herein. In some operational scenarios, no game view switching is provided to a specific game spectator. Only a specific game view as selected by a game server or a streaming server is provided to such a game spectator. In some operational scenarios, only game content is streamed to a specific game spectator, without insertion or overlaying of any non-game content into game view images that provide the game content. In some operational scenarios, game content overlaid or superimposed with non-game content - e.g., with transparent backgrounds to show through the covered original game content portions covered by non-game content portions, etc. - is streamed to a specific game spectator. In some operational scenarios, game content depicting multiple game players or their corresponding avatars - e.g., not only seen from one game player, not only a favorite game player or a corresponding avatar, etc. - and captured by a camera placed/located within or without the game world is streamed to a specific game spectator.

[0095] In some operational scenarios, in response to the request for video streaming of the computer game from the game spectator device, the game server (104) of FIG. 1A or FIG. IB - or the game streaming server (126) of FIG. 1C - may stream, transmit or deliver one or more specific game video streams to the game spectator device.

[0096] Additionally, optionally or alternatively, in some operational scenarios, in response to the request for video streaming of the computer game from the game spectator device, the game server (104) of FIG. 1A or FIG. IB - or the game streaming server (126) of FIG. 1C - may first send a video game streaming response (e.g., a web page, an HTML message, etc.) to the game spectator device. The response may include one or more selectable options (e.g., one or more selectable links on a web page, one or more selectable/clickable thumbnail images, etc.) respectively corresponding to the one or more game video streams.

[0097] In response to receiving the game streaming response, the game spectator device can present the selectable options specified in the video streaming response, for example in a display page, to the game spectator. Based on interaction (e.g., user clicking, user selecting, selection of a particular selectable/clickable thumbnail image on the display page, etc.) with the game spectator, the game spectator determines or identifies a specific selectable option from among the selectable options. The game spectator device proceeds to send to the game server (104) of FIG. 1A or FIG. IB - or the game streaming server (126) of FIG. 1C - a selection request for a specific video stream corresponding to the specific selectable option.

[0098] In response to receiving the selection request for the specific video stream from the game spectator device, the game server (104) of FIG. 1A or FIG. IB - or the game streaming server (126) of FIG. 1C - streams, transmits or delivers the specific video stream to the game spectator device. In a game session of the computer game, the game player may operate the game spectator device to switch between different available game video streams.

[0099] In some operational scenarios, (e.g., spectator specific, non-spectator specific, etc.) non-game content can be provided, selected, directed or curated to specific game spectators to lengthen viewing time of the non-game content (e.g., conveying a specific commercial or noncommercial message, etc.). Additionally, optionally or alternatively, in some operational scenarios, (e.g., spectator specific, non-spectator specific, etc.) game content can be provided, selected, directed or curated to specific game spectators to lengthen viewing time of the game content (e.g., depicting an avatar representing a specific game player, depicting a specific game play, etc.).

[0100] In some operational scenarios, a game streaming tracker 124 may be implemented as a part of, or a computing device operating in conjunction with, the game server (104) of FIG. 1A or FIG. IB - or the game streaming server (126) of FIG. 1C - to keep track of game viewing statistics of (e.g., a population of, etc.) game spectators and respective times. The game viewing statistics may include individual or overall viewing statistics to individual portions of game content or individual portions of non-game content. Some or all of the game viewing statistics can be used on an ongoing basis for directing game spectators to specific game players, specific game plays, specific locations in the game world (108 of FIG. 1 A or FIG. IB) or the mirror copy (128 of FIG. 1C), etc., or to specific non-game content or specific portions thereof.

[0101] In a first example, the game streaming tracker (124) may generate specific game viewing statistics for non-game content viewing with respect to an individual game spectator, an individual game spectator group (comprising multiple game spectators), and up to an entire game spectator population. The specific game viewing statistics may include, but are not necessarily limited to only, any, some or all of: types of non-game content items/portions, sources of non-game content items/portions, numbers of spectator views of non-game content items/portions, time durations of spectator views of non-game content items/portions, etc.

[0102] Additionally, optionally or alternatively, the game streaming tracker (124) may generate specific game viewing statistics for game content viewing with respect to an individual game spectator, an individual game spectator group (comprising multiple game spectators), and up to an entire game spectator population. The specific game viewing statistics may include, but are not necessarily limited to only, any, some or all of: types of game players, game players, game play locations, avatars, attendant game objects/characters, spectator views and time durations of views of any or any combination of the foregoing, etc.

[0103] Additionally, optionally or alternatively, the game streaming tracker (124) may generate specific finer game viewing statistics from collected game viewing statistics with respect to an individual game spectator, an individual game spectator group (comprising multiple game spectators), and up to an entire game spectator population. These finer game viewing statistics may include any inferred viewing relationships between viewed game content items/portions and viewed non-game content items/portions with respect to an individual game spectator, an individual game spectator group (comprising multiple game spectators), and up to an entire game spectator population. [0104] In a second example, the game streaming tracker (124) may generate specific game viewing statistics for non-game content viewing with respect to an individual game player, an individual game player team, etc. The specific game viewing statistics may include, but are not necessarily limited to only, any, some or all of: types of non-game content items/portions viewed by spectator group (e.g., demographic, geographic, age, other spectator characteristics, etc.) up to entire spectator population, sources of non-game content items/portions viewed by spectator group up to entire spectator population, numbers of spectator views of non-game content items/portions viewed by spectator group up to entire spectator population, time durations of spectator views of non-game content items/portions viewed by spectator group up to entire spectator population, etc.

[0105] Additionally, optionally or alternatively, the game streaming tracker (124) may generate specific game viewing statistics for game content viewing with respect to an individual game player, an individual game player team, etc. The specific game viewing statistics may include, but are not necessarily limited to only, any, some or all of: types of game players, game players, game play locations, avatars, attendant game objects/characters, spectator views and time durations of spectator views of any or any combination of the foregoing, etc.

[0106] Additionally, optionally or alternatively, the game streaming tracker (124) may generate specific finer game viewing statistics from collected game viewing statistics with respect to an individual game player, an individual game player team, etc. These finer game viewing statistics may include any inferred viewing relationships between viewed game content items/portions and viewed non-game content items/portions with respect to an individual game player, an individual game player team, etc.

[0107] In a third example, the game streaming tracker (124) may generate specific game viewing statistics for non-game content viewing with respect to an individual non-game content item/portion, an individual source of non-game content items/portions, etc. The specific game viewing statistics may include, but are not necessarily limited to only, any, some or all of: types of non-game content items/portions viewed by spectator group (e.g., demographic, geographic, age, other spectator characteristics, etc.) up to entire spectator population, sources of non-game content items/portions viewed by spectator group up to entire spectator population, numbers of spectator views of non-game content items/portions viewed by spectator group up to entire spectator population, time durations of spectator views of non-game content items/portions viewed by spectator group up to entire spectator population, etc.

[0108] Additionally, optionally or alternatively, the game streaming tracker (124) may generate specific game viewing statistics for game content viewing with respect to an individual non-game content item/portion, an individual source of non-game content items/portions, etc. The specific game viewing statistics may include, but are not necessarily limited to only, any, some or all of: types of game players, game players, game play locations, avatars, attendant game objects/characters, spectator views and time durations of spectator views of any or any combination of the foregoing, etc.

[0109] Additionally, optionally or alternatively, the game streaming tracker (124) may generate specific finer game viewing statistics from collected game viewing statistics with respect to an individual non-game content item/portion, an individual source of non-game content items/portions, etc. These finer game viewing statistics may include any inferred viewing relationships between viewed game content items/portions and viewed non-game content items/portions with respect to an individual non-game content item/portion, an individual source of non-game content items/portions, etc.

[0110] Based at least in part on these specific game viewing statistics and/or finer game viewing statistics and/or inferred viewing relationships between the game content items/portions and the non-game content items/portions, the game server (104) of FIG. 1A or FIG. IB or the game streaming server (126) of FIG. 1C may control cameras to be focus on specific game players or avatars, specific game plays, etc., and control non-game content insertion or superimposition from specific non-game content sources, specific non-game content items/portions, etc.

[0111] In some operational scenarios, a game player may be assigned by a game server with additional weapons, lives, energies, features, credits, skins, materials, capabilities, magics, etc., in present or future game playing or in present or future computer games in response to determining that the game player or avatar(s) representing the game player generate viewership from more spectators, more viewing numbers, more viewing time durations, etc., than other game players.

[0112] In some operational scenarios, a game play, a game player, an avatar, a non-game content item/portion, a source of non-game content, etc., may be assigned by a game server or a game streaming server with more lengthened time or higher priority to be included in present or future video game streaming in response to determining that the game play, game player, avatar, non-game content item/portion, source of non-game content, etc., generate viewership from more spectators, more viewing numbers, more viewing time durations, etc., as compared with other game plays, game players, avatars, non-game content items/portions, sources of non-game content, etc.

5. EXAMPLE PROCESS FLOWS

[0113] FIG. 4 illustrates an example process flow according to an example embodiment of the present invention. In some example embodiments, one or more computing devices or components may perform this process flow. In block 402, a system for incorporating non-game content into video game streaming designates a spatial portion of a game world of a computer game for incorporating non-game visual content.

[0114] In block 404, the system receives a game view image that includes a visual depiction of at least a part of the game world at a first time.

[0115] In block 406, the system determines whether the game view image includes the spatial portion of the game world. [0116] In block 408, in response to determining that the game view image includes the spatial portion of the game world, the system inserts a non-game visual content portion into the game view image at a spatial location corresponding to the spatial portion.

[0117] In block 410, the system encodes the game view image, as inserted with the nongame visual content portion, into a video game stream to cause a recipient device of the video game stream to generate a display image from the game view image and render the display image on an image display operating in conjunction with the recipient device.

[0118] In an embodiment, the spatial portion is visually perceptibly demarcated in the game view image.

[0119] In an embodiment, the spatial portion is not visually perceptibly demarcated in the game view image and is indicated in image metadata generated for the game view image.

[0120] In an embodiment, a game spectator operates the recipient device to receive and view video game streaming of the computer game.

[0121] In an embodiment, the game view image depicts a specific visual scene, represented in the game world, selected based on spectator preferences for game content.

[0122] In an embodiment, the non-game content item is selected based on spectator preferences for non-game content.

[0123] In an embodiment, different video game streams generated by multiple cameras from the game world are streamed to different recipient devices; game viewing statistics are collected, tracked and analyzed to cause changes to future video game streaming of the computer game.

[0124] In an embodiment, different game spectators operate the different recipient devices, respectively; a specific game spectator, among the different game spectators, is provided with a specific non-game content item incorporated into game view images rendered to the specific game spectator; the specific non-game content item is selected based at least in part on individual game viewing statistics collected for the specific game spectator. [0125] In an embodiment, a specific game player of the computer game is provided with additional capabilities based at least in part on individual game viewing statistics collected for the specific game player.

[0126] In an embodiment, a camera is placed at a specific position and a specific direction relative to a coordinate system of the game world to capture one or more game view images from the computer world; the specific portion and the specific direction are selected based at least in part on individual game viewing statistics collected for one of a game spectator, a game player, a game play, a source of non-game content, etc.

[0127] In an embodiment, the game world represents a mirror copy of player interacting game world.

[0128] In an embodiment, the video game stream is encoded with a sequence of consecutive game view images captured from the game world under direction of one of an automated director, a designated director, a game spectator, etc.

[0129] In various example embodiments, an apparatus, a system, an apparatus, or one or more other computing devices performs any or a part of the foregoing methods as described. In an embodiment, a non-transitory computer readable storage medium stores software instructions, which when executed by one or more processors cause performance of a method as described herein.

[0130] Note that, although separate embodiments are discussed herein, any combination of embodiments and/or partial embodiments discussed herein may be combined to form further embodiments.

6. IMPLEMENTATION MECHANISMS - HARDWARE OVERVIEW

[0131] According to one embodiment, the techniques described herein are implemented by one or more special-purpose computing devices. The special-purpose computing devices may be hard-wired to perform the techniques, or may include digital electronic devices such as one or more application-specific integrated circuits (ASICs) or field programmable gate arrays (FPGAs) that are persistently programmed to perform the techniques, or may include one or more general purpose hardware processors programmed to perform the techniques pursuant to program instructions in firmware, memory, other storage, or a combination. Such specialpurpose computing devices may also combine custom hard-wired logic, ASICs, or FPGAs with custom programming to accomplish the techniques. The special-purpose computing devices may be desktop computer systems, portable computer systems, handheld devices, networking devices or any other device that incorporates hard-wired and/or program logic to implement the techniques.

[0132] For example, FIG. 5 is a block diagram that illustrates a computer system 500 upon which an example embodiment of the invention may be implemented. Computer system 500 includes a bus 502 or other communication mechanism for communicating information, and a hardware processor 504 coupled with bus 502 for processing information. Hardware processor 504 may be, for example, a general purpose microprocessor.

[0133] Computer system 500 also includes a main memory 506, such as a random access memory (RAM) or other dynamic storage device, coupled to bus 502 for storing information and instructions to be executed by processor 504. Main memory 506 also may be used for storing temporary variables or other intermediate information during execution of instructions to be executed by processor 504. Such instructions, when stored in non-transitory storage media accessible to processor 504, render computer system 500 into a special-purpose machine that is customized to perform the operations specified in the instructions.

[0134] Computer system 500 further includes a read only memory (ROM) 508 or other static storage device coupled to bus 502 for storing static information and instructions for processor 504.

[0135] A storage device 510, such as a magnetic disk or optical disk, solid state RAM, is provided and coupled to bus 502 for storing information and instructions.

[0136] Computer system 500 may be coupled via bus 502 to a display 512, such as a liquid crystal display, for displaying information to a computer user. An input device 514, including alphanumeric and other keys, is coupled to bus 502 for communicating information and command selections to processor 504. Another type of user input device is cursor control 516, such as a mouse, a trackball, or cursor direction keys for communicating direction information and command selections to processor 504 and for controlling cursor movement on display 512. This input device typically has two degrees of freedom in two axes, a first axis (e.g., x) and a second axis (e.g., y), that allows the device to specify positions in a plane.

[0137] Computer system 500 may implement the techniques described herein using customized hard-wired logic, one or more ASICs or FPGAs, firmware and/or program logic which in combination with the computer system causes or programs computer system 500 to be a special-purpose machine. According to one embodiment, the techniques herein are performed by computer system 500 in response to processor 504 executing one or more sequences of one or more instructions contained in main memory 506. Such instructions may be read into main memory 506 from another storage medium, such as storage device 510. Execution of the sequences of instructions contained in main memory 506 causes processor 504 to perform the process steps described herein. In alternative embodiments, hard-wired circuitry may be used in place of or in combination with software instructions.

[0138] The term “storage media” as used herein refers to any non-transitory media that store data and/or instructions that cause a machine to operation in a specific fashion. Such storage media may comprise non-volatile media and/or volatile media. Non-volatile media includes, for example, optical or magnetic disks, such as storage device 510. Volatile media includes dynamic memory, such as main memory 506. Common forms of storage media include, for example, a floppy disk, a flexible disk, hard disk, solid state drive, magnetic tape, or any other magnetic data storage medium, a CD-ROM, any other optical data storage medium, any physical medium with patterns of holes, a RAM, a PROM, and EPROM, a FLASH-EPROM, NVRAM, any other memory chip or cartridge. [0139] Storage media is distinct from but may be used in conjunction with transmission media. Transmission media participates in transferring information between storage media. For example, transmission media includes coaxial cables, copper wire and fiber optics, including the wires that comprise bus 502. Transmission media can also take the form of acoustic or light waves, such as those generated during radio-wave and infra-red data communications.

[0140] Various forms of media may be involved in carrying one or more sequences of one or more instructions to processor 504 for execution. For example, the instructions may initially be carried on a magnetic disk or solid state drive of a remote computer. The remote computer can load the instructions into its dynamic memory and send the instructions over a telephone line using a modem. A modem local to computer system 500 can receive the data on the telephone line and use an infra-red transmitter to convert the data to an infra-red signal. An infra-red detector can receive the data carried in the infra-red signal and appropriate circuitry can place the data on bus 502. Bus 502 carries the data to main memory 506, from which processor 504 retrieves and executes the instructions. The instructions received by main memory 506 may optionally be stored on storage device 510 either before or after execution by processor 504.

[0141] Computer system 500 also includes a communication interface 518 coupled to bus 502. Communication interface 518 provides a two-way data communication coupling to a network link 520 that is connected to a local network 522. For example, communication interface 518 may be an integrated services digital network (ISDN) card, cable modem, satellite modem, or a modem to provide a data communication connection to a corresponding type of telephone line. As another example, communication interface 518 may be a local area network (LAN) card to provide a data communication connection to a compatible LAN. Wireless links may also be implemented. In any such implementation, communication interface 518 sends and receives electrical, electromagnetic or optical signals that carry digital data streams representing various types of information.

[0142] Network link 520 typically provides data communication through one or more networks to other data devices. For example, network link 520 may provide a connection through local network 522 to a host computer 524 or to data equipment operated by an Internet Service Provider (ISP) 526. ISP 526 in turn provides data communication services through the world wide packet data communication network now commonly referred to as the “Internet” 528. Local network 522 and Internet 528 both use electrical, electromagnetic or optical signals that carry digital data streams. The signals through the various networks and the signals on network link 520 and through communication interface 518, which carry the digital data to and from computer system 500, are example forms of transmission media.

[0143] Computer system 500 can send messages and receive data, including program code, through the network(s), network link 520 and communication interface 518. In the Internet example, a server 530 might transmit a requested code for an application program through Internet 528, ISP 526, local network 522 and communication interface 518.

[0144] The received code may be executed by processor 504 as it is received, and/or stored in storage device 510, or other non-volatile storage for later execution.

7. EQUIVALENTS , EXTENSIONS , ALTERNATIVES AND MISCELLANEOUS

[0145] In the foregoing specification, example embodiments of the invention have been described with reference to numerous specific details that may vary from implementation to implementation. Thus, the sole and exclusive indicator of what is the invention, and is intended by the applicants to be the invention, is the set of claims that issue from this application, in the specific form in which such claims issue, including any subsequent correction. Any definitions expressly set forth herein for terms contained in such claims shall govern the meaning of such terms as used in the claims. Hence, no limitation, element, property, feature, advantage or attribute that is not expressly recited in a claim should limit the scope of such claim in any way. The specification and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense.

[0146] Various aspects of the present invention may be appreciated from the following enumerated example embodiments (EEEs):

EEE 1. A method comprising: designating a spatial portion of a game world of a computer game for incorporating nongame visual content; receiving a game view image that includes a visual depiction of at least a part of the game world at a first time; determining whether the game view image includes the spatial portion of the game world; in response to determining that the game view image includes the spatial portion of the game world, inserting a non-game visual content portion into the game view image at a spatial location corresponding to the spatial portion; encoding the game view image, as inserted with the non-game visual content portion, into a video game stream to cause a recipient device of the video game stream to generate a display image from the game view image and render the display image on an image display operating in conjunction with the recipient device.

EEE 2. The method of EEE 1, wherein the spatial portion is visually perceptibly demarcated in the game view image.

EEE 3. The method of EEE 1, wherein the spatial portion is not visually perceptibly demarcated in the game view image and is indicated in image metadata generated for the game view image.

EEE 4. The method of any of EEEs 1-3, wherein a game spectator operates the recipient device to receive and view video game streaming of the computer game. EEE 5. The method of any of EEEs 1-4, wherein the game view image depicts a specific visual scene, represented in the game world, selected based on spectator preferences for game content.

EEE 6. The method of any of EEEs 1-5, wherein the non-game content item is selected based on spectator preferences for non-game content.

EEE 7. The method of any of EEEs 1-6, wherein different video game streams generated by multiple cameras from the game world are streamed to different recipient devices; wherein game viewing statistics are collected, tracked and analyzed to cause changes to future video game streaming of the computer game.

EEE 8. The method of any of EEEs 1-7, wherein different game spectators operate the different recipient devices, respectively; wherein a specific game spectator, among the different game spectators, is provided with a specific non-game content item incorporated into game view images rendered to the specific game spectator; wherein the specific non-game content item is selected based at least in part on individual game viewing statistics collected for the specific game spectator.

EEE 9. The method of any of EEEs 1-8, wherein a specific game player of the computer game is provided with additional capabilities based at least in part on individual game viewing statistics collected for the specific game player.

EEE 10. The method of any of EEEs 1-9, wherein a camera is placed at a specific position and a specific direction relative to a coordinate system of the game world to capture one or more game view images from the computer world; wherein the specific portion and the specific direction are selected based at least in part on individual game viewing statistics collected for one of a game spectator, a game player, a game play, or a source of non-game content.

EEE 11. The method of any of EEEs 1-10, wherein the game world represents a mirror copy of player interacting game world.

EEE 12. The method of any of EEEs 1-10, wherein the video game stream is encoded with a sequence of consecutive game view images captured from the game world under direction of one of an automated director, a designated director or a game spectator.

EEE 13. An apparatus performing any of the methods as recited in EEEs 1-12.

EEE 14. A non-transitory computer readable storage medium, storing software instructions, which when executed by one or more processors cause performance of the method recited in any of EEEs 1-12.

EEE 15. A computing device comprising one or more processors and one or more storage media, storing a set of instructions, which when executed by one or more processors cause performance of the method recited in any of EEEs 1-12.