Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
STATE SHARE AND USE BETWEEN GAMEPLAY VIDEO AND GAME
Document Type and Number:
WIPO Patent Application WO/2021/066848
Kind Code:
A1
Abstract:
The disclosed subject matter receives a first game state of one or more game states that are associated with a game. Each game state includes an attribute that defines a condition of a game and an identification value that defines when the state occurred during gameplay. The first game state is assigned to a first portion of a video that is based on captured gameplay of the game. A request is received to launch the game. A selected game state is determined from the one or more game states. The game is launched based on the selected game state.

Inventors:
DANIELS MIKE (US)
PETERSON STACEY (US)
CHEUNG SAM (US)
HSIAO CATHERINE (US)
Application Number:
PCT/US2019/054891
Publication Date:
April 08, 2021
Filing Date:
October 04, 2019
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
GOOGLE LLC (US)
International Classes:
A63F13/497; A63F13/48; A63F13/533; A63F13/86
Foreign References:
US20180001216A12018-01-04
GB2557976A2018-07-04
EP2014342A12009-01-14
EP2656889A12013-10-30
Attorney, Agent or Firm:
DAVIDSON, Ryan S. et al. (US)
Download PDF:
Claims:
CLAIMS

1. A computer-implemented method comprising: receiving a first game state of a plurality of game states, each game state of the plurality of game states comprising: an attribute that defines a condition of a game; and an identification value that defines when the state occurred during gameplay; assigning the first game state to a first portion of a video, the video based on captured gameplay of the game; receiving a request to launch the game; determining a selected game state from the plurality of game states; and launching the game based on the selected game state.

2. The method of claim 1, wherein the game is launched using the selected game state to incorporate the condition defined by the attribute in the game.

3. The method of claim 1, wherein the request to launch the game is received from a viewer of the video.

4. The method of claim 1, wherein: the request to launch the game is based on a selected frame or a selected timestamp of the video; and the determining the selected game state from the plurality of game states is based on comparing the selected frame or the selected timestamp with the identification values of one or more of the plurality of game states.

5. The method of claim 1, wherein the attribute is a player position, a game level, a game map, a player inventory, a player status, a location of an adversary, a location of an object, or a progress point.

6. The method of claim 1, wherein only a single game state of the plurality of game states is assigned to the video at a time.

7. The method of claim 1, further comprising: assigning a second game state of the plurality of game states to a second portion of the video.

8. The method of claim 1, further comprising: providing a video player user interface to play the video, wherein the determining the selected game state from the plurality of game states is based on the position of a playhead of the video player user interface.

9. The method of claim 1, wherein the plurality of game states are generated during gameplay of the game.

10. The method of claim 1, further comprising: discarding an old game state of the plurality of game states that reaches a threshold age when the old game state has not been assigned to the video.

11. The method of claim 1, wherein the video is livestreamed during gameplay of the game.

12. The method of claim 1, wherein the video was previously captured during gameplay of the game.

13. The method of claim 4, further comprising: receiving an additional request to launch the game based on an additional selected timestamp or an additional selected frame of the video when the game has already launched.

14. A non-transitory computer-readable medium comprising instructions operable, when executed by one or more computing systems, to: receive a first game state of a plurality of game states, each game state of the plurality of game states comprising: an attribute that defines a condition of a game; and an identification value that defines when the state occurred during gameplay; assign the first game state to a first portion of a video, the video based on captured gameplay of the game; receive a request to launch the game; determine a selected game state from the plurality of game states; and launch the game based on the selected game state.

15. The medium of claim 14, wherein the game is launched using the selected game state to incorporate the condition defined by the attribute in the game.

16. The medium of claim 14, wherein the request to launch the game is received from a viewer of the video.

17. The medium of claim 14, wherein: the request to launch the game is based on a selected frame or a selected timestamp of the video; and the determination of the selected game state from the plurality of game states is based on comparing the selected frame or the selected timestamp with the identification values of one or more of the plurality of game states.

18. The medium of claim 14, wherein the attribute is a player position, a game level, a game map, a player inventory, a player status, a location of an adversary, a location of an object, or a progress point.

19. The medium of claim 14, wherein only a single game state of the plurality of game states is assigned to the video at a time.

20. The medium of claim 14, further comprising instructions to: assign a second game state of the plurality of game states to a second portion of the video.

21. The medium of claim 14, further comprising instructions to: provide a video player user interface to play the video, wherein the determination of the selected game state from the plurality of game states is based on the position of a playhead of the video player user interface.

22. The medium of claim 14, wherein the plurality of game states are generated during gameplay of the game.

23. The medium of claim 14, further comprising instructions to: discard an old game state of the plurality of game states that reaches a threshold age when the old game state has not been assigned to the video.

24. The medium of claim 14, wherein the video is livestreamed during gameplay of the game.

25. The medium of claim 14, wherein the video was previously captured during gameplay of the game.

26. The medium of claim 17, further comprising instructions to: receive an additional request to launch the game based on an additional selected timestamp or an additional selected frame of the video when the game has already launched.

Description:
STATE SHARE AND USE BETWEEN GAMEPLAY VIDEO AND GAME

BACKGROUND

[0001] Watching others play video games has become an increasingly popular activity for gamers and non-gamers alike. Gameplay videos have dominated the ranks of popular video sharing websites and spawned Internet celebrities known for their playing skills and personalities. The reasons viewers enjoy watching video games may parallel the reasons that viewers enjoy watching conventional spectator sports. For example, viewers may watch gameplay videos based on an interest in learning more about a game, to observe and adopt strategies from skillful players, and for entertainment purposes. The gaming experience presented to an observer player in a video may be based on a combination of several factors, such as the level or progression within a game that the player possesses, the player profile, the player position, the player equipment, the positions of game entities and/or adversaries, and player status, for example.

BRIEF SUMMARY

[0002] According to an embodiment of the disclosed subject matter a computer-implemented method may include receiving a first game state of a plurality of game states. Each game state of the plurality of game states may include an attribute that defines a condition of a game and an identification value that defines when the state occurred during gameplay. The method may further include assigning the first game state to a first portion of a video. The video may be based on captured gameplay of the game. The method may further include receiving a request to launch the game based on a selected timestamp of the video. The request may be received from a view of the video. The method may further include determining a selected game state from the plurality of game states based on comparing the selected timestamp with the identification values of one or more of the plurality of game states. The method may further include launching the game based on the selected game state. The game may be launched using the selected game state to incorporate the condition defined by the attribute in the game. The attribute may be a player position, a game level, a game map, a player inventory, a player status, a location of an adversary, a location of an object, or a progress point. It may be that only a single game state of the plurality of game states is assigned to the video at any timestamp of the video. The method may further include assigning a second game state of the plurality of game states to a second portion of the video. The method may further include providing a video player user interface to play the video. The method may further include selecting the selected timestamp based on the position of a playhead of the video player user interface. The plurality of game states may be generated during gameplay of the game. The method may further include discarding an old game state of the plurality of game states that reaches a threshold age when the old game state has not been assigned to the video. The video may be livestreamed during gameplay of the game. The video may be previously captured during gameplay of the game. The method may further include receiving an additional request to launch the game based on an additional selected timestamp of the video when the game has already launched.

[0003] According to an embodiment of the disclosed subject matter, a non-transitory computer-readable medium may include instructions operable, when executed by one or more computing systems to receive a first game state of a plurality of game states. Each game state of the plurality of game states may include an attribute that defines a condition of a game and an identification value that defines when the state occurred during gameplay. The medium may further include instructions to assign the first game state to a first portion of a video. The video may be based on captured gameplay of the game. The medium may further include instructions to receive a request to launch the game based on a selected timestamp of the video. The request may be received from a viewer of the video. The medium may further include instructions to determine a selected game state from the plurality of game states based on comparing the selected timestamp with the identification values of one or more of the plurality of game states. The medium may further include instructions to launch the game based on the selected game state. The game may be launched using the selected game state to incorporate the condition defined by the attribute in the game. The attribute may be a player position, a game level, a game map, a player inventory, a player status, a location of an adversary, a location of an object, or a progress point. It may be that only a single game state of the plurality of game states is assigned to the video at any timestamp of the video. The medium may further include instructions to assign a second game state of the plurality of game states to a second portion of the video. The medium may further include instructions to provide a video player user interface to play the video. The medium may further include instructions to select the selected timestamp based on the position of a playhead of the video player user interface. The plurality of game states may be generated during gameplay of the game. The medium may further include instructions to discard an old game state of the plurality of game states that reaches a threshold age when the old game state has not been assigned to the video. The video may be livestreamed during gameplay of the game. The video may be previously captured during gameplay of the game. The medium may further include instructions to receive an additional request to launch the game based on an additional selected timestamp of the video when the game has already launched.

[0004] Additional features, advantages, and embodiments of the disclosed subject matter may be set forth or apparent from consideration of the following detailed description, drawings, and claims. Moreover, it is to be understood that both the foregoing summary and the following detailed description are illustrative and are intended to provide further explanation without limiting the scope of the claims.

BRIEF DESCRIPTION OF THE DRAWINGS

[0005] The accompanying drawings, which are included to provide a further understanding of the disclosed subject matter, are incorporated in and constitute a part of this specification. The drawings also illustrate embodiments of the disclosed subject matter and together with the detailed description serve to explain the principles of embodiments of the disclosed subject matter. No attempt is made to show structural details in more detail than may be necessary for a fundamental understanding of the disclosed subject matter and various ways in which it may be practiced.

[0006] FIG. 1 shows a block diagram of a computer system according to an embodiment of the disclosed subject matter.

[0007] FIG. 2A shows a flow diagram illustrating how game states may be determined according to an embodiment of the disclosed subject matter. [0008] FIG. 2B shows an organization of data according to an embodiment of the disclosed subject matter.

[0009] FIG. 3 shows a flow diagram illustrating communications and interactions in the context of capturing a video or screenshot according to an embodiment of the disclosed subject matter.

[0010] FIG. 4 shows a flow diagram illustrating communications and interactions in the context of livestreaming a video according to an embodiment of the disclosed subject matter.

[0011] FIG. 5 shows a flow diagram illustrating communications and interactions in the context of publishing a captured video to a video-sharing website according to an embodiment of the disclosed subject matter.

[0012] FIG. 6A shows a flow diagram in the context of launching a game from the video sharing website according to an embodiment of the disclosed subject matter.

[0013] FIG. 6B shows a flow diagram in the context of launching a game from the video sharing website according to an embodiment of the disclosed subject matter.

[0014] FIG. 7 shows a computing device according to an embodiment of the disclosed subject matter.

[0015] FIG. 8 shows a network configuration according to an embodiment of the disclosed subject matter.

[0016] FIG. 9 shows an example network and system configuration according to an embodiment of the disclosed subject matter.

DETAILED DESCRIPTION

[0017] Gameplay videos have become increasingly popular on video-sharing platforms. Viewers may watch the videos for a variety of reasons, such as to learn more about a game, to observe successful strategies and techniques, or for purely entertainment purposes. A viewer may not be able to easily recreate the same gaming experience seen in a video because the viewer may not possess a game shown, may not have achieved the same level of progression, and/or may not possess the same player profile characteristics, items, or status as the player shown in the video. Where a player elects to live-stream a gaming session video, it may not be possible or easy for a viewer to join the same gaming session due to difficulty in finding the session, not owning or otherwise possessing the rights to launch the game, and the like.

[0018] The present subject matter discloses systems and techniques for allowing a viewer of a gaming video to launch into the game and experience the scenario seen in the video or screenshot firsthand. By using the techniques disclosed herein, a viewer may launch into a game at the same or nearly the same point in the game as shown in a video. Alternatively, or in addition, a viewer may launch into a game based on a screenshot that exhibits a game scenario.

It should be appreciated that while the following description may use examples referring to captured video, livestreamed video, and/or a captured screenshot, each may be freely substituted for the other without departing from the scope of the present subject matter.

[0019] FIG. 1 shows an example block diagram illustrating various components of a system 100. System 100 includes a video-sharing website 105. The video-sharing website 105 may allow a user to upload and publish a gameplay video having associated shared game states. Video-sharing website 105 may also allow a user to “livestream,” or otherwise stream their gameplay in real-time while emitting shared game states. Video-sharing website 105 may provide a user interface element, such as a “launch” button, to allow a viewing user to launch into a game using the emitted shared game state information to recreate the same or a similar game situation as seen in the video. Player page 160 may be a website that presents a game being played or to be played. Portal front-end 115 may be a web application that executes within a web browser and may provide a user interface to a gaming platform 175. API front-end 125 may an interface between servers 130, 135, 140 that manage the game states and video associations and the executing game instances. Coordinator 155 may coordinate the communication of information between the game code 110, API 120, and recorder 130. API 120 may provide an interface between the game code 110 and the gaming platform 175. Game code 110 may be the code that executes to create a gaming experience for a user. Recorder 130 may record video and screenshot captures and may notify API 120 when a capture occurs. Recorder 130 may also send livestream data to the video-sharing website 105. State share server 135 may manage access to the state share game state table 250 stored in database 15. Capture server 140 may manage access to the capture media game states table 260 stored in database 15. External video server 145 may manage access to the external video game states table 270 stored in database 15. Database 15 may provide data storage and/or organization functions.

[0020] It should be appreciated that the components shown in system 100 of FIG. 1 may be implemented on a same computing device, separate computing devices, or may be virtualized while sharing or not sharing processing spaces, data storage, and network infrastructure, as well as any combination of these. Depending on the implementation of system 100, including whether two communicating components are implemented on a same or separate machine and whether they communicate within a same memory space or across a network, the components may communicate with one another via one or more of inter-process communications (IPC) and remote procedure calls (RPC). As used herein, a term “call” or “message” may refer to either an IPC or an RPC communication or other equivalent communication, depending on the implementation and location of the communicating components and/or processes.

[0021] The set of conditions that define a gaming experience or scenario may be known as a game state. Conditions may include, for example, a selected game, the level or progression within the selected game, the player’s profile or character, the player’s position or location, the player’s inventory of items or equipment, the locations of adversaries, objects, and other game entities, the player’s score, and other game elements that may be used to recreate conditions within a game.

[0022] FIG. 2A shows an example flow 200 illustrating how game states may be determined as disclosed herein. A game may emit game states periodically at regular intervals, at variable intervals, at the time of in-game events or other progress points, or at any other time that a game developer either configures a game state 210, 220, 230 to be emitted or provides an option to a player to emit a game state 210, 220, 230. During execution of a game, an active state 205 may be designated using an API call 215. A maximum number of active states may be set for any game session or captured video, but in some configurations only a single active state may be allowed at any time. In an example, an API call 215 may be to Set State A 235 to the active state 205, followed by a call to Set State B 245 at a subsequent time, followed by a call to Clear State 245, which clears the active state 215, and followed by a call to Set State C 250. A game state 210, 220, 230 may be implemented using a data structure that contains one or more of a start timestamp, an end timestamp, an associated state “blob” 253 of game-specific data, and metadata 254 that describes when the state blob 253 was emitted, the title and/or version of the game that emitted the state blob 253.

[0023] A game developer may provide a set of one or more localized strings describing the data to be encoded into a state blob 253. For example, the developer may provide strings that provide game-related information such as, “your character’s location in the game,” “level that [user ID] designed,” “your racetrack ghost,” and “the randomly-generated dungeon you previously visited.” The localized strings may be stored in the state share server 135 and may be presented to the user via the user interface through the API front-end 125 while presenting a prompt to confirm consent to share game states emitted by a particular game. The state blob 253 may be a byte array of any size, such as four kilobytes. The state blob 253 may encode the game state that a game may use to initialize and restore a game state when launched by a player using the game state containing the state blob. The state blob 253 may include data attributes that define one or more conditions within the game, such as the position of game entities, status, inventory, and progression. The state blob 253 may contain data emitted by the game code 110 and that may be read by the game code 110. The start timestamp of a game state 210, 220, 230 may set at the time that the Set State 235, 240, 250 API call may be made with the associated state’s state blob 253. Correspondingly, a state’s end timestamp may be set at the time that the active state 205 is either cleared using a Clear State 245 API call or set to a new active state 205 using the Set State 235, 240, 250 API call. As shown in FIG. 2, game may emit game states 210, 220, 230 using API calls 215 at various intervals as often as desired and may not necessarily be continuous.

[0024] In some embodiments, game states 210, 220, 230 including a state blob 253 may be emitted at a maximum rate, for example, every five seconds on average. The API 120 may limit the rate at which games may emit game states. Where a game attempts to exceed the limit, the game may be notified that it has exceeded the rate limit to allow the game to take an appropriate action, such as backing off with a timeout, for example. Game states may be buffered using API 120, as subsequently described. The API 120 buffer may discard past game states older than a threshold age when a user is neither livestreaming video nor saving a video capture. In an example, the threshold age may be thirty seconds. API 120 buffer may not discard an active game state 205, regardless of when it was created, because the active game state 205 may be the only game state eligible to be associated if the user begins a video capture or livestream.

[0025] FIG. 2B illustrates an example of tables 250, 260, 270, 280 that may be stored in database 15 and subsequently referred to in the succeeding description of the present subject matter. A first state share game state table 250 may store information about each game state, including the associated state blobs 253. State share game state table 250 may include one or more fields including a user ID 251 of the user who made the capture or is making the livestream video, a system-wide unique shared game state ID 252, and the state blob 253 as previously described. The state share game state table 250 may also include a metadata field 254 that may include the ID of the application that created the shared game state, the version of the application that created the shared game state, a timestamp of the earliest moment that the game state was active, and a timestamp of the moment that the game state was no longer active. A capture media game states table 260 may store associations between a capture and the game states that were active at specific timestamp ranges during the capture. Capture media game states table 260 may include the user ID 251, a media ID 262 of the capture or livestream video, the shared game state ID 252, and association data 264 that may describe the association between the capture media and the shared game state. An external video game states table 270 may store associations between an external video and the game states that were active at specific timestamps during the video. External video game states tables 270 may include the user ID 251, an external video ID 272 that may correspond to the identifier of the video as referenced by the video-sharing website 105, a video type 273 field that may indicate whether the video is a captured and uploaded video or a livestream, the shared game state ID 252, and association data 275 that may describe the association between the external video and the shared game state. A pending shared game states table 280 may include the user ID 251, the shared game state ID 252, and a creation timestamp that may indicate when this row was created.

[0026] A video and/or screenshot may contain metadata that allows a viewer to launch into a game at a similar point as shown in the video or screenshot. A state-sharing application programming interface (API) may be provided to game developers to emit the metadata from their games during gameplay, as well as to allow a game to be launched responsively to the metadata. The metadata emitted from the game may then be associated with videos of gameplay or screenshots that are captured and posted various video-sharing platforms and websites. Where a video includes the associated metadata, the video-sharing platform or website may provide additional functionality, including a selectable input that allows a viewer to launch into a game at a similar point or under similar conditions as shown in the video.

[0027] The metadata that may be emitted during gameplay of a game may define the attributes of a game state. In an example, the metadata emitted by an instance of a game recorded to video or captured in a screenshot by a first player may be subsequently utilized by a second instance of the same game when launched by a second player in order to recreate the game at a similar or same point. The degree of similarity between the game conditions experienced by the first player and the game conditions of the second player may be configurable by the game developer, the player who records the video, and/or the player who launches into the game based on viewing the video. For example, a player may configure that a recorded gameplay video may include metadata to share the current level and inventory, but not the player’s character or avatar. Metadata defining a game state may be automatically generated during gameplay for games that support the state-sharing feature and may automatically be associated with any video captured or live-streamed by a player. Metadata generation and associating with video may not be required in order to share video or live-stream gameplay video, and the player may disable the same if desired.

[0028] Game developers may use the state-sharing API in a variety of ways. In an embodiment, a developer may use the API to configure a set of gameplay conditions to create a scenario that may or may not occur during normal gameplay. For example, in the context of a baseball game, a developer could configure a scenario where the player may launch into the bottom of the ninth inning with bases loaded and two outs. In an embodiment, a game developer may use the API to establish a player location within a world of a sandbox-style game such that the viewer may launch directly to the geographical location seen in the video. In an embodiment, a game developer may make equipment or other items available through viewing a video that may or may not be obtainable during normal gameplay. For example, as part of a promotion, a game developer may make a soft drink available as a power-up in a game when launched during an advertisement for the soft drink. Similarly, a game developer may make levels or maps available through viewing a video that may or may not be playable during normal gameplay. It should be appreciated that example scenarios, items, equipment, levels, maps, racing track “ghosts,” and adversaries may be provided from a game developer’s database of user-generated content.

[0029] FIG. 3 shows an example flow 300 illustrating communications and interactions between various components of the present subject matter in the context of capturing a video or screenshot while executing game code 110. As shown in FIG. 3, executing game code 110 may make an API call 215 to Set State, as in S305, to an active state 205 and to clear the active state 205 using a Clear State, as in S310, and as previously described with respect to FIG. 2. The active state 205, when set in S305, may be associated with a current timestamp. The API 120 may maintain a buffer or other memory storage of shared game states that have been recently emitted by game code 110. The buffered game states may be associated automatically with a subsequent video capture or screenshot when the player creates one. The buffer associated with and controlled by API 120 may only buffer states that may possibly be included in a video capture or livestream. Any game states older than a threshold time may be eliminated from the API’s 120 buffer of game states. The threshold may be calculated by summing a maximum duration of video capture and a maximum time a video capture may take to create and upload.

On the other hand, a game state that exceeds the threshold age but has not been replaced by a newer state may be retained because it may be the active state 205. A user may initiate a video capture or screenshot during gameplay, which may occur during execution of game code 110. During the timespan of the video capture, the API 120 may store and associate any and all game states that are active with the video capture. The API 120 may be unaware that a player has initiated a video capture. Therefore, recorder 130 may transmit an IPC message S320 to the API 120 to provide a notification that a video capture has occurred and/or completed. The IPC message S320 may include a capture ID 262, a start timestamp, and a duration assigned for the video capture. Alternatively, or in addition, as used herein, a start timestamp and a duration may be equivalently replaced by a start timestamp and end timestamp; either may be used to determine the overall timespan of the associated game state. Alternatively, or in addition, other techniques for identifying portions of a video capture with which one or more game states may be associated may be employed. For example, a video capture may include metadata that defines the game states associated with frames or spans of frames of the video capture without reference to timestamps, timespans, or time in general. In response to the IPC message sent in S320, API 120 may search its buffer to find the game states that were active during the timespan of the video capture, where a timespan may be determined by adding a duration of the state to a timestamp that indicates when the state became the active state 205. Game states that were active during the timespan of the video capture may be associated with the video ID 272 of the video capture. API 120 may also determine the previously active game state just prior to the video capture, in case the user initiating the video capture was slow or delayed in initiating the capture, or where the game allows a player to produce a video capture of a gameplay event that began in the past. For example, in a racing game, a player may complete a race and subsequently determine, based on his or her performance, that he or she would like to capture and save the newly-completed race and share the game state from the time the race began. The API 120 may then compose an RPC request to API front-end 125 to save and associate the relevant shared game state data with the recently captured video or screenshot in S325. Relevant shared game state data may include, for example, the capture ID 262 received in S320 from recorder 130, the start timestamp, the duration of the video capture, and the state blob 253 that may be used by the game code 110 to recreate the shared game state.

[0030] In an example, Gary Gamer may play a game by executing game code 101. During execution of game code 101 and as defined by the game developer, game states 210, 220, 230 may be emitted during S305 at varying intervals. Game states may be stored in the state share game state table 250, which may include the state blob 253 that may define game conditions of a game state, as well as a shared game state ID 252 and metadata 254. Gary Gamer may initiate a capture of a screenshot or video during S320. In response, the game state(s) 210220, 230 determined to be active during the video capture may be exported in S325 and stored in S350 with an association established in S345with the captured video.

[0031] In response to receiving the RPC request, the API front-end 125 may decompose the RPC request into two backend calls. First, API front-end 125 may call the state share server 135 to save the shared game states and their associated state blobs 253 in S330. State share server 135 may control reads and writes to the state share game states table 250 and state blobs 253 located in database 15. The state share server 135 may store the shared game state blobs 253 in database 15 in a subordinate or child table of a user table that stores a list of users in S335. Each video and/or screenshot capture may be therefore associated with a user. Database 15 may assign the user ID of the user that created the video capture to the shared game states stored within the subordinate table. State share server 135 may transfer the state IDs 252 to API front- end 125 in S340. State IDs 252 may be generated for each of the state blobs 253. Second, API front-end 125 may then make a call to capture server 140 to request that the saved game states be associated with the video or screenshot capture in S345, passing one or more of the capture ID 262, the state IDs 252 that were active during the video capture or screenshot, the starting timestamp, and the duration of each state ID 252 while it was active. In S350, the capture server 140 may write the associations between the shared game states and video capture in a capture media game states table 260. The game code 110 may or may not receive a notification that its previously-emitted game states were saved in one or more of the state share server 135, capture server 140, and database 15.

[0032] FIG. 4 shows an example flow 400 illustrating communications and interactions between various components of the present subject matter in the context of livestreaming a video while executing game code 110. A livestream may be viewed, for example, on a video sharing website 105 either during the real-time streaming of gameplay or as a published video capture when the livestream ends. A user may begin a livestream while playing a game that corresponds to the execution of game code 110. Beginning the livestream may cause an RPC to be transmitted from API front-end 125 to the API 120 indicating that the livestream has started in S420. Until the livestream ends, the API 120 may send all emitted game states in one or more batches to the API front-end 125 via RPCs, as in S425. The emitted game states may include one or more of a stream ID, starting timestamp, duration, and the associated state blobs 253. The API front-end 125 may save the shared game states and their associated state blobs in the state share server 135 in S430. The state share server 135 may store the shared game state blobs 253 in database 15 in a subordinate or child table of a table that stores a list of users in S435. State share server 135 may transfer the state IDs 252 to API front-end 125 in S440. State IDs 252 may be generated for each of the state blobs 253. The API front-end 125 may subsequently make a call to the external video server 145 to associate the newly-written shared game states with the stream ID of the livestream in S445. The stream ID may become the video ID 272 of a video-on-demand that the video sharing website 105 may automatically create when a livestream ends. In S450, the external video server 145 may write the associations between the shared game states and the livestream stream ID into an external video shared game states table 270. Upon termination of the livestream, the API 120 may receive an RPC notification in S415 indicating the same from API front-end 125. In response, API 120 may transmit any remaining game states in an RPC to API front-end 125 that have not already been exported with a previous batch of emitted game states in S425.

[0033] In an example, Gary Gamer may continue to play the game embodied by game code 110 when he decides to begin livestreaming his gameplay in S420. In response, the active game states 210, 220, 230 during the livestream may be exported in one or more batches during S425. In this way, viewers of Gary’s livestream may be able to launch into Gary’s game at the instant gameplay moment or at any past gameplay moment for which one or more game states 210, 220, 230 have been exported and stored in database 15.

[0034] FIG. 5 shows an example flow 500 illustrating interactions between various components of the present subject matter in the context of publishing a captured video having shared game states to a video sharing website 105. Flow 500 may begin when a user elects to publish a captured video to a video sharing website 105 from a portal front-end 115. Portal front-end 115 may be a web application having a user interface that executes within a web browser. In response to the user’s request to publish the captured video, portal-front end 115 may transmit a request to API front-end 125 in S505, which may forward the request to capture server 140 in S510. Capture server 140 may upload the captured video to the video sharing website 105 in S515 and may set a flag that indicates the video may have shared game states associated with it. In response, the video sharing website 105 may return a video ID 272 to the capture server 140 in S520 that may be used to subsequently locate the video on the video sharing website 105. Capture server 140 may forward the video ID 272to the API front-end 125 in S525. By referencing the capture ID 262 of the video capture, API front-end 25 may make a request to capture server 140 for all of the shared game state association data 264 related to the video capture being uploaded to the video-sharing website 105 in S530. The shared game state association data 264 may include the identifiers of the shared game states, the timestamps at which they become active, the duration, the state blob 253, and the metadata 254, as previously described. Capture server 140 may provide this data to the API front-end 125 in S535. The API front-end 125 may then call the external video server 145 in S540 both to notify it of the video ID 272 of the newly uploaded video capture to the video-sharing website 105 and to provide the shared game state association data 264 received from capture server 140. The shared game state association data 264 may be copied from the capture media gate states table 260 stored in database 15, as previously described. External video server 145 may write the video ID 272 from the video-sharing website 105 as well as the shared game states to tables stored in database 15 during S545 and S555.

[0035] In an example, Gary Gamer may publish a captured video to a video-sharing site 105 during S505, which may include uploading a previously-saved video in S515. Where game state information exists for the captured video, API front-end 125 may retrieve the game state associations from capture server 140 in S530. With the shared game state association data received in S535, the published video may be associated with the shared game states and stored in database 15 such that Vickie Viewer, who views Gary Gamer’s published video on the video sharing website 105 may have the option to launch into the game at any of the associated shared game states.

[0036] As previously described, gameplay videos uploaded to the video-sharing website 105 may have one or more associated shared game states. These videos may be flagged with an indicator in a database of the video-sharing website 105 that may cause the video-sharing website 105 to display a user interface element that allows a user to launch into the game at one of the associated shared game states. The interface element may be, for example, an image, such as a button, icon, photo, emoticon, or animation, a textual string, a uniform resource identifier (URI), a hyperlink, or as a combination of image and text. For example, the interface element may be a button that displays the text, “Launch.” The interface element may be dynamically enabled and disabled depending on whether there is a shared game state associated with a currently-playing video. Alternatively, or in addition, the interface element may be dynamically enabled and disabled based on whether there is a shared game state associated with the timestamp of the current position of the “playhead,” or stated another way, the current marker position on the “seek bar” of the video player interface, which indicates a current location of a gameplay video being played, such as in a web browser or other interface capable of displaying a gameplay video. Alternatively, or in addition, the interface element may be always enabled and configured to direct the user to a URI when clicked or otherwise activated by the user based on the associated video ID 272 and timestamp. Once clicked, system 100 may use the video ID and timestamp may be checked for a corresponding game state, and if available, allow the user to launch into the game state. If no game state is associated with the video ID at the current timestamp, the user may be presented with an error message or presented with additional options, such as to play the game from another state, to purchase the game, or to play a different game, for example. Alternatively, or in addition, if no game state is associated with the video ID at the current timestamp, the game may be launched into a next-nearest, next-subsequent-nearest, or next-previous-nearest game state to the current timestamp.

[0037] FIG. 6A shows an example flow 600 illustrating interactions between various components of the present subject matter in the context of launching a game from the video sharing website 105. In an embodiment, a user may be watching a gameplay video of a game on the video-sharing website 105 and decide to launch the game into the same game state at the same or at a relatively similar point as displayed in the gameplay video. Clicking or otherwise activating the user interface element to launch the game may redirect the user to a player web page or equivalent interface 160 that presents the game to be played in SI 08. The player page 160 may be assigned to a URL that may include parameters including the video ID of the video that was being viewed and the timestamp at the current position of the play head or seek bar marker when the user interface element was clicked to launch the game. The player page 160 may transmit a start game RPC to the portal front-end 115 in S610, passing the video-sharing website 105 video ID and timestamp passed in S605. In S615, the portal front-end 115 may transmit a “initialize game” call passing the video ID and timestamp to the API front-end 125 to prepare to initialize and execute the game on a virtual machine. The API front-end 125 may encode the game state into a URI that may be passed via an RPC to a coordinator 155 in S620. Coordinator 155 may provide initial game state keys into the API 120 during game startup. In S625, the coordinator 155 may pass the URI to the API 120 via an “initialize connection” RPC. When the user connects to the game, the API 120 may make an RPC to the API front-end 125 to retrieve the game state ID 252 associated with the URI received in S625. The API front-end 125 may look up the game state ID 252 associated with the URI and retrieve the state blob 253 associated with the game state ID 252 in S640. In S645, the API front-end 125 may then transmit the retrieved state blob 253 to the API 120. State blob 253 may then be available to the game code 110 in S655 in response to a call to retrieve the startup game state, as shown in S650.

[0038] FIG. 6B shows an alternate example flow 650 illustrating interactions between various components of the present subject matter in the context of launching a game from the video-sharing website 105. Example flow 650 includes many of the same interactions between components as shown in example flow 600. To aid in highlighting the differences between flow 600 and 650, some components and interactions that may be otherwise present in flow 650 are omitted from FIG. 6B. In an embodiment, following S615 (not shown) of flow 650, API front- end 125 may encode the video ID and timestamp into a URI that may be passed via an RPC to a coordinator 155 in S621. Coordinator 155 may provide initial game state keys into the API 120 during game startup. In S626, the coordinator 155 may pass the URI to the API 120 using an “initialize connection” RPC. When the user connects to the game, the API 120 may make an RPC to the API front-end 125 passing the URI in S631. The API front-end 125 may decode the URI to reveal the video-sharing website 105 video ID and timestamp in S641. API front-end 125 may then make an RPC to the external video server 145 to retrieve the shared game state ID 252 associated with the video ID and timestamp decoded in S642. In S643, external video server 145 may compare the timestamp of the video capture assigned to the video ID with the metadata 254 assigned to the shared game state ID 252 in state share game state table 250 to determine if the timestamp of the video falls within the timespan defined by the start timestamp and end timestamp within the metadata 254 of state share game state table 250. If a game state exists having for the video ID and for which the timestamp falls within the timespan assigned to the game state, the external video server 145 may identify the associated game state ID 252 and make an RPC to the API front-end 125 to retrieve the game state blob for the identified game state ID in S644. In S645, as previously-discussed with reference to FIG. 6A, the API front-end 125 may then transmit the retrieved state blob to the API 120.

[0039] In an example, Vickie Viewer may view Gary Gamer’s published gameplay video on video-sharing website 105. Should Vickie decide that she would like to launch into the game at the same scenario and/or under the same game conditions shown in the video, she may select a “launch” button to do so. In response, the video-sharing website 105 may pass the video ID of Gary’s video along with the current timestamp at S605 to the player page 160. Player page 160 may call to start the game in S610 while passing the video ID and timestamp to the portal front- end 115. Portal front-end 115 may call API front-end 125 to initialize the game while continuing to pass the video ID and timestamp in S615. At this point, the process may proceed according to either flow 600 or flow 650 to continue with either S620 or S621, although elements of either flow may be combined together, appended, removed, and/or otherwise modified as desired. In the case of either flow 600 or 650, the process may retrieve the associated active game state at the timestamp in which Vickie depressed the launch button for the game shown and associated with Gary’s video. Once the active game state is determined by its ID in S630/S643, the associated state blob 253 may be retrieved in S640/S644 from state share game state table 250. API 120 may subsequently pass the state blob 253 to game code 110 where it can be utilized to recreate the same or similar game conditions as shown in Gary’s video for Vickie to experience in her game.

[0040] A user may select new states from the video-sharing website 105 without ending his or her game session. In an embodiment, a user may launch a game from a video associated with a first game state. The user may play the game from the first game state for a period of time, then switch back to the video-sharing website 105 to select a second game state to play. API 120 may allow the game code 110 to be re-initialized to a new game state without the inconvenience of ending and restarting the game. Similarly, a user may queue two or more game states using the user interface of the video-sharing website 105 to play consecutively without ending and restarting the game. The API 120 may direct game code 110 to transition from the initially- loaded game state to subsequent game states based on the expiration of a predetermined time, the occurrence of an in-game event, or otherwise based on the user simply requesting to shift to the next queued game state. Each time a game is launched with a shared game state, the API front- end 125 may check with the external video game states table 270 stored in database 15 to verify that the shared game state is associated with either a video capture or livestream video type 273 having an associated external video ID 272.

[0041] A game state not associated with either a previously-shared video capture or livestream may not be used to launch a game unless the user who created the game state consents. A user that captures gameplay video or livestreams may have the option whether to enable sharing game states with the video or livestream, as well as the option to withdraw the sharing of game states from a previously-published video. Where a user withdraws consent, the database 15 may remove the rows of the external video game states table 270 that associate the captured or livestreamed video with the shared game states. An RPC may also be provided from the external video server 145 that may allow a user to remove the rows in external video game states table 270 associated with a video according to its video ID on the video-sharing website 105. Should the user withdraw consent, the video-sharing website 105 may also be notified to clear the flag indicating a published video has shared game states associated with it, thereby disabling the user interface element that would otherwise be used to launch into a game using the shared game state. Without consenting to the sharing of game states associated with the captured or livestreamed video, a user may still publish a video to video-sharing website 105, but another user may be unable to launch into a shared game state from the published video or livestream. Users may also have the option to see the information stored in a shared game state emitted by a game.

[0042] Shared game states may appear in a user-visible “library” of shared game states. The shared game states appearing in the library may be displayed as part of a promotion, based on the user’s indicated interests, past gameplay history, subscriptions to video channels, and/or recommendations and shares from other users. The library may appear, for example, on one or more of the video-sharing website 105, the player page 160, and the portal front-end 115. A user may share a game state directly with other players via a URL. Based on permissions granted by a user, a game of interest to the user may be permitted to share game states into a user’s shared game state library with additional game-provided data such as a name, description, screenshot, and/or video preview.

[0043] In situations in which the systems discussed here collect personal information about users, or may make use of personal information, the users may be provided with an opportunity to control whether programs or features collect user information (e.g., information about a user’s social network, social actions or activities, profession, a user’s preferences, or a user’s current location), or to control whether and/or how to receive content from the content server that may be more relevant to the user. In addition, certain data may be treated in one or more ways before it is stored or used, so that personally identifiable information is removed or limited. For example, a user’s identity may be treated so that no personally identifiable information can be determined for the user, or a user’s geographic location may be generalized where location information is obtained (such as to a city, ZIP code, or state level), so that a particular location of a user cannot be determined. Conversely, where the user is interacting with known friends or acquaintances within a game or party, some or all personal information may be made selectively accessible by the other users. Thus, the user may have control over how information is collected about the user and used by a system as disclosed herein.

[0044] Embodiments of the presently disclosed subject matter may be implemented in and used with a variety of component and network architectures. FIG. 7 is an example computing device 20 suitable for implementing embodiments of the presently disclosed subject matter. The device 20 may be, for example, a desktop or laptop computer, gaming console, gaming server, set-top box, or a mobile computing device such as a smart phone, tablet, or the like. The device 20 may include a bus 21 which interconnects major components of the computing device 20, such as a central processor 24, a memory 27 such as Random Access Memory (RAM), Read Only Memory (ROM), flash RAM, or the like, a user display 22 such as a display screen, a user input interface 26, which may include one or more controllers and associated user input devices such as a keyboard, mouse, touch screen, and the like, a fixed storage 23 such as a hard drive, flash storage, and the like, a removable media component 25 operative to control and receive an optical disk, flash drive, and the like, and a network interface 29 operable to communicate with one or more remote devices via a suitable network connection.

[0045] The bus 21 allows data communication between the central processor 24 and one or more memory components, which may include RAM, ROM, and other memory, as previously noted. Typically, RAM is the main memory into which an operating system and application programs are loaded. A ROM or flash memory component can contain, among other code, the Basic Input-Output system (BIOS) which controls basic hardware operation such as the interaction with peripheral components. Applications resident with the computer 20 are generally stored on and accessed via a computer readable medium, such as a hard disk drive (e.g., fixed storage 23), an optical drive, floppy disk, or other storage medium.

[0046] The fixed storage 23 may be integral with the computer 20 or may be separate and accessed through other interfaces. The network interface 29 may provide a direct connection to a remote server via a wired or wireless connection. The network interface 29 may provide such connection using any suitable technique and protocol as will be readily understood by one of skill in the art, including digital cellular telephone, WiFi, Bluetooth(R), near-field, and the like. For example, the network interface 29 may allow the computer to communicate with other computers via one or more local, wide-area, or other communication networks, as described in further detail below.

[0047] Many other devices or components (not shown) may be connected in a similar manner (e.g., document scanners, digital cameras and so on). Conversely, all of the components shown in FIG. 7 need not be present to practice the present disclosure. The components can be interconnected in different ways from that shown. The operation of a computer such as that shown in FIG. 7 is readily known in the art and is not discussed in detail in this application. Code to implement the present disclosure can be stored in computer-readable storage media such as one or more of the memory 27, fixed storage 23, removable media 25, or on a remote storage location.

[0048] FIG. 8 shows an example network arrangement according to an embodiment of the disclosed subject matter. One or more devices 10, 11, such as local computers, smart phones, tablet computing devices, and the like may connect to other devices via one or more networks 7. Each device may be a computing device as previously described. The network may be a local network, wide-area network, the Internet, or any other suitable communication network or networks, and may be implemented on any suitable platform including wired and/or wireless networks. The devices may communicate with one or more remote devices, such as servers 13 and/or databases 15. The remote devices may be directly accessible by the devices 10, 11, or one or more other devices may provide intermediary access such as where a server 13 provides access to resources stored in a database 15. The devices 10, 11 also may access remote platforms 17 or services provided by remote platforms 17 such as cloud computing arrangements and services. The remote platform 17 may include one or more servers 13 and/or databases 15.

[0001] FIG. 9 shows an example arrangement according to an embodiment of the disclosed subject matter. One or more devices or systems 10, 11, such as remote services or service providers 11, user devices 10 such as local computers, smart phones, tablet computing devices, and the like, may connect to other devices via one or more networks 7. The network may be a local network, wide-area network, the Internet, or any other suitable communication network or networks, and may be implemented on any suitable platform including wired and/or wireless networks. The devices 10, 11 may communicate with one or more remote computer systems, such as processing units 14, databases 15, and user interface systems 13. In some cases, the devices 10, 11 may communicate with a user-facing interface system 13, which may provide access to one or more other systems such as a database 15, a processing unit 14, or the like. For example, the user interface 13 may be a user-accessible web page that provides data from one or more other computer systems. The user interface 13 may provide different interfaces to different clients, such as where a human-readable web page is provided to a web browser client on a user device 10, and a computer-readable API or other interface is provided to a remote service client 11

[0002] The user interface 13, database 15, and/or processing units 14 may be part of an integral system or may include multiple computer systems communicating via a private network, the Internet, or any other suitable network. One or more processing units 14 may be, for example, part of a distributed system such as a cloud-based computing system, search engine, content delivery system, or the like, which may also include or communicate with a database 15 and/or user interface 13. In some arrangements, a machine learning system 5 may provide various prediction models, data analysis, or the like to one or more other systems 13, 14, 15.

[0003] More generally, various embodiments of the presently disclosed subject matter may include or be embodied in the form of computer-implemented processes and apparatuses for practicing those processes. Embodiments also may be embodied in the form of a computer program product having computer program code containing instructions embodied in non- transitory and/or tangible media, such as floppy diskettes, CD-ROMs, hard drives, USB (universal serial bus) drives, or any other machine readable storage medium, such that when the computer program code is loaded into and executed by a computer, the computer becomes an apparatus for practicing embodiments of the disclosed subject matter. Embodiments also may be embodied in the form of computer program code, for example, whether stored in a storage medium, loaded into and/or executed by a computer, or transmitted over some transmission medium, such as over electrical wiring or cabling, through fiber optics, or via electromagnetic radiation, such that when the computer program code is loaded into and executed by a computer, the computer becomes an apparatus for practicing embodiments of the disclosed subject matter. When implemented on a general-purpose microprocessor, the computer program code segments configure the microprocessor to create specific logic circuits.

[0004] In some configurations, a set of computer-readable instructions stored on a computer- readable storage medium may be implemented by a general-purpose processor, which may transform the general-purpose processor or a device containing the general-purpose processor into a special-purpose device configured to implement or carry out the instructions.

Embodiments may be implemented using hardware that may include a processor, such as a general-purpose microprocessor and/or an Application Specific Integrated Circuit (ASIC) that embodies all or part of the techniques according to embodiments of the disclosed subject matter in hardware and/or firmware. The processor may be coupled to memory, such as RAM, ROM, flash memory, a hard disk or any other device capable of storing electronic information. The memory may store instructions adapted to be executed by the processor to perform the techniques according to embodiments of the disclosed subject matter.

[0005] The foregoing description, for purpose of explanation, has been described with reference to specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit embodiments of the disclosed subject matter to the precise forms disclosed. Many modifications and variations are possible in view of the above teachings. The embodiments were chosen and described in order to explain the principles of embodiments of the disclosed subject matter and their practical applications, to thereby enable others skilled in the art to utilize those embodiments as well as various embodiments with various modifications as may be suited to the particular use contemplated.

[0006] Implementations disclosed herein can include systems, devices, arrangements, techniques, and compositions such as the following:

1. A computer-implemented method comprising: receiving a first game state of a plurality of game states, each game state of the plurality of game states comprising: an attribute that defines a condition of a game; and an identification value that defines when the state occurred during gameplay; assigning the first game state to a first portion of a video, the video based on captured gameplay of the game; receiving a request to launch the game; determining a selected game state from the plurality of game states; and launching the game based on the selected game state.

2. The method of implementation 1, wherein the game is launched using the selected game state to incorporate the condition defined by the attribute in the game.

3. The method of implementations 1 or 2, wherein the request to launch the game is received from a viewer of the video.

4. The method of any one of the preceding implementations, wherein: the request to launch the game is based on a selected frame or a selected timestamp of the video; and the determining the selected game state from the plurality of game states is based on comparing the selected frame or the selected timestamp with the identification values of one or more of the plurality of game states.

5. The method of any one of the preceding implementations, wherein the attribute is a player position, a game level, a game map, a player inventory, a player status, a location of an adversary, a location of an object, or a progress point.

6. The method of any one of the preceding implementations, wherein only a single game state of the plurality of game states is assigned to the video at a time.

7. The method of any one of the preceding implementations, further comprising: assigning a second game state of the plurality of game states to a second portion of the video.

8. The method of any one of the preceding implementations, further comprising: providing a video player user interface to play the video, wherein the determining the selected game state from the plurality of game states is based on the position of a playhead of the video player user interface.

9. The method of any one of the preceding implementations, wherein the plurality of game states are generated during gameplay of the game.

10. The method of any one of the preceding implementations, further comprising: discarding an old game state of the plurality of game states that reaches a threshold age when the old game state has not been assigned to the video.

11. The method of any one of the preceding implementations, wherein the video is livestreamed during gameplay of the game.

12. The method of any one of the preceding implementations, wherein the video was previously captured during gameplay of the game.

13. The method of any one of the preceding implementations, further comprising: receiving an additional request to launch the game based on an additional selected timestamp or an additional selected frame of the video when the game has already launched.

14. A non-transitory computer-readable medium comprising instructions operable, when executed by one or more computing systems, to: receive a first game state of a plurality of game states, each game state of the plurality of game states comprising: an attribute that defines a condition of a game; and an identification value that defines when the state occurred during gameplay; assign the first game state to a first portion of a video, the video based on captured gameplay of the game; receive a request to launch the game; determine a selected game state from the plurality of game states; and launch the game based on the selected game state.

15. The method of implementation 14, wherein the game is launched using the selected game state to incorporate the condition defined by the attribute in the game.

16. The medium of implementation 14 or 15, wherein the request to launch the game is received from a viewer of the video.

17. The medium of any one of the preceding implementations, wherein: the request to launch the game is based on a selected frame or a selected timestamp of the video; and the determination of the selected game state from the plurality of game states is based on comparing the selected frame or the selected timestamp with the identification values of one or more of the plurality of game states.

18. The medium of any one of the preceding implementations, wherein the attribute is a player position, a game level, a game map, a player inventory, a player status, a location of an adversary, a location of an object, or a progress point.

19. The medium of any one of the preceding implementations, wherein only a single game state of the plurality of game states is assigned to the video at a time.

20. The medium of any one of the preceding implementations, further comprising instructions to: assign a second game state of the plurality of game states to a second portion of the video.

21. The medium of any one of the preceding implementations, further comprising instructions to: provide a video player user interface to play the video, wherein the determination of the selected game state from the plurality of game states is based on the position of a playhead of the video player user interface. 22. The medium of any one of the preceding implementations, wherein the plurality of game states are generated during gameplay of the game.

23. The medium of any one of the preceding implementations, further comprising instructions to: discard an old game state of the plurality of game states that reaches a threshold age when the old game state has not been assigned to the video.

24. The medium of any one of the preceding implementations, wherein the video is livestreamed during gameplay of the game.

25. The medium of any one of the preceding implementations, wherein the video was previously captured during gameplay of the game.

26. The medium of any one of the preceding implementations, further comprising instructions to: receive an additional request to launch the game based on an additional selected timestamp or an additional selected frame of the video when the game has already launched.




 
Previous Patent: CENTRALLY CONTROLLED LOCKS

Next Patent: VIDEO GAME OVERLAY