Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
SYSTEMS AND METHODS FOR MULTIMEDIA TACTILE AUGMENTATION
Document Type and Number:
WIPO Patent Application WO/2018/035442
Kind Code:
A1
Abstract:
A hardware instruction set infrastructure and a method of generating hardware instruction sets involving an artificial intelligence engine is provided. The hardware instruction set infrastructure includes an artificial intelligence engine, a plurality of hardware translation layers a database infrastructure and a web interface. The artificial intelligence engine is trained to generate a plurality of media context instruction sets from patterns in multimedia events, each media context instruction set including one or more of scene context designators, scene intensity values, and time flags corresponding to content in a supported multimedia event. Each hardware translation layer includes conversion factors corresponding to a specific local hardware device. The database infrastructure stores a plurality of local hardware instruction sets generated by the plurality of hardware translation layers. The web interface is configured to: provide responsive local hardware instruction sets to the requesting local software.

Inventors:
HARRIS RALPH (US)
ANSELMI THOMAS (US)
Application Number:
PCT/US2017/047573
Publication Date:
February 22, 2018
Filing Date:
August 18, 2017
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
XSYNC TECH LLC (US)
International Classes:
G08B6/00
Domestic Patent References:
WO2015057979A12015-04-23
Foreign References:
US20150057493A12015-02-26
US20130225261A12013-08-29
US20160063893A12016-03-03
US20120327006A12012-12-27
US20110133910A12011-06-09
US5857986A1999-01-12
Attorney, Agent or Firm:
KAWULA, Walter (US)
Download PDF:
Claims:
What is claimed is:

1. A hardware instruction set infrastructure for interfacing with instances of local software executing on a plurality of corresponding local computers, comprising: a. an artificial intelligence engine trained to generate a plurality of media context instruction sets from patterns in multimedia events, each media context instruction set including scene intensity values and time flags corresponding to content in a supported multimedia event; b. a plurality of hardware translation layers, each hardware translation layer including conversion factors corresponding to a local hardware device and configured to translate the media context instruction sets generated by the artificial intelligence engine to device-specific local hardware instruction sets; c. a database infrastructure storing a plurality of local hardware instruction sets generated by the plurality of hardware translation layers, wherein a subset of the plurality of local hardware instruction sets correspond to a given supported multimedia event, each of the hardware instruction sets in the subset being optimized for different types of supported local hardware devices; and d. a web interface, the web interface being configured to:

1. receive media file information from at least one instance of local software executing on a local computer having a local hardware device attached thereto;

2. query the database infrastructure for local hardware instruction sets corresponding to the media file information and a particular local hardware device;

3. if a responsive local hardware instruction set is found, provide the responsive local hardware instruction set to the requesting local software.

2. The hardware instruction set infrastructure of claim 1, wherein the artificial intelligence engine trained to map media context information into an unpopulated media context code to produce a media context instruction set.

3. The hardware instruction set infrastructure of claim 1 , wherein each media context instruction set further includes scene context designators.

4. The hardware instruction set infrastructure of claim 3, wherein the scene context designators comprise designations of objects that arc a source of or recipient of action in a scene in a support multimedia event.

5. The hardware instruction set infrastructure of claim 1 , wherein hardware instruction sets comprise haptic hardware instruction sets.

6. The hardware instruction set infrastructure of claim I , wherein the hardware translation layers comprise look-up tables.

7. The hardware instruction set infrastructure of claim 1 , wherein the hardware translation layers comprise algorithmic translation processes.

8. A media tactile augmentation system for interfacing with instances of local software executing on a plurality of corresponding local computers, comprising: a. an artificial intelligence engine trained to generate a plurality of media context instruction sets from patterns in multimedia event, each media context instruction set including scene intensity values and time flags corresponding to content in a supported multimedia event; b. a database infrastructure storing a plurality of media context instruction sets generated by the artificial intelligence engine; d. a web interface, the web interface being configured to:

1. receive media file information from a local application executing on a local computer having a local hardware device attached thereto;

2. query the database infrastructure for media context instruction sets corresponding to the media file information;

3. if a responsive media context instruction set is found, provide the responsive media context instruction set to the requesting local software; and e. at least one local computer, comprising:

1. a processor coupled to a memory storing computer-executable instructions for a local application for augmenting content playback;

2. a first communication interface configured to communicate with the web interface of the media tactile augmentation system;

3. at least one hardware translation layer having conversation factors corresponding to the at least one local hardware device and configured to translate media context instruction sets to local hardware instruction sets;

4. a second communication interface configure to communicate with at least one local hardware device externally located with respect to the media appliance; and

5. a display interface configured to output display information associated with content playback to a display.

9. The media tactile augmentation system of claim 8, wherein the local application, when executed by the processor of the local computer, configures the processor to: a. identify a media event selected for playback; b. identify one or more local hardware devices communicatively coupled with the media appliance; c. send a request to the web interface of the media tactile augmentation system requesting one or more media context instructions sets corresponding to the identified media event; d. receive the one or more media context instructions sets from the web interface; e. translate the one or more media context instruction set to one or more local hardware instruction sets; f. initiate playback of the media event; and g. synchronize execution of instructions, from the one or more local hardware instructions sets, by the set of hardware devices with the playback of the media event.

10. The media tactile augmentation system of claim 9, wherein each of a plurality of local hardware instruction sets of a subset arc optimized for a different local hardware device.

1 1. The media tactile augmentation system of claim 8, wherein the local computer comprises a mobile device.

12. The media tactile augmentation system of claim 8, wherein the local computer has a separate display.

13. The media tactile augmentation system of claim 8, wherein the hardware translation layers further comprise user feedback input.

14. The media tactile augmentation system of claim 8, wherein the media context instruction sets are hardware agnostic.

15. The media tactile augmentation system of claim 8, wherein the artificial intelligence engine and the web interface are further configured to generate and provide media context instruction sets for streaming video in real time.

16. A process for using a trained artificial intelligence engine to produce media context instruction sets, comprising: a. generating formatted, unpopulated media context code; b. converting a media event into artificial intelligence readable format; c. feeding the formatted, unpopulated code and the converted media event into the trained artificial intelligence engine, d. the trained artificial intelligence engine men creating populated media context instructions.

17. The process of claim 16, further comprising: a. performing a manual review of the media context instructions generated by the artificial intelligence engine; b. making manual adjustments to the media context instructions, and c. feeding the revised media context instructions back into the artificial intelligence engine.

18. The process of claim 16, further comprising converting the media context instructions into hardware-specific instructions.

19. The process of claim 16, further comprising converting the media context instructions into haptic instructions for delivery to local hardware.

20. The process of claim 16, further comprising providing haptic device feedback as an input to the artificial intelligence engine.

Description:
SYSTEMS AND METHODS FOR MULTIMEDIA TACTILE AUGMENTATION

RELATED APPLICATIONS

[0001] This application is a non-provisional application corresponding to and claiming priority to 62/377,205 filed August 19, 2016, the disclosure of which is incorporated by reference.

BACKGROUND AND SUMMARY

[0002] The present disclosure relates to systems and methods for augmenting multimedia experiences with tactile sensations.

[0003] The computing industry has trended towards the convergence of online content and hardware-connected components since the inception of the World Wide Web. Only recently, however, has this trend begun to include haptic devices. As of the writing of this patent, a number of inventions have been disclosed that address the intersection between multimedia and locally-connected hardware devices. For example, U.S. Patent No. 6,368,268 to Hassex, Inc. ("Hassex") shows an invention that is capable of synchronizing sensations between a user interface and a sexually-oriented hardware-connected device.

[0004] Various User Interfaces exist, including: a sexually-oriented hardware device receiving uni -directional control signals from a hand-operated controller, such as a joystick; two sexually- oriented hardware devices, both of which send simultaneous, bi-directional control signals based on real-time usage; and a server that sends control signals directly to a remote software interface, which sends the control signal to the sexually-oriented hardware device. For example, Hassex specifies that the control signal, which is received via a data packet, is first decoded by the user interface then sent through the I/O port of the computer to the locally-connected, sexually- oriented hardware device.

[0005] Similarly, U.S. Patent No. 8,378,794 to Internet Services, LLC ("Inet Svcs") describes a hardware component and its interactions with multimedia files that are associated with hardware control signals. The hardware device generates sensations in response to the hardware control signals via a belt system. The RealTouch™ device described by Inet Svcs uses a static Web portal to serve users multimedia files containing embedded hardware control signals. A user's computer receiving the files includes a software add-on for Windows Media Player that decodes the control signals. Once the software add-on is installed and the multimedia file is downloaded, Windows Media Player simultaneously plays the multimedia file and decrypts, then plays the associated hardware control signals directly from the user's computer.

[0006] Presently disclosed is a hardware instruction set infrastructure for interfacing with instances of local software executing on a plurality of corresponding local computers. The hardware instruction set infrastructure includes an artificial intelligence engine, a plurality of hardware translation layers, a database infrastructure and a web interface. The artificial intelligence engine is trained to generate a plurality of media context instruction sets from patterns in multimedia events, each media context instruction set including some or all of scene context designators, scene intensity values, and time flags corresponding to content in a supported multimedia event. Each hardware translation layer includes conversion factors corresponding to a local hardware device and is configured to translate the media context instruction sets generated by the artificial intelligence engine to device-specific local hardware instruction sets. The database infrastructure stores a plurality of local hardware instruction sets generated by the plurality of hardware translation layers, where a subset of the plurality of local hardware instruction sets correspond to a given supported multimedia event, each of the hardware instruction sets in the subset being optimized for different types of supported local hardware devices. The web interface is configured to: receive media file information from at least one instance of local software executing on a local computer having a local hardware device attached thereto; query the database infrastructure for local hardware instruction sets corresponding to the media file information and a particular local hardware device; if a responsive local hardware instruction set is found, provide the responsive local hardware instruction set to the requesting local software.

[0007] The artificial intelligence engine may be trained to map media context information into an unpopulated media context code to produce a media context instruction set. The scene context designators may comprise designations of objects that are a source of or recipient of action in a scene in a support multimedia event. In one example, the hardware translation layers comprise look-up tables. In another example, the hardware translation layers comprise algorithmic translation processes. Both examples of hardware translation layers may be employed concurrently. The hardware instruction sets may comprise haptic hardware instruction sets. [0008] Also disclosed is a media tactile augmentation system for interfacing with instances of local software executing on a plurality of corresponding local computers. The media tactile augmentation system includes an artificial intelligence engine, a database infrastructure, a web interface, and at least on local computer. The artificial intelligence engine is trained to generate a plurality of media context instruction sets from patterns in multimedia event, each media context instruction set including some or all of scene context designators, scene intensity values, and time flags corresponding to content in a supported multimedia event. The database infrastructure stores a plurality of media context instruction sets generated by the artificial intelligence engine. The web interface is configured to: receive media file information from a local application executing on a local computer having a local hardware device attached thereto; query the database infrastructure for media context instruction sets corresponding to the media file information; if a responsive media context instruction set is found, provide the responsive media context instruction set to the requesting local software;

[0009] The local computer includes a processor, a first communication interface, at least one hardware translation layer, a second communication interface, and a display interface. The processor is coupled to a memory storing computer-executable instructions for a local application for augmenting content playback. The first communication interface is configured to communicate with the web interface of the media tactile augmentation system. The at least one hardware translation layer has conversation factors corresponding to the at least one local hardware device and configured to translate media context instruction sets to local hardware instruction sets. The second communication interface is configured to communicate with at least one local hardware device externally located with respect to the media appliance. The display interface is configured to output display information associated with content playback to a display.

[0010] In one example of the media tactile augmentation system, the local application, when executed by the processor of the local computer, configures the processor to: identify a media event selected for playback; identify one or more local hardware devices communicatively coupled with the media appliance; send a request to the web interface of the media tactile augmentation system requesting one or more media context instructions sets corresponding to the identified media event; receive the one or more media context instructions sets from the web interface; translate the one or more media context instruction set to one or more local hardware instruction sets; initiate playback of the media event; and synchronize execution of instructions, from the one or more local hardware instructions sets, by the set of hardware devices with the playback of the media event.

[0011] 9. In the above systems, each of a plurality of local hardware instruction sets of a subset may be optimized for a different local hardware device. Also, the local computer may comprise a mobile device. In another example, the local computer may have a separate display. In another example, the hardware translation layers further comprises user feedback input.

[0012] 13. The media context instruction sets may be hardware agnostic. Also, the artificial intelligence engine and the web interface may also be further configured to generate and provide media context instruction sets for streaming video in real time.

[0013] 15. A process for using a trained artificial intelligence engine to produce media context instruction sets is also provided herein. The process includes: generating formatted, unpopulated media context code; converting a media event into artificial intelligence readable format; feeding the formatted, unpopulated code and the converted media event into the trained artificial intelligence engine, the trained artificial intelligence engine then creating populated media context instructions. The process may further comprise: performing a manual review of the media context instructions generated by the artificial intelligence engine; making manual adjustments to the media context instructions, and feeding the revised media context instructions back into the artificial intelligence engine. The process may further comprise converting the media context instructions into hardware-specific instructions. The process may further comprise converting the media context instructions into haptic instructions for delivery to local hardware. The process may further comprise providing haptic device feedback as an input to the artificial intelligence engine.

BRIEF DESCRIPTION OF THE DRAWINGS

[0014] A better understanding of the various disclosed embodiments can be obtained when the following detailed description is considered in conjunction with the attached drawings, in which:

[0015] FIG. 1 shows a system diagram that includes an illustrative Web Infrastructure.

[0016] FIG. 2 shows a block diagram of a Local Hardware Device.

[0017] FIGS. 3A-3C show a software flowchart and functional distribution for a first illustrative embodiment [0018] FIGS. 4A-4C show a software flowchart and functional distribution for a second illustrative embodiment.

[0019] FIGS. 5A-5C show a software flowchart and functional distribution for a third illustrative embodiment.

[0020] FIG. 6 illustrates a block diagram of a system that includes an exemplary, non-limiting media appliance.

[0021] FIG. 7 illustrates a block diagram of the media appliance.

[0022] FIGs. 8 and 9 illustrate flow diagrams of an exemplary, non-limiting embodiment for utilizing one or more hardware devices with a media appliance having access to content from one or more content providers.

[0023] Figure 10 illustrates a process for training an A.I. Engine.

[0024] Figures 11A and 1 IB illustrate processes for using an A.I. Engine to generate media context instruction sets.

[0025] Figure 12 illustrates converting a media context instruction set into a plurality of device-specific hardware instruction sets.

[0026] It should be understood that the drawings and corresponding detailed description do not limit the disclosure, but on the contrary, they provide the foundation for understanding all modifications, equivalents, and alternatives falling within the scope of the appended claims.

DETAILED DESCRIPTION

[0027] The disclosed embodiments relate to the synchronization of locally-connected hardware components to online, web-based multimedia events such as online movies, interactive video games, entertainment media and/or other non-entertainment media. More specifically, the disclosed embodiments relate to software and systems executing the software that create the user-experienced haptic synchronization effect between various online video sources and locally- connected haptic hardware components. The disclosed embodiments rely on a client-server architecture to maintain synchronization between locally-connected haptic hardware and online multimedia events. These embodiments synchronize two separate data sources: online multimedia being served from a content provider's website, such as YouTube, Netfiix, or Amazon Instant Video; and haptic hardware instructions associated with said multimedia and hardware device being served by a web infrastructure configured to support said embodiments. In contrast to prior systems, the separation of the multimedia content and the haptic hardware instructions improves the utility of the presently disclosed systems and methods by enabling rapid deployment of instructions for new media content and new haptic hardware. In this manner, media content may be synchronized with haptic hardware instructions for both preexisting devices as well as new devices without modification or alternation of the media content. This flexibility and scalability was not possible with prior systems, such as those discussed above, in which hardware control signals were embedded directly in multimedia files.

[0028] The paragraphs that follow describe in detail various examples of the above-described embodiments. An example of local software executing on a user's local client computer system is first described, followed by a description of an illustrative Web infrastructure and a description of an illustrative local hardware device. Three illustrative embodiments are then described, wherein the multimedia event's hardware instruction set is interpreted by a different component for each embodiment (i.e., the local client computer, the local hardware device and the Web infrastructure).

Web Infrastructure

[0029] FIG. 1 shows an overview of an illustrative embodiment of a multimedia tactile augmentation system that includes the Web Infrastructure 120, an Internet 110 infrastructure with one or more Web Server 130 and Database Infrastructure 140. In the embodiment shown, a Web Server 130 responds to requests from the Local Software 310 executing on Local Computer 100 or from the Local Hardware Device 200, and further queries the Database Infrastructure 140 as needed. The web server 130 may be configured with an A.I. Engine as explained in more detail below with respect to Figures 10-12. Local Software 310 may be stored on non-volatile information storage medium as shown, or may be downloaded onto the Local Computer 100 via, e.g., the Internet 110.

[0030] The core functionalities of the Web Infrastructure 120 include, but are not limited to:

1. Responding to requests from the Local Software 310 or Local Hardware Device 200;

2. Maintaining User account information; and

3. Hosting any web sites required to support the disclosed system.

4. Generating video content instruction sets.

Local Hardware Device [0031] FIG. 2 shows a preferred embodiment of the Local Hardware Device 200 that includes a Communication Device 210, a Processing Unit 220, one or more Haptic Feedback Capable Components 240, such as vibration motors, a Power Source 260, such as a battery or a port for electrical port connectivity, and a Bus 2S0 which is used to transport communications between each of the Local Hardware Device's 200 components. The Local Hardware Device 200 may exist in any shape or size, with the distinguishing factor being that it is designed to work with the disclosed system. In at least some illustrative embodiments, multiple Local Hardware Devices 200 are coupled to local computer 100 and are simultaneously operable. In at least some illustrative embodiments, the Haptic Feedback Capable components 240 of local hardware device 200 provide physical and/or sexual stimulation to one or more users of the disclosed system(s).

[0032] The core functionalities of the Local Hardware Device 200 may include, but are not limited to:

1. Establishing connectivity to the personal computing environment via the Communication Device 210; examples of the Communication Device 210 include, but arc not limited to: a. Serial and parallel ports, Universal Serial Bus (USB), BlueTooth, Near Field

Communications (NFC) or other wireless technologies;

2. Establishing connectivity to the Local Software 310 or Web Infrastructure 120;

3. Responding to requests for identifying information;

4. Interacting with instructions:

a. The Processing Unit 220 may receive instructions via the Web Infrastructure 120 or the Local Software 310; and

b. The instructions may be processed before the Processing Unit 220 receives them; i. If the instructions are not processed before the Processing Unit 220

receives them, the Processing Unit 220 may process the instructions, possibly including the temporary storage of processed instructions in the Locally Connected Hardware's 200 Memory 230; and

5. Interacting with the Haptic Feedback Capable Components 240:

a. The processing unit commands each Haptic Feedback Capable Component 240 to respond for specific durations and in specific patterns based on the instructions, with the end result being synchronization between the Local Hardware Device's 200 Haptic Feedback Capable Components 240 and the online multimedia event. Local Software

[0033] In at least some illustrative embodiments, Local Software 310 executes on a Local Computer 100 in the form of a browser extension or add-on. This enables the Local Software 310 to enhance an existing web browser such as, e.g., Microsoft Internet Explorer, Google Chrome or Mozilla Firefox by adding the functionalities of the Local Software 310 to the existing web browser.

[0034] The core functionalities of the Local Software 310 include, but are not limited to:

1. Establishing and maintaining connectivity to and interactivity with the Web Infrastructure 120:

a. Facilitating User authentication to the Web Infrastructure 120;

b. Facilitating User authentication for multiple accounts simultaneously; c. Sending identifying information about multimedia to the Web Infrastructure 120 to determine if the detected multimedia has an associated hardware instruction set; if the associated hardware instruction set exists, designating the multimedia as supported;

d. Retrieving associated hardware instruction sets;

2. Interacting with the Local Hardware Device 200:

a. Establish and maintain connectivity to the Local Hardware Device 200; b. Retrieve identifying information about the Local Hardware Device 200; c. Determine if the Local Hardware Device 200 is associated with the authenticated User's account; if not, facilitate account interactions with the Web Infrastructure 120 to associate the Local Hardware Device 200 with the User's account, or facilitate authentication to the correct account to permit usage of the Local Hardware Device 200;

d. Sending instructions to the Local Hardware Device 200;

e. Monitoring playback of Local Hardware Device 200 as needed to maintain

synchronization between Local Hardware Device 200 and multimedia event;

3. Interacting with web pages:

a. Parsing webpages to detect multimedia; b. Detecting identifying information about the multimedia;

c. Manipulating the displayed web page to label discovered multimedia as supported or not, based on feedback from the Web Infrastructure 120;

d. Manipulating the multimedia player to facilitate interaction with multimedia; 4. Interacting with multimedia on web pages:

a. Controlling the playback of multimedia to facilitate synchronization between the Local Hardware Device 200 and the multimedia; and

b. Identifying and controlling the current playback time of multimedia to facilitate synchronization between the Local Hardware Device 200 and the multimedia. c. Analyzing the multimedia content with computer vision to determine a playback time of the multimedia content and synchronizing the instructions for the Local Hardware Device 200 with the multimedia content based on the determined playback time.

[0035] Alternate embodiments of the Local Software 310 may include, but are not limited to:

1. A component of a completely custom application, such as an application designed for a mobile platform, tablet, wearable device, or other, similar personal computing

environments;

2. A multi-component system of applications that interact and operate to deliver the same functionality of a single browser add-on or extension; examples include a desktop application that facilitates User authentication and hardware communications and interacts with a web browser add-on or extension that facilitates interactivity with a web page, multimedia, and the Web Infrastructure 120; and

3. A fully-enclosed web application that facilitates all functions across a network via a web server, said Web Server 130 may exist in various forms, such as on a standalone hardware device, on the Local Hardware Device 200, or as a component within a cloud infrastructure environment.

[0036] The disclosed systems and methods are best understood when described in an illustrative usage context To this end, the paragraphs that follow describe three illustrative embodiments, wherein for each embodiment the multimedia event's hardware instruction set is interpreted by a different component.

First Illustrative Embodiment [0037] FIG. 3 shows a beginning-to-end illustrative process view of the disclosed systems and methods, wherein the instruction set processing occurs within the Local Software 310.

[0038] A User connects the Local Hardware Device 200 to the Local Computer 100 via the Local Hardware Device's 200 Communication Device 210. Once the Local Hardware Device 200 is connected to the Local Computer 100, the User launches the Local Software 310. The Local Software 310 will prompt the User to log in to their account; if an account has not been established, the User is prompted to create a new account. Credentials are passed from the Local Software 310 to the Web Infrastructure 120, which validates the authentication attempt.

[0039] Once authenticated, the Local Software 310 detects the Local Hardware Device 200, requests the Local Hardware Device's 200 unique hardware ID, and verifies with the Web Infrastructure 120 that the specific Local Hardware Device 200 is associated with a currently- authenticated account. If so, the Local Software 310 initiates the Local Hardware Device 200. When initialized, the Local Hardware Device 200 is ready to receive instructions from the Local Software 310 and the User is notified that the Local Hardware Device 200 is ready for use.

[0040] If the unique hardware ID is not associated with an authenticated account, the Web Infrastructure 120 checks to see if it is associated with another account. If it is, the Local Software 310 prompts the User to log in to the correct account to be able to use the connected device; in at least some illustrative embodiments, the usage of simultaneous multiple account logins may be allowed to permit the usage of multiple devices. If the unique hardware ID is not associated with any accounts, the User is prompted to register the unique hardware ID with their account. Once the Local Hardware Device 200 is registered or the User is authenticated to the correct account, the Local Software 310 verifies again with the Web Infrastructure 120 that the specific Local Hardware Device 200 is associated with a currently-authenticated account. The Local Software 310 then initiates the Local Hardware Device 200 and notifies the User that the Local Hardware Device 200 is ready for use.

[0041] The User then uses the Local Software 310 to navigate to a web page. Each time a new web page is loaded, the Local Software 310 parses the page to detect supported media. If supported media is not found, the Local Software 310 docs nothing; if supported media is found, the Local Software 310 detects and sends the media file information to the Web Infrastructure 120. The Web Infrastructure 120 queries the Database Infrastructure 140 to determine if a hardware instruction set exists for the detected media. If an instruction set is not found, the Local Software 310 designates the media as "not supported;" if an instruction set is found, the Local Software 310 designates the media as "supported & ready for play." and the instruction set file is downloaded to the Local Software 310 for parsing and playback. The designation of "supported & ready for play'' and "not supported" may exist as a visual indicator on or near the media file within the web page.

[0042] The User then plays the media. The Local Software 310 reads the current playback time of the media and sends the hardware instruction set to the Local Hardware 200 synchronous to the current media playback time. The hardware instruction set contains time flags to mark the accurate playback time of the Haptic Feedback Capable Components 240; the Local Software 310 monitors the accurate playback time of the media event and uses both pieces of information to maintain synchronization of the Haptic Feedback Capable Components 240 on the Local Hardware 200 and the media. If the current playback time of the media is adjusted by the User, the Local Software 310 uses the updated current playback time and the time flags in the hardware instruction set to re-establish synchronization based on the adjustment made by the User. The playback time of the media may be access from the media player, the operating system, the media Ale, or any other location from which a playback time may be accessed.

[0043] In another embodiment, the Local Software 310 includes a computer vision component adapted to analyze the content of the media. The computer vision component may determine a playback time of the media based on the analyzed content. In some embodiments, the computer vision component may analyze one or more frames of the media content, and analyzed frames may be used as an index to the media to determine the playback time. In this manner, the playback time may be either a time index or another index to the media content capable of being associated with the instructions to synchronize the hardware instructions with the playback of the media content. The computer vision component may be a software component stored in memory and executed on a processor of a haptic feedback system.

[0044] When the Local Hardware 200 receives the instructions from the Local Software 310, the Local Hardware 200 interprets said instructions and generates corresponding electrical signals that are sent to the Haptic Feedback Capable Components 240, resulting in the production of physical haptic feedback output that is experienced by the User.

(004S| When the User stops the playback of the media the Local Software stops the playback of the instruction set to the Local Hardware 200, which stops sending electrical signals to the Haptic Feedback Capable Components 240. The system is then ready for navigation to the next web page, or, if the User disconnects the Local Hardware 200, all memory is cleared and the Local Software 310 is rendered dormant until the Local Hardware 200 is connected again.

Second Illustrative Embodiment

[0046] FIG. 4 shows a beginning-to-end process view of the disclosed systems and methods, wherein the instruction set processing occurs within the Local Hardware 200.

[0047] A User connects the Local Hardware Device 200 to the Local Computer 100 via the Local Hardware Device's 200 Communication Device 210. Once the Local Hardware Device 200 is connected to the Local Computer 100, the User launches the Local Software 310. The Local Software 310 prompts the User to log in to their account; if an account has not been established, the User is prompted to create a new account. Credentials are passed from the Local Software 310 to the Web Infrastructure 120, which validates the authentication attempt.

[0048] Once authenticated, the Local Software 310 delects the Local Hardware Device 200, requests the Local Hardware Device's 200 unique hardware ID, and verifies with the Web Infrastructure 120 that the specific Local Hardware Device 200 is associated with a currently- authenticated account. If so, the Local Software 310 initiates the Local Hardware Device 200. When initialized, the Local Hardware Device 200 is ready to receive current media playback times from the Local Software 310 and the User is notified that the Local Hardware Device 200 is ready for use.

[0049] If the unique hardware ID is not associated with an authenticated account, the Web Infrastructure 120 checks to see if it is associated with another account. If it is, the Local Software 310 prompts the User to log in to the correct account to be able to use the connected device; in at least some illustrative embodiments, the usage of simultaneous multiple account logins may be allowed to permit the usage of multiple devices. If the unique hardware ID is not associated with any accounts, the User is prompted to register the unique hardware ID with their account. Once the Local Hardware Device 200 is registered or the User is authenticated to the correct account, the Local Software 310 verifies again with the Web Infrastructure 120 that the specific Local Hardware Device 200 is associated with a currently-authenticated account. The Local Software 310 then initiates the Local Hardware Device 200 and notifies the User that the Local Hardware Device 200 is ready for use. [0050] The User then uses the Local Software 310 to navigate to a web page. Each time a new web page is loaded, the Local Software 310 parses the page to detect supported media. If supported media is not found, the Local Software 310 does nothing; if supported media is found, the Local Software 310 detects and sends the media file information to the Web Infrastructure 120. The Web Infrastructure 120 queries the Database Infrastructure 140 to determine if a hardware instruction set exists for the detected media. If an instruction set is not found, the Local Software 310 designates the media as "not supported;" if an instruction set is found, the Local Software 310 designates the media as ''supported & ready for play," and the instruction set file is passed through the Local Software 310 to the Local Hardware 200 for parsing and playback. The designation of "supported & ready for play" and "not supported" may exist as a visual indicator on or near the media file within the web page.

[0051] The User then plays the media. The Local Software 310 reads the current playback time of the media and sends that information to the Local Hardware 200. The hardware instruction set contains time flags to mark the accurate playback time of the Haptic Feedback Capable Components 240; the Local Software 310 monitors the accurate playback time of the media, and the Local Hardware 200 uses the media's current playback time and the lime flags within the hardware instruction set to maintain synchronization of the Haptic Feedback Capable Components 240 and the media. If the current playback time of the media is adjusted by the User, the Local Software 310 sends the updated current playback time to the Local Hardware 200, which uses the time flags in the hardware instruction set to re-establish synchronization based on the adjustment made by the User. In other embodiments, the playback time may be determined by analyzing the content using computer vision as discussed above.

[0052] When the Local Hardware 200 receives the current playback times from the Local Software 310, the Local Hardware 200 interprets said instructions and generates corresponding electrical signals that are sent to the Haptic Feedback Capable Components 240, resulting in the production of physical haptic feedback output that is experienced by the User.

[0053] When the User stops the playback of the media the Local Software stops the playback of the hardware instruction set and sends a stop signal to the Local Hardware 200, which stops sending electrical signals to the Haptic Feedback Capable Components 240. The system is then ready for navigation to the next web page, or, if the User disconnects the Local Hardware 200, all memory is cleared and the Local Software 310 is rendered dormant until the Local Hardware 200 is connected again.

Third Illustrative Embodiment

[0054] FIG. S shows a beginning-to-end process view of the disclosed systems and methods, wherein the instruction set processing occurs within the Web Infrastructure 120.

[0055] Λ User connects the Local Hardware Device 200 to the Web Infrastructure 120 via the Local Hardware Device ' s 200 Communication Device 210. Once the Local Hardware Device 200 is connected to the Web Infrastructure 120, the User will launch the Local Software 310. The Local Software 310 receives confirmation from the Web Infrastructure 120 that the Local Hardware is connected and prompts the User to log in to their account; if an account has not been established, the User is prompted to create a new account. Credentials are passed from the Local Software 310 to the Web mfrastructure 120, which validates the authentication attempt.

[0056] Once authenticated, the Web Infrastructure 120 detects the Local Hardware Device 200, requests the Local Hardware Device's 200 unique hardware ID, and verifies that the specific Local Hardware Device 200 is associated with a currently-authenticated account. If so, the Web Infrastructure 120 initiates the Local Hardware Device 200. When initialized, the Local Hardware Device 200 is ready to receive instructions from the Web Infrastructure 120 and the User 300 is notified that the Local Hardware Device 200 is ready for use.

[0057] If the unique hardware ID is not associated with an authenticated account, the Web Infrastructure 120 checks to see if it is associated with another account. If it is, the Local Software 310 prompts the User to log in to the correct account to be able to use the connected device; in at least some illustrative embodiments, the usage of simultaneous multiple account logins may be allowed to permit the usage of multiple devices. If the unique hardware ID is not associated with any accounts, the User is prompted to register the unique hardware ID with their account. Once the Local Hardware Device 200 is registered or the User is authenticated to the correct account, the Local Software 310 verifies again with the Web Infrastructure 120 that the specific Local Hardware Device 200 is associated with a currently-authenticated account. The Local Software 310 then notifies the Web Infrastructure 120 to initiate the Local Hardware Device 200 and the User is notified that the Local Hardware Device 200 is ready for use.

[0058] The User then uses the Local Software 310 to navigate to a web page. Each time a new web page is loaded, the Local Software 310 parses the page to detect supported media. If supported media is not found, the Local Software 310 does nothing; if supported media is found, the Local Software 310 detects and sends the media tile information to the Web Infrastructure 120. The Web Infrastructure 120 queries the Database Infrastructure 140 to determine if a hardware instruction set exists for the detected media. If an instruction set is not round, the Local Software 310 designates the media as "not supported;" if an instruction set is found, the Local Software 310 designates the media as "supported & ready for play," and the Web Infrastructure 120 retrieves the instruction set file for parsing and playback. The designation of "supported & ready for play" and "not supported" may exist as a visual indicator on or near the media file indicator (e.g., an icon) within the web page.

[0059] The User then plays the media. The Local Software 310 reads the current playback time of the media and sends that information to the Web Infrastructure 120. The hardware instruction set contains time flags to mark the accurate playback time of the Haptic Feedback Capable Components 240; the Local Software 310 monitors the accurate playback time of the media, and the Web Infrastructure 120 uses the media's current playback time and the time flags within the hardware instruction set to maintain synchronization of the Haptic Feedback Capable Components 240 and the media. If the current playback time of the media is adjusted by the User, the Local Software 310 sends the updated current playback time to the Web Infrastructure 120, which uses the time flags in the hardware instruction set to re-establish synchronization based on the adjustment made by the User 300. In other embodiments, the playback time may be determined by analyzing the content using computer vision as discussed above.

[0060] When the Local Hardware 200 receives the current instructions from the Web Infrastructure 120, the Local Hardware 200 interprets said instructions and generates corresponding electrical signals that are sent to the Haptic Feedback Capable Components 240, resulting in the production of physical haptic feedback output that is experienced by the User.

[0061] When the User stops the playback of the media event, the Local Software sends a stop signal to the Web Infrastructure 120. Web Infrastructure 120 in turn sends a stop signal to the Local Hardware 200, which stops sending electrical signals to the Haptic Feedback Capable Components 240. The system is then ready for navigation to the next web page, or, if the User disconnects the Local Hardware 200, all memory is cleared and the Local Software 310 is rendered dormant until the Local Hardware 200 is connected again. [0062] In an alternate embodiment, all of the Local Software 310 functions may exist entirely within the Web Infrastructure 120, and the User may access the disclosed system via a web browser.

Media Appliance

[0063] FIG. 6 illustrates an exemplary, non-limiting system 600 that includes a media appliance 610 in accordance with one or more aspects. Media appliance 610 can be a computing device configured to interact with a portal server 630 and a set of hardware devices 620. In one example, media appliance 610 can be a computing device configured to be coupled to a display device (e.g. a television, projector, computer monitor, etc.). In another example, media appliance 610 can include an integrated display device. For instance, media appliance 610 can be a portable media playback device. Alternatively, media appliance 610 can be a laptop or desktop computer having an application mat, when executed by the appliance 610, interacts with the portal server 630 and the set of hardware devices 620 according to the aspects described herein. Further, media appliance 610 can be a mobile device (e.g., a tablet, a smartphone, a portable media device, etc.) executing an application to perform aspects described herein.

[0064] A user of media appliance 610 can browse content from content providers 640 via the portal server 630. As shown in FIG. 6, content providers 640 can include one or more individual providers 642 such as provider #1 through provider #M, where M is an integer greater than or equal to one. Content from content providers 640 can be multimedia content such as movies, television shows, streaming video, video games, interactive media, or other strcamable or downloadable multimedia content. Portal server 630 can organize content from various content providers 640 into a plurality of channels selectable by the user for consumption via the media appliance 610. A channel from the plurality of channels can include content group according to one or more criteria. For instance, the channel can include content from a particular provider 642. In another example, the channel can include content drawn to a specific genre, featuring a specific artist or performer, targeted to a particular audience, related to a specific interest, receiving a threshold critic rating, or a combination of the foregoing. Accordingly, the channel can incorporate content from one or more individual providers 642.

|006S] The user can subscribe to the channel and receive the associated content on a subscription basis. For instance, the user can consume all content associated with the channel via the media appliance 610 so long as a subscription fee is paid. In another example, individual content items can be acquired on a pay-per basis such as a single payment for unlimited viewing, a payment for a single viewing, or a rental charge for unlimited viewings until released. Further, instead of individual content items, a bundle or season of content items can be acquired as described above.

[0066] In an aspect, portal server 630 can deliver web-based information to media appliance 610 for output to the user to enable browsing, purchase, and/or streaming of content from content providers 640. The web-based information can include styling according to a form of media appliance 610. For instance, for mobile device-based implementation of media appliance 610, the web-based information can include styling to display the information in a suitable format for mobile browsing. For a media appliance 610 coupled to an external display, the styling can transform the information to a format suitable for interaction via a remote control for example. In yet another aspect, the portal application 620 can deliver information in a general form such that the media appliance 610 generates a native user interface that incorporates the information. Further, media appliance 610 can also interact directly with content providers 640 to receive available content, select content, playback content, etc.

[0067] As shown in FIG. 6, media appliance 610 interacts with hardware devices 620, which can be a set of one or more devices 622 such as device #1 through device #N, where N is an integer greater than or equal to one. Particularly, media appliance 610 interacts with hardware devices 620 during playback of media or content. As described above in other embodiments, hardware devices 620 can include various components or features activated responsive to instructions or signals. For instance, hardware devices 620 can include haptic devices, drones, robotic elements, clcctrically-drivcn mechanical components, audio-producing components, visual elements (e.g., displays, lighting features, etc.), or the like which can be selectively activated and controlled via instructions. As described above, an instruction set for a particular device 622 and a particular content item can be separately stored, managed, retrieved, and executed (played back) from the content item. Media appliance 610, during playback of the content, ensures synchronization between the content and the instructions such that results exhibited in the hardware devices 620 are coordinated with content playback.

[0068] In an aspect, media appliance 610 can operate multiple hardware devices 622 in parallel during playback of a specific content item. For instance, a user can utilize a subset of the set of hardware devices 620 supported in connection with playback of a content item. The media appliance 610 can generate, retrieve, or otherwise acquire instruction sets associated with the content item. The instruction sets include a plurality of instruction sets respectively associated with the subset of hardware devices 620 utilized by the user. The media appliance 610 can transmit instructions or signals to the subset of hardware devices 620, based on the respective instruction sets, in synchronization with playback of the content item.

[0069] Turning to FIG. 7, illustrated is a non-limiting, exemplary embodiment of media appliance 610 according to one or more aspects. Media appliance 610 includes a processor 611 configured to execute computer-executable instructions such as instructions composing application 618. Such computer-executable instructions can be stored on one or more computer- readable media including a non-transitory, computer-readable storage medium such as memory 617 of media appliance 610. The memory 617 can also store content 619 retrieved or downloaded from portal server 630 or content providers 640, and/or instruction sets 621 retrieved from a separate location (separate storage of portal server 630 or other remote storage) or generated by application 618.

[0070] Media appliance 610 includes a first communication interface 612 and a second communication interface 613. As schematically depicted and described herein, a "communication interface" refers to a logical interface through which communication between at least two entities is established and conducted. The communication interface incorporates an address, identifier, frequency, etc. to which transmission can be directed for reception by the entity utilizing the interface. The address, identifier, or frequency may also serve to identify an origin for transmission from the interface. As a logical interface, the communication interface can include one or more protocols enabling the communication. These protocols can be layered (e.g., according to the OSI model) as one of ordinary skill in the art would appreciate. Further, these protocols can vary depending a medium of transmission. For example, the communication interface can utilize a wired or wireless medium. To this end, as utilized herein, the communication interface also includes physical interfaces and transmit/receive processing chains to implement the communication of the medium. For example, the communication interface can include physical wired or wireless interfaces such as, but not limited to, a USB interface, a serial interface, a WiFi interface, a short-range RF interface (Bluetooth), an infrared interface, a near- field communication (NFC) interface, an Ethernet interface, a fiber optic interface, a cellular radio interface, a satellite interface, etc. [0071] According to one example, first communication interface 612 is utilized by the media appliance 610 to communicate with portal server 630, content providers 640, and/or another entity storing instruction sets. Accordingly, first communication interface 612 can be a network or Internet interface. Second communication interface 613 can be utilized by the media appliance 610 to communicate with hardware devices 620. Accordingly, the second communication interface 613 can be a short-range RF interface (e.g. Bluetooth), a wired interface (e.g. USB), or a combination thereof (i.e. capable of either so as to enable use while charging for example).

[0072] While shown in FIG. 7 as separate communication interfaces, it is to be appreciated that the first communication interface 612 and the second communication interface 613 can be a single communication interface. That is, the first communication interface 612 and the second communication interface 613 can overlap in terms of protocol or physical interface usage. For example, the first communication interface 612 can employ an IP-based communication via WiFi to communicate with portal server 630. The second communication interface 613 can also utilized an IP-based WiFi communication with hardware devices 620. That is, the media appliance 610 can communicate with hardware devices 620 via a local wireless network. A single physical wireless adapter can be employed to conduct both communications.

[0073] Media appliance 614 further includes a user interface 614 configured to obtain input from a user and output display information to the user. User interface 614 can include a display interface 615 and an input interface 616. In one example, the display interface 615 can include a digital visual interface (DVI), a high-definition multimedia interface (HDMI), an optical audio interface, a video graphics array (VGA) interface, or substantially any audio/video interface. Accordingly, the media appliance 615 can be coupled to a display, speakers, an entertainment system, an audio/visual receiver, a television, etc. via the display interface 615. Input interface 616 can include a wired or wireless adapter for an input device such as a keyboard, a pointing device (e.g. mouse, touchpad), or a remote control. Further, it is to be appreciated that the user interface 614 can be provided by a single device. For instance, a touch display or a touchscreen of a mobile device serves to display output and capture input.

[0074] Application 618 is configured to initiate communication with portal server 630 and/or content providers 640 to obtain information related to available content. Application 618 can also authenticate a user to verify access to the portal server 630 is authorized. Application 618 can display the available content as channels or other groups for selection by the user. Application 618 further facilitates selection, purchase, download, or streaming of content from among the available content.

[0075] In a further aspect, application 618 performs playback of content 619. As discussed herein, playback of content 619 can involve interactions with hardware devices 620. Accordingly, the application 618 can detect available hardware devices 620, i.e. devices paired with media appliance 610 or otherwise active and accessible in an environment of media appliance 610. Having identified available hardware devices 620, application 618 retrieves appropriates instruction sets 621 corresponding to content 619 and suitable for the hardware devices 620 detected. Application 618 synchronizes the instructions with playback of content 619 as described herein.

[0076] Turning to FIG. 8, an exemplary, non-limiting method for retrieving content via a service is depicted. The method can be performed by the media appliance 610, for example. At 800, a user is authenticated with a service provided via a media appliance. At 802, available content associated with the service is retrieved. For instance, the media appliance can communicate with a portal server delivering web-based information. Alternatively, the media appliance can communicate with content providers directly and aggregate respective content available from disparate providers. At 804, a selection of content is obtained from the user and a license for playback of the content selected is acquired.

A.I. Based Instruction Set Generation

[0077] FIG. 9 illustrates a method for playback of content via the media appliance via one or more aspects. At 900, one or more hardware devices communicatively coupled to the media appliances are discovered. At 902, a request to initiate playback of a content item is received. At 904, instruction sets are obtained. The instruction sets correspond to the content item and the hardware devices coupled to the media appliance. At 906, playback of the content item is performed in synchronization with execution, by the hardware devices, of instructions from the instruction sets.

[0078] An advantageous implementation of the present invention involves a method and system for generating the hardware instruction sets. The system may comprise the web infrastructure as set forth above, further comprising an Artificial Intelligence (A.I.) engine, such as TcnsorFlow or any other suitable engine. The A.I. engine is trained to generate media context instruction sets that correspond to audio and video content of a multimedia event Such media context instruction sets are generally hardware-agnostic instruction sets that characterize a multimedia file or stream.

[0079] For example, a media context instruction set may comprise a scries of scene context designators, scene intensity values, and time stamps to correlate a set of scene context designators and scene intensity values to a particular frame or set of frames of a media event. More than one set of scene context designators and intensity values may be provided for a given time stamp. Additional characterizations may be made. Media metadata may also be included, such as scene name, scene type, website, director, actors, or other contextual information about the media. The scene context designators may comprise designations of objects that arc the source of, or recipient of, action in a scene as opposed to objects that are merely present in a scene.

[0080] Scene context designations may also comprise a type of action occurring in a scene. For example, in the context of a video game, a scene context designation may comprise Hand (object), Door (object), Push (action). Scene context designators may also designate a point of interest viewed by an actor or other participant in a video. Scene intensity may be valued on a scale of one to three, one to ten, one to 100, or any other appropriate scale. Scene intensity and scene context designations may be based on video, audio, or both.

[0081] Referring to Figure 10, an A.I. Engine may be trained to generate media context instructions as follows. In step 1012, a media event, such as a video/audio event, is reviewed by a person, and training sets are manually prepared. That is, a human creates a series of scene context designations and intensity values by reviewing frames of video and/or listening to audio. The training sets need not be limited to these values.

[0082] In step 1014, the media event is converted into a form that can be fed into the A.I. Engine. In step 1016, the converted media event and the corresponding training media context instructions are fed into the A.I. Engine. In step 1018, the A.I. Engine discerns patterns in the video and/or audio, correlates the patterns with the manually-prepared scene context designations and scene intensity values, and "learns" how to predict scene context designations and scene intensity values. This training process may be repeated with a variety of training sets and corresponding media events to improve the A.I. Engine's predictive abilities. An advantage provided by this training process is that it avoids the need to teach the A.I. Engine to recognize a predetermined set of objects or actions and, indeed, avoids limitations inherent in even having a predetermined set of recognizable objects or actions. In this way, the A.I. Engine of the present invention can make use of the entire data set that is available, and is more flexible than previously known systems.

[0083] Referring to Figure 1 1 A, a process 1 1 10a for using a trained A.l. engine to produce media context instruction sets as illustrated. In step 1 1 12, formatted, unpopulated media context code is generated. The code is similar in format to the training code of Figure 10, but has no assigned values. For example, the formatted code may have a structure to include scene context designators and scene intensity values, but the values would be place holders or have null values. In step 1114, the media event is converted into an A.l. readable format.

[0084] In step 1 116, the formatted, unpopulated code and the converted media are fed into the trained A.I. Engine. The trained A.l. Engine then creates media context instructions in step 1 1 18. In optional step 1120, a person may review the A.I. Engine generated media context instructions, make adjustments, and feed the revised media context instructions back into the A.I. Engine to improve the training of the A.L Engine. In step 1222, the media context instructions may be converted into hard ware-specific instructions. In optional step 1124 user input may be input before or after conversion of the media context instructions to hardware specific instructions. The hardware specific instructions may then be converted to haptic instructions for delivery to local hardware 200, as described above.

[0085] The converted media may comprise media from a plurality of sources, or multiple perspectives of a given media event, such as a multiplayer game. Multiple sets of media context instructions may be produced by the A.I. Engine corresponding to the different perspectives, or one set of media context instructions may be produced comprising a composite set of instructions synthesizing inputs from the different sources and/or perspectives.

[0086] Additional methods for improving the training of the A.I. Engine include incorporating haptic device feedback and neural network self-learning. In the case of haptic device feedback, a local hardware device may be configured for two way communication. The hardware device may communicate back to the local software and, eventually to the Web Infrastructure, express user feedback (for example, user ratings) or implied user feedback (for example, feedback from sensors detecting how a user is using a local hardware device). The A.I. Engine may incorporate this user-provided feedback to improve generated instruction sets. In the case of neural network self-learning, the A.I. Engine may be configured to self-leam pattern recognition and novelty detection. Novelty detection involves recognizing inputs that are different from those that are typically seen.

[0087] Figure 1 IB illustrates a variation of the process to generate media context instructions. In the illustrated process 1 110b, partially-populated code may be prepared manually, and included in the formatted code in step 1 112b to be fed into the A.I. Engine in step 1 116b. The A.I. Engine would then add additional information to complete the media context instructions, such as adding scene intensity values. This variation on process 1110a may be useful when training the A.I. Engine.

[0088] Figure 12 illustrates the hardware translation layer for converting the media context instruction sets to hardware instruction sets. Media context instructions 1212 are prepared as described by any of the methods above. One or more hardware translation layers 1214 are also prepared. The hardware translation layer includes conversion factors to translate the media context instruction sets to device-specific hardware instructions. In one example, a hardware translation layer may comprise a look-up table that maps a given scene context designator and intensity value and intensity value to a predetermined motor or actuator of a plurality of motors or actuators at a predetermined power level.

[0089] Another example of a hardware translation layer moves beyond simple look-up tables and into a more algorithmic translation process. For example, sequences of and or combinations of media context designators and/or power values may be combine by using Boolean logic or other means to generate hardware-specific instructions. The logic may operate on a set of the occurring context and/or intensity values (e.g., Context A and not Context C), or a sequence of context/intensity values occurring over a period of time. (e.g., Context A→ Context B→ Context C).

[0090] For certain hardware types having a limited number of motors or actuators, individual motors may be prioritized depending on scene context designations. More complex devices may have more screen context designators mapped to more motors concurrently. In some hardware types, motors may be controlled by instructing them to operate at a speed value between 0 and 127, or between 0 and 255, The hardware translation layer will map scene intensity values of 1- 4, for example, into the addressable range of motor speed values of 0-127, 0-255, etc., or their hexadecimal equivalents. [0091] The hardware translation step 1216 may smooth out transitions. For example, if scene intensity has a range of values of 0-4, and a particular device has a motor that may be controlled with a value of 0-127, a scene intensity value of 0 may correspond to a motor speed value of 0, a scene intensity value of 1 may correspond to a motor speed value of 30, a scene intensity value of 2 may correspond to a motor speed value of 60, etc. When going from a scene intensity value of 1 to 2, the translator process may gradually transition motor speed values from 30 and 60. The resulting motor or actuator stimulus will be more like a waveform than a step function.

[0092] As illustrated in Figure 12, each media context instruction set may be used to generate a plurality of distinct hardware instruction sets. One or more of the hardware instruction sets may be generated with user preferences as an input to the hardware translation layer, or haptic instructions generated after the hardware translation layer may be further refined by the user preferences. The hardware instruction sets may be generated by the Web Infrastructure 120 and stored in the Database Infrastructure 140. The hardware instruction sets may then be accessed as described above. Additionally, the media context instructions may be stored in Database Infrastructure 140 and downloaded to local computer 100, and Local Software 310 may then convert the media context instructions to hardware instruction sets corresponding local hardware devices 200 that are connected to the local computer.

[0093] Additionally, the Web Infrastructure 120 may generate media context instruction sets in real time for streaming media content such as a webcam stream or a video game stream, and download the media context instruction sets in real time. For example, the disclosed systems and methods are configured to provide tactile sensations for video game. When a video game is played, the video game controller provides instructions that the disclosed system may interpret into commands for one or more connected hardware devices. The instruction set generated by the video game controller may be pre-programmed or generated in response to user-trigger events within the video game. While a video game controller may generate instructions for a given hardware device (such as a device sold or licensed by the video game manufacturer), the disclosed system is configured to receive the instruction from the video game controller, and further configured to generate instructions for one or more hardware devices based upon the received instructions. In this manner, the disclosed system enables multiple different hardware devices to be used in combination with a given video game or other media event. For applications where low latency is not as important, a media buffer may be included to allow sufficient time to generate and synchronize media context instructions and/or hardware instructions to the media stream. Additionally, instruction sets generated in a real-time application may be stored in the database infrastructure for later retrieval and playback as described above.

[0094] In another embodiment, the A.I. Engine may be trained to generate hardware instruction sets directly, without generating media context instruction sets as an intermediate set of instructions. This may be useful when the A.I. Engine is primarily paired with a discrete set of hardware devices, such as in a video gaming context or sexually-oriented hardware devices.

[0095] While various embodiments of the disclosed system refer to specific wireless communication techniques, the system may be used with two or more communication techniques in the same appliance. For example, it may be desired to incorporate two or more of RF, WiFi, and Bluetooth into a given appliance in order to provide multiple connection methods. As one illustration, a media appliance in accordance with the present disclosure may connect to a server using a WiFi connection, and also be configured to communicate with hardware devices using cither a proprietary RF signal, or a Bluetooth signal, depending upon the requirements of the specific hardware device. The disclosed system may be particularly well suited for use with hardware devices, including certain virtual reality devices, such as the Oculus Rift, Sony VR, Steam VR, and HTC Vive.

[0096] Further aspects of the disclosed systems and methods are also disclosed that may be used individually or in combination with the illustrative embodiments previously discussed. In one embodiment, a haptic feedback system is disclosed that includes a processor coupled to a memory and at least one haptic feedback device in communication with the processor. The haptic feedback system may also include a presentation device adapted to present the media content to the user. The processor is configured to execute computer executable instructions, including synchronizing media content loaded on a first computer with instructions received from a second computer. As detailed in the embodiments, above, the media content may be supported media content identified on a particular webpage. In one example, the processor is configured to identify the uniform resource locator of a particular webpage, and the URL is used to identify the media content for the purpose of determining whether the particular media content is supported by the system, i.e. whether the media content has a corresponding instruction set. In other examples, media content may be identified by a serial number or other unique indicia accessible lo the system that may be correlated to an instruction set for the media content. In yet other embodiments, the system may interface with a media player and receive information identifying the media content from the media player, such as through a programmatic interface.

[0097] In another aspect, the haptic feedback system includes at least one haptic feedback device that includes a hardware identification. In embodiments, the system uses the hardware identification in combination with the identification of the media content to determine whether an instruction set exists for a given media content and haptic feedback device. In this manner, the presently disclosed systems and methods may provide instructions for multiple different haptic devices capable of use with a given media content. In addition, instructions for new or additional haptic devices may be added to the system, reducing the costs associated with supporting multiple devices. In yet other embodiments, the instruction set may contain multiple instruction sets associated with a media content or a haptic device, which are used in combination to generate the desired operation. Because the instructions are stored on a different computer than the media content, a wider variety of media and haptic devices may be supported than was possible with prior integrated systems that included haptic commands embedded in the media content.

[0098] The identification of the haptic device may occur at various times during operation of the system. In one example, when an internet browser is launched, the haptic device is automatically detected and a determination made as to whether the device is supported. In other embodiments, a user is permitted to provide the identification of the haptic device. In yet other embodiments, the haptic device may be a plug-and-play device, which is automatically identified when connected to a computer. Other methods of identifying the haptic device are also possible and are contemplated for use with the presently disclosed systems and method.

[0099] In various embodiments, the haptic feedback system synchronizes the media content with the instructions. The synchronization may be achieved using the current playback time of the media content as previously discussed, which may be accessed from the media player. In some embodiments, the media content may be parsed into scenes or frames of a defined duration, and the instructions synchronized to such scenes or frames as desired. Either prior to or after synchronization with the media content, the instructions are converted into one or more electrical signals used to control the at least one haptic feedback device. One of ordinary skill in the art will appreciate that the transformation of instructions into electrical signals may be performed in a variety of ways using conventional computing equipment, including processors and hardware drivers located in a computer, in a given haptic device or allocated between multiple components. In any event, the at least one haptic device is configured to receive the one or more electrical signals and to provide physical stimulations to a user accessing the content in synchronization with the playback of the media content. In embodiments, haptic device may be immediately responsive to the electrical signals, or alternatively, the haptic device may store the electrical signals as instructions to be applied when triggered by the haptic feedback system.

[00100] The haptic feedback systems and methods presently disclosed may be applied in a wide range of applications. For example, the haptic feedback device may be a game controller adapted for use with an online video game system. The game controller may provide physical stimulation corresponding to actions occurring within the game content to enhance the user experience. In other embodiments, the haptic device may by a sexually oriented hardware device or other entertainment related device. In yet other embodiments, the haptic feedback system may be applicable to training or education applications, such as surgical training simulators that benefit from physical stimulations to simulate real life operating procedures. These and other applications will be apparent to persons of ordinary skill in the art and these examples arc provided solely to help illustrate the breadth of applications for the haptic feedback systems and methods.

[00101] Numerous other modifications, equivalents, and alternatives, will become apparent to those skilled in the art once the above disclosure is fully appreciated. For example, while the disclosed embodiments are described within the context of a system used to augment a multimedia event for entertainment purposes, other embodiments may include systems used in the rehabilitation of patients suffering from, for example, reduced physical and/or sexual function such as that caused by the side-effects of diabetes, traumatic brain injuries, spinal cord injuries and/or prostate-related issues. It is intended that the following claims be interpreted to embrace all such modifications, equivalents, and alternatives where applicable.