Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
INTERFACE APPARATUS FOR SOFTWARE
Document Type and Number:
WIPO Patent Application WO/2010/068901
Kind Code:
A2
Abstract:
A system having plural input devices for sensing motion of a subject and controlling execution of a character in a video game is disclosed. The input devices can detect three-dimensional movement of portions of the player's body and provide data representative of the movements to a system executing the game. Each plural input device includes motion-sensing apparatus and can be attached to a location on the subject. Data from the plural input devices, e.g., representative of motion of any combination of the subject's hands, feet, knees, hips, elbows, and head can be used by a processing system to control the visual action of the video-game character. The player's movements in three-dimensional (3D) space allow the player to control an in- game character. The present invention provides an intuitive way to play a game by attaching one or more input devices directly to the body.

Inventors:
ERDMANN BRIDGET (US)
Application Number:
PCT/US2009/067726
Publication Date:
June 17, 2010
Filing Date:
December 11, 2009
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
GIZMO6 LLC (US)
ERDMANN BRIDGET (US)
International Classes:
G06F3/01
Domestic Patent References:
WO2008061023A22008-05-22
Foreign References:
US20050282633A12005-12-22
US20080242415A12008-10-02
US20020077189A12002-06-20
Attorney, Agent or Firm:
LANZA, John, D. (Hall & Stewart LLPTwo International Plac, Boston MA, US)
Download PDF:
Claims:
CLAIMS

What is claimed is:

1. An input device for providing data representative of motion of the input device comprising: a housing having a top portion and a bottom portion; plural eyelets disposed in the housing; a tether adapted to be threaded through at least two of the plural eyelets; and a motion sensing device disposed within the housing wherein the tether is configurable via the plural eyelets to securely attach the input device to a location on a subject and the motion sensing device produces data representative of three-dimensional motion of the input device.

2. The input device of claim 1 further comprising a pad or cushion disposed on the bottom portion, wherein the pad or cushion contacts the location on the subject.

3. The input device of claim 1, wherein the bottom portion has a concave shape.

4. The input device of claim 1, wherein the input device remains at the location with substantially no slipping when attached to the location by the tether.

5. The input device of claim 1, wherein the tether comprises a cord, elastic cord, or strap.

6. The input device of claim 1 further comprising a cinch or clip for tightening the tether.

7. The input device of claim 1, wherein the motion sensing device comprises an accelerometer or a microelectromechanical system gyroscope or a combination thereof.

8. The input device of claim 7, wherein the accelerometer or gyroscope is a multi-axis motion sensor.

9. The input device of claim 1 further comprising a user-interface button disposed on the top portion.

10. The input device of claim 1 further comprising a wireless communication transmitter and receiver disposed within the housing.

11. The input device of claim 10, wherein the produced data representative of three-dimensional motion is provided via a wireless communication link to an electronic processing system having executable instructions in operation thereon.

12. The input device of claim 11, wherein motion of the input device alters the execution of the executable instructions.

13. The input device of claim 11 adapted to operate in conjunction with plural input devices as claimed in claim 11.

14. The input device of claim 11 adapted to operate in conjunction with a remote controller configured to operate the electronic processing system.

15. A system comprising: plural input devices as claimed in claim 1; an electronic processing system configured to receive data representative of three-dimensional motion from each of the plural input devices and alter the execution of executable instructions in operation on the electronic processing system based on the received data representative of three-dimensional motion; and a remote controller configured to operate the electronic processing system having the executable instructions in operation thereon, wherein the plural input devices and remote controller cooperatively alter the execution of executable instructions in operation on the electronic processing system.

16. The system of claim 15, wherein the plural input devices sense different motions at different locations on the subject.

17. The system of claim 15, wherein data from the plural input devices is processed by the electronic processing system to control a subject represented visually by the electronic processing system.

18. The system of claim 15, wherein the executable instructions comprise a video game in operation on a video gaming console or personal computer.

19. The system of claim 15, wherein the executable instructions comprise an interactive fitness video in operation on a video gaming console or personal computer.

20. The system of claim 15, wherein the executable instructions comprise an interactive instructional video in operation on a video gaming console or personal computer.

21. The system of claim 15, wherein each of the plural input devices further comprises a battery to power the input device, a power switch to turn the input device on or off, and a light to indicate that the input device is in a power-on state.

Description:
INTERFACE APPARATUS FOR SOFTWARE

CROSS-REFERENCE TO RELATED U.S. APPLICATIONS

[0001] The present continuation-in-part application claims priority to U.S. provisional patent application No. 61/121,895 filed on December 11, 2008, which is incorporated herein by reference.

FIELD OF THE INVENTION

[0002] The present disclosure relates to a user interface apparatus. In particular, the present disclosure relates to apparatus for controlling a software application, such as a software entertainment product.

BACKGROUND OF THE INVENTION

[0003] Since their introduction, video games have become increasingly visually sophisticated. In a typical modern video game, players control the movement and behavior of game characters that appear to be three-dimensional. Game players navigate these characters through three-dimensional environments to position a character at a particular location in the environment, solve problems posed by, or discover secrets hidden in, the environment, and engage other characters that may be controlled either by the game engine or by another game player. Despite increasingly realistic worlds and increasingly realistic effects on the environment caused by the character, user input to these games is still limited to input sequences that a game player can generate entirely with fingers and thumbs through manipulation a gamepad, a joystick, or keys on a computer keyboard.

[0004] Perhaps because of the inherent limitation of these traditional input devices, other input devices have begun to appear. One particular example is the Wii Remote, the primary controller for the Nintendo Wii console which allows the user to interact with and manipulate items on screen via movement and pointing through the use of accelerometer and optical sensor technology. The Wii remote, however, is designed for use only with the hands. The Wii Balance Board is similar to a household body

I of21 scale with a plain white top and light grey bottom and contains multiple pressure sensors that are used to measure the user's center of balance only. It cannot tell when the users feet move in 3D space.

[0005] Another particular example is a camera manufactured by Sony Corporation for the PlayStation 2 game console and sold under the trade name EyeToy. This peripheral input device has enabled a number of "camera-based" video games, such as the twelve "mini-games" shipped by Sony Corporation for the PlayStation 2 under the trade name EyeToy:Play. In each of the twelve mini-games included on EyeToy:Play, an image of the game player is displayed on screen and the player engages in gameplay by having his image collide with game items on the screen. However, these games suffer from the drawback that, since a video image of the player is inherently "flat," these games are typically restricted to comparatively shallow and simplistic two-dimensional gameplay. Further, since these games directly display the image of the game player on the screen, game play is limited to actions the game player can physically perform.

BRIEF SUMMARY OF THE INVENTION

[0006] The present invention can provide an intuitive way for a user to interact with executable instructions in operation on an electronic processing system, e.g., a video game or other interactive software operating on a processing system. In some embodiments, the present invention provides an intuitive way to play a game by attaching one or more input devices directly to the body. The player's movements in three-dimensional (3D) space allow the player to control an in-game character. The input devices can detect three-dimensional movement of portions of the player's body and provide data representative of the movements to a system executing the game. An input device of the present invention can be adapted to operate in conjunction with plural similar input devices and/or in conjunction with a remote controller configured to operate the electronic processing system.

[0007] In certain embodiments, each input device comprises a housing having a top portion and a bottom portion, plural eyelets disposed in and/or on the housing, a tether adapted to be threaded through at least two of the plural eyelets, and a motion sensing device disposed within the housing. The input device can provide data representative of motion of the input device. In various embodiments, the tether is configurable via the plural eyelets to securely attach the input device to a location on a subject and the motion sensing device produces data representative of three-dimensional motion of the input device.

[0008] Also disclosed is a system comprising plural input devices as described above, and an electronic processing system configured to receive data representative of three-dimensional motion from each of the plural input devices and alter the execution of executable instructions in operation on the electronic processing system based on the received data representative of three-dimensional motion. The system further comprises a remote controller configured to operate the electronic processing system having the executable instructions in operation thereon, wherein the plural input devices and remote controller cooperatively alter the execution of the executable instructions in operation on the electronic processing system. In various embodiments, the plural input devices sense different motions at different locations on the subject.

BRIEF DESCRIPTION OF THE DRAWINGS

[0009] The foregoing and other objects, aspects, features, and advantages of the disclosure will become more apparent and better understood by referring to the following description taken in conjunction with the accompanying drawings, in which:

[0010] FIG. 1 is a perspective view of a device for interacting with a software product;

[0011] FIG. 2 is a exploded diagram of the apparatus of Fig. 1;

[0012] FIG. 3 depicts an embodiment of the electronics and logic circuits of the device 100.

[0013] FIGS. 4A and 4B are block diagrams depicting embodiments of computer systems useful in connection with the present invention. [0014] FIGS. 5A-5D are perspective views depicting ways of affixing the apparatus of FIG. 1 to an extremity of a user; and

[0015] FIGS. 6A-6K are screen shots showing environments in which the apparatus may be used.

DETAILED DESCRIPTION OF THE INVENTION

[0016] Referring now to FIG. 1, an embodiment of a device for interacting with a software product, and in brief overview, a device 100 includes a casing 102, an attachment string 104, and a cinch or clip 106. The device 100 may attach to various parts of a user's body, such as the feet, hands, hips, or head. The device 100 may be attached in pairs or as a single unit. In some embodiments the device 100 includes an on/off switch for turning the device on. In other embodiments, the device uses a sensor to determine when a user desires to use the device and, in response to a signal from the sensor, turns the device 100 on.

[0017] Still referring to FIG. 1, and in greater detail, the device 100 includes a casing 102. In some embodiments, the casing is shaped to fit snugly on a specific body part. For example, the casing 102 may have a slight concavity in order to fit snugly over the top of a foot. In some of these embodiments, the casing 102 includes additional attachment materials, such as soft cushioning material. In other embodiments the device 100 includes a ring element suitable for serving to a finger through which the string 104 runs to help secure the device 100 to a player's hand. The casing 102 can be made of plastic, or it may be manufactured from any type of rigid, lightweight material.

[0018] The attachment tether or string 104 can be made of stretchable material. In some embodiments, the string 104 is made of elastic cord, plastic, fabric or leather. As shown in FIG. 1, the string 104 can be a single piece that loops through eyelets 108 on the casing of the unit. Although the embodiment shown in FIG. 1 provides two eyelets 108, any number of eyelets 108 may be provided so that the string 104 can hold the device 100 snugly against the user's body. In certain embodiments, the number and placement of eyelets 108 may be chosen based on the particular body part on which the device 100 is intended for use. In various embodiments, the tether 104 is reconfϊgurable on the device 100, e.g., can be fed through different sets of eyelets, so that the input device 100 can be attached securely, e.g., with substantially no slippage, to a selected location on a subject, e.g., hand, foot, knee, ankle, hip, head, etc. In some embodiments, the input device 100 includes one or more cushions or pads (not shown in the drawings) on a surface which contacts a location on a subject. A cushion or pad can improve comfort of the attached input device, and assist in preventing slipping of the device.

[0019] The device 100 also includes an adjustable toggle 106. As shown in FIG. 1, the device may include a single clip 106. Alternatively, multiple clips 106 may be provided. The clip 106 allows the string 104 to be shortened or lengthened to fit. In the embodiment shown in FIG. 1 the clip 106 is provided as plastic with a spring or a sliding adjustable clip.

[0020] FIG. 2 depicts an exploded view of one embodiment of the device 100. As shown in FIG. 2, the device includes a casing 102 comprising a top portion or shell 102A and a bottom portion or shell 102B. In some embodiments, the casing 102 is elongate in shape and small in size, e.g. , oval or rectangular in shape with a maximum length less than about six inches, less than about four inches, less than about three inches, less than about two inches, and yet in some embodiments, less than about one inch. The width and height of the casing 102 can be equal to or less than the length of the casing. In various embodiments, the height of the casing is a fraction of the casing's length, e.g., between about 0.5 and about 0.1 of the casing's length. In some embodiments, the bottom shell 102B has a flat surface which contacts a location on a subject when the input device is attached to the subject. In some embodiments, the bottom shell 102B has a concave surface which contacts a location on a subject. In various embodiments, the casing 102 encompasses a computer board 200 comprising an accelerometer 202 , a battery 204, and a wireless communication link 206.

[0021] As shown in FIG. 2, the eyelets 108 for the casing 102 may be provided by the top shell 102A. In alternative embodiments, the eyelets 108 may be provided by the bottom shell 102B. In still other embodiment, the eyelets 108 may be provided by the interlocking of the top shell 102A and the bottom shell 102B. FIG. 2 also shows an alternative eyelet 108 configuration from that of the embodiment shown in FIG. 1. [0022] As shown in FIG. 2, the computer board 200 includes a three-axis accelerometer 202, a battery 204 a wireless communication chip 206. The three-axis accelerometer may be provided as any one of number of chips, including the Freescale RD3473MMA7360L, published by Freescaler Semiconductor of Austin, Texas. Output from the accelerometer 202 is provided to the wireless communication chip 206 for transmission to a game console. As used in this documents, the term "game console" means any machine that executes a software program and accepts input from the device, including a personal computer, the XBOX line of game consoles manufactured by Microsoft Corp. of Redmond, Washington, the PlayStationl, PlayStation2 and PlayStation3 consoles manufactured by Sony Corp. of Japan, the GameCube and Wii consoles manufactured by the Nintendo Corp. of Japan, and a variety of portable gaming devices such as the PlayStation Portable, GameBoy, GameBoy Advance, Nintendo DS and N-Gage, manufactured by Nokia Corp. of Finland.

[0023] In certain embodiments, the computer board 200 includes one or more microelectro-mechanical system (MEMS) gyroscopes configured to provide information about motion of the device 100. The MEMS gyroscopes can be single or multiple-axis gyroscopes. An example of a single-axis gyroscope that can be used with the device 100 is the ADXSR401 available from Analog Devices of Norwood, Massachusetts. An example of a dual -axis gyroscope that can be used with the device is the IDG-500 available from InvenSense, Inc. of Sunnyvale, California. An example of a three-axis gyroscope that can be used with the device is the LYPR540AH gyroscope available from STMicroelectronics of Geneva, Switzerland. In various embodiments, the one or plural gyroscopes provide data representative of motion of the device 100.

[0024] The battery 204, in one embodiment, is a non-rechargeable lithium-ion coin cell. In other embodiments the battery may be rechargeable. In certain embodiments, the battery 204 features a CR2477 form factor. In some embodiments, the battery 204 comprises one or more rechargeable or non-rechargeable batteries, e.g., one or more AA batteries, one or more AAA batteries. In certain embodiments, the battery 204 is housed in a battery compartment within the device 100. [0025] Referring now to FIG. 3, an embodiment of electronic components and logic circuits 300 for the inventive input device 100 is shown. In various embodiments, the input device 100 comprises a power supply 204 and power feed to various electronic components on the input device. The power feed can comprise a power controller 310 and one or more power buses 312. The power controller 310 can regulate and/or convert supplied power. For example, the power controller 310 can apply or terminate power after receiving a power-on or power-off signal from a power switch 305. The power controller can, in some embodiments, terminate applied power automatically after receiving a "device -inactive" signal, e.g., a lack of motion sensor data for a selected period of time.

[0026] The input device can further comprise a status indication light 110, a motion sensor 202, a user-interface button 112, and communication link 206. The status indication light can indicate the power status and/or operational status of the input device 100. Various types of statuses can be indicated via blinking codes and/or color change of the light. The motion sensor 202 can comprise one or more single or multi- axis accelerometers. In some embodiments, the motion sensor 202 comprises a three- axis digital accelerometer chip ADXL330, or the newer version ADXL345, available from Analog Devices, Inc. of Norwood, Massachusetts. In some embodiments, the motion sensor comprises one or more single or multi-axis MEMS gyroscopes.

[0027] The user-interface button 112 can comprise a push button which can be used to manually enter data via the input device. For example, the button can be depressed once to initiate a pause of executable instructions in operation on an electronic processing system. The button could be depressed twice to activate a menu, and depressed and held to select an item from the menu. The user-interface button can be disposed on the top portion of the input device or on a side or edge portion.

[0028] The communication link 206 can comprise a wireless transceiver adapted to transmit and receive data via wireless means. For example, wireless transceiver 206 can comprise a wireless bluetooth chip, such as the BCM22042 bluetooth chip available from Broadcom Corporation of Irvine, California. In various embodiments, data representative of three-dimensional motion produced by the input device 100 is provided via the wireless communication link to an electronic processing system having executable instructions in operation thereon. For example, the data is provided wirelessly to a video gaming console or personal computer adapted to receive the wireless data representative of three-dimensional motion.

[0029] Each of the electronic components can receive power directly from the power controller 310, or from a power bus 312. Each electronic component can receive or provide data to any of the other components directly or via a data bus 314.

[0030] FIGS. 4A and 4B depict block diagrams of a typical electronic processing system or computer 400 useful in connection with the present invention. For example, the block diagrams of FIGS. 4A and 4B can be representative of a person computer or video gaming system, or elements thereof, which can have executable instructions in operation thereon. The input device 100 can be used to alter the execution of the executable instructions, e.g., control a character in a video game, control the execution of software on a computer, etc. As shown in FIGS. 4A and 4B, each computer 400 includes a central processing unit 402, and a main memory unit 404. Each computer 400 may also include other optional elements, such as one or more input/output devices 430a-430n (generally referred to using reference numeral 430), and a cache memory 440 in communication with the central processing unit 402.

[0031] In certain embodiments, the computer 400 can include an analysis engine. The analysis engine can receive data from one or more input devices 100 and process or analyze the received data to determine the motion of the one or more input devices. The determined motion data can be provided by the analysis engine to the computer 400 to alter the execution of executable instructions in operation on the computer, e.g., control a subject represented visually by the electronic processing system. In some embodiments, the executable instructions comprise a video game, and the determined motion data from the analysis engine controls the visually-displayed action of an in-game character.

[0032] In the present invention, a camera can be one of the input/output devices 430. The camera can capture digital video image data and transfers the captured video image data to the main memory 404 via the system bus 420. Various busses may be used to connect the camera to the processor 402, including a VESA VL bus, an ISA bus, an EISA bus, a MicroChannel Architecture (MCA) bus, a PCI bus, a PCI-

! of21 X bus, a PCI-Express bus, or a NuBus. In these embodiments, the camera typically communicates with the local system bus 420 via another I/O device 430 which serves as a bridge between the system bus 420 and an external communication bus used by the camera.

[0033] Universal general-purpose desktop computers of the sort depicted in FIGS. 4A and 4B typically operate under the control of operating systems, which control scheduling of tasks and access to system resources. Typical operating systems include: MICROSOFT WINDOWS, manufactured by Microsoft Corp. of Redmond, Washington; MacOS, manufactured by Apple Computer of Cupertino, California; OS/2, manufactured by International Business Machines of Armonk, New York; and Linux, a freely-available operating system distributed by Caldera Corp. of Salt Lake City, Utah, among others.

[0034] FIGS. 5A-5D depict ways in which the device 100 may be affixed to a user's body. FIG. 5 A shows a user having a pair of devices affixed to the hips; FIG. 5B shows a device affixed to a user's foot; and FIG. 5 C shows a device affixed to a user's hand. FIG. 5D shows a pair of devices attached to a user's head.

[0035] In operation the following steps are taken by the device:

1) The player or user initiates a program, e.g., initiates operation of executable instructions on a programmable logic processor, and turns the device or plural devices 100 on.

2) The program auto-detects the one or plural devices and tells them to start sending acceleration data.

3) The program asks the user to stand still while it calibrates. Once the program detects that no movement has occurred for a second, the program records calibration values for each device.

4) The program is constantly receiving data from each device. When the player moves their feet the values in the transmitted data change as well.

5) The program uses algorithms on the data received from the devices to classify the motion to decide if the movement is something it is programmed to recognize.

6) When the program detects a move, it informs the game which processes it as a user input. [0036] In certain embodiments, the inventive input devices 100 are used in conjunction with an existing controller, e.g., with a Wiimote controller, to provide additional user-interactivity for a processing system having executable instructions in operation thereon. When used in conjunction, the input devices 100 can provide data representative of motion to the processing system substantially simultaneously with data provided by the existing controller to cooperatively control the execution of the executable instructions. For example, the input devices 100 can be attached to various locations of a user's body and can provide data which can be processed by an analysis engine and the processing system to realistically simulate the motion of an avatar in a video game. The input devices 100 can be provided as add-on equipment to enhance the functionality of certain video gaming systems.

[0037] The apparatuses described above can be used in tandem to track multiple parts of the game player's body. For example, the analysis engine may track the game player's head, hands, feet, torso, legs, and arms. Any combination of any number of these parts may be tracked simultaneously, that is, the analysis engine may track: head, hands, feet, torso, legs, arms, head and hands, head and feet, head and torso, head and legs, head and arms, hands and feet, hands and torso, hands and legs, hands and arms, feet and torso, feet and legs, feet and arms, torso and legs, torso and arms, legs and arms, head and hands and feet, head and hands and torso, head and hands and legs, head and hands and arms, head and feet and torso, head and feet and legs, head and feet and arms, head and torso and legs, head and torso and arms, head and legs and arms, hands and feet and torso, hands and feet and legs, hands and feet and arms, hands and torso and legs, hands and torso and arms, hands and legs and arms, feet and torso and legs, feet and torso and arms, feet and legs and arms, torso and legs and arms, head and hands and feet and torso, head and hands and feet and arms, head and hands and feet and legs, head and hands and torso and arms, head and hands and torso and legs, head and hands and arms and legs, head and feet and torso and arms, head and feet and torso and legs, head and torso and arms and legs, hands and feet and torso and arms, hands and feet and torso and legs, feet and torso and arms and legs, head and hands and feet and torso and arms, head and hands and feet and torso and legs, head and feet and torso and arms and legs, head and hands and feet and torso and arms and legs. [0038] This concept can be extended to nearly any number of points or parts of the game player's body, such as: hands, eyes, nose, mouth, neck, torso, shoulders, arms, elbows, forearms, upper arm, hands, fingers, chest, stomach, waist, hips, legs, knees, thighs, shins, ankles, feet, and toes., In general, any number of parts of the player's body in any combination may be tracked.

[0039] However the location or motion of the player's body is determined, that information is used to control the behavior or movement of a game character. A large number of game character behaviors may be indicated by the location or movement of a part of the game player's body. For example, the motion of the player's hands may directly control motion of the character's hands. Raising the player's hands can cause the associated character to assume an erect position. Lowering the player's hands can cause the associated character to assume a crouched position. Leaning the player's hands to the left can cause the associated character lean to the left or, alternatively, to the right. In some embodiments, leaning the player's hands to the left or right also causes the associated character to turn to the left or right. Similarly, motion of the player's hands may directly control motion of the character's hands and motion of the player's feet may directly control motion of the character's feet. That is, motion of hands and feet by the game player may "marionette" the game character, i.e., the hands and feet of the game character do what the hands and feet of the game player do.

[0040] The location or movement of various parts of the game player's body may also control a number of game character motions. In some embodiments, the player's hands cause "drag" to be experienced by the associated game character, slowing the velocity with which the game character navigates through the game environment. In some of these embodiments, the further the player's hands are positioned from the player's body, the more drag is experienced by the player's game character and the faster the velocity of the game character decreases. Extension of the player's hands in a direction may cause the game character to slow its progress through the game environment. In some of these embodiments, extension of the player's hands above the player's hands causes deceleration of the game character. In others of these embodiments, extension of the player's hands in front of the player causes deceleration of the game character.

I l of21 [0041] In still other embodiments, the player's head position may control the speed with which a game character moves through the game environment. For example, lowering the player's head (i.e., crouching) may cause the game character to accelerate in a forward direction. Conversely, raising the player's head (i.e., assuming an erect position) may cause the game character to decelerate. The player's vertical posture may control the character's vertical navigation in the game environment (e.g. crouching steers in an upward direction and standing steers in a downward direction, or vice versa). The player's entire body leaning may cause the character's entire body to lean in the same, or the opposite, direction. A rapid vertical displacement of the player's head may trigger a jump on the game character's part.

[0042] In other embodiments, gestures made by the game player can trigger complex motions on the character's part. For example, the game player sweeping both arms clockwise may cause the game character to execute a spin (i.e. rotation about the axis running from the hands to the feet of the game character) in a clockwise direction and sweeping arms counter-clockwise may cause the game character to execute a spin in a counter-clockwise direction, or vice versa. In another embodiment, raising the player's arms causes the game character to execute a forward, or backward, tumble (i.e. rotation about an axis from the left side of the game character's body to the right side of the game character's body). In another embodiment, lowering the player's hands causes the game character to execute a forward, or backward, tumble. In still other embodiments, raising the game player's left arm while lowering the game player's right arm will cause the game character to roll (i.e., rotation about an axis from the front of the game character's body to the rear of the game character's body) in a counter-clockwise direction, or vice versa. In another embodiment, raising the game player's right arm while lowering the game player's left arm will cause the game character to roll clockwise, or vice versa.

[0043] In some embodiments, gameplay can be broken down in to two distinct modes: navigation and "rail-grinding". In "rail-grinding" mode, the player controls the game character riding the hoverboard on a narrow rail. If the player raises his head, the game character assumes an erect position on the hoverboard. If the player lowers his head, the game character crouches on the hoverboard. A rapid acceleration of the player's head in an upward direction causes the game character to execute a jump maneuver with the hoverboard. If the player leans to the right or left, i.e. displaces his head to the right or left, the game character leans to the right or left on the hoverboard. In this gameplay mode, the game character's hands track the movement of the game player's hands. This allows the player to make the game character reach out to slap targets or to grab game elements positioned near the rail on which the player causes the game character to ride.

[0044] In navigation mode, the player controls the game character to move through the game environment on the hoverboard. If the player raises his head, the game character assumes an erect position on the hoverboard and the game character's acceleration slows. If the player lowers his head, the game character crouches on the hoverboard and the game character's acceleration increases. A rapid acceleration of the player's head in an upward direction causes the game character to execute a jump maneuver with the hoverboard. If the player leans to the right or left, i.e. displaces his head to the right or left, the game character leans to the right or left on the hoverboard. In this gameplay mode, leaning to the right or left also causes the game character to turn to the right or left on the hoverboard. During a "rail-grinding" session, the game character's hands track the movement of the game player's hands cause the game character to experience "drag," which slows the velocity of the game character on the hoverboard. In some embodiments, the further from the body the player positions his hands, the more drag the game character experiences. In one particular embodiment, holding the left hand away from the body while leaving the right hand near the body causes the game character to execute a "power slide" to the left. Similarly, holding the right hand away from the body while leaving the left hand near the body causes the game character to execute a "power slide" to the right. If the game player holds both hands away from his body, the game character is caused to slow to a stop.

[0045] In another example, the described systems and methods are used to provide a music rhythm game in which the controlled character is a musician. In one example, the controlled character is a guitarist and the player attempts to have the guitarist play chords or riffs in synchronicity or near-synchronicity with indications from the game that a chord or riff is to be played. The system tracks the location of the player's arms and hands and motion of the characters arms and hands track those of the player. Movement of the player's strumming hand causes the guitar character to strum the virtual guitar and play chords. In some embodiments the system can track the location of the player's chord hand to both adjust the location of the character's chord hand as well as determine if a higher or lower chord should be played. Similarly, the player can cause the guitarist to execute "moves" during game play, such as windmills, etc.

[0046] In another example, the described system and methods are used to provide a fantasy game. In one embodiment, the game player controls a wizard, whose arm motions follow those of the player. In these embodiments, the particular spell cast by the wizard is controlled by motion of the player's hands. Circular motion of the player's hands causes the wizard to move his hands in a circular motion and cast a spell shielding the wizard from damage. The player clapping his hands together causes the wizard to clap his hands to cast a spell crushing any other game characters in the wizard's line-of-sight. Raising one of the player's hands while lowering the other causes the wizard to do the same and cast a spell that makes all other game characters in the wizard's line-of-sight to lose their balance. When the player rapidly moves his hands directly out from his body, the wizard casts a fireball spell in the direction in which the player stretched his hands.

[0047] In another embodiment, the system can be used to control a warrior in the fantasy game. In this embodiment, the player's hands are tracked to determine when and how the warrior swings, or stabs, his sword. The warrior's arm motions track those of the player. In some embodiments, the player may be provided with a prop sword to provide enhanced verisimilitude to player's actions.

[0048] It will be appreciated in view of the foregoing description and following examples that the inventive devices 100 can be used for instructional or educational purposes. For example, one or more devices 100 can be affixed to a student to teach a student certain moves or motions for dance, martial arts, yoga, and the like.

Application Examples

[0049] FIGS. 6A-6K depict screenshots from various game genres with which the input device 100 may be used. [0050] Referring now to FIG. 6A, an embodiment of a rhythm-action game, such as Donkey Konga, manufactured by Nintendo Corp. of Japan is shown. As is known with respect to rhythm-action games, a player is required to provide input at a specific time in rhythm with screen data. In the embodiment depicted in FIG. 6A, a player must "drum" on a "conga drum" when targets 602 are lined up with a target sight 604. In these embodiments, a player may affix devices 100 to her hands. Movement of the player's hands will be sent to the game as game input.

[0051] Referring now to FIG. 6B, an embodiment of a rhythm-action game, such as DDR, published by Konami Corp. of Japan is shown. A player must step on a dance mat on the floor and press the correct directional arrows at a specific time in rhythm with the data screen, In these embodiments, a player may affix devices 100 to her feet. Movement of the player's feet will be sent to the game as game input. Enhancements to this traditional game may include adding more devices to the hands and/or hips to create other ways to input data during dance sequences.

[0052] Referring now to FIG. 6C, an embodiment of a fighting game, such as Tekken, published by Namco Bandai of Japan is shown. A player must avoid being hit by the enemy player (another player or the computer) and try to damage the opponent by contact with the opponents avatar utilizing their own in-game avatar. In these embodiments, a player may affix devices to her hands. Movement of the player's hands will be sent to the game as game input. Enhancements to this traditional game may include adding more devices to the feet, head and/or hips to create other ways to input data during a fight sequence. For example, the way system tracks the location and motion of the player's arms, legs, and head. In this example, the player can cause the game character to jump or crouch by raising or lowering his head. The player causes the game character to punch by rapidly extending his hands. Similarly, the player causes the character to kick by rapidly extending his legs.

[0053] Continuing with this example, the game character can be caused to perform "combination moves." For example, the player can cause the game character to perform a flying kick by raising his head and rapidly extending his leg at the same time. Similarly, the game character can be controlled to perform a flying punch by rapidly raising his head and rapidly extending his arm at the same time. In a similar manner, a sweep kick is performed by the character when the game player rapidly lowers his head and rapidly extends his leg at the same time.

[0054] In the embodiment depicted by FIG. 6D, the described systems and methods are used to provide a boxing game. The system tracks the game player's head, hands, and torso. The game character punches when the game player punches. The player can cause the game character to duck punches by ducking, or to avoid punches by moving his torso and head rapidly to one side in an evasive manner.

[0055] Referring now to FIG. 6E, an embodiment of a sword fighting game, such as Star Wars: The Force Unleashed, published by LucasArts Entertainment Company Ltd. of California is shown. A player must use a sword to battle an opponent. In these embodiments, a player may affix devices 100 to her hands.

[0056] Referring now to FIG. 6F, an embodiment of a personal enhancement game, such as Yourself Fitness, published by responDESIGN is shown. A player would move their feet, hands, hips and/or head to do yoga or train in a gym. Movement of the player's body will be sent to the game as game input. Enhancements to this traditional game may include brain training games such as Brain Age, manufactured by Nintendo Corp of Japan.

[0057] Referring now to FIG. 6G, an embodiment of a skateboard park simulation game, such as Tony Hawk, published by Activision Publishing of California is shown. A player move their feet to control a board type object similar to a skateboard or surf or snowboard and get points for speed and accomplishing trick moves, In these embodiments, a player may affix devices 100 to her feet. Movement of the player's feet will be sent to the game as game input. Enhancements to this traditional game may include adding more devices to the hands and/or hips to create other ways to input data during a skate or surfing sequences.

[0058] Referring now to FIG. 6H, an embodiment of a sport-action game, such as Mario Tennis, published by Nintendo Corp. of Japan is shown. A player would toss a ball and swing a racket to volley the ball towards an opponent, In these embodiments, a player may affix devices 100 to her hands Movement of the player's hands will be sent to the game as game input. Enhancements to this traditional game may include adding devices to the feet to input data during tennis game sequences. [0059] Referring now to FIG. 61, an embodiment of a licensed team sport game, such as Madden Football, published by Electronic Arts of California is shown. A player would throw passes and/or kick the football. In these embodiments, a player may affix devices 100 to her feet and hands. Movement of the player's feet and hands will be sent to the game as game input. Other titles could include FIFA Soccer, published by Electronic Arts of California which would utilize the foot devices for kicking the soccer ball into the goal, depicted in FIG 4J.

[0060] Referring now to FIG. 6K, an embodiment of a first person shooter game, such as Battlefield 1942, published by Electronic Arts of California is shown. A player would be able to maneuver the screen and the gun by moving the different parts of the body. In these embodiments, a player may affix devices 100 to her feet, hands, hips and head Movement of the player's body will be sent to the game as game input. In this example, the system tracks the location of the player's arms and the motion of at least one of the player's fingers. Motion of the player's arms causes the character to aim the sniper rifle. Similarly, a rapid jerking motion of the player's finger causes the on-screen sniper to fire the weapon.