Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
COMPUTER CONTROL SYSTEM
Document Type and Number:
WIPO Patent Application WO/2022/101604
Kind Code:
A1
Abstract:
A computer control system is disclosed comprising a first computer (301) having connected thereto a display device the first computer being programmed to display on the display device an array of command icons, the first computer having an associated eye-gaze detector and being configured in conjunction with the first computer and the display device to detect when and for what period of time a user's gaze is directed to any given command icon displayed on the display device. The first computer is further programmed to generate a command signal corresponding to a command icon when the period of time exceeds a predetermined value, a wireless communication device associated with the first computer wirelessly transmitting the command signals to an interface device (303) remote from the first computer. The interface device comprises a processor programmed to receive the command signals and to output to a second computer (302) signals emulating the output of a computer manual input device. A user not having use of his or her hands is thereby able to control another conventional computer or games console.

Inventors:
GILBERT DAVID PETER (GB)
WAGNER THOMAS (US)
Application Number:
PCT/GB2021/052734
Publication Date:
May 19, 2022
Filing Date:
October 21, 2021
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
PRETORIAN TECH LTD (GB)
International Classes:
G06F3/01; G06F3/0481
Foreign References:
US20170153701A12017-06-01
US20160225012A12016-08-04
Attorney, Agent or Firm:
LOVEN, Keith James (GB)
Download PDF:
Claims:
- 9 -

CLAIMS

1 . A computer control system comprising: a first computer having connected thereto a display device, the first computer being programmed to display on the display device an array of command icons; an eye-gaze detector configured in conjunction with the first computer and the display device to detect when and for what period of time a user’s gaze is directed to any given command icon displayed on the display device, the first computer further being programmed to generate a command signal corresponding to a command icon when the period of time exceeds a predetermined value, a wireless communication device associated with the first computer for wirelessly transmitting the command signals to an interface device remote from the first computer, the interface device comprising a processor programmed to receive the command signals and to output to a second computer signals emulating the output of a computer manual input device.

2. A computer control system according to Claim 1 , wherein the processor of the interface device is programmed to output to the second computer signals emulating the output of a computer games controller.

3. A computer control system according to Claim 1 or 2, wherein the first computer is programmed to generate a first command signal corresponding to a command icon when the period of time exceeds a first predetermined value but does not exceed a second value and a second command signal if the period of time exceeds the second value.

4. A computer control system according to Claim 1 , 2 or 3, wherein the wireless communication device transmits infra-red signals to the second computer.

5. A computer control system according to any preceding claim, wherein the interface device comprises a display and input means associated therewith, and the processor is programmed to receive signals via the input means to enable the emulation of the output to be adjusted to suit a particular user. 6. A computer control system according to Claim 5, wherein the interface device is configured to store a set of adjustments as a profile for the user.

7. A computer control system according to any preceding claim, comprising a plurality of first computers and a respective interface device for each first computer, each first computer being configured to transmit to the interface device an identifier, and the interface device being configured to process only received signals including the identifier for the respective first computer.

8. A computer control system according to any preceding claim, wherein the or each interface device is connected to the second computer through a wired connection.

9. A computer control system according to any of Claims 1 to 7, wherein the interface device is connected to the second computer through a wireless connection.

10. A computer control system according to Claim 9, wherein the interface device is connected to the second computer through an adaptive controller providing separate left hand and right hand control channels and the interface device is configured to provide signals to the adaptive controller emulating left hand and right hand input signals.

11. A computer control system according to Claim 10, wherein the interface device is configured to output the left hand and right hand input signals to the adaptive controller on separate respective wired connections.

Description:
COMPUTER CONTROL SYSTEM

Field of the Invention

[0001] This invention relates to a computer control system to enable one or more disabled persons to play computer games without the use of their hands.

Background to the Invention

[0002] Most computer games, whether they run on a computer or on a gaming console, require as input a device which generally comprises a plurality of joysticks (often referred to in this context as thumbsticks) and a plurality of buttons. Such gaming devices are variously referred to as gamepads, gaming handsets, gaming input devices or game controllers. Most have a layout similar to those shown in Figure 1 a and 1 b. The provision of multiple human input controls invariably requires that both hands are required to facilitate game play, which precludes many members of society who cannot use one or both hands.

[0003] In so far unrelated technological developments, Eye-Gaze systems have gained considerable popularity amongst disabled users in recent years, particularly when incorporated into a communication device. Such communication devices are frequently referred to as AAC (Assistive and Augmentative Communications) systems and are characterised in having a display upon which a variable matrix of cells may be visible, each containing individual words or phrases. Figure 2 shows a typical AAC device with a matrix visible on the display.

[0004] The user is able to navigate to the required word/phrase by moving his or her gaze and then holding their gaze on the required cell to make a selection. The system is thus able to speak the selected word or phrase and may generally be made to store a number of words or phrases to make up a sentence, or even long passages of speech, and then play them back in coherent sentences on demand by the user. Most AAC systems have the ability to change the particular matrix of words and phrases which is shown on the display, often in accordance with the context of the sentence being created from time to time. [0005] In most cases AAC devices have the alternative facility to scan to the desired cell by using one or more switches which may be adapted to the user’s particular disability in many differing ways. This is sometimes used in situations where Eye-Gaze is either not available or not possible.

[0006] Some AAC devices also have facilities other than speech generation built into them, such that Eye-Gaze can be used in a wider context, for example to change the channel on a TV using infra-red communications in the same way as a TV remote control (a process usually referred to as Environmental Control); or the facility to control a radio-controlled appliance such as a wall-socket so that a lamp may be turned on or off. Such facilities require that the AAC device has built into it, an infra-red generating device, a radio transmitter or other similar wireless communications means. In order to facilitate Eye-Gaze control of environmental control devices, it is generally the case that the AAC device has stored within it a different matrix of cells which is relevant to the control of a TV or other Environmental Control device and these matrices are called up to the display on demand by the user, using Eye-Gaze or switch scanning techniques.

[0007] To date, Eye-Gaze has not been applied to the stated problem of disabled players being unable to use gaming controllers.

Summary of the Invention

[0008] The invention is defined in the claims.

[0009] By using the system of the invention, an Eye-Gaze system on one computer system is used to control a game running on another. The AAC device, herein referred to as the first computer, communicates with the gaming computer, herein referred to as the second computer, via a bridging or interface device, hereinafter referred to as the third equipment, which is able to translate infra-red, radio, or other wireless communication means into gaming input to the third computer, according to Figure 3.

[0010] By calling up a relevant matrix of cells for game play, the user may gaze upon (or scan and select using switches) any of the cells in the matrix displayed on the first computer and in so-doing send a command to the third equipment via the chosen wireless communications method, whereupon the third equipment translates the command into a Gamepad action to be sent to the second computer, allowing game play to ensue.

[0011] It will be appreciated that the first computer is frequently mounted to the user’s wheelchair, whereas the second computer and third equipment are not, meaning that wireless communications between the first computer and the third equipment is essential to maintain the mobility of the user in their wheelchair. Conversely, communications between the second computer and the third equipment may take place along one or more cables without detriment to the mobility of the user.

Brief Description of the Drawings

[0012] In the drawings, which illustrate exemplary embodiments of the invention:

Figure 1a shows a typical game controller handset front operating area;

Figure 1b shows a typical game controller handset upper operating area;

Figure 2 shows a typical AAC device with a matrix of cells visible on its display;

Figure 3 is a perspective view of a typical configuration of the first and second computers and third equipment;

Figure 4 is a block diagram of the third equipment;

Figure 5 is a typical matrix layout for achieving game play on the first computer;

Figure 6 is a flow diagram for setting a user Profile;

Figure 7 is a perspective view of an alternative embodiment using an Xbox Adaptive Controller; and

Figure 8 depicts an alternative block diagram when using an Xbox Adaptive Controller.

Detailed Description of the Illustrated Embodiment

[0013] Figure 3 shows a first embodiment in which a first computer [301] is mounted to a wheelchair, is operated by the user and sends game play data to the third equipment [303] via infra-red. The second computer [302] receives game play information from the first computer via the third equipment [303] and is running a computer game which responds to the game play information.

[0014] Figure 5 shows a typical cell matrix to be displayed on the first computer for periods during which the user wishes to achieve game play. Although this is an example cell matrix, it is readily understood that the cross-shaped arrangement of cells on the left [501] is intended to emulate the left-hand thumbstick [101] on a gaming controller and similarly the right-hand cross-shape arrangement [502] emulates the right-hand thumbstick [102], The smaller cross shape in the centre [503] emulates the digital joystick [103], often referred to as the D-Pad, whereas the four cells marked ‘A, ‘B’, ‘X’ and Y’ [504] emulate the buttons of the corresponding names on a game controller [104], Similarly, the button pairs [505] and [506] emulate the features on a gamepad usually referred to as Bumper and Trigger for both left [105] and right [106],

[0015] When any given cell is gazed upon for a pre-determined period of time, the first computer responds by sending via infra-red, radio or other wireless communications technique, a code which uniquely identifies both the gaming controller movement that the user wishes to make and for what duration. For example, considering the upward portion of the left-hand thumbstick emulator, there are three cells, the one closest to the centre giving slow movement upwards for a pre-determined period likely to be of the order of a few seconds; the next cell giving somewhat faster movement for a similar period of time and then the outer cell giving what is often described as ‘sticky’ movement, i.e. it yields movement at a pre-determined speed until such time as the user gazes upon the centre cell, which is the command to stop. Similar descriptions can be made for all left, right and down branches of matrix arrangement [501] and for the right-thumbstick emulation cells [502],

[0016] Although not shown in this exemplar, it would be equally possible to include in the thumbstick emulation cells [501] and [502] the facility to move along diagonals or indeed any chosen angular movement, the main limitation being the resultant complexity of the matrix, especially where the user may have cognitive difficulties. [0017] Indeed, it is possible to create matrices which are adapted to differing levels of user cognition from very simple provision of a small number of cells to a complete set for users who are more cognitively able.

[0018] The various controls in Figure 5 may be broadly categorised as analogue or digital, with thumbsticks being generally regarded as analogue, whereas buttons and the D-Pad are usually digital. The left and right Trigger features on a standard Gamepad may be treated as being either analogue or digital; that is to say in the former case the feature is sensitive to the degree of depression, whereas in the latter case it is regarded as either fully pressed or fully released. Depending on which is required for a particular gaming situation, the left and right Trigger cells in the game-play matrix may need to be similar to one branch of the thumbstick cells, encompassing both slow and fast movement or, as shown in Figure 5, may be a simple cell to achieve a digital output.

[0019] Some games operable on the second computer require that one or more of the digital buttons be operable in particular ways, for example ‘doubleclicked’ or pressed and held for a particular period in order to achieve favourable game interaction. It should be understood that such provision of additional cells within a matrix to achieve these extended features is entirely possible, each sending a unique code to the third equipment to enable it to identify precisely the chosen action.

[0020] Referring now to Figure 4, the third equipment is able to receive infrared data from the first computer by means of the infra-red receiver [401], although it should be appreciated that this could equally be substituted for a radio receiver or other wireless communications method, in order that the first computer and third equipment are co-operatively able to communicate with each another.

[0021] During game play, once the first computer has sent the corresponding infra-red data to carry out a particular movement according to the user’s Eye- Gaze instruction, the infra-red code is received by the third equipment, which contains a microprocessor [404] which is adapted to allow it to interpret the infra-red code and translate it into a gaming command, the latter being communi- cated to the second computer system, usually via one or more wired connections. A secondary microprocessor [405] may be required to achieve the onward communication to the second computer is some cases. The third equipment thus electrically emulates the function of the game controller, although it has no mechanical human input devices of its own, instead relying on the received infra-red codes to instruct the gaming commands.

[0022] Thus the user can achieve game play in all respects simply by Eye- Gaze on the first computer. The third equipment is incidental to the user and no direct interaction takes place between the user and the third equipment during normal game play. It is there solely to translate instructions from the first computer into control of the second computer.

[0023] A series of buttons [402] and a simple display [403] are present in the third equipment to allow the degree of movement (in the case of analogue gamepad features) and the duration of the event (in all cases) to be set by the user, or by another person in support of the disabled user where the latter would find difficulty in doing so. Figure 6 shows a typical flow diagram for setting the various parameters in turn using the exemplary embodiment. Such a set of settings is hereinafter referred to as a Profile. In some cases the first computer may be capable of calling up a cell matrix of sufficient complexity to allow the settings to be modified on the first computer instead of on the third equipment, in which case buttons [402] and display [403] may be dispensed with, in favour of additional matrices on the first computer. In this case an additional level of protocol will be required in passing data between the first computer and third equipment such that the latter is able to differentiate between an incoming profile and normal gaming input.

[0024] Because the profiles referred to above may need to be different from one game to another, or from one user to another if the equipment is to be used by a plurality of users, it is highly advantageous to be able to store and recall multiple profiles quickly and without recourse to time-consuming navigation of multiple menus on the display [403], To this end, the microprocessor [404] may be advantageously adapted to allocate part of its memory [406] to store a num- ber of profiles, which may be recalled either on the third equipment using buttons [402] and display [403], or from the first computer by provision of further cells on the game-play matrix, such recall instructions being passed from the first computer to the third equipment by further provision within the data communications protocol.

[0025] In some cases it may be required that a plurality of disabled gamers be present in a single setting, either to achieve co-operative game play where each user takes control of a sub-set of the gaming controls to play a single instance of a single game, or during multi-player gaming tournaments where multiple games are being played on multiple instances of said equipment. In such eventualities, it is vital that game play data created by one user does not disad- vantageously impinge on other users. To avoid such erroneous operation, the first computer and third equipment are co-operatively adapted such that the first computer precedes game play and profile data with address information. Whilst all instances of third equipment in the vicinity will receive data from all similarly proximate first computers, by setting the single address which the first equipment sends to coincide with the address to which the corresponding third equipment responds, the data can be directed from a single first computer to a single third equipment and all other instances of third equipment will disregard it. Prior to game play the user, or his/her supporter, sets the player number and in so doing sets the address. The third equipment is similarly set to the corresponding player number and correct operation will ensue at all times.

[0026] In addition to game play and profile information, it may sometimes be advantageous to allow the first computer to send other forms of data to the second computer using similar techniques. In particular, keyboard data may occasionally be required, for example to allow the user to save their ‘high score’ by name. An extension to the data protocol between the first computer and third equipment, together with modifications to the microprocessors [404] and [405] and in most cases a different matrix of cells on the first computer, will allow for such functionality. This is particularly the case where the passage of data between the third equipment and second computer is by means of USB, since it may be achieved along a single cable. [0027] Furthermore, the data need not be restricted to keyboard data and may usefully include mouse movement and button data in some circumstances. In certain events, and for certain users, the keyboard and mouse data may be the only facilities that are in fact required and in such circumstances the gaming feature may be dispensed with entirely.

[0028] In a second exemplary embodiment as shown in Figure 7, the second computer may be a dedicated gaming console which, by way of example, could be an Xbox gaming console [701], Such devices are characterised in having USB sockets which are only able to accept proprietary game controllers, making them incompatible with the third equipment. Nevertheless, the third equipment may be made to be operable with the Xbox gaming console by the further provision of an Xbox Adaptive Controller [702], which communicates with the Xbox gaming console wirelessly and comprises a plurality of sockets to allow switches to be attached to achieve game control by a disabled user but also provides two USB sockets [702, 703] for the attachment of certain human input devices, each of which implements a sub-set of the full game controller. In particular, USB socket [702] implements the left thumbstick and some buttons, whereas USB socket [703] implements the right thumbstick and some of the remaining buttons.

[0029] The corollary of this modus operandi is that in order to achieve full game controller functionality via an Xbox Adaptive Controller, the third equipment must be fitted with two USB cables, each being plugged into its respective socket on the Xbox Adaptive Controller according to Figure 7.

[0030] In this eventuality, the arrangement of microprocessors in the third equipment may need to differ according to Figure 8. In particular secondary microprocessor [405] may in fact need to be sub-divided into two separate microprocessors, [405a] and [405b], each receiving all gaming commands from microprocessor [404] but only responding to the ones which are appropriate to the USB socket to which is it connected.