Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
A METHOD FOR CAPTURING USER INPUT FROM A TOUCH SCREEN
Document Type and Number:
WIPO Patent Application WO/2017/009195
Kind Code:
A1
Abstract:
A touch screen device is configured to provide outputs responsive to inputs provided by a user interacting with the touch screen. The outputs are processed by processing circuitry to determine touch locations of consecutive taps. If the latter tap is within a threshold area of the former tap the action performed in response to the former tap is repeated. In the latter tap is not within a threshold area of the subsequent tap, the action performed in response to the former tap is different to the action performed in response to the subsequent tap.

Inventors:
CLAROS DE LUNA FERNANDO (ES)
SWARTZ PATRIK (ES)
Application Number:
PCT/EP2016/066204
Publication Date:
January 19, 2017
Filing Date:
July 07, 2016
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
KING COM LTD (MT)
International Classes:
A63F13/42; A63F13/2145; A63F13/537
Foreign References:
US20140038717A12014-02-06
US20140007019A12014-01-02
EP2383027A22011-11-02
US20140221088A12014-08-07
Other References:
None
Attorney, Agent or Firm:
SYTLE, Kelda Camilla Karen (GB)
Download PDF:
Claims:
CLAIMS

1. A device comprising:

a touch screen configured to display an image, said touch screen comprising circuitry configured to provide outputs dependent on inputs provided by a user via said touch screen;

at least one processor configured, in dependence on said outputs to determine locations of said inputs and when said inputs comprise a first tap input followed by a second tap input, determining if said second tap input is at a location which is less than a first distance from a location of said first tap input,

wherein if it is determined that said second tap input is at a location which is less than said first distance, said at least one processor is configured to cause an action responsive to the first tap input to be carried out, and

if it is determined that said second tap input is at a location which is greater than said first distance, said at least one processor is configured to cause a different action to be carried out to that carried out responsive to the first tap input.

2. A device as claimed in claim 1 , wherein said at least one processor is configured, when said inputs further comprises a third tap input following said second tap input, determining if said third tap input is at a location which is less than a second distance from a location of said second tap input,

wherein if it is determined that said third tap input is at a location which is less than said second distance, said at least one processor is configured to cause an action responsive to the second tap input to be carried out, and

if it is determined that said third tap input is at a location which is greater than said second distance, said at least one processor is configure to cause a different action to be carried out to that carried out responsive to the second tap input. 3. A device as claimed in claim 2, wherein said first and second distances are one of the same and different.

4. A device as claimed in any preceding claim, wherein said at least one processor is configured, to determine if said second tap input is at a location which is less than the first distance from the location of said first tap input, said first distance being dependent on a direction of the second tap with respect to the first tap.

5. A device as claimed in any preceding claim, wherein the at least one processor is configured, responsive to a tap input to define a threshold area, said device further comprising a memory configured to store information defining said threshold area and responsive to a subsequent tap input, said at least one processor is configured to redefine said threshold area, said memory being configured to store information on said redefined threshold area.

6. A device as claimed in claim 5, wherein each threshold area is substantially centred on said respective tap input,

7. A device as claimed in claim 5, wherein the at least one processor is configured to determine if said second tap input is at a location which is less than the first distance from the location of said first tap input by determining if said second tap input is at a location within the threshold area defined with respect to the first tap input.

8. A device as claimed in any preceding claim, wherein the at least one processor is configured to determine if said second tap input is at a location which is less than the first distance from the location of said first tap input by determining the distance between the location of the first tap input and the location of the second tap input.

9. A device as claimed in claim 8, wherein at least one processor is configured to compare said determined distance to said first distance to determine if said second tap input is at a location which is less than the first distance from the location of the first tap input.

10. A device as claimed in any preceding claim, wherein the at least one processor is configured to determine time information between said first and second tap inputs, said first distance being dependent on said time information.

1 1. A device as claimed in any preceding claim, wherein said device is configured to support a computer implemented game, said touch screen being configured to display a plurality of game objects, said at least one processor configured to cause said action to be performed by at least one game object,

12. A device as claimed in claim 11 , wherein said action is defined by a first component and a second component, and said different action has at least one different first and second component.

13. A device as claimed in claim 12, wherein said first component of said action and said different action defines an action and said second component comprises a direction, said direction being different in said action and said different action.

14. A device as claimed in any preceding claim, wherein said at least one processor configured, in dependence on said outputs to determine locations of said inputs by determining for each input a contact point with respect to said touch screen.

15. A computer implemented method comprising:

displaying an image an image on a touch screen;

receiving outputs from touch screen circuitry responsive to inputs provided by a user via said touch screen;

processing said outputs to determine locations of said inputs and when said inputs comprise a first tap input followed by a second tap input, determining if said second tap input is at a location which is less than a first distance from a location of said first tap input,

wherein if it is determined that said second tap input is at a location which is less than said first distance, causing an action responsive to the first tap input to be carried out, and

if it is determined that said second tap input is at a location which is greater than said first distance, causing a different action to be carried out to that carried out responsive to the first tap input.

16. A computer program product for providing a computer implemented game in a touch screen device, said computer program product comprising computer executable code which when run is configured to:

cause an image to be displayed on a touch screen; process outputs from touch screen circuitry responsive to inputs provided by a user via said touch screen;

determine locations of said inputs from said outputs and when said inputs comprise a first tap input followed by a second tap input, determine if said second tap input is at a location which is less than a first distance from a location of said first tap input,

wherein if it is determined that said second tap input is at a location which is less than said first distance, cause an action responsive to the first tap input to be carried out, and

if it is determined that said second tap input is at a location which is greater than said first distance, cause a different action to be carried out to that carried out responsive to the first tap input.

17. A computer program comprising a plurality of computer executable instructions which when run on at least one processor are configured to cause the method of claim 15 to be performed.

Description:
A METHOD FOR CAPTURING USER INPUT FROM A TOUCH SCREEN

FIELD

Some embodiments relate to the use of a touch screen device to capture user input, for example in the context of a computer implemented game.

Some embodiments relate to controlling an entity displayed on a user interface, particularly but not exclusively to a user interface of a touch screen device, for example in the context of a computer implemented game.

BACKGROUND

There exist many types of computer device where the display is controlled by an input. The input may be a cursor or pointer that is controlled by a human interface device such as a mouse, joystick, keyboard etc. Increasingly, the display may comprise a touchscreen which can be controlled by a user's touch. That is, activation of functions or objects are responsive to user input made by way of the user touching the screen.

Touch screens may be provided on smaller devices such a smart phones. Interaction with objects displayed on such touch screens can be difficult in that the size of a user's finger is relatively large with respect to the area of the screen where for example an object is displayed. This may lead to a lack of precision when for example playing a computer implemented game on a smart phone.

A computer implemented game is available in which an entity displayed on a user interface of a computer device is a character whose movement is controlled by a user input wherein a user engages with the interface (e.g. a finger on a touch screen). Movement by the user's input causes corresponding movement of the entity. For example the entity may be a character which is pursuing and attacking, or being pursued.

In such a game, it may be desirable to provide intuitive feedback to a user such that they may control the character with a minimum of 'turning' or 'false starts' in the game.

i This patent specification describes not only various ideas and functions, but also their creative expression. A portion of the disclosure of this patent document therefore contains materia! to which a claim for copyright is made and notice is hereby given: Copyright King.com Limited 2015 (pursuant to 17 U.S.C. 401 ). A claim to copyright protection is made to all screen shots, icons, look and feel and all other protectable expression associated with the games illustrated and described in this patent specification.

The copyright owner has no objection to the facsimile reproduction by anyone of the patent document or the patent disclosure, as it appears in the Patent and Trademark Office patent file or records, but reserves all other copyright rights whatsoever. No express or implied license under any copyright whatsoever is therefore granted.

SUMMARY

In some embodiments, there is provided a device comprising: a touch screen configured to display an image, said touch screen comprising circuitry configured to provide outputs dependent on inputs provided by a user via said touch screen; at least one processor configured, in dependence on said outputs to determine locations of said inputs and when said inputs comprise a first tap input followed by a second tap input, determining if said second tap input is at a location which is less than a first distance from a location of said first tap input, wherein if it is determined that said second tap input is at a location which is less than said first distance, said at least one processor is configured to cause an action responsive to the first tap input to be carried out, and if it is determined that said second tap input is at a location which is greater than said first distance, said at least one processor is configured to cause a different action to be carried out to that carried out responsive to the first tap input.

The determining if the second tap location is less than the first distance from the first tap location may be carried out in any suitable way. For example, the determination may determine the actual distance between the first and second tap locations. Alternatively, a region may be defined around the first tap location, defining a threshold area and a determination is made as to whether the second tap location is within that threshold area and hence less than the first distance from the first tap location. In other embodiments, any other suitable way of directly or indirectly determining if the second tap location is less than the first distance from the first location may alternatively or additionally be used. This may also be the case with the second and third taps discussed below. In some embodiments, one of the first and different actions may be to provide no action.

The at least one processor may be configured, when said inputs further comprise a third tap input following said second tap input, determining if said third tap input is at a location which is less than a second distance from a location of said second tap input, wherein if it is determined that said third tap input is at a location which is less than said second distance, said at least one processor is configured to cause an action responsive to the second tap input to be carried out, and if it is determined that said third tap input is at a location which is greater than said second distance, said at least one processor is configure to cause a different action to be carried out to that carried out responsive to the second tap input.

The first and second distances are one of the same and different. For example, the distances may be set to be a constant size. In other embodiments, the distance may be dependent on a direction component and/or the frequency of the taps and/or duration of the taps.

The at least one processor may be configured, to determine if said second tap input is at a location which is less than the first distance from the location of said first tap input, said first distance being dependent on a direction of the second tap with respect to the first tap. In some embodiments, the first distance may vary with direction. This may be if a de facto threshold area around the location is not circular. The threshold area could be regular or irregular in shape. The at least one processor may be configured, responsive to a tap input to define a threshold area, said device further comprising a memory configured to store information defining said threshold area and responsive to a subsequent tap input, said at least one processor is configured to redefine said threshold area, said memory being configured to store information on said redefined threshold area. In this embodiment, the first distance may be defined in terms of the threshold area.

Each threshold area may be substantially centred on said respective tap input.

The at least one processor may be configured to determine if said second tap input is at a location which is less than the first distance from the location of said first tap input by determining if said second tap input is at a location within the threshold area defined with respect to the first tap input.

The at least one processor may be configured to determine if said second tap input is at a location which is less than the first distance from the location of said first tap input by determining the distance between the location of the first tap input and the location of the second tap input. For example, this may be done using coordinates associated with the first tap input and coordinates associated with the second tap input.

At least one processor may be configured to compare said determined distance to said first distance to determine if said second tap input is at a location which is less than the first distance from the location of the first tap input.

The at least one processor is configured to determine time information between said first and second tap inputs, said first distance being dependent on said time information. The device may be configured to support a computer implemented game, said touch screen being configured to display a plurality of game objects, said at least one processor configured to cause said action to be performed by at least one game object.

The action may be defined by a first component and a second component, and said different action has at least one different first and second component.

The first component of said action and said different action defines an action and said second component comprises a direction, said direction being different in said action and said different action. The at least one processor may be configured, in dependence on said outputs to determine locations of said inputs by determining for each input a contact point with respect to said touch screen.

According to an aspect, a touch screen device is configured to provide outputs responsive to inputs provided by a user interacting with the touch screen. The outputs may be processed by processing circuitry to determine touch locations of consecutive taps. If the latter tap is within a threshold area of the former tap the action performed in response to the former tap may be repeated. In the latter tap is not within a threshold area of the subsequent tap, the action performed in response to the former tap may be different to the action performed in response to the subsequent tap.

According to another aspect, there is provided a computer implemented method comprising: displaying an image an image on a touch screen, receiving outputs from touch screen circuitry responsive to inputs provided by a user via said touch screen; processing said outputs to determine locations of said inputs and when said inputs comprise a first tap input followed by a second tap input, determining if said second tap input is at a location which is less than a first distance from a location of said first tap input, wherein if it is determined that said second tap input is at a location which is less than said first distance, causing an action responsive to the first tap input to be carried out, and if it is determined that said second tap input is at a location which is greater than said first distance, causing a different action to be carried out to that carried out responsive to the first tap input.

The method may comprise, when said inputs further comprises a third tap input following said second tap input, determining if said third tap input is at a location which is less than a second distance from a location of said second tap input, wherein if it is determined that said third tap input is at a location which is less than said second distance, the method comprises causing an action carried out responsive to the second tap input to be carried out, and if it is determined that said third tap input is at a location which is greater than said second distance, the method comprises causing a different action to be carried out to that carried out responsive to the second tap input. The first and second distances may be one of the same and different.

The method may comprise determining if said second tap input is at a location which is less than the first distance from the !ocation of said first tap input, said first distance being dependent on a direction of the second tap with respect to the first tap.

The method may comprise, responsive to a tap input to define a threshold area, storing information defining said threshold area and responsive to a subsequent tap input, redefining said threshold area and storing information on said redefined threshold area.

Each threshold area may be substantially centred on said respective tap input. The method may comprise determining if said second tap input is at a location which is less than the first distance from the location of said first tap input by determining if said second tap input is at a location within the threshold area defined with respect to the first tap input. The method may comprise determining if said second tap input is at a location which is less than the first distance from the location of said first tap input by determining the distance between the location of the first tap input and the location of the second tap input. The method may comprise comparing said determined distance to said first distance to determine if said second tap input is at a location which is less than the first distance from the location of the first tap input.

The method may comprise determining time information between said first and second tap inputs, said first distance being dependent on said time information.

The method may comprise displaying a plurality of game objects of a computer implemented game and causing said action to be performed by at least one game object. The action may be defined by a first component and a second component, and said different action has at least one different first and second component. The first component of said action and said different action may define an action and said second component may comprise a direction, said direction being different in said action and said different action.

The method may comprise, in dependence on said outputs determining locations of said inputs by determining for each input a contact point with respect to said touch screen.

A computer program comprising program code means adapted to perform the method(s) may also be provided. The computer program may be stored and/or otherwise embodied by means of a carrier medium.

According to one aspect, there is provided a computer program product for providing a computer implemented game in a touch screen device, said computer program product comprising computer executable code which when run is configured to: cause an image to be displayed on a touch screen; process outputs from touch screen circuitry responsive to inputs provided by a user via said touch screen; determine locations of said inputs from said outputs and when said inputs comprise a first tap input followed by a second tap input, determine if said second tap input is at a location which is less than a first distance from a location of said first tap input, wherein if it is determined that said second tap input is at a location which is less than said first distance, cause an action responsive to the first tap input to be carried out, and if it is determined that said second tap input is at a location which is greater than said first distance, cause a different action to be carried out to that carried out responsive to the first tap input.

According to an aspect there is provided a method of controlling an entity displayed on a user interface of a computer device, the method implemented by computer readable code executed by a processor of the computer device, the method comprising: detecting a user input at a first location on the user interface; detecting that the user input continues along a user input trace to a second location; determining a distance along a direct path between the first and second locations; calculating a speed of movement of the entity based on the determined distance; and generating for display a number of graphical indicators to be displayed on the user interface at locations spaced from one another between the first and second locations, wherein the number of displayed indicators represent the speed of movement of the entity.

In one embodiment, the graphical indicators may be displayed along the direct path. The direct path may be a linear trace, wherein the trace itself is not displayed on the user interface, but the graphical indicators are displayed aligned along the direct path.

The graphical indicators may be directional icons which indicate the direction of movement of the entity. These directional icons may be aligned along the direct path.

Alternatively, the graphical indicators may be spaced from one another between the first and second locations in a manner such that their spacing is in a direction following the direction of the direct path, but wherein the graphical indicators need not necessarily be aligned along a linear trace according to the direct path.

The graphical indicators may be placed on either side of a linear trace corresponding to the direct path, or for example weaving generally in the direction of the direct path between the first and second locations.

In one embodiment, each graphical indicator may be an arrow with an arrowhead pointing in the direction of movement.

When the displayed indicators represent the direction of movement of the entity on the user interface, this may allow a user to easily understand the effect of his trace on the user interface.

An initial part of the user input trace may be a straight line from the first to the second location, in which case the direct path would be determined to match that straight line. The number of graphical indicators which are displayed may correspond to the distance determined between the first and second locations along the direct path, and may represent the speed of movement of the entity.

The method may comprise detecting a change of direction of the user input trace and adjusting a direction of the direct path by pivoting about the first location. Thus, where a user's finger does, for example, an L shape, the path of movement of the entity on the user interface may not follow an L shape, but instead may changethe direction of the direct path between the first and second locations so that there is still a straight line.

The number of displayed indicators along this line may represent the speed of movement of the entity and the line indicates the direction of movement of the entity.

The method may comprise the step of controlling the entity to move on the user interface at the calculated speed.

The method may comprise the step of controlling the entity to move on the user interface in the direction indicated by the direct path.

The method may comprise the step of controlling the entity to move on the user interface in the direction indicated by the graphical indicators. In some embodiments, the method may comprise causing a commencement marker to be displayed at the first location.

The commencement marker may indicate a direction of the entity. The commencement marker may initially indicate the direction of the user input trace. When the direction of the direct path is adjusted, the commencement marker may be rotated such that it then indicates the direction of the adjusted direct path. The commencement marker may for example be a circle with a small arrowhead set into its circumference,

A first one of the number of graphical indicators may be displayed after a minimum distance from the first location has been attained from the user input.

A last one of the number of graphical indicators may be displayed after a maximum distance from the first location has been attained by the continued user input. This may have the effect that further continued user input does not result in further graphical indicators being displayed after the maximum distance has been attained.

The first location may be the displayed location of the entity,

The first location may be a location separated from the displayed location of the entity.

The method may comprise the step of detecting the continuous user input to increase the distance between the first and second locations and generating an increased number of graphical indicators responsive to the increased distance. The method may comprise detecting the continuous user input to reduce the distance between the first and second locations and removing from display at least one of the number of graphical indicators in response to detecting the reduced distance.

A graphical indicator may be removed from the display after retracting it towards an adjacent one of the graphical indicators. It may be made smaller and smaller in a direction towards the adjacent graphical indicator, and then fade away just before or as it touches the adjacent graphical indicator.

Alternative methods are possible for removing the graphical indicators from the display as a user traces his finger back to reduce the speed (by reducing the distance between the first and second locations). The detecting that the user input continues may comprise detecting that the user input has continued outside an area around the first location, to avoid detecting unintended user input as continued user input. According to another aspect there is provided a computer device comprising: a user interface; a processor; and a memory, wherein the memory holds computer readable code which when executed by the processor implements a method of controlling an entity displayed on the user interface, the method comprising: detecting a user input at a first location on the user interface; detecting that the user input continues along a user input trace to a second location; determining a distance along a direct path between the first and second locations; calculating a speed of movement of the entity based on the determined distance; and generating for display a number of graphical indicators to be displayed on the user interface at locations spaced from one another between the first and second locations, wherein the number of displayed indicators represent the speed of movement of the entity.

The graphical indicators may be displayed along the direct path.

The graphical indicators may be directional icons which indicate a direction of movement of the entity.

The computer readable code when executed by the processor may implement the step of controlling the entity to move on the user interface at the calculated speed. The computer readable code when executed by the processor may implement the step of controlling the entity to move on the user interface in a direction indicated by the direct path.

The computer readable code when executed by the processor may cause a commencement marker to be displayed at the first location.

The commencement marker may indicates a direction of the entity. The computer readable code when executed by the processor may cause displaying of a first one of the number of graphical indicators after a minimum distance from the first location has been attained from the user input. The computer readable code when executed by the processor may cause displaying a last one of the number of graphical indicators after a maximum distance from the first location has been attained by the continued user input.

The first location may be a location separated from the displayed location of the entity.

The graphical indicators may be directional icons which indicate a direction of movement of the entity.

Each graphical indicator may be an arrow with an arrowhead pointing in the direction of movement.

The computer readable code when executed by the processor may cause detecting a change of direction of the user input trace and adjusting a direction of the direct path by pivoting about the first location.

The computer readable code when executed by the processor may cause detecting the continuous user input to increase the distance between the first and second locations and generating an increased number of graphical indicators responsive to the increased distance.

The computer readable code when executed by the processor may cause detecting the continuous user input to reduce the distance between the first and second locations and removing from display at least one of the number of graphical indicators in response to detecting the reduced distance.

At least one of the graphical indicators may be retracted towards an adjacent one of the graphical indicators, before being removed from the display The computer readable code when executed by the processor may cause detecting that the user input has continued outside an area around the first location, to avoid detecting unintended user input as continued user input. According to another aspect there is provided a non-transitory computer readable media on which is stored a computer program comprising computer readable code, the code when executed by the processor implementing a method of controlling an entity displayed on a user interface of a computer device, the method comprising: detecting a user input at a first location on the user interface; detecting that the user input continues along a user input trace to a second location; determining a distance along a direct path between the first and second locations; calculating a speed of movement of the entity based on the determined distance; and generating for display a number of graphical indicators to be displayed on the user interface at locations spaced from one another between the first and second locations, wherein the number of displayed indicators represent the speed of movement of the entity.

A computer program comprising program code means adapted to perform the method(s) may also be provided. The computer program may be stored and/or otherwise embodied by means of a carrier medium.

In the above, many different embodiments have been described. It should be appreciated that further embodiments may be provided by the combination of any two or more of the embodiments described above. Various other aspects and further embodiments are also described in the following detailed description and in the attached claims.

BRIEF DESCRIPTION OF DRAWINGS

To understand some embodiments, reference will now be made by way of example only to the accompanying drawings, in which:

Figure 1 shows an example device in which some embodiments may be provided;

Figure 2 shows a cross section of part of a touch screen display;

Figures 3a to 3e show schematically a first series of views of a touchscreen; Figure 4a to 4e show schematically a second series of views of a touchscreen; Figure 5 shows an example flowchart according to an embodiment;

Figures 6a to 6e show a set of views of a touchscreen display;

Figures 7a to 7c show the expansion and retraction of graphical indicators; and Figure 8 shows an example flowchart according to an embodiment;

DETAILED DESCRIPTION

A schematic view of a user or computing device 100 according to an embodiment is shown in Figure 1 , All of the blocks shown are implemented by suitable circuitry. The blocks may be implemented in hardware and/or software. The user device may have a control part 110. The control part 110 has one or more processors 115 and one or more memories 120. The control part 110 is also shown as having a graphics controller 125 and a sound controller 130. It should be appreciated that one or other or both of the graphics controller 125 and sound controller 130 may be provided by the one or more processors 115.

The graphics controller 125 is configured to provide a video output 135. The sound controller 130 is configured to provide an audio output 140. The controller 110 has an interface 145 allowing the device to be able to communicate with a network such as the Internet or other communication infrastructure.

The video output 135 is provided to a display 155. The audio output 140 is provided to an audio device 160 such as a speaker and/or earphone(s).

The device 100 has an input device 165. The input device 165 is in the form of a touch sensitive device such as a touch screen. It should be appreciated that the display 155 may in some embodiments also provide the input device 165 by way of an integrated touch screen for example.

The blocks of the controller 110 are configured to communicate with each other via an interconnect such as a bus or any other suitable interconnect and/or by point to point communication. It should be appreciated that in some embodiments, the controller 110 may be implemented by one or more integrated circuits, at least in part.

The user device 100 is shown by way of example only. In alternative embodiments, one or more of the parts may be omitted. Alternatively or additionally, some embodiments may comprise one or more other parts. Alternatively or additionally, one or more parts may be combined.

Reference is made to Figure 2 which schematically shows a touch screen. The touch screen may incorporate any suitable touch screen technology. One example of a touch screen technology is the so-called resistive touch screen technology.

The front layer or surface 2 of the touch screen is typically made of a scratch-resistant, flexible plastic or similar. A thin film or coating 4 of conductive material is provided on the underside of the front surface. The film of conductive material may be of any suitable material and may for example be Indium Tin Oxide. A gap 6 is provided. This gap may be created by suitable spacers 12. The gap may be an air gap. A second layer of material is provided. That layer may be of glass or hard plastic. The second layer 10 is also provided with a thin film or coating 8 of conductive material on the side of the second layer facing the spacing. The coating may be of any suitable material and may also be Indium Tin Oxide. Thus, the two layers 2 and 10 are kept apart by the spacers 12 which may be arranged at regular intervals. The thin conductive films or coatings are arranged to provide electrical resistance. The arrangement is such that the electrical charge runs in one direction on the one conductive coating or film and in a perpendicular direction on the other conductive coating or film.

With a resistive touch screen, when the screen is touched, the plastic deforms so that the two conductive films meet. By measuring the resistance of the two conductive films or coatings, the touch position may be accurately determined.

It should be appreciated that this is one example of a touch screen. Another technology often used for touch screens is capacitive technology. The structure of the touchscreen is similar to that described in relation to Figure 2. However, the first layer may typically be glass, and thus not flexible. The conductive coatings may be a uniform layer, a grid or parallel stripes running at right angles to each other on the two layers. A capacitive arrangement is formed by the two conductive coatings separated by the insulating material (air). When the finger comes close to a capacitor, it changes the local electrostatic field. The touchscreen effectively is made up of a large number of tiny capacitors. The system is arranged to monitor each of these tiny capacitors to determine where the finger touches the screen. Capacitive touch screens have the advantage that it is possible to determine several discrete touch points at the same time. It should be appreciated that embodiments may be used with any suitable touch screen technology.

Embodiments may be particularly applicable for games which are to be played on devices which have a relatively small screen, such as smart phones and some smaller tablets. In such scenarios, a technical problem exists that the user has a problem in contacting a precise area on the small screen as the user's finger is relatively large relative to the screen. The contact location may control the selection of one more actions in a game. Subsequent interactions by the user are likely to be at a different location on the screen which may lead the user to inadvertently selecting the wrong option. If a user has to make a series of taps, the user contact with the screen may drift over time. With a small screen, this could lead to a different option being selected when the user is attempting to select the same option repeatedly.

Reference is now made to Figures 3a to e which shows a first embodiment. Each of the arrangements shown in Figure 3 shows a schematic view of a touchscreen display which is displaying a plurality of game objects. One of the game objects 226 is a user controlled entity. Other of the game objects 220, 222 and 224 represent target objects.

In order to move the user controlled entity 226, the user may touch the touch screen at the location occupied by the user controlled entity and holding a finger against the touchscreen drag the user controlled entity to the new desired location.

In some embodiments, the user controlled entity may be moved by contacting the screen at any location, and without lifting the finger from the screen, moving the finger to anywhere on the screen. The user controlled entity may move in the same direction between the point first touched on the screen and the new "drag"-position.

The move velocity of the user controlled entity may be dependent on the distance between these points. There may be a maximum velocity.

The user controlled entity may continue to move until the user releases their finger from the screen. In this embodiment, the user controlled entity 226 is to be controlled so as to aim at one of the target objects. The current aim area is illustrated schematically by the cone 232. The cone may be displayed on the screen or may not be displayed. Thus, the aim area of the user controlled entity is generally in the direction in which the user controlled entity is facing. The aim area may be a function of the direction of movement of that user controlled entity in some embodiments.

In the arrangement shown in Figure 3b, the user has tapped the touchscreen at location 228. This location 228 may be displayed on the screen or may not be displayed. This location may be any location on the screen and in the example shown in Figure 3b is remote from the position of the user controlled entity. However, it should be appreciated that the user may tap the touch screen at any suitable location. A tap is defined as the user making contact with the touch screen for a relatively short amount of time before lifting his finger away from the touch screen. When the use taps the touchscreen, the user controlled entity may be controlled, for example, to fire at one or more of the target objects.

As shown in Figure 3c as soon as the user has tapped the display, a threshold area 13 is defined around the location where the user tapped the screen. The threshold area is larger than the location of the screen which is tapped by the user.

The threshold area may be displayed on the screen or may not be displayed. Data defining the threshold area may be stored in memory. As is shown in Figure 3d, the user taps the screen again at a location 228. A determination is made as to whether or not this next tap is within the defined threshold area 230. If so, the user controlled entity continues to target the same target. As shown in Figure 3e, the target area is redefined so as to be centered on the latest tap area. This process is continued for as long as the user continues to tap within the threshold area.

In some embodiments the detected tap location may be determined/approximated as a contact point on the touchscreen. The contact point may coincide with a specific pixel. However, in other embodiments, there may not be a one to one relationship between respective contact point and a respective pixel. That contact point may be determined by an algorithm in dependence on the signal output from the touch screen. The location may be defined as coordinates with respect to the touch screen surface and these coordinates may be used to determine whether the second tap is within the threshold area. The second tap location may also be defined as a contact point on the touchscreen. When determining a contact point, the signal output is processed so as to define the contact point as the center point of the tap location. Defining a center of the tap location may make the defining of the threshold area simpler to determine. For example a circular threshold area may be easy to calculate, determine and/or to store in memory as it may be defined as a radius from such a determined point for a tap.

The defining of a contact point for a tap may make it easier for the processor to determine if a subsequent tap is within the threshold area.

It should be appreciated that that the threshold area may be of any other suitable form or shape in other embodiments.

The threshold area may be defined about a tap area or a determined contact point. It is possible that the threshold area may be smaller than the contact area, larger than the contact area or around the same size. Reference is made to which shows a different scenario. Figures 4a to 4c are as described with reference to Figures 3a to 3c. In the scenario shown in Figure 4d, the user taps the screen at a location which is outside the predefined threshold area 230. In response to this, the aim area of the user controlled entity is changed. The aim area may change in dependence on the current location of the tap as compared to the previous location of the tap.

As shown in Figure 4e, the new threshold area 230 is defined around the latest tap location of the user.

Reference is made to Figure 5 which shows a method of an embodiment.

In step S1 , the user touches the screen. As mentioned, this touch may take any suitable format and may comprise the user tapping the touchscreen or dragging their finger across the touchscreen in order to move the user controlled entity. It should be appreciated that the touch may be via a user's fingers or by using a stylus or the like. In some embodiments, the user may need to physically touch the touch screen in order to cause the interaction. In other embodiments, the user's interaction may need only to be in close proximity to the touchscreen in order to cause the interaction. It should be appreciated that the term "touch" is it intended to cover any or all of these scenarios which result in a user input being detected by the touchscreen.

In step S2, electrical signals may be generated by the touchscreen circuitry. This is as described for example in relation to Figure 2. The electrical signals provide information as to the location where the user has touched the touchscreen, the duration of the touch and any movement.

The electrical signals are provided in step S3 to a processing function. The processing function may comprise one or more processers and optionally may include one or more signal processing entities. The signal processing function may analyze the signals to determine if the input is a tap or if the user's controlling the movement of a user controlled entity. To determine if an input has been a tap, the processor is configured to determine the duration of the generated electronic signal. In some embodiments, the signal duration is compared to upper and lower signal duration thresholds. In some embodiments, either an upper signal duration or a lower signal duration threshold is used. The location of the tap may be taken into account. The one or more thresholds may be stored in memory. The comparison to the one or more thresholds may be performed by the processor running suitable computer executable code or may be performed by comparison circuitry. The at least processor may determine the location of the user's input. The location of the user's touch on the touch screen is determined from the received electronic signals from the touch screen, as known in the art. It should be appreciated that the processing function may also use stored information as to the entity or entities being displayed on the display at the location associated with the touch input.

If an input is received over a period of time, and the location of the touch on the touchscreen is changing, then a determination may be made that the user controlled character is being moved. In some embodiments, this may be in conjunction with determining that the initial touch location is associated with a user controlled entity.

In step S4, if it is determined that the user controlled entity is being moved, then the image of the user controlled entity which is displayed on the display moves along a path defined by the user's input. Where a user controlled entity is determined to be moving, the path of the movement and/or the location of the user controlled entity may be stored in memory. In some embodiments, where a user controlled entity is determined to move, the location data of the user controlled entity may be updated in memory.

It should be appreciated that the method may continuously constantly loop between the steps S1 to S3 and the resulting subsequent steps.

It is determined in step S3 that the input provided by the user is a tap, then in step S5, it is determined if the tap is the first tap. This may be done by determining if there is current a threshold area and/or from the tap number which has been recorded. In some embodiments, the processor is configured to provide a count function every time there is a tap.

If the tap as a first tap, then the next step is step S9 and a threshold area is defined around the tap location. The aim direction may be defined with respect to the movement of the user controlled entity and/or the direction in which the user controlled entity is considered to face. The method may loop back to step S1 .

If it is determined that the tap is not the first tap, then the next step is step S6. In this step, it is determined whether the tap location is within the previously defined threshold area associated with the preceding tap. The threshold area may be defined as an area on the screen. The determination may be made as to whether or not the tap is within the defined area. Data on the defined area would be stored in memory. The location of the tap would be determined and the processor may determine if the tap is within the defined area.

In some embodiments, the distance between the location of the current tap and the location of the previous tap is determined. If the distance is less than a threshold distance, then it is determined that the current tap is within the threshold area. In some embodiments, this may be carried out by determining the coordinates of the current tap location and the coordinates of the previous tap location and using this information to determine the distance between the two touch locations.

If the tap is within the threshold area, the next step is step SB where the aim direction is left unchanged. The threshold area may be redefined around the current tap location. The method may then loop back to step S1 .

If it is determined that the tap is not in the threshold area, the next step is step S7. The aim direction may be changed and a new threshold area is defined. It should be appreciated that the aim direction may take into account the difference in position between the previous tap and the current tap.

In one embodiment, the rate at which the user taps the screen may be determined. For example the time between one tap and the next may be determined. The size of the threshold distance or area may be determined in dependence on the determined time between taps. The threshold distance or area may be larger if the time between taps is relatively short. The threshold distance or area may be smaller if the time between taps is relatively long.

In some embodiments, the length of a tap may alternatively or additionally influence the threshold area. A longer tap may result in a smaller threshold area in some embodiments or vice versa. In some embodiments, the time between a series of three or more successive taps may be determined to provide an average tap rate which is used to select or determine the appropriate threshold area or distance.

In the embodiments which use a threshold area, the threshold area has been defined as being substantially circular. However, in other embodiments, the threshold area may have any other suitable shape.

In embodiments, where the threshold area is not circular, account may need to be taken of distance and direction when determining if a successive tap is within a threshold area. For example, if the threshold area is square, then the possible distance between the tap locations (and still be in the threshold area) may vary depending on the direction of the subsequent tap with respect to the previous tap.

In some embodiments, a threshold are may be defined by a series of coordinates and it is determined if the successive tap location is within the area defined by coordinates. In some embodiments, the distance and direction between a tap location and a successive tap location may be determined and based on this information, it is determined if a successive tap is within a threshold area. This may be done in any suitable manner and may for example be done using a look up table or the like.

In some embodiments, the threshold area may be defined so as to exclude an area occupied by one or more game objects and/or to include an area occupied by one or more game objects. In some embodiments, an aim of a computer implemented game may be to repeatedly tap within the threshold area. For example, the threshold area may be displayed on the display. The shape of the threshold area may be any suitable shape. In some embodiments, the shape of the threshold area may be selected to provide a challenge to the user. For example the threshold area may have a more complex shape such as a star shape or similarly complex shape.

In the above described embodiments, the action resulting from the tapping within the threshold area has been a firing action. It should be appreciated that in other embodiments, the action {resulting from tapping within or outside the threshold area) may be any other suitable action. For example, the action may be to cause selection of an object in a computer implemented game, selection of a character in a computer implemented game, selection of a game play option in a computer implemented game, selection of a characteristic of a game object; cause an object to move or any other suitable action. In some embodiments, the different action may be no action.

In some embodiments, an action may be defined by two components. The first component may be the action itself and a control component such as direction, strength of action or the like.

In some embodiments the action component may be the same regardless of whether the subsequent tap location is within or outside the threshold area but with direction or the other control component different if the subsequent tap is outside the threshold area.

In other embodiments, different actions may be performed depending on the location of the subsequent tap.

In the embodiments to be described below, features which control a character on an interface are outlined which improve the experience of the user when controlling the character. A distance from a first touch point to a second contact point is determined when a user swipes a finger across a touch screen. An indication of the first touch point is visually rendered to give the user a reference point. Additionally a direction as given by a straight line (direct path) passing through the first and the second points is determined, sometimes for as long as the finger remains in contact with the touch screen. Dependent on the determined distance a number (e.g. 4) of graphical indicators are rendered and repositioned to indicate how far from the first touch point the finger has been swiped. The graphical indicators are rendered in-between the first touch point and the second contact point, along the determined direction. In some scenarios a first indicator is not rendered until a predetermined distance is attained from the first touch point. Each graphical indicator may be associated with a maximum distance from the first touch point at which they may be rendered. The determined distance and placement and/or the number of indicators are used to convey to the user speed and direction information which corresponds to the way in which the character is controlled to move on the screen.

Reference is now made to Figures 6a to 6e which shows a first embodiment. Each of the arrangements shown in Figures 6a and 6e shows a view of a touchscreen display which is displaying a plurality of game objects. One of the game objects 26 is a user controlled entity.

In order to move the user controlled entity 26, the user may touch the touch screen at the location occupied by the user controlled entity and holding a finger against the touchscreen drag the user controlled entity to the new desired location, in some embodiments, the user controlled entity may be moved by contacting the screen at any location (which may be separate from the location of the entity 26), and without lifting the finger from the screen, moving the finger to anywhere on the screen. The user controlled entity may move in the same direction between the point first touched on the screen (first touch point) and the new "drag"-position. That is, in a direction that extends from the first touch point to the new "drag" - position along a direct path.

The move velocity of the user controlled entity is dependent on the distance between these points up to a maximum velocity.

The user controlled entity continues to move until the user releases their finger from the screen. As shown in Figure 6a, the user is able to control a character 26 on the game area shown on the display. The user places their finger anywhere on the screen (e.g. touch point 29 in Figure 6c) to 'activate' the character for some motion. Note that each character is associated with a crescent shaped marker 28 underneath the character 26 pointed in the direction the character is currently facing. Note that the user may touch the screen where the character is, or at another location spaced from the character. When the user lifts his finger from the screen, the character stops. Thus the control of the character is directly responsive to the user's engagement with the screen.

The described embodiments have user friendly character control features exemplified by graphical indicators as described in the following. As soon as it is determined that a user wants to move the character 26 the character 26 turns to "face" the direction of intended movement and starts to move at the speed and in the direction as determined by the distance in between, and the direction indicated by, the initial touch point 29 and the current position of the finger/contact point; and the first indicator 32 is shown, as shown for example in Figure 8c. At a predetermined distance the second indicator 34 becomes visible and at a third and fourth distance from the initial contact point third and fourth indicators 36, 38 are displayed respectively: see Figure 6d for example. Each subsequent indicator is displayed at any one time in an animation which shows the indicator developing from the point at where the previous indicator was displayed, or conversely retracting towards the initial point of contact with respective increase and decrease of the distance from the initial touchpoint to the current point of contact. To achieve the above features, the user drags their finger along the screen creating a user input trace in the direction in which they wish the character to move. Note that this trace is detected by the circuitry of the touch screen, but is not made visible to a player. If the user only drags their finger a short distance, a small white circle 30 with a point (small arrowhead) in the direction of the drag appears (see Figure 8b). This is a first graphical marker in the form of a directional icon - it indicates a direction of movement intended by the user. Once they have dragged their finger beyond a minimum distance from the initial touch point, the first graphical indicator in the form of an arrow 32 appears. Note that the minimum distance is along a first part of the user input trace, for as long as the user input trace is a straight line. However, if the user has changed direction in the meantime, a linear trace is calculated by the processor (as explained later) to determine a direct path between the first touch point and the new contact point, and the distance along that linear trace is used to determine the minimum distance. As the user continues to drag his finger, the second, third and fourth arrows consecutively appear, pointing in the direction the user dragged their finger in (or of the calculated linear trace). The number of arrows which appear depends upon the distance of the drag, up to a maximum of four arrows once a maximum distance is reached - the user has only dragged their finger a small distance in Figure 6c so one arrow 32 has appeared, but the user has dragged up to or beyond the maximum distance in Figure 6d causing four arrows to appear, 32, 34, 36, 38. The graphical markers described above are represented by arrowheads in figures 6a-6e, but are not limited to this shape. For example, the graphical indicators could instead take the shape of footsteps, dots or crosses. They may also not be limited to a maximum of four graphical indicators once the maximum distance of drag has been reached. If the user changes the direction of the drag, the direction of the point on the crescent and the point on the circle move to follow, and the arrows move linearly, effectively as if user's first touch point on the screen is a pivot point for a straight line and the point at which finger is currently at is the tip of the line. It is possible to consider the direction and speed as an arrow (radial vector) extending outwards from a center of a circle (being the first touch point) to the radius of the circle (the second contact point). At each of certain radii from the centre the indicators are respectively rendered on the screen in a stepwise manner and once rendered, they are extended along the radial vector in a linear manner based on the determined distance of the finger from the initial point of contact.

The indicators do not have to appear in a straight line arrangement in the direction of the character's travel. For example, they could appear like a pair of footprints, each foot appearing alternately, or in a pattern such as a wave, wherein the indicators are displaced at varying distances from either side of the arrow (radial vector) to give a wave pattern which 'grows/shrinks' in the direction of the arrow (radial vector).

As explained above, the aim of the user input is to control the movement of the character on the display. The direction of the arrows 32, 34, 36, 38 (or as many of these as are present) determines the direction of motion of the character 26, and the number of arrows determines the speed at which the character travels. To increase the speed, the user drags further from the first point of contact, causing more arrows to appear until the maximum number of arrows is reached. See figure 7a, where intermediate renderings between, for example, indicators 32 and 34 are shown as 32a, 34a. The finger swipe is shown at 40, in successive distances. To reduce the speed the user drags the finger back closer to the first point of contact, reducing the number of arrows until the minimum distance is reached. See figure 7b, where intermediate renderings between the fourth and third indicators 38, 36 are labelled 38b, 36b. When the user drags his finger to reduce the distance between the locations to reduce the speed, the arrows disappear smoothly. The outermost arrow retracts into (shrinks into) the one behind it until it disappears, keeping on going until the user stops dragging or until the minimum distance is attained. In some embodiments the outermost arrow fades away before it reaches the adjacent arrow. The arrows retracting into/growing out from the one behind, shown in figures 7a - 7c, is one example of a multitude of possible animations to show changes in the character's move velocity. The outermost graphical indicator could fade in/out, or slide behind/out of the previous indicator, or both. The animation could also match the type of indicator used. For example, in the case of footprint indicators, the indicators could appear as if footsteps were imprinting onto/growing out of the ground.

If there is an obstruction in the way of the user, the animation of the arrows is exactly the same, but the character may not be able to move past the obstruction until the direction is changed. See Figure 6e.

The graphical indicator above the character indicates their strength or health, which fluctuates depending on their performance in the game, against other characters and other game elements. When the user removes their finger the character's movement stops and the indicators 32, 34, 36, 38 are retracted and fade away (similarly to as shown in figure 7b).

To re-engage the character 26, a new finger touch starts the process over. A new initial point of contact is indicated and speed and direction are determined in relation W 201 to the new touch point. When moving back and forth along the indicators the speed may increase and decrease depending on the direction in relation to the initial point of contact. When character is moving (i.e. a user's finger is in touch with screen to control the character 26) the user may control the speed by moving his finger in any direction away from the touch point. So even if the user drags his finger in another direction, the speed of the character is still increased and decreased dependent on the distance to the initial touch point. Thus it is not necessary that the finger only moves back and forth along the indicators. It may move to anywhere on the screen and as long as the distance between current and initial touch point increases (increase speed) or decreases (decrease speed), the speed may be modified. Figure 7c shows the effects on the graphical indicators when a user changes the direction of his finger swipe, e.g. 40a causes the line of the indicator to move from position 30 through position 30' to position 30", showing that the graphical indicators are repositioned along the new direct path.

In some embodiments a small "dead zone" area (of a few pixels) around the initial touch position may be implemented, to avoid detecting unintended slight initial movements of the player's finger. A decision as to whether or not a user intends to move the character, and has not just inadvertently moved their finger, is determined/approximated by detecting a minimum distance between the initial touch point and second contact point, before a move is considered intentional and subsequently implemented. The touch point may coincide with a specific pixel. However, in other embodiments, there may not be a one to one relationship between respective touch point and a respective pixel. That touch point may be determined by an algorithm in dependence on the signal output from the touch screen, for example to define coordinates with respect to the touch screen surface. When the user's finger stops moving, new coordinates may be detected for the new contact point and these coordinates may be used to determine whether the new contact point is outside a minimum distance from the touch point. When determining a contact point, the signal output is processed so as to define the contact point as the center point of the touch location. Defining a center of the touch location may make the determination of distance simpler. Reference is made to Figure 8 which shows a method of an embodiment. In step S1 , the user touches the screen. As mentioned, this touch may take any suitable format and may comprise the user dragging their finger across the touchscreen in order to move the user controlled entity. It should be appreciated that the touch may be via a user's fingers or by using a stylus or the like. In some embodiments, the user may need to physically touch the touch screen in order to cause the interaction. In other embodiments, the user's interaction may need only to be in close proximity to the touchscreen in order to cause the interaction. It should be appreciated that the term "touch" is intended to cover any or all of these scenarios which result in a user input being detected by the touchscreen.

In step S2, electrical signals may be generated by the touchscreen circuitry. This is as described for example in relation to Figure 2. The electrical signals provide information as to the location where the user has touched the touchscreen, (the user's first touch point) and latest contact point with the screen.

The electrical signals are provided in step S3 to a processing function. The processing function, may be carried out by the code executed by the processers 115. The signal processing function may analyze the signals to detect the user's intention to control the movement of the user controlled entity 26.

The processing function determines the location of the user's input. The location of the user's touch on the touch screen is determined from the received electronic signals from the touch screen, as known in the art. It should be appreciated that the processing function may also use stored information as to the entity or entities being displayed on the display at the location associated with the touch input.

To move the character, the user input may be received continuously over a period of time. The location of the touch on the touchscreen changes and a determination may be made as to how fast and in what direction the user controlled character is to be moved. In some embodiments, this may be in conjunction with determining that the initial touch location is associated with a user controlled entity. Note that the direction of movement is also determined. In step S4, if it is determined that the user controlled entity is being moved, then the image of the user controlled entity which is displayed on the display moves along a direct path derived from (but not necessarily matching) the user's input traces as described above. Where a user controlled entity is determined to be moving, the path of the movement and/or the location of the user controlled entity may be stored in memory. In some embodiments, where a user controlled entity is determined to move, the location data of the user controlled entity may be updated in memory. In step S5, appropriate graphical icons are rendered as described earlier, corresponding to the determined distance between touch and the direction of movement of the user's touch.

Various embodiments of methods and devices have been described in the foregoing. it should be appreciated that such may be implemented in apparatus, where the apparatus is implemented by any suitable circuitry. Some embodiments may be implemented by at least one memory and at least one processor. The memory may be provided by memory circuitry and the processor may be provided by processor circuitry. Some embodiments may be provided by a computer program running on the at least one processor. The computer program may comprise computer implemented instructions which are stored in the at least one memory and which may be run on the at least one processor.

It is also noted herein that there are several variations and modifications which may be made to the disclosed solution without departing from the scope of the present disclosure.