Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
CONTROLLING A HAPTIC TOUCH-BASED INPUT DEVICE
Document Type and Number:
WIPO Patent Application WO/2023/217366
Kind Code:
A1
Abstract:
A computing device for controlling a haptic touch-based input device 5is provided. The computing device comprising processing circuitry causing the computing device to become operative to control the input device to render one or more primary haptic trails on a touch surface of the input device. Each haptic trail extends between an entry area (E1/2) and a trigger area (T1, T2). The trigger area (T1, T2) is associated with an action which is 10triggered in response to detecting a contact input to the trigger area (T1, T2). The computing device becomes further operative to detect that an object (140) contacts the touch surface at the entry area (E1/2) of at least one of the one or more primary haptic trails and slides along a selected one of the one or more primary haptic trails towards the trigger area (T2) of the 15selected primary haptic trail. The computing device becomes further operative to trigger the associated action in response to detecting that the object (140) has reached the trigger area (T2) of a haptic trail.

Inventors:
ARNGREN TOMMY (SE)
ÖKVIST PETER (SE)
KRISTENSSON ANDREAS (SE)
Application Number:
PCT/EP2022/062805
Publication Date:
November 16, 2023
Filing Date:
May 11, 2022
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
ERICSSON TELEFON AB L M (SE)
International Classes:
G06F3/01; G06F3/0482; G06F3/04883
Foreign References:
CN109254658A2019-01-22
US20140098038A12014-04-10
US20020054060A12002-05-09
US20110128239A12011-06-02
Attorney, Agent or Firm:
ERICSSON (SE)
Download PDF:
Claims:
CLAIMS

1 . A computing device (100) for controlling a haptic touch-based input device (110), the computing device comprising processing circuitry (120) causing the computing device (100) to become operative to: control the input device (110) to render one or more primary haptic trails (310) on a touch surface of the input device (110), each haptic trail (310) extending between an entry area (E1 , E2, E3/4, E1/2, E/T) and a trigger area (T1 , T2, T3, T4, E/T), the trigger area being associated with an action which is triggered in response to detecting a contact input to the trigger area (T1 , T2, T3, T4, E/T), detect that an object (140) contacts the touch surface at the entry area (E1 , E2, E1/2, E/T) of at least one of the one or more primary haptic trails and slides along a selected one of the one or more primary haptic trails towards the trigger area (T1 , T2, E/T) of the selected primary haptic trail, and in response to detecting that the object (140) has reached the trigger area (T1 , T2, T3, T4, E/T) of a haptic trail, trigger the associated action.

2. The computing device (100) according to claim 1 , operative to control the input device (110) to render two or more primary haptic trails with their respective entry areas (E1/2) at a common position on the touch surface.

3. The computing device (100) according to claim 2, operative to control the input device (110) to render the two or more primary haptic trails having respective directions of entry into the haptic trails which are nonparallel.

4. The computing device (100) according to claim 1 , operative to control the input device (110) to render each of the one or more primary haptic trails with its entry area (E/T) and its trigger area (E/T) at a respective common position on the touch surface.

5. The computing device (100) according to claim 4, wherein the trigger area (E/T) of each one of the one or more primary haptic trails is associated with two actions, the computing device operative to trigger one of the two actions based on a direction of the object (140) sliding along the selected primary haptic trail.

6. The computing device (100) according to any one of claims 1 to 5, operative to control the input device (110) to render the one or more primary haptic trails in response to detecting that the object (140) contacts the touch surface.

7. The computing device (100) according to claim 6, operative to control the input device (110) to render the one or more primary haptic trails with their respective entry areas (E1 , E2, E1/2, E/T) at a position where the object (140) contacts the touch surface.

8. The computing device (100) according to any one of claims 1 to 7, operative to control the input device (110) to render one or more secondary haptic trails in response to detecting that the object (140) slides along the selected primary haptic trail.

9. The computing device (100) according to claim 8, operative to control the input device (110) to render the one or more secondary haptic trails with their respective entry areas (E3/4) at a current position where the object (140) contacts the touch surface while sliding along the selected primary haptic trail.

10. The computing device (100) according to claims 8 or 9, operative to control the input device (110) to render the one or more secondary haptic trails based on the selected primary haptic trail.

11 . The computing device (100) according to any one of claims 8 to 10, operative to control the input device (110) to associate the respective actions with the trigger areas (T3, T4) of the secondary haptic trails based on the selected primary haptic trail.

12. The computing device (100) according to any one of claims 1 to 11 , operative to control the input device (110) to render one or more of the not selected primary haptic trails with their respective entry areas (E2) at a current position where the object (140) contacts the touch surface while sliding along the selected primary haptic trail.

13. The computing device (100) according to any one of claims 1 to 12, operative to control the input device (110) to render one or more of the not selected primary haptic trails with their respective entry areas (E2) at a current position where the object (140) contacts the touch surface in response to receiving an indication that a user operating the input device (110) intends to select a different primary haptic trail.

14. The computing device (100) according to claim 13, operative to receive the indication that a user operating the input device (110) intends to select a different primary haptic trail by any one, or a combination, of: detecting that the object (140) sliding along the selected primary haptic trail has reversed direction, detecting that the object (140) sliding along the selected primary haptic trail has stopped sliding, and detecting a change in pressure on the touch surface of the object (140) sliding along the selected primary haptic trail. 15. The computing device (100) according to any one of claims 1 to 14, operative to control the input device (110) to render the haptic trails with a spatial arrangement based on any one, or a combination, of: a position where the object (140) contacts the touch surface, a type of the object (140), an identity of the object (140), a current location of the input device (110), an environment of the input device (110), and an identity of a user of the input device (110).

16. The computing device (100) according to any one of claims 1 to 15, operative to associate the respective actions with the trigger areas (T1 , T2, T3, T4, E/T) of the haptic trails based on any one, or a combination, of: a position where the object (140) contacts the touch surface, a type of the object (140), an identity of the object (140), a current location of the input device (110), an environment of the input device (110), and an identity of a user of the input device (110).

17. The computing device (100) according to any one of claims 1 to 16, operative to control the input device (110) to cease rendering the one or more of the not selected haptic trails in response to detecting that the object (140) has reached the trigger area (T1 , T2, T3, T4, E/T) of a haptic trail.

18. The computing device (100) according to any one of claims 1 to 17, operative to control the input device (110) to render the haptic trails haptically distinguishably.

19. The computing device (100) according to any one of claims 1 to 18, wherein the object (140) is any one of: a finger and a stylus pen.

20. The computing device (100) according to any one of claims 1 to 19, wherein the haptic touch-based input device (110) is a haptic touchscreen.

21 . A method (500) of controlling a haptic touch-based input device (110), the method being performed by a computing device (100) and comprising: controlling (502) the input device (110) to render one or more primary haptic trails on a touch surface of the input device (110), each haptic trail extending between an entry area (E1 , E2, E3/4, E1/2, E/T) and a trigger area (T1 , T2, T3, T4, E/T), the trigger area being associated with an action which is triggered in response to detecting a contact input to the trigger area (T1 , T2, T3, T4, E/T), detecting (503) that an object (140) contacts the touch surface at the entry area (E1 , E2, E1/2, E/T) of at least one of the one or more primary haptic trails and slides along a selected one of the one or more primary haptic trails towards the trigger area (T1 , T2, E/T) of the selected primary haptic trail, and in response to detecting (507) that the object has reached the trigger area (T1 , T2, T3, T4, E/T) of a haptic trail, triggering (509) the associated action.

22. The method (500) according to claim 21 , wherein the two or more primary haptic trails are rendered with their respective entry areas (E1/2) at a common position on the touch surface.

23. The method (500) according to claim 22, wherein the two or more primary haptic trails are rendered having respective directions of entry into the haptic trails which are non-parallel.

24. The method (500) according to claim 21 , wherein each of the one or more primary haptic trails is rendered with its entry area (E/T) and its trigger area (E/T) at a respective common position on the touch surface.

25. The method (500) according to claim 24, wherein the trigger area (E/T) of each one of the one or more primary haptic trails is associated with two actions, and one of the two actions is triggered based on a direction of the object (140) sliding along the selected primary haptic trail.

26. The method (500) according to any one of claims 21 to 25, wherein the one or more primary haptic trails are rendered in response to detecting (503) that the object (140) contacts the touch surface.

27. The method (500) according to claim 26, wherein the one or more primary haptic trails are rendered with their respective entry areas (E1 , E2, E1/2, E/T) at a position where the object (140) contacts the touch surface.

28. The method (500) according to any one of claims 21 to 27, further comprising controlling (506) the input device (110) to render one or more secondary haptic trails in response to detecting (503) that the object slides along the selected primary haptic trail.

29. The method (500) according to claim 28, wherein the one or more secondary haptic trails are rendered with their respective entry areas (E3/4) at a current position where the object (140) contacts the touch surface while sliding along the selected primary haptic trail.

30. The method (500) according to claims 28 or 29, wherein the one or more secondary haptic trails are rendered based on the selected primary haptic trail. 31 . The method (500) according to any one of claims 28 to 30, wherein the respective actions are associated with the trigger areas (T3, T4) of the secondary haptic trails based on the selected primary haptic trail.

32. The method (500) according to any one of claims 21 to 31 , further comprising controlling (505) the input device (110) to render one or more of the not selected primary haptic trails with their respective entry areas (E2) at a current position where the object (140) contacts the touch surface while sliding along the selected primary haptic trail.

33. The method (500) according to any one of claims 21 to 32, further comprising controlling (505) the input device (110) to render one or more of the not selected primary haptic trails with their respective entry areas (E2) at a current position where the object (140) contacts the touch surface in response to receiving (504) an indication that a user operating the input device (110) intends to select a different primary haptic trail.

34. The method (500) according to claim 33, wherein the receiving (504) an indication that a user operating the input device (110) intends to select a different primary haptic trail comprises any one, or a combination, of: detecting that the object (140) sliding along the selected primary haptic trail has reversed direction, detecting that the object (140) sliding along the selected primary haptic trail has stopped sliding, and detecting a change in pressure on the touch surface of the object (140) sliding along the selected primary haptic trail.

35. The method (500) according to any one of claims 21 to 34, wherein the haptic trails are rendered with a spatial arrangement based on any one, or a combination, of: a position where the object (140) contacts the touch surface, a type of the object (140), an identity of the object (140), a current location of the input device (110), an environment of the input device (110), and an identity of a user of the input device (110).

36. The method (500) according to any one of claims 21 to 35, wherein the respective actions are associated with the trigger areas (T1 , T2, T3, T4, E/T) of the haptic trails based on any one, or a combination, of: a position where the object (140) contacts the touch surface, a type of the object (140), an identity of the object (140), a current location of the input device (110), an environment of the input device (110), and an identity of a user of the input device (110).

37. The method (500) according to any one of claims 21 to 36, further comprising controlling (508) the input device (110) to cease rendering the one or more of the not selected haptic trails in response to detecting (507) that the object (140) has reached the trigger area (T1 , T2, T3, T4, E/T) of a haptic trail.

38. The method (500) according to any one of claims 21 to 37, wherein the haptic trails are rendered haptically distinguishably.

39. The method (500) according to any one of claims 21 to 38, wherein the object (140) is any one of: a finger and a stylus pen.

40. The method (500) according to any one of claims 21 to 39, wherein the haptic touch-based input device (110) is a haptic touchscreen.

41. A computer program (213) comprising instructions which, when the computer program (213) is executed by a computing device (100), causes the computing device (100) to carry out the method (500) according to any one of claims 21 to 40.

42. A computer-readable data carrier (212) having stored thereon the computer program (213) according to claim 41 .

43. A data carrier signal carrying the computer program (213) according to claim 41 .

Description:
CONTROLLING A HAPTIC TOUCH-BASED INPUT DEVICE

Technical field

The invention relates to a computing device for controlling a haptic touch-based input device, a method of controlling a haptic touch-based input device, a corresponding computer program, a corresponding computer- readable data carrier, and a corresponding data carrier signal.

Background

Many computing devices, including smartphones and tablets, but also infotainment systems in modem cars, can be controlled by users via a touchbased user interface which is rendered on a touchscreen, comprising virtual User-Interface (Ul) elements such as virtual buttons, keys, sliders, dials, etc.

In order to provide guidance to users operating a touch-based virtual Ul, in particular in situations when users cannot gaze at the Ul for a sufficient duration of time, it is known to provide haptic feedback so as to mimic the sense of touching and/or actuating Ul elements which users are acquainted with from physical, i.e., mechanical, buttons, keys, sliders, dials, etc. Haptic feedback may be provided by haptic technologies which can provide a sensation of touch to the user interacting with a touchscreen or other touchbased input device. Haptic feedback can create an experience of touch by applying forces, vibrations, or motions, to the object contacting the touchscreen, such as the finger of the user or a stylus pen which the user is holding. Thereby, finding the correct button or other Ul element on a touchscreen is simplified, in particular if the user cannot view the displayed Ul elements, or if the touch-based input device which the user is operating is not a display. However, it is a challenge for users to operate a touch-based III while not viewing a visual representation of the III, either because they are not able to view the touchscreen, e.g., because the computing device the user is operating is in a pocket, or because the user is driving a car and cannot direct his/her attention to the touchscreen, or because the III is only haptically rendered on a touch surface of an input device, but not visually rendered for display.

It is an object of the invention to provide an improved alternative to the above techniques and prior art.

More specifically, it is an object of the invention to provide improved haptic touch-based Ills. In particular, it is an object of the invention to provide improved haptic touch-based Ills which can be more easily operated by users without viewing a visual representation of the III.

These and other objects of the invention are achieved by means of different aspects of the invention, as defined by the independent claims. Embodiments of the invention are characterized by the dependent claims.

According to a first aspect of the invention, a computing device for controlling a haptic touch-based input device is provided. The computing device comprises processing circuitry which causes the computing device to become operative to control the input device to render one or more primary haptic trails on a touch surface of the input device. Each haptic trail extends between an entry area and a trigger area. The trigger area is associated with an action which is triggered in response to detecting a contact input to the trigger area. The computing device becomes further operative to detect that an object contacts the touch surface at the entry area of at least one of the one or more primary haptic trails and slides along a selected one of the one or more primary haptic trails towards the trigger area of the selected primary haptic trail. The computing device becomes further operative to trigger the associated action in response to detecting that the object has reached the trigger area of a haptic trail.

According to a second aspect of the invention, a method of controlling a haptic touch-based input device is provided. The method is performed by a computing device and comprises controlling the input device to render one or more primary haptic trails on a touch surface of the input device. Each haptic trail extends between an entry area and a trigger area. The trigger area is associated with an action which is triggered in response to detecting a contact input to the trigger area. The method further comprises detecting that an object contacts the touch surface at the entry area of at least one of the one or more primary haptic trails and slides along a selected one of the one or more primary haptic trails towards the trigger area of the selected primary haptic trail. The method further comprises triggering the associated action in response to detecting that the object has reached the trigger area of a haptic trail.

According to a third aspect of the invention, a computer program is provided. The computer program comprises instructions which, when the computer program is executed by a computing device, causes the computing device to carry out the method according to an embodiment of the second aspect of the invention.

According to a fourth aspect of the invention, a computer-readable data carrier is provided. The data carrier has stored thereon the computer program according to an embodiment of the third aspect of the invention.

According to a fifth aspect of the invention, a data carrier signal is provided. The data carrier signal carries the computer program according to an embodiment of the third aspect of the invention.

In the present context, a haptic trail is a haptically rendered III element which provides haptic feedback to an object contacting the touch surface of a haptic touch-based input device, such as a finger of a user operating the touch-based input device, or a stylus pen held by the user. The haptic feedback is used to prevent the finger or other object contacting the touch surface from leaving the path defined by the haptic trail, unless a threshold force or barrier is overcome.

The invention makes use of an understanding that improved haptic touch-based Ills, which can be more easily operated by users without viewing a visual representation of the III, may be achieved by means of haptic trails with facilitate input gestures without requiring the user to view the touch surface of the input device.

Even though advantages of the invention have in some cases been described with reference to embodiments of the first aspect of the invention, corresponding reasoning applies to embodiments of other aspects of the invention.

Further objectives of, features of, and advantages with, the invention will become apparent when studying the following detailed disclosure, the drawings and the appended claims. Those skilled in the art realize that different features of the invention can be combined to create embodiments other than those described in the following.

Brief description of the drawings

The above, as well as additional objects, features and advantages of the invention, will be better understood through the following illustrative and non-limiting detailed description of embodiments of the invention, with reference to the appended drawings, in which:

Fig. 1 shows a computing device for controlling a haptic touch-based input device, in accordance with embodiments of the invention.

Fig. 2 shows a processing circuitry comprised in the computing device for controlling a haptic touch-based input device, in accordance with embodiments of the invention. Figs. 3A-3C illustrate rendering of haptic trails, in accordance with embodiments of the invention.

Figs. 4A-4D illustrate arrangements of haptic trails, in accordance with embodiments of the invention.

Fig. 5 shows a method of controlling a haptic touch-based input device, in accordance with embodiments of the invention.

All the figures are schematic, not necessarily to scale, and generally only show parts which are necessary in order to elucidate the invention, wherein other parts may be omitted or merely suggested.

Detailed description

The invention will now be described more fully herein after with reference to the accompanying drawings, in which certain embodiments of the invention are shown. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided by way of example so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art.

Fig. 1 illustrates a computing device 100 for controlling a haptic touchbased input device 110, in accordance with embodiments of the invention. The computing device 100 may, e.g., be a smartphone, a tablet computer (aka tablet), a laptop, a desktop computer, a vending machine, an infotainment system for a car, or the like. The haptic touch-based input device 110 may, e.g., be a haptic touchscreen, comprising a touch surface or touch panel, which is integrated into a front face of the computing device 100. The haptic touch-based input device 110 may alternatively be provided separate from, but operatively connected to, the computing device 100. As an alternative, the haptic touch-based input device 110 may comprise a touch surface, or touch panel, without an integrated display. Such a touch surface without display capabilities is not able to visually render a III with its III elements, such as virtual buttons, keys, sliders, dials, etc, but can only haptically render III elements for sensing by the user who is operating the touch-based input device 110 by contacting its touch surface with his/her finger or other object, e.g., a stylus pen. Such a haptic touch-based input device 110 may, e.g., be provided with its touch surface on a rear face of the computing device 100 (indicated as a dashed rectangle 110 in Fig. 1) for enabling the user to control the computing device 100 using one, or both, of the touch surfaces, i.e. , the touchscreen 110 with its touch surface provided on the front face, and/or the input device 110 with its touch surface provided on the rear face of the computing device 100. The computing device 100 may alternatively be embodied as a display-less device, e.g., as a key fob, remote control, or similar type of device, which can be used for unlocking doors by performing a touch gesture on a touch surface of a haptic touch-based input device 110 which is integrated into the key fob, remote control, or similar type of device.

The haptic touch-based input device 110 may be provided with integrated haptic capabilities. For instance, the haptic touch-based input device 110 may be based on Electroactive Polymers (EAPs), which are deposited in a multilayer structure on top of a surface, e.g., a conventional (non-haptic) touchscreen, to provide a touch surface with spatially resolved haptic actuation by application of an electric field (see, e.g., US 2002/0054060 A1 , US 2011/0128239 A1 ). Thereby, changes in texture, vibrations, forces, or motion, can be rendered for sensing by an object interacting with, i.e., contacting or touching, the touch surface, such as a finger of a user operating the touch-based input device 110, or a stylus pen held by the user. The touch surface of the haptic touch-based input device 110 may alternatively be based on other known technologies for haptic actuation, including piezoelectric actuators, shape-shifting materials, linear resonant actuators, UV shape polymers, etc. The haptic touch-based input device 110 may alternatively comprise a non-haptic touch-based input device, e.g., a conventional touchscreen (i.e., a touch-sensitive display without integrated haptic capabilities), in combination with a haptic actuator which may be provided separately from the non-haptic touch-based input device 110. For instance, the haptic actuator may be based on ultrasonic haptic technology which enables creating a haptic sensation mid-air. Ultrasonic haptic technology utilizes ultrasonic focusing technology and modulation to apply desired tactile sensory stimuli to a certain point in mid-air, by controlling the phase and intensity of ultrasound pulses which are emitted by a set of ultrasound transducers. Such ultrasonic haptic actuator may be provided adjacent to the touch surface of the haptic touch-based input device 110 so as to provide a haptic sensation to the finger of the user when approaching, being close to, or contacting, the touch surface. As an example, Ultraleap offers a mid-air haptics Ul for automotive applications (https://www.ultraleap.com/enterprise/automotive/, retrieved on 16 March 2022).

The known haptic technologies described above may be used for rendering one or more haptic trails on the touch surface of the haptic touchbased input device 110, as is schematically illustrated in Figs. 3A-3C, which each show a top view onto the touch surface of the input device 110 to the left, and a cross section of the touch surface of the input device 110 to the right.

In the present context, a haptic trail 310 provides haptic feedback to an object 140 contacting the touch surface of the haptic touch-based input device 110, such as a finger or a stylus pen (in Fig. 3 sketched with a bold line), in the form of force, friction, motion, surface texture, etc, so as to guide the user to move his/her finger 140, or other object 140 he/she is holding to contact the touch surface, along the path defined by the haptic trail 310 along the touch surface, i.e., while the finger 140 or other object 140 is sliding across the touch surface. Haptic feedback is used to prevent the finger 140 or other object 140 contacting the touch surface from leaving the path defined by the haptic trail 310, unless a threshold force or barrier is overcome.

With reference to Fig. 3A, a haptic trail 310 may, e.g., be created by haptically rendering changes in topology of the touch surface of the input device 110 so as to create an elongated groove 312 along the path defined by the haptic trail 310, with a depth of a few tenths of a millimeter below the surrounding touch surface. As an alternative, which is sketched in Fig. 3B, the haptic trail 310 may be delimited by elongated barriers 322 extending along each side of the path defined by the haptic trail 310, the barriers 322 being raised a few tenths of a millimeter above the surrounding touch surface. As a further alternative, shown in Fig. 3C, a haptic trail 310 may be generated by rendering friction which opposes the motion of a tip of the finger 140 or other object 140 sliding along the haptic 310 trail but deviating from its center 331 (illustrated as a dashed line in Fig. 3C), so as to push or force the tip of the finger 140 or other object 140 back towards the center 331 of the haptic trail 310. This may be achieved by rendering a force onto the tip of the finger 140 or object 140 contacting the haptic trail 310, which increases with increasing distance of the tip of the finger 140 or other object 140 from the center 331 of the haptic trail 310. It will also be appreciated that haptic properties other than force may be used to guide the tip of the finger 140 or other object 140 when sliding along the haptic trail 310, including variations in texture or vibrations which assist the user in distinguishing the haptic trail 310 from surrounding areas of the touch surface.

Further with reference to Fig. 1 , the computing device 100 (herein also referred to as the computing device) for controlling a haptic touch-based input device 110 comprises processing circuitry 120. The processing circuitry 120, which is schematically illustrated in Fig. 2, may comprise one or more processors 211 , such as Central Processing Units (CPUs), microprocessors, application processors, application-specific processors, Graphics Processing Units (GPUs), and Digital Signal Processors (DSPs) including image processors, or a combination thereof, and a memory 212 comprising a computer program 213 comprising instructions. When executed by the processor(s) 211 , the instructions cause the computing device 100 to become operative in accordance with embodiments of the invention described herein. The memory 212 may, e.g., be a Random-Access Memory (RAM), a Read-Only Memory (ROM), a Flash memory, or the like. The computer program 213 may be downloaded to the memory 212 by means of a network interface circuitry (not shown in Figs. 1 and 2), as a data carrier signal carrying the computer program 213. The network interface circuitry may comprise one or more of a cellular modem (e.g., GSM, UMTS, LTE, 5G, or higher generation), a WLAN/Wi-Fi modem, a Bluetooth modem, an Ethernet interface, an optical interface, or the like, for exchanging data between the computing device 100 and other computing devices, communications devices, a radio-access network, and/or the Internet. The processing circuitry 210 may alternatively or additionally comprise one or more Application-Specific Integrated Circuits (ASICs), Field-Programmable Gate Arrays (FPGAs), or the like, which are operative to cause the computing device 100 to become operative in accordance with embodiments of the invention described herein.

The processing circuitry 120, or the computing device 100, may further comprise one or more interface circuitries 201 (“I/O” in Fig. 2) for communicating with the haptic touch-based input device 110, and optionally other circuitry or devices. The one or more interface circuitries 201 may be based on Universal Serial Bus (USB), the Vehicle Communication Interface (VCI), or the like.

More specifically, the processing circuitry 120 causes the computing device 100 to become operative to control the haptic touch-based input device 110 to render one or more primary haptic trails (in Figs. 4A-4D illustrated as solid lines connecting an entry area, marked “E”, and a trigger aera, marked “T”) on a touch surface of the input device 110. In Figs. 4A-4D, haptic trails are for simplicity illustrated as lines, where each line symbolizes a path defining a haptic trail 310 similar to what has been described hereinbefore and is illustrated in Figs. 3A-3C. Each haptic trail, both primary haptic trails as well as secondary haptic trails which are described below, extends between an entry area (marked “E” in Figs. 4A-4D) and a trigger area (marked “T” in Figs. 4A-4D). The entry aera of a haptic trail is the starting point of the haptic trail. The trigger area of each haptic trail is associated with an action which is triggered, e.g., started or initiated, in response to detecting a contact input to the trigger area, i.e., an object 140 (such as a finger) contacting the touch surface of the input device 110 within a defined region of the trigger area. An action which is triggered in response to detecting a contact input to the trigger area may, e.g., be an action which typically is triggered by operating or actuating a III element, such as starting an app or functionality of the computing device 100 or any other computing device which is operatively connected to, and controlled by, the computing device 100, changing a setting, or the like. Examples of actions associated with trigger aeras of haptic trails include, but are not limited to, controlling the operation of computing device 100 or a device or apparatus controlled by the computing device 100, and starting, stopping, or modifying, functionality performed by the computing device 100 or a device or apparatus controlled by the computing device 100, such as increasing or decreasing playout volume of a music player, switching a light on or off, increasing or decreasing a temperature, locking or unlocking a door, etc.

The computing device 100 becomes further operative to detect that the object 140 contacts the touch surface at the entry area (e.g., entry area E1/2 in Fig. 4A) of at least one of the one or more primary haptic trails and slides along a selected one of the one or more primary haptic trails towards the trigger area of the selected primary haptic trail (e.g., trigger area T2, as illustrated in Fig. 4A). The computing device 100 becomes further operative to trigger the associated action in response to detecting that the object 140 has reached the trigger area of a haptic trail (e.g., trigger area T2). This may either be achieved by detecting that the finger 140 or other object 140 has reached a certain position within the trigger area T2, e.g., a center point of the trigger area, or that the finger 140 or other object 140 has passed i.e., has slid past, a boundary of the trigger area T2. The trigger area may either be that of one of the primary haptic trails, such as trigger areas T 1 or T2, or that of a secondary haptic trail, such as trigger areas T3 or T4, described in further detail below. Optionally, the computing device 100 may be operative to control the input device 110 to render distinctive haptic feedback in response to detecting that the object 140 has reached the trigger area of a haptic trail. Thereby, the user can be notified that the object 140 has reached the trigger area and the associated action is about to be triggered.

The computing device 100 may be operative to control the input device 110 to render the one or more haptic trails with varying spatial arrangements. For instance, the computing device 100 may be operative to control the input device 110 to render two or more primary haptic trails with their respective entry areas at a common position on the touch surface, in Fig. 4A illustrated as entry area E1/2. The entry area E1/2 is a common entry area for both two primary haptic trials illustrated in Fig. 4A, a first primary haptic trail extending between the entry area E1/2 and the trigger area T1 , and a second primary haptic trail extending between the entry area E1/2 and the trigger area T2. In practice, the entry aera E1/2 is the position where the finger 140 or other object 140 contacts the touch surface of the input device 110 and subsequently slides along one of the primary haptic trails, herein referred to as the selected haptic trail. As an alternative to rendering the entry area E1/2 as a common entry area for both primary haptic trials, embodiments of the invention may render separate entry areas E1 and E2 at substantially the same, i.e., a common, position on the touch surface. As a further alternative, embodiments of the invention may render separate entry areas E1 and E2 at distinct positions on the touch surface in proximity of each other, preferably at least partially overlapping.

The computing device 100 may optionally be operative to control the input device 110 to render the two or more primary haptic trails having respective directions of entry into the haptic trails which are non-parallel. In the present context, the direction of entry is to be understood as the direction of the finger 140 or other object 140 entering a haptic trail at its entry area and sliding towards the selected haptic trial’s trigger area. As an example, reference is made to Fig. 4A, which illustrates two primary haptic trails having respective directions of entry into the haptic trails which are non-parallel.

The computing device 100 may alternatively be operative to control the input device 110 to render each of the one or more primary haptic trails with its entry area and its trigger area at a respective common position on the touch surface, in Fig. 4B illustrates as common entry/trigger area E/T. In practice, each of the rendered haptic trail forms a loop along which the finger 140 or other object 140 can slide either clockwise or counterclockwise. As an alternative to rendering the entry area and the trigger area of each haptic trail at a common position on the touch surface, embodiments of the invention may render the entry area and the trigger area at distinct positions on the touch surface in proximity of each other, preferably at least partially overlapping. Optionally, the trigger area of each one of the one or more primary haptic trails may be associated with two actions, and the computing device 100 may be operative to trigger one of the two actions based on a direction of the object 140 sliding along the selected primary haptic trail. In this case, each of the directions which the object 140 can slide along the haptic trail from its entry area to its trigger area, i.e. , clockwise or counterclockwise, is associated with a corresponding action. For instance, one of the directions may be associated with locking a door, while the other direction may be associated with unlocking the door. As a further example, one of the directions may be associated with increasing a temperature or light intensity, while the other direction may be associated with decreasing the temperature or light intensity, respectively.

The computing device 100 may optionally be operative to control the input device 110 to render the one or more primary haptic trails in response to detecting that the object 140 contacts the touch surface. This is advantageous in that the computing device 100 is alleviated from rendering the haptic trails unless the user starts interacting with the touch surface of the input device 110 using his/her finger 140 or other object 140. Optionally, the computing device 100 may be operative to control the input device 110 to render the one or more primary haptic trails with their respective entry areas at a position where the object 140 contacts the touch surface. This is advantageous in that the user does not need to contact the touch surface at a specific position, where the one or more primary haptic trails are rendered, but can simply touch the touch surface at a random position, in response to which the primary haptic trails are rendered.

The computing device 100 may be operative to control the input device 110 to render one or more of the not selected primary haptic trails with their respective entry areas at a current position where the object 140 contacts the touch surface while sliding along the selected primary haptic trail. In other words, the entry areas of primary haptic trails other than the selected primary haptic trail follow the finger 140 or other object 140 while sliding along the selected primary haptic trail. This allows the user to change his/her mind by deviating from the selected primary haptic trail and entering one of the other primary haptic trails. As an example, Fig. 4C illustrates that the user has selected the first primary haptic trail, extending between the entry area E1 and the trigger area T1 , by sliding with his/her finger 140, or other object 140, along the first primary haptic trail towards the trigger area T1. As can be seen in Fig. 4C, the entry area E2 of the second primary haptic trail, extending between the entry area E2 and the trigger area T2, follows the tip of the finger 140 or other object 140 while sliding along the first primary haptic trail. Advantageously, this provides the user operating the input device 110 with the opportunity to select a different haptic trail before reaching the trigger area (T1 in Fig. 4C) of the initially selected primary haptic trail. For the example illustrated in Fig. 4C, the user may deviate from the path defined by the first primary haptic trail towards the trigger area T1 and instead follow the path defined by the second primary haptic trail along the path towards trigger area T2. Optionally, the computing device 100 may be operative to control the input device 110 to render distinctive haptic feedback to make the user aware of the possibility to deviate from the selected primary haptic trail and instead select one of the other primary haptic trails. This may, e.g., be achieved by controlling the input device 100 to render the selected primary haptic trail, or the entry area E2 which is following the tip of the finger 140 or other object 140 while sliding along the selected primary haptic trail, with haptic properties different from a selected primary haptic trail in scenarios where the user cannot select a different primary haptic trail after the finger 140 or other object 140 has started sliding along the selected primary haptic trail.

As an alternative, the computing device 100 may be operative to control the input device 110 to render one or more of the not selected primary haptic trails with their respective entry areas at a current position where the object 140 contacts the touch surface in response to receiving an indication that a user operating the input device 110 intends to select a different primary haptic trail. That is, entry areas of the one or more of the not selected primary haptic trails do not follow the tip of the finger 140 or other object 140 while sliding along the initially selected primary haptic trail as described above. Rather, in response to receiving an indication that user intends to select a different one than the selected primary haptic trail, the one or more not selected primary haptic trails are rendered such that their entry areas are located at the current point of contact of the finger 140 or other object 140 with the touch surface. With reference to Fig. 4C, the entry area E2 of the second primary haptic trial, which is not the haptic trial initially selected by the user, would remain at its original position illustrated as entry area E1/2 in fig. 4A), until the computing device 100 receives the indication that the user intends to select a different primary haptic trail. This allows the user to select a different primary haptic trail by deviating from the currently selected primary haptic trail. The computing device 100 may be operative to receive the indication that a user operating the input device 110 intends to select a different primary haptic trail by any one, or a combination, of: detecting that the object 140 sliding along the selected primary haptic trail has reversed direction, detecting that the object 140 sliding along the selected primary haptic trail has stopped sliding, and detecting a change in pressure on the touch surface of the object 140 sliding along the selected primary haptic trail. These are actions which the user operating the input device 110 intuitively would perform when has decided to select a different primary haptic trail than the currently selected primary haptic trail. The input device 110 may be operative to measure a change in pressure on the touch surface exerted by the object 140 by means of a pressure or force sensor which is integrated into the touch surface, e.g., a piezo electric sensor or a MEMS sensor. Alternatively, the computing device 100, and/or the input device 110, may be operative to detect a change in pressure on the touch surface exerted by the finger 140 by detecting a change in the contact area between the finger 140 and the touch surface, as the contact area between the (soft) tip of the finger 140 and the touch surface changes when the force exerted by the finger 140 changes.

The computing device 100 may alternatively be operative to receive an indication that the user operating the input device 110 intends to select a different primary haptic trail as a spoken instruction, or by detecting that the user is shaking the computing device 100. Optionally, the computing device 100 may be operative to control the input device 110 to render distinctive haptic feedback in response to receiving an indication that the user operating the input device 110 intends to select a different primary haptic trail, similar to what is described hereinbefore. Thereby, the user is made aware of the possibility to deviate from the selected primary haptic trail and instead select one of the other primary haptic trails.

The computing device 100 may further be operative to control the input device 110 to render one or more secondary haptic trails in response to detecting that the object 140 slides along the selected primary haptic trail. In the present context, secondary haptic trails are haptic trails which are rendered after, i.e., in response to, detecting that the object 140 has started sliding along one of the one or more primary haptic trails, i.e., one of the primary haptic trails has been selected by the user. These one or more secondary haptic trails may, e.g., be associated with alternative actions related to the action which is associated with the selected primary haptic trail. For instance, and with reference to Fig. 4D, if the trigger area T1 of the currently selected primary haptic trail is associated with switching on music playout in a car (whereas changing the temperate inside the car may be associated with an alternative primary haptic trail having trigger area T2), two secondary haptic trails may be rendered with trigger areas T3 and T4, which are associated with increasing and decreasing the playout volume, respectively. The computing device 100 may optionally be operative to control the input device 110 to render the one or more secondary haptic trails with their respective entry areas E3/4 at a current position where the object 140 contacts the touch surface while sliding along the selected primary haptic trail. Thereby, the different alternatives in the form of secondary haptic trails are available to the user wherever the finger 140 or other object 140 currently is located while sliding along the selected primary haptic trail, such that the user can deviate from the selected primary haptic trail to select one of the secondary haptic trails and eventually trigger one of the actions which are associated with the trigger areas T3 or T4 of the secondary haptic trails. In other words, the secondary haptic trails, or rather their respective entry area(s) E3/4, follow the finger 140 or other object 140 while sliding along the selected primary haptic trail. Optionally, the computing device 100 may be operative to control the input device 110 to render distinctive haptic feedback to make the user aware of the possibility to select a secondary haptic trail. This may, e.g., be achieved by controlling the input device 100 to render the selected primary haptic trail, or the entry areas E3/4 of the secondary haptic trails, with haptic properties different from a selected primary haptic trail in scenarios where the user cannot select a secondary haptic trail.

The computing device 100 may be operative to control the input device 110 to render the one or more secondary haptic trails based on the selected primary haptic trail. That is, whether the secondary haptic trails are rendered, or not, how many secondary haptic trails are rendered, and/or their spatial arrangement, may be dependent on which of the primary haptic trails is selected by the user by sliding his/her finger 140 or other object 140 along the one of the primary haptic trails.

The computing device 100 may further be operative to control the input device 110 to associate the respective actions with the trigger areas T3 and T4 of the secondary haptic trails based on the selected primary haptic trail. In other words, which actions are associated with the trigger areas T3 and T4 of the secondary haptic trails depends on which primary haptic trail is selected. For instance, as an alternative to the example of music playout described above, if the primary haptic trail relates to heating, the trigger areas T3 and T4 of the secondary haptic trails may be associated with increasing or decreasing a temperature setting, respectively.

The computing device 100 may further be operative to control the input device 110 to render the haptic trails, primary and/or secondary, with a spatial arrangement based on any one, or a combination, of: a position where the object 140 contacts the touch surface, a type of the object 140 (e.g., a finger or a stylus pen), an identity of the object 140 (e.g., a stylus pen may transmit a unique identifier, such as a MAC address or a Bluetooth device address, using a radio transmitter), a current location of the input device 110, an environment of the input device 110 (e.g., indoors or outdoors, based on a current temperature, etc), and an identity of a user of the input device (i.e. , the user operating the object 140, e.g., the user which is logged in to the computing device 100).

The computing device 100 may further be operative to associate the respective actions with the trigger areas of the haptic trails, primary and/or secondary, based on any one, or a combination, of: a position where the object 140 contacts the touch surface, a type of the object 140 (e.g., a finger or a stylus pen), an identity of the object 140 (e.g., a stylus pen may transmit a unique identifier, such as a MAC address or Bluetooth device address, using a radio transmitter), a current location of the input device 110, an environment of the input device 110 (e.g., indoors or outdoors, based on a current temperature, etc), and an identity of a user of the input device 110 (i.e., the user operating the object 140, e.g., the user which is logged in to the computing device 100). For instance, the computing device 100 may be operative to identify the user operating the input device by recognizing a fingerprint of the finger 140 contacting the touch surface of the input device 110.

The computing device may further be operative to control the input device 110 to cease rendering the one or more of the not selected haptic trails (except, or including, the trigger area), primary and/or secondary, in response to detecting that the object 140 has reached the trigger area of a haptic trail. This means that the action associated with the trigger area of the selected haptic trail is about to be triggered and cannot be reversed. Optionally, the computing device 100 may be operative to control the input device 110 to cease rendering the one or more of the not selected haptic trails in response to detecting that the object 140 has come closer to the trigger area than a threshold distance from the trigger area. Thereby, the input device 110 ceases rendering the not selected haptic trail before the action associated with the trigger area of the selected haptic trail is triggered.

The computing device 100 may further be operative to control the input device 110 to render the haptic trails haptically distinguishably. In the present context, “haptically distinguishable” means that the haptic trails are rendered with a haptic contrast relative to each other, i.e. , with respective haptic properties which are sufficiently different such that the user touching the haptic trails, e.g., by sliding his/her finger 140 or other object 140 along the haptic trails, can sense a difference between the different haptic trails. In particular, the primary haptic trails or the secondary haptic trails may be rendered haptically distinguishable to facilitate selecting the desired haptic trail by the user.

In the following, embodiments of a method 500 a method of controlling a haptic touch-based input device 110 are described with reference to Fig. 5. The method 500 is performed by a computing device 100 and comprises controlling 502 the input device 110 to render one or more primary haptic trails 310 on a touch surface of the input device 110. Each haptic trail extends between an entry area and a trigger area. The trigger area is associated with an action which is triggered in response to detecting a contact input to the trigger area. The method 500 further comprises detecting 503 that an object 140 contacts the touch surface at the entry area of at least one of the one or more primary haptic trails and slides along a selected one of the one or more primary haptic trails towards the trigger area of the selected primary haptic trail. The method further comprises triggering 509 the associated action in response to detecting 503 that the object has reached the trigger area of a haptic trail. The haptic trails may be rendered haptically distinguishably. The object 140 may be any one of: a finger and a stylus pen. The haptic touch-based input device 110 may be a haptic touchscreen. The two or more primary haptic trails may be rendered 502 with their respective entry areas at a common position on the touch surface. Optionally, the two or more primary haptic trails are rendered 502 having respective directions of entry into the haptic trails which are non-parallel.

Alternatively, each of the one or more primary haptic trails is rendered 502 with its entry area and its trigger area at a respective common position on the touch surface. Optionally, the trigger area of each one of the one or more primary haptic trails is associated with two actions, and one of the two actions is triggered 509 based on a direction of the object sliding along the selected primary haptic trail.

The one or more primary haptic trails may be rendered 502 in response to detecting 501 that the object 140 contacts the touch surface. Optionally, the one or more primary haptic trails are rendered 502 with their respective entry areas at a position where the object contacts the touch surface.

The method 500 may further comprise controlling 506 the input device to render one or more secondary haptic trails in response to detecting 503 that the object slides along the selected primary haptic trail. Optionally, the one or more secondary haptic trails are rendered 506 with their respective entry areas at a current position where the object 140 contacts the touch surface while sliding along the selected primary haptic trail.

The one or more secondary haptic trails may be rendered 506 based on the selected primary haptic trail.

The respective actions may be associated with the trigger areas of the secondary haptic trails based on the selected primary haptic trail.

The method 500 may further comprise controlling 505 the input device 110 to render one or more of the not selected primary haptic trails with their respective entry areas at a current position where the object 140 contacts the touch surface while sliding along the selected primary haptic trail. The method 500 may alternatively further comprise controlling 505 the input device 110 to render one or more of the not selected primary haptic trails with their respective entry areas at a current position where the object 140 contacts the touch surface in response to receiving 504 an indication that a user operating the input device 110 intends to select a different primary haptic trail. Optionally, the receiving 504 an indication that a user operating the input device 110 intends to select a different primary haptic trail comprises any one, or a combination, of: detecting that the object 140 sliding along the selected primary haptic trail has reversed direction, detecting that the object 140 sliding along the selected primary haptic trail has stopped sliding, and detecting a change in pressure on the touch surface of the object 140 sliding along the selected primary haptic trail. The input device 110 may be operative to measure a change in pressure on the touch surface exerted by the object 140 by means of a pressure or force sensor which is integrated into the touch surface, e.g., a piezo electric sensor or a MEMS sensor. Alternatively, the computing device 100, and/or the input device 110, may be operative to detect a change in pressure on the touch surface exerted by the finger 140 by detecting a change in the contact area between the finger 140 and the touch surface, as the contact area between the (soft) tip of the finger 140 and the touch surface changes when the force exerted by the finger 140 changes.

The haptic trails may be rendered with a spatial arrangement based on any one, or a combination, of: a position where the object 140 contacts the touch surface, a type of the object 140, an identity of the object 140, a current location of the input device 110, an environment of the input device 110, and an identity of a user of the input device 110.

The respective actions may be associated with the trigger areas of the haptic trails based on any one, or a combination, of: a position where the object 140 contacts the touch surface, a type of the object 140, an identity of the object 140, a current location of the input device 110, an environment of the input device 110, and an identity of a user of the input device 110.

The method 500 may further comprise controlling 508 the input device 110 to cease rendering the one or more of the not selected haptic trails in response to detecting 507 that the object has reached the trigger area of a haptic trail.

It will be appreciated that the method 500 may comprise additional, alternative, or modified, steps in accordance with what is described throughout this disclosure. An embodiment of the method 500 may be implemented as the computer program 213 comprising instructions which, when the computer program 213 is executed by the computing device 100 cause the computing device 100 to carry out the method 500 and become operative in accordance with embodiments of the invention described herein. The computer program 213 may be stored in a computer-readable data carrier, such as the memory 212. Alternatively, the computer program 213 may be carried by a data carrier signal, e.g., downloaded to the memory 212 via the network interface circuitry (not shown in Figs. 1 and 2).

The person skilled in the art realizes that the invention by no means is limited to the embodiments described above. On the contrary, many modifications and variations are possible within the scope of the appended claims.