Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
HAPTIC REALIGNMENT CUES FOR TOUCH-INPUT DISPLAYS
Document Type and Number:
WIPO Patent Application WO/2019/046523
Kind Code:
A1
Abstract:
A touch-sensitive display of an electronic device is operated in conjunction with a notification system configured to provide haptic, acoustic, and/or visual output to cue a user to align and/or maintain the user's finger positioning relative to one or more virtual input regions, such as virtual keys of a virtual keyboard, presented on the touch-sensitive display.

Inventors:
WANG PAUL X (US)
BULLOCK IAN M (US)
Application Number:
PCT/US2018/048733
Publication Date:
March 07, 2019
Filing Date:
August 30, 2018
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
APPLE INC (US)
International Classes:
G06F3/0488; G06F3/01; G06F3/0481
Foreign References:
US20150293592A12015-10-15
US20100156818A12010-06-24
US20120113008A12012-05-10
US20100231550A12010-09-16
US20170097681A12017-04-06
Other References:
None
Attorney, Agent or Firm:
HEMENWAY, S. Craig et al. (US)
Download PDF:
Claims:
CLAIMS

What is claimed is:

1 . An electronic device comprising:

a touch-sensitive display defining an input surface configured to receive a touch input to a virtual key presented by the touch-sensitive display;

a haptic element coupled to the input surface; and

a controller in communication with the haptic element and the touch-sensitive display and configured to:

determine a distance between the touch input and a central region of the virtual key;

actuate the haptic element to provide a first haptic output based on the determined distance; and

actuate the haptic element to provide a second haptic output upon determining that the touch input overlaps a boundary of the virtual key. 2. The electronic device of claim 1 , wherein the haptic element comprises a

piezoelectric material coupled to the touch-sensitive display.

3. The electronic device of claim 2, wherein the piezoelectric material is configured to induce a bending moment into the input surface such that, when actuated, the haptic element generates a localized haptic output through the input surface below the virtual key. 4. The electronic device of claim 1 , wherein the haptic element is configured to vibrate the input surface at a frequency between 100Hz and 200Hz to produce the first haptic output.

5. The electronic device of claim 1 , wherein the haptic element is configured to vibrate the input surface at an ultrasonic frequency to produce the second haptic output. 6. The electronic device of claim 1 , wherein the haptic element is configured to laterally shift the input surface to produce the first haptic output.

7. The electronic device of claim 1 , wherein a frequency or amplitude of the first haptic output is based, at least in part, on the determined distance.

8. The electronic device of claim 1 , wherein the first haptic output is localized to the touch input.

9. A method of operating a touch-sensitive display positioned below an input surface, the method comprising: displaying a virtual keyboard on the touch-sensitive display;

receiving a touch input at least partially overlapping a boundary of a virtual key of the virtual keyboard;

determining a distance between the touch input and a central region of the virtual key; and

providing a haptic output at least partially localized to the touch input, the haptic output based, at least in part, on the determined distance.

10. The method of claim 9, wherein the haptic output is localized to the boundary of the virtual key. 1 1 . The method of claim 9, wherein the haptic output is localized to the virtual key.

12. The method of claim 9, wherein the haptic output comprises decreasing friction between the input surface and an object in contact with the input surface.

13. The method of claim 9, further comprising providing an acoustic output

simultaneously with the haptic output. 14. The method of claim 9, further comprising providing a visual output on the touch- sensitive display simultaneously with the haptic output.

15. A method of operating a touch-sensitive display positioned below an input surface, the method comprising:

receiving a touch input to a virtual key of a virtual keyboard presented by the touch- sensitive display;

providing a first haptic output, localized to the virtual key, upon determining that the touch input is located in a central region of the virtual key; and

providing a second haptic output, localized to a boundary of the virtual key, upon determining that the touch input is not located in the central region of the virtual key. 16. The method of claim 15, wherein the first haptic output is different from the second haptic output.

17. The method of claim 16, wherein:

the first haptic output corresponds to a first vibration of the input surface at a first frequency; and

the second haptic output corresponds to a second vibration of the input surface at a second frequency.

18. The method of claim 17, wherein the first frequency is related to a distance between the touch input and the central region of the virtual key.

19. The method of claim 17, wherein an amplitude of the first vibration is related to a distance between the touch input and the central region of the virtual key. 20. The method of claim 15, further comprising:

providing a first acoustic output with the first haptic output; and

providing a second acoustic output with the second haptic output.

Description:
HAPTIC REALIGNMENT CUES FOR TOUCH-INPUT DISPLAYS

CROSS-REFERENCE TO RELATED APPLICATION

[0001] This Patent Cooperation Treaty patent application claims priority to U.S. Non- provisional Patent Application No. 16/041 ,659, filed July 20, 2018, and titled "Haptic Realignment Cues for Touch-Input Displays," and U.S. Provisional Patent Application No. 62/553,041 , filed August 31 , 2017, and titled "Haptic Realignment Cues for Touch-Input Displays," the contents of which are incorporated herein by reference in their entirety.

FIELD

[0002] Embodiments described herein relate to graphical user interfaces for electronic devices, and, in particular, to systems and methods for providing haptic realignment cues to a user interacting with virtual keys of a graphical user interface presented by a touch-input display of an electronic device.

BACKGROUND

[0003] An electronic device can include a planar touch-sensitive display for presenting a graphical user interface to interact with a user. The graphical user interface can render multiple virtual keys, such as buttons or keys, that may be selected (e.g., touched) by the user to input specific information to the electronic device. However, in some cases, a user's finger may unintentionally drift while providing input to a particular virtual key, resulting in incorrect or undetected input to the electronic device.

SUMMARY

[0004] Embodiments described herein generally reference notification systems configured to generate haptic outputs, sounds, and visual effects that provide cues (e.g., alignment cues, realignment cues, centering cues, and so on) to a user to adjust or maintain that user's finger positioning when providing touch and/or force input to a particular virtual key - or other virtual input region - of a virtual keyboard shown on a graphical user interface presented or rendered by a (typically planar) touch-sensitive display of an electronic device.

[0005] In particular, in many embodiments, the notification system includes a haptic output subsystem configured to generate a global, semi-local, or local haptic output by varying one or more output characteristics of one or more global, semi-local, or local haptic output elements (e.g., vibrating elements, vertical or horizontal displacement elements, acoustic elements, electrostatic elements, and so on). The haptic output generated by the haptic output subsystem can be varied based on substantially real-time touch and/or force input to the touch-sensitive display of the electronic device. For example, the haptic output generated by the haptic output subsystem can be varied based on a location of a touch input, a magnitude of a force input, an acceleration of a gesture input, and so on. In these examples, the haptic output generated by the haptic output subsystem is configured to provide a haptic cue to the user to either maintain the user's finger positioning or, in the alternative, to adjust the user's finger positioning with respect to a particular virtual key of the virtual keyboard. In other cases, a touch input can be provided to the input surface by an object, such as a stylus.

[0006] In further embodiments, more than one haptic output can be provided by the haptic output subsystem to cue the user to adjust the user's finger positioning relative to a particular virtual key. For example, the notification system can instruct the haptic output subsystem to generate haptic outputs with properties (e.g., amplitude, frequency, location, and so on) proportional or otherwise related to a distance between the user's finger and a central region of the key. For example, if the user presses a virtual key in the center of that key, a first haptic output can be provided. As the user's finger drifts toward a boundary of the virtual key, eventually overlapping the boundary, a magnitude of the first haptic output can be changed, cuing the user to re-center the drifting finger. Once the user's finger drifts to overlap the boundary of the virtual key, a second haptic output can be provided, cuing the user to re-align the drifting finger.

[0007] In still further embodiments, the notification system is configured to generate one or more sounds to provide realignment cues to a user. For example, a first sound can be generated if the user presses a virtual key in the center. As the user's finger drifts toward a boundary of the virtual key, the first sound can be changed, cuing the user to re-center the drifting finger. Once the user's finger drifts to overlap the boundary of the virtual key, a second sound can be provided, cuing the user to re-align the drifting finger.

[0008] In still further embodiments, the notification system can trigger one or more visual effects to provide realignment cues to the user. For example, a first visual effect can be generated if the user presses a virtual key in the center. As the user's finger drifts toward a boundary of the virtual key, the first visual effect can be changed, cuing the user to re-center the drifting finger. Once the user's finger drifts to overlap the boundary of the virtual key, a second visual effect can be provided, cuing the user to re-align the drifting finger.

[0009] In still further embodiments, a notification system can be configured to provide multiple substantially simultaneous haptic outputs, acoustic outputs, and visual effects to provide realignment cues to the user.

BRIEF DESCRIPTION OF THE FIGURES

[0010] Reference will now be made to representative embodiments illustrated in the accompanying figures. It should be understood that the following descriptions are not intended to limit the disclosure to a finite set of preferred embodiments. To the contrary, it is intended that the following description covers alternatives, modifications, and equivalents as may be included within the spirit and scope of the described or depicted embodiments and as defined by the appended claims.

[0011] FIG. 1 depicts an electronic device that can incorporate a notification system operated in conjunction with a touch-sensitive display.

[0012] FIG. 2 depicts the electronic device of FIG. 1 , specifically depicting the touch- sensitive display presenting a graphical user interface including a virtual keyboard.

[0013] FIG. 3A depicts a virtual key that may be touched by a user and, additionally, a simplified graph depicting an example relationship between the output of a notification system, such as described herein, and the location of the user's finger relative to a boundary of the virtual key.

[0014] FIG. 3B depicts a virtual key and, additionally, a simplified graph depicting an example relationship between the output of a notification system, such as described herein, and the location of the user's finger relative to a center point and a boundary of the virtual key.

[0015] FIG. 4A depicts a virtual key and, additionally, a simplified graph depicting another example relationship between the output of a notification system, such as described herein, to the location of the user's finger relative to a center point and a boundary of the virtual key.

[0016] FIG. 4B depicts a virtual key and, additionally, a simplified graph depicting another example relationship between the output of a notification system, such as described herein, and the location of the user's finger relative to a center point and a boundary of the virtual key.

[0017] FIG. 5 depicts a simplified system diagram of a notification system such as described herein.

[0018] FIG. 6A depicts a simplified view of a haptic output subsystem including a set of haptic elements that may be positioned below an input surface of a touch-sensitive display.

[0019] FIG. 6B depicts a simplified cross-section, taken through section line A-A of FIG. 6A, of an example haptic element of the haptic output subsystem of FIG. 6A, specifically showing an outward deformation of the input surface in response to an actuation of the haptic element.

[0020] FIG. 6C depicts the haptic element of FIG. 6B, specifically showing an inward deformation of the input surface in response to an actuation of the haptic element.

[0021] FIG. 6D depicts the haptic element of FIG. 6B, specifically showing an arbitrary deformation of the input surface in response to an actuation of the haptic element. [0022] FIG. 6E depicts a simplified cross-section of another example haptic element that can be associated with the haptic output subsystem of FIG. 6A.

[0023] FIG. 6F depicts a simplified cross-section of yet another example haptic element that can be associated with the haptic output subsystem of FIG. 6A.

[0024] FIG. 6G depicts a simplified cross-section of yet another example haptic element that can be associated with the haptic output subsystem of FIG. 6A.

[0025] FIG. 6H depicts a simplified cross-section of yet another example haptic element that can be associated with the haptic output subsystem of FIG. 6A.

[0026] FIG. 7A depicts a virtual keyboard that may be presented by a graphical user interface, particularly showing an example visual realignment cue.

[0027] FIG. 7B depicts a virtual keyboard showing another example visual realignment cue.

[0028] FIG. 7C depicts a virtual keyboard showing another example visual realignment cue.

[0029] FIG. 7D depicts a virtual keyboard showing another example visual realignment cue.

[0030] FIG. 7E depicts a virtual keyboard showing another example visual realignment cue.

[0031] FIG. 7F depicts a virtual keyboard showing another example visual realignment cue.

[0032] FIG. 8 is a flowchart depicting example operations of a method of providing realignment cues to a user of a touch-sensitive display.

[0033] FIG. 9 is a flowchart depicting example operations of another method of providing realignment cues to a user of a touch-sensitive display.

[0034] The use of the same or similar reference numerals in different figures indicates similar, related, or identical items.

[0035] The use of cross-hatching or shading in the accompanying figures is generally provided to clarify the boundaries between adjacent elements and also to facilitate legibility of the figures. Accordingly, neither the presence nor the absence of cross-hatching or shading conveys or indicates any preference or requirement for particular materials, material properties, element proportions, element dimensions, commonalities of similarly illustrated elements, or any other characteristic, attribute, or property for any element illustrated in the accompanying figures. [0036] Additionally, it should be understood that the proportions and dimensions (either relative or absolute) of the various features and elements (and collections and groupings thereof) and the boundaries, separations, and positional relationships presented

therebetween, are provided in the accompanying figures merely to facilitate an

understanding of the various embodiments described herein and, accordingly, may not necessarily be presented or illustrated to scale, and are not intended to indicate any preference or requirement for an illustrated embodiment to the exclusion of embodiments described with reference thereto.

DETAILED DESCRIPTION

[0037] Embodiments described herein generally reference systems and methods for providing finger realignment cues to a user interacting with an exterior surface of a touch- sensitive display.

[0038] More particularly, embodiments described herein relate to systems and methods for operating a notification system in conjunction with a graphical user interface shown on a touch-sensitive display of an electronic device. The notification system is configured to generate one or more haptic, acoustic, and/or visual outputs that provide cues (e.g., key press/make outputs, haptic realignment cues, and so on) to a user operating the keyboard. In many examples, the notification system is configured to provide the haptic, acoustic, and/or visual cues to the user to simulate or emulate a physical key press and, additionally, to encourage a user to adjust or maintain that user's finger positioning when providing input. Haptic, acoustic, and/or visual outputs can be provided separately or together and may vary from embodiment to embodiment.

[0039] For example, in one implementation, an electronic device includes a touch- sensitive display that presents a graphical user interface including a virtual keyboard with virtual keys. The user types on the virtual keyboard by touching an exterior surface of the touch-sensitive display, generally referred to herein as the "input surface." As the user touches the input surface to repeatedly press specific virtual keys, a processor of the notification system can detect drift of the user's fingers relative to a central region of one or more of the virtual keys. It may be appreciated that, as used herein, the phrase "central region" is not intended to convey precise geometrical constraints or limitations; a central region is understood to be an area or point generally located over, nearby, or adjacent to a geometric center or centroid of a virtual key (or other virtual input region).

[0040] Once a drift is detected with respect to a particular virtual key, the notification system is configured to provide one or more haptic, acoustic, and/or visual outputs to cue the user that the drifting finger should be realigned (e.g., alignment cues, realignment cues, centering cues, and so on). For example, a boundary of the virtual key can be vibrated (e.g., at a selected frequency, magnitude, and so on) by a haptic element in response to a detected drift toward that boundary. In other cases, the haptic element can be actuated after a user's finger overlaps the boundary. In another example, the input surface can be laterally shifted in the same direction as a detected drift. In yet another example, a haptic output and an acoustic output normally provided when a virtual key is pressed in the center can be increased in magnitude and volume, respectively, in response to a detected drift toward a boundary of that virtual key.

[0041] In further examples, other outputs and/or sets of outputs can be provided by the notification system to cue a user to adjust the user's finger positioning.

[0042] These and other embodiments are discussed below with reference to FIGs. 1 -9. However, those skilled in the art will readily appreciate that the detailed description given herein with respect to these figures is for explanatory purposes only and should not be construed as limiting.

[0043] FIG. 1 depicts an electronic device 100 including a housing 102 and a touch- sensitive display 104. The touch-sensitive display 104 may be operated in conjunction with a notification system, such as described in detail below.

[0044] The housing 102 of the electronic device 100 can form an outer surface and protective case for the internal components of the electronic device 100, including the notification system. In the illustrated embodiment, the housing 102 is formed in a substantially rectangular shape, although this is not required. The housing 102 can be formed of one or more components operably connected together, such as a front piece and a back piece or a top clamshell and a bottom clamshell. Alternatively, the housing 102 can be formed of a single piece (e.g., uniform body). The housing 102 may be planar, or may be partially or entirely curved.

[0045] The touch-sensitive display 104 may include one or more touch sensors and/or force sensors that are configured to detect various combinations of user touch and force input on an exterior surface (e.g., the "input surface") of the touch-sensitive display 104. More specifically, the touch and/or force sensors may be used separately or in combination to interpret a broad range of user inputs such as, but not limited to: touch-based gestures; force-based gestures; touch patterns; tap patterns; single-finger gestures; multi-finger gestures; multi-force gestures; and so on. The touch sensors and/or force sensors may be configured to interpret user input by comparing real-time touch and/or force input to one or more thresholds that may be static or variable, such as, but not limited to: down-stroke force thresholds; upstroke force thresholds; movement thresholds; force magnitude thresholds; location thresholds; and so on. In addition, the touch and/or force sensors of the touch- sensitive display 104 may be configured to detect rates of change in touch input, force input, gesture input, or any combination thereof, that is provided by a user to the input surface.

[0046] The touch and/or force sensors associated with the touch-sensitive display 104 may be implemented in any number of suitable ways with any suitable technology or combination of technologies including, but not limited to: self-capacitance touch sensing; mutual capacitance touch sensing; resistive touch sensing; optical touch sensing; acoustic touch sensing; capacitive force sensing; strain-based force sensing; optical force sensing; acoustic force sensing; and so on, or any combination thereof. The touch and/or force sensors may be independently or mutually addressable and may be distributed and/or segmented and disposed relative to an active display region and/or a bezel region of the touch-sensitive display 104.

[0047] It may be appreciated that the touch-sensitive display 104 can be implemented with any suitable technology, including, but not limited to, a multi-touch or multi-force sensing touchscreen that uses liquid crystal display technology, light-emitting diode technology, organic light-emitting display technology, organic electroluminescence technology, or another type of display technology.

[0048] As noted above the electronic device 100 also includes a notification system (described in greater detail with reference to FIGs. 3A- 9) that is operated in conjunction with a graphical user interface presented by the touch-sensitive display 104. The notification system is configured to provide one or more haptic, acoustic, or visual cues to a user interacting with the touch-sensitive display 104.

[0049] For example, FIG. 2 depicts an electronic device 200 that includes a housing 202 and a touch-sensitive display 204 defining an input surface 204a that can receive a touch input from a user. The housing 202 and the touch-sensitive display 204 can be configured or implemented as described above with respect to FIG. 1 , or in any other suitable manner. As with the electronic device 100 depicted in FIG. 1 , the electronic device 200 includes a notification system, such as described herein.

[0050] In the illustrated example, the touch-sensitive display 204 presents a graphical user interface below the input surface 204a including a virtual keyboard 206. The virtual keyboard 206 includes several rows of virtual keys, each of which may be selected by a user by touching or applying a force to the input surface 204a (e.g., which may be a portion of the housing 202 or the touch-sensitive display 204 above the touch-sensitive display 204) above the virtual keyboard 206. In the illustrated embodiment, example touch locations of an example user's fingers on the input surface 204a are depicted by dotted ellipses. Fingers of the user's left hand are labeled as the fingers 208a-208e and fingers of the user's right hand are labeled as the fingers 210a-210e. [0051 ] While typing on the virtual keyboard (e.g., contacting the input surface 204a with one or more fingers), the user's fingers may unintentionally drift on the input surface 204a away from center points or central regions of the virtual keys, resulting in incorrect or undetected input to the electronic device 200. For example, the user's left-hand third finger 208c is depicted as drifting to the right of the virtual key 212, the user's right-hand index finger 210a is depicted as drifting to the left of the virtual key 214, and the user's right-hand thumb 21 Oe is depicted as drifting downwardly relative to the virtual key 216.

[0052] In the illustrated embodiment, the notification system (described in detail with respect to FIGs. 3A - 6H) of the electronic device 200 and/or the touch-sensitive display 204 can be configured to monitor the user's finger positioning on the input surface 204a relative to one or more of the virtual keys of the virtual keyboard 206 shown on the touch-sensitive display 204.

[0053] For example, the touch-sensitive display 204 (and/or the notification system) can monitor placement of the user's right-hand index finger 210a on the input surface 204a relative to a central region of the virtual key 214 of the virtual keyboard 206. If the distance between sequential placements of the user's right-hand index finger 210a and the central region of the virtual key 214 increases beyond a threshold or at a particular rate, the touch- sensitive display 204 can determine that the user's right-hand index finger 210a has drifted on the input surface 204a. In one example, the distance between the user's touch input and the central region of the virtual key 214 is based on a distance between the centroid of the touch input and the centroid of the virtual key. In other examples, the distance between the user's touch input and the central region of the virtual key is the minimum distance between the centroid of the virtual key and a perimeter of an area corresponding to the user's touch input. These are merely examples; it may be appreciated that any suitable distance measurement or representation of relative separation between a user's touch input and the central region of the virtual key can be used.

[0054] In response to the touch-sensitive display 204 (and/or the notification system) detecting drift, the notification system can generate one or more haptic outputs through the input surface 204a to cue the user to adjust the positioning of the user's right-hand index finger 210a back toward the central region of the virtual key 214. For example, the notification system can include a haptic output subsystem configured to locally vibrate the virtual key 214 to cue the user to adjust the positioning of the user's right-hand index finger 210a back toward the central region of the virtual key 214. In some examples, the virtual key 214 can be vibrated at a frequency between 100Hz and 200Hz. In other cases, the virtual key 214 can be vibrated at a higher frequency, such as an ultrasonic frequency. [0055] In other cases, the notification system can generate one or more acoustic outputs via a speaker within the electronic device 200 to cue the user to adjust the position of the user's right-hand index finger 210a back toward the central region of the virtual key 214. For example, the notification system can generate different sounds based on the user's finger position relative to the virtual key 214. For example, the notification system can generate a sharp click sound when the user's right-hand index finger 210a presses the virtual key 214 when correctly aligned with the central region and a dull or hollow sound when the user's right-hand index finger 210a presses the virtual key 214 when incorrectly aligned with the central region of the virtual key 214.

[0056] In yet other examples, the notification system can generate one or more visual effects via the touch-sensitive display 204 to cue the user to adjust the position of the user's right-hand index finger 210a back toward the central region of the virtual key 214. For example, the notification system can generate an after-image corresponding to an incorrect placement of the user's right-hand index finger 210a.

[0057] Further example operations that may be performed by the notification system in response to a detected finger drift, include, but may not be limited to: vibrating a boundary of the virtual key 214; vibrating a boundary of the virtual key 214 at a different frequency or amplitude than a central region of the virtual key 214; vibrating a region of the input surface 204a between adjacent keys; increasing or decreasing friction between the user's finger and the input surface 204a; increasing or decreasing friction between the user's finger and a boundary of the virtual key 214; increasing or decreasing friction between the user's finger and a region of the input surface 204a between adjacent keys; increasing or decreasing a magnitude of a haptic and/or acoustic click output provided when the user presses the virtual key 214; changing an appearance of the virtual key 214 (e.g., increasing brightness, decreasing brightness, changing color, changing shading, and so on); flashing or highlighting a portion of the graphical user interface presented on the touch-sensitive display 204;

increasing or decreasing a temperature of the input surface 204a; and so on, or any combination thereof.

[0058] Further, in some embodiments, outputs provided by the notification system can, without limitation: change over time (e.g., become more pronounced if ignored by the user); vary from user to user; vary from finger to finger; vary based on a location of a particular virtual key or virtual input region; vary based on a user setting; vary based on a system setting; vary based on an application setting; vary based on an orientation or position of the electronic device 200; vary based on a typing history of the user; vary based on detected grammar or spelling error rates; vary based on ambient light and/or sound detected by a sensor of the electronic device 200; vary based on environmental conditions (e.g., ambient sound levels, ambient light levels, ambient temperature, ambient humidity, and so on); and so on, or any combination thereof.

[0059] Accordingly, it is appreciated that a notification system - such as described herein - can be suitably configured to cue a user to realign that user's finger by providing any combination of haptic, acoustic, and/or visual output relative to any virtual key or virtual input region of a graphical user interface presented by a touch-sensitive display of an electronic device, such as the electronic device 200.

[0060] However, as may be appreciated, certain combinations of outputs provided by a notification system may be more effective to cue a user to realign that user's fingers than others. For example, in many cases, haptic outputs can more effectively solicit a user's attention than acoustic or visual outputs. Further, certain haptic outputs (or combinations of haptic outputs) may be more effective to solicit a particular user's attention than others; haptic effects may vary from device to device, user to user, application to application, implementation to implementation, virtual input region to virtual input region, and so on.

[0061] FIGs. 3A - 4B are provided in reference to certain example haptic outputs that may be provided by a haptic output subsystem of a notification system, such as described herein.

[0062] More specifically, a haptic output subsystem of a notification system such as described herein is typically configured to provide haptic output to a user, localized to a virtual input region (e.g., a virtual key) of a graphical user interface shown on a display. As a result of this construction, as a user provides a touch input to the input surface above the virtual input region, the user perceives a localized haptic output, and associates that haptic output with the virtual input region itself, cuing the user to position the user's finger generally in the center region of the virtual input region.

[0063] Accordingly, the phrase "haptic output," as used herein, broadly encompasses an output provided one or more components of a haptic output subsystem that stimulates a user's sense of touch and/or a user's perception related to the user's sense of touch including, but not necessarily limited to, a sense of surface temperature, a sense of surface topology, a sense of surface friction, a sense of numbness, a sense of mechanical pressure, a sense of mechanical distortion, a sense of motion, a sense of vibration, a sense of stickiness, a sense of slipperiness, a sense of attraction, and so on or any combination thereof. Similarly, the phrase "haptic output subsystem" as used herein broadly

encompasses the components, or groups of components, that may be used by or with a notification system to stimulate a user's sense of touch and/or affect a user's perception related to the user's sense of touch. [0064] A haptic output subsystem, such as may be used in conjunction with the embodiments described in reference to FIGs. 3A - 6H, includes one or more haptic elements. A haptic element can be any component or group of components configured to generate a haptic output that can be felt by a user. For example, a haptic element can be configured to move or vibrate the input surface, affect temperature of the input surface, affect friction between the user and the input surface, and so on. Haptic elements can include, but are not limited to, acoustic transducers (e.g., voice coil, piezoelectric elements or piezoelectric materials, and so on), thermal elements (e.g., resistive heaters, Peltier elements, and so on), and electrostatic plates. In other cases, other haptic elements or haptic element types may be associated with a haptic output subsystem, such as described herein.

[0065] Example output characteristics of a haptic element that can be controlled by the haptic output subsystem can include, but may not be limited to: a frequency, amplitude, duty cycle, envelope, and/or phase of a haptic element configured to move or vibrate an input surface; an absolute temperature, temperature gradient, and/or relative temperature of a haptic element configured to affect temperature of an input surface; an electrostatic field magnitude, frequency, duty cycle, and so on of a haptic element configured to affect friction between the user and an input surface by electrostatic attraction; a frequency, amplitude, duty cycle, envelope, and/or phase of a haptic element configured to ultrasonically move or vibrate an input surface to affect friction between the user and the input surface; and so on. In still further embodiments, other output characteristics may be controlled or influenced by a haptic output subsystem such as described herein.

[0066] In many embodiments, including those referenced below with respect to FIGs. 3A - 4B, a haptic output subsystem can simultaneously actuate different haptic elements with different output characteristics such that an aggregate output of the haptic output subsystem is a unique haptic effect that is perceivably different from the individual haptic outputs of the actuated haptic elements. In such examples, a single haptic element can be associated with a single virtual input region or, in other embodiments, multiple haptic elements can be associated with (e.g., positioned below or relative to) a single virtual input region.

[0067] For example, multiple vibrating haptic elements (see, e.g., FIGs. 6A - 6H) can be actuated simultaneously with different output characteristics (e.g., frequencies, phases, amplitudes, and so on) to produce vibrations that constructively and/or destructively interfere while propagating through the input surface. Such examples are discussed below with reference to FIGs. 3A - 4B.

[0068] In one implementation, haptic output from one haptic element can be configured to produce vibrations that cancel vibrations produced by another haptic element in order to define a non-vibrating region of the input surface. In this implementation, the haptic effect generated by the haptic output subsystem can be characterized by the location(s) and boundary(s) of vibrating and non-vibrating regions of the input surface that result, in aggregate, from the haptic outputs of the individually-actuated haptic elements. Such examples are discussed below with reference to FIGs. 5 - 6H.

[0069] In another implementation, a set of vibrating haptic elements - positioned below or relative to a single virtual input region or positioned below or relative to multiple virtual input regions - can each be actuated with specific output characteristics (e.g., frequencies, phases, amplitudes, and so on) that produce vibrations corresponding to frequency components of a finite series that represents (or approximates) the shape of an arbitrary function (e.g., impulse function, square wave, triangle wave, saw tooth wave, or any other suitable periodic function). The various vibrations produced by the haptic elements constructively and/or destructively interfere while propagating through the input surface, causing the input surface to locally deform or displace to take the general shape of the periodic function. In this implementation, the haptic effect generated by the haptic output subsystem can be characterized by the location(s), boundary(s), and contour(s) of the deformations or displacements of the input surface that result, in aggregate, from the vibrations produced by the haptic outputs of the actuated haptic elements.

[0070] In one specific example, an impulse function can be approximated by a sum of sinusoidal waveforms (see, e.g., FIG. 3A). In this example, a group of haptic elements of a haptic output subsystem associated with an input surface can be actuated with specific output characteristics (e.g., specific frequencies, phases, and amplitudes) each

corresponding to at least one of a particular selected sinusoidal waveform. In this manner, the vibrations in the input surface produced by each actuated haptic element correspond to at least one respective frequency component of a finite series that approximates the shape of an impulse function. In other words, the vibrations cause the input surface to deform and/or displace to take the shape of an impulse function. In this example, the contours of the impulse function waveform (e.g., edges, peaks, and so on) may be felt by a user touching the input surface while the haptic elements are actuated by the haptic output subsystem. In this implementation, the haptic effect generated by the haptic output subsystem can be characterized by the location(s), boundary(s), and contour(s) of the sharp and/or defined edges characterized by the impulse waveform-shaped deformation of the input surface.

[0071] In further implementations, additional haptic elements can be actuated to refine or supplement particular haptic effect(s) generated by the haptic output subsystem. For example, an electrostatic plate can generate an electric field that attracts the user's finger to a particular region of the input surface. An adjacent electrostatic plate may generate an electric field of opposite polarity. In this example, when the user draws a finger from the first electrostatic plate to the second electrostatic plate, a change in friction may be perceived by the user due to the inversion of the electric field magnitude. In this implementation, the haptic effect generated by the haptic output subsystem can be characterized by the location(s), boundary(s), and magnitude(s) of the electrostatic fields generated by the haptic output subsystem of the input surface.

[0072] In yet another example, a haptic output subsystem such as described herein (e.g., in reference to FIGs. 3A - 6H) can supplement a haptic effect generated by vibrating haptic elements with a haptic effect generated by one or more electrostatic plates. For example, a haptic effect of an impulse function, perceived by a user as a bump extending from an input surface, can be supplemented by a haptic effect of a change in friction across the bump extending from the input surface.

[0073] In another example, a thermal element (e.g., a Peltier element, heat pump, resistive heater, inductive heater, and so on) can increase or decrease a temperature of a particular region of the input surface. For example, the thermal element can locally increase a temperature of a region of the input surface. An adjacent thermal element can locally decrease the temperature of a second region of the input surface. In this example, when the user draws a finger from the first region to the second region, a change in temperature may be perceived by the user. In this implementation, the haptic effect generated by the haptic output subsystem can be characterized by the location(s), boundary(s), and magnitude(s) of the various temperature gradients generated by the haptic output subsystem.

[0074] In still further embodiments, a haptic output subsystem can be operated and/or configured to produce a series or set haptic effects that are collectively configured to simulate the presence of a physical component (e.g., button, switch, key, rocker, slider, dial, and so on) on the input surface. In many cases, the simulated physical component corresponds to a virtual input region below the input surface. In particular, the haptic output subsystem in these examples associates a set of particular haptic effects with a set of particular combinations of user touch and/or force input. In these examples, each haptic effect generated by the haptic output subsystem simulates a particular response or characteristic exhibited by the simulated physical component when the user interacts with that component in a specific way. In this manner, by simulating multiple responses or characteristics of the physical component in response to particular user input, the haptic output subsystem can cause the user to perceive that the physical component is present on the input surface.

[0075] Accordingly, generally and broadly, it is understood that a haptic output subsystem such as described herein is configured to produce haptic effects that simulate and/or approximate the various static or dynamic mechanical, physical, or textural attributes or properties that may be exhibited by a physical component when a user interacts with that component by touching, feeling, rotating, tapping, pushing, pulling, pressing, releasing, or otherwise physically interacting with the component in a specific or particular manner. These haptic effects cue a user to align the user's finger with a central region of an associated virtual input region, such as a virtual key of a virtual keyboard.

[0076] For example, when a user places a finger on a physical key of a keyboard, the user can interact with the key in a number of ways (e.g., by touching, pressing, feeling, tapping, and so on), each of which may be associated with a different set of haptic effects or outputs that inform the user's perception of properties of the key including size, location, shape, height, texture, material, rigidity, flexibility, and so on. Such interactions might include drawing a finger across a surface or edge of the key by touching or feeling, pressing the key with different magnitudes of force, holding the key in a specific position, sliding a finger across a surface of the key, wiggling the key, and so on. Each of these different interactions can be associated with different haptic effects or outputs that inform the user's perception of that particular key in a unique way. These haptic outputs can include, but may not be limited to, a sharpness or roundness of an edge of the key, a texture of a surface of the key, a concavity or convexity of the key, a rattle or wobble of the key, a presence or absence of surface features, a stiffness or elasticity of the key, a smoothness or discontinuousness of travel when the key is pressed, a magnitude of force required to actuate the key, and so on.

[0077] For simplicity of description and illustration, FIGs. 3A - 4B are provided herein to illustrate specific simplified example outputs of a haptic output subsystem such as described herein. However, it is appreciated that these examples - in addition to the examples referenced above - are not exhaustive; additional haptic effects that can be provided in response to additional user input may be provided in further embodiments to simulate different physical input components, non-input objects or features (e.g., surface

characteristics, textures, and so on), or for any other suitable purpose.

[0078] In particular, the embodiments shown in FIGs. 3A - 4B each depict a virtual key that may be touched by a user. The virtual key, as with other embodiments described herein, can be presented by a graphical user interface shown on a touch-sensitive display that is operated in conjunction with a notification system, such as described herein. The notification system includes a haptic output subsystem configured to generate localized haptic effects - such as those described above - through an input surface above the touch-sensitive display that may be touched by a user.

[0079] In particular, FIG. 3A depicts a virtual key 300. The haptic output subsystem in this example (not shown) is configured to produce a haptic output, such as a vibration or an impulse, in response to detecting that a user's finger has drifted away from a central point x 0 of the virtual key 300. In particular, a graph 302 is provided showing that when the user's finger drifts along the input surface 304 above the virtual key 300 (e.g., from the central point x 0 , across the boundary 306 that defines a perimeter of the virtual key 300) into an adjacent region 308 (e.g., point x ), a haptic effect 310 is generated by the haptic output subsystem to cue the user to readjust the user's finger positioning. The haptic effect 310 is illustrated as an impulse (e.g., a sharp, or high-frequency vibration) localized to the adjacent region 308, but this may not be required.

[0080] In another non-limiting phrasing, the haptic output subsystem is configured to generate a localized haptic effect, adjacent to the virtual key, when the user's finger drifts in that direction. In this manner, the haptic output subsystem provides a cue to the user to readjust the user's finger positioning back toward the central point x 0 .

[0081] It may be appreciated that the haptic effect 310 can be any suitable haptic output or combination of haptic outputs including, but not limited to: a high-frequency vibration; a low-frequency vibration; a high-friction effect; a low-friction effect; a high-temperature effect; a low-temperature effect; a click effect; a lateral shift or mechanical displacement of the input surface (in any planar direction generally parallel to the input surface); a vertical mechanical displacement of the input surface (in any direction generally perpendicular to the input surface); and so on. As with other embodiments described herein, the magnitude (and/or other characteristics) of the haptic effect 310 can be based on the real-time touch or force input, user history, device settings, and so on.

[0082] In other embodiments and implementations, a haptic output subsystem can be configured to provide different haptic outputs. For example, FIG. 3B depicts a virtual key 300. The haptic output subsystem in this example (also not shown) is configured to produce two different haptic outputs. One of the haptic outputs is localized to the surface of the virtual key 300, and a second haptic output is localized to a region adjacent to the virtual key 300.

[0083] More particularly, as with FIG. 3A, the graph 302 shown in FIG. 3B shows that when the user's finger drifts along the input surface 304 above the virtual key 300 from the central point x 0 toward the boundary 306, a haptic effect 312 is provided. The haptic effect 312 in this example has a magnitude that changes continuously as a function of the distance between the user's finger and the central point x 0 . More particularly, the closer the user's finger is to the boundary 306 of the virtual key 300, the lower the magnitude of the haptic effect 312. Once the user crosses the boundary, the haptic effect 310 can be provided as described in reference to FIG. 3A.

[0084] In another non-limiting phrasing, the haptic output subsystem in this example is configured to generate a localized haptic effect when the user presses the virtual key correctly, diminishing that effect as the user's finger drifts toward an edge of the virtual key. Once the user's finger has crossed the perimeter or boundary of the virtual key so as to at least partially overlap the perimeter or boundary, a different haptic output is provided.

[0085] The haptic effect 312 is depicted as a continuous function of distance from the central point x 0 , but this is not required. For example, FIG. 4A depicts a virtual key 400. The haptic output subsystem in this example (also not shown) is configured to produce two different haptic outputs. One of the haptic outputs is localized to the surface of the virtual key 400, and a second haptic output is localized to a region adjacent to the virtual key 400.

[0086] More particularly, as with FIGs. 3A - 3B, the graph 402 shown in FIG. 4A shows that when the user's finger drifts along the input surface 404 above the virtual key 400, from the central point x 0 toward the boundary 406 (and, eventually into a region adjacent to the virtual key 400, identified as the adjacent region 408), a haptic effect 410 is provided. The haptic effect 412 in this example has a magnitude that changes discretely as a function of the distance between the user's finger and the central point x 0 . In another phrasing, the magnitude (or another output characteristic, such as frequency) of the haptic effect 410 differs based on a subarea of the input surface 404 selected by the user. Once the user crosses the boundary, the haptic effect 410 can be provided as described in reference to FIGs. 3A - 3B.

[0087] In still other examples, a haptic effect provided by a haptic output subsystem need not decrease as a function of a user's touch location away from a central point or central region of a virtual key. For example, FIG. 4B shows an alternate implementation of the embodiment depicted and described in reference to FIG. 4A. In this example, a haptic effect 414 increases in magnitude (discretely, although continuous embodiments are possible as well) as the user's finger drifts away from the central region of the virtual key 400.

[0088] The foregoing description of the embodiments depicted in FIGs. 3A - 4B, and various alternatives thereof and variations thereto are presented, generally, for purposes of explanation, and to facilitate a thorough understanding of the detailed embodiments presented herein. However, it will be apparent to one skilled in the art that some of the specific details presented herein may not be required in order to practice a particular described embodiment, or an equivalent thereof.

[0089] Thus, it is understood that the foregoing and following descriptions of specific embodiments of a haptic output subsystem of a notification system are presented for the limited purposes of illustration and description. These descriptions are not targeted to be exhaustive or to limit the disclosure to the precise forms recited herein. To the contrary, it will be apparent to one of ordinary skill in the art that many modifications and variations are possible in view of the above teachings. [0090] For example, it may be appreciated that a haptic output subsystem, such as described above, may be implemented in a number of suitable ways. In addition, a haptic element of a haptic output subsystem can be implemented in a number of suitable ways. FIGs. 5 - 6H are provided to illustrate example configurations of a haptic output subsystem and a haptic element that may be used with a haptic output subsystem.

[0091] FIG. 5 depicts a system diagram of an example haptic output subsystem 500. The haptic output subsystem 500 includes a controller 502 (such as a processor) that is coupled to a haptic actuator 504 (or haptic element), such as an electrostatic plate, a vibrating haptic element, or a thermal element. The controller 502 is also coupled to an input sensor 506 configured to detect a magnitude, location, and/or direction of force input and touch input to an input surface (e.g., detected by one or more of a force sensor or a touch sensor disposed relative to the input surface). In some embodiments, the controller 502 is also in

communication with other systems or subsystems of a notification system, such as an acoustic controller or a graphic controller 508.

[0092] The controller 502 can be implemented with any electronic device capable of processing, receiving, or transmitting data or instructions. For example, the controller 502 can be a microprocessor, a central processing unit, an application-specific integrated circuit, a field-programmable gate array, a digital signal processor, an analog circuit, a digital circuit, or combination of such devices. The processor may be a single-thread or multi-thread processor. The processor may be a single-core or multi-core processor. Accordingly, as described herein, the phrase "processing unit" or, more generally, "processor" refers to a hardware-implemented data processing device or circuit physically structured to execute specific transformations of data including data operations represented as code and/or instructions included in a program that can be stored within and accessed from a memory. The term is meant to encompass a single processor or processing unit, multiple processors, multiple processing units, analog or digital circuits, or other suitably configured computing elements or combination of elements.

[0093] The controller 502, in many embodiments, can include or can be communicably coupled to circuitry and/or logic components, such as a dedicated processor and a memory. The circuitry of the controller 502 can perform, coordinate, and/or monitor one or more of the functions or operations associated with the haptic output subsystem 500 including, but not limited to: increasing the temperature of an area of an input surface; decreasing the temperature of an area of an input surface; decreasing the temperature surrounding an area of an input surface; increasing the temperature surrounding an area of an input surface; detecting, approximating, and/or measuring the temperature of an area of an input surface; increasing the friction exhibited by an area of an input surface; decreasing the friction exhibited by an area of the input surface; increasing the friction exhibited surrounding an area of an input surface; decreasing the friction exhibited surrounding an area of an input surface; increasing a vibration emanating from a local area of an input surface; decreasing a vibration output from one or more haptic actuators; generating a vibration that constructively interferes with a vibration propagating through an area of an input surface; generating a vibration that destructively interferes with a vibration propagating through an area of an input surface; measuring, estimating and/or determining a frequency, amplitude and/or phase of a vibration propagating through an area of an input surface; and so on or any combination thereof. In some examples, the controller 502 may use time multiplexing techniques to obtain measurements from and to apply signals to each independent element of each portion of a haptic output subsystem 500.

[0094] In further embodiments, the haptic output subsystem 500 can be operated and/or configured to produce a series or set of haptic effects (together with, or independent of acoustic or visual effects generated or controlled by the acoustic controller or graphic controller 508) that are collectively configured to simulate the presence of a physical component, a physical boundary, a physical texture, and so on at one or more locations or regions of the input surface.

[0095] In particular, as noted above, the haptic output subsystem can simulate any arbitrary physical component by associating a set of particular haptic effects with a set of particular combinations of user touch and/or force input. In these examples, each haptic effect generated by the haptic output subsystem simulates a particular response or characteristic exhibited by the physical component when the user interacts with that component in a specific way. In this manner, by simulating multiple responses or characteristics of the physical component in response to particular user input, the haptic output subsystem can provide finger realignment cues to a user.

[0096] For example, FIG. 6A depicts a simplified view of haptic output subsystem 600, including a number of individually-addressable individual haptic elements, that can be associated with an input surface 602. As noted above, the haptic output subsystem can be appropriately configured to simulate a physical component, boundary, texture, or object by generating haptic effects with one or more haptic elements coupled to the input surface 602. In the illustrated embodiment, a set of haptic elements are illustrated in phantom to indicate that the haptic elements are positioned below the input surface. One of the haptic elements is identified as the individually-addressable haptic element 602a.

[0097] The individually-addressable haptic element 602a can be implemented in a number of ways. For example, the individually-addressable haptic element 602a can include one or more, without limitation: lateral displacement elements; vertical displacement elements; low-frequency vibration elements; high-frequency vibration elements; ultrasonic vibration elements; electrostatic elements; thermal elements; and so on. For simplicity of description and illustration, the embodiments that follow reference implementations of the individually-addressable haptic element 602a that are configured to displace or vibrate the input surface 602. However, it may be appreciated that other example embodiments can include other haptic elements, such as those referenced above.

[0098] More specifically, FIGs. 6B-6H depict various example embodiments of the individually-addressable haptic element 602a, viewed through the section line A-A of FIG. 6A.

[0099] In some embodiments, such as shown in FIGs. 6B - 6E, the individually- addressable haptic element 602a includes a haptic element 604 that is coupled directly or indirectly to an interior surface of the input surface 602. The haptic element 604 is configured to compress (FIG. 6B) and/or expand (FIG. 6C) in response to an actuation. For the embodiments depicted in FIG. 6B - 6D, the haptic element 604 compresses or expands along an axis generally parallel to the input surface 602. In this manner, the haptic element 604 induces a localized bending moment into the input surface 602 when actuated. The bending moment causes the input surface 602 to deform vertically, in a positive direction (see, e.g., FIG. 6B) or negative direction (see, e.g., FIG. 6B) relative to the input surface 602. In other cases, the bending moment causes the input surface 602 to vibrate, such as shown in FIG. 6D. In other cases, such as shown in FIG 6E, the haptic element 604 is compresses or expands along an axis generally perpendicular to the input surface 602.

[0100] The haptic element 604 can be a piezoelectric element, an acoustic transducer, an electrically deformable material (e.g., nitinol), an electromagnet and attractor plate (see, e.g., FIG. 6G), or any other suitable element. In other cases, the haptic element 604 may be an eccentrically weighted motor, linear actuator, or any other suitable mechanical element.

[0101] In some embodiments, the input surface 602 can be supported by one or more stiffeners or backing plates. The stiffeners/backing plates can locally increase a rigidity of the input surface 602 such that an output generated by the haptic element 604 is localized to a greater extent. In FIGs. 6B - 6H, a stiffener 606 (depicted in cross section, with left-hand and right-hand portions of the stiffener 606 identified as the stiffener sections 606a, 660b respectively) is coupled to the input surface 602. The stiffener 606 in one example can be a ring that circumscribes the haptic element 604. In another example, the stiffener 606 includes two portions, a first portion 606a and a second portion 606b. The first and second portions of the stiffener can be coupled together or may be discrete, independent parts.

[0102] As with other embodiments described herein, the haptic output subsystem 600 (as depicted in FIG. 6B) also includes a controller 608. The controller 608 can be a processor and/or electrical circuit appropriately configured to apply a signal to the haptic element 604 to cause the haptic element 604 to generate haptic output. The controller 608 can be configured to vary one or more characteristics of the signal (e.g., voltage, current, frequency, amplitude, phase, envelope, duty cycle, and so on) in order to vary the haptic output characteristics of the haptic element 604.

[0103] In many cases, the controller 608 is in communication with one or more components of the input surface, such as a force/touch input sensor 610. The controller 608 can receive data or information from these components and may alter characteristics of signals provided by the controller 608 to the haptic element 604 based on the received data or information. For example, the controller 608 can vary characteristics of signals provided by the controller 608 to the haptic element 604 based on a magnitude of force detected by the force/touch input sensor 610.

[0104] Other embodiments are configured in other ways. For example, as shown in FIG. 6F, more than one haptic element can be stacked or layered together. In particular, the haptic element 604 is backed by a differently-sized haptic element, labeled as the haptic element 612. In turn, the haptic element 612 is backed by a differently-sized haptic element, labeled as the haptic element 614. In this embodiment, the haptic output subsystem 600 can be configured to generate haptic output from one or all of the haptic elements 604, 612, and 614.

[0105] In still further examples, a haptic element can include multiple parts that attract or repel one another to generate a bending moment in the input surface 602. More particularly, FIG. 6G includes a haptic element implemented with an electromagnet 616 and an attractor plate 618, each coupled to the input surface and separated from one another by a distance. The attractor plate 618 can be a ferromagnetic material or, in other embodiments, can be a permanent magnet that can be attracted and/or repelled from by the electromagnet 616. In these embodiments, actuation of the electromagnet 616 causes a magnetic field to attract or repel the attractor plate 618, thereby inducing a bending moment in the input surface 602. In this example, the electromagnet 616 and the attractor plate 618 attract or repel along an axis generally parallel to the input surface 602, but this may not be required. For example, as shown in FIG. 6H, the attractor plate 618 and the electromagnet 616 may be configured to attract or repel along an axis generally perpendicular to the input surface 602.

[0106] The foregoing description of the embodiments depicted in FIGs. 5 - 6H, and various alternatives thereof and variations thereto are presented, generally, for purposes of explanation, and to facilitate a thorough understanding of the detailed embodiments presented herein. However, it will be apparent to one skilled in the art that some of the specific details presented herein may not be required in order to practice a particular described embodiment, or an equivalent thereof.

[0107] Thus, it is understood that the foregoing and following descriptions of specific embodiments of a haptic element of a haptic output subsystem of a notification system, such as described herein, are presented for the limited purposes of illustration and description. These descriptions are not targeted to be exhaustive or to limit the disclosure to the precise forms recited herein. To the contrary, it will be apparent to one of ordinary skill in the art that many modifications and variations are possible in view of the above teachings.

[0108] For example, as noted above, a notification system such as described herein can also provide one or more visual realignment cues to a user to encourage realignment of one or more of the user's fingers relative to a virtual input region. FIGs. 7 A - 7F depict example visual realignment cues that can be provided to a user in place of, or in conjunction with one or more of the haptic realignment cues described in reference to other embodiments described herein.

[0109] For example, FIG. 7A depicts a virtual keyboard 700a (that may be presented by a graphical user interface) including multiple rows of virtual keys. In this example, a visual realignment cue 702 is provided by modifying a position of a virtual key that can be selected by a user. FIG. 7B depicts a virtual keyboard 700b which provides a visual realignment cue 704 in which a virtual key's size and positioning are modified. FIG. 7C depicts a virtual keyboard 700c which provides a visual realignment cue 706 in which a virtual key's shape and positioning are modified. FIG. 7D depicts a virtual keyboard 700d which provides a visual realignment animation 708 in which a virtual key's position is highlighted by a perimeter animation. FIG. 7E depicts a virtual keyboard 700e which provides a visual realignment cue 710 in which a virtual key's coloring or patterning are modified. FIG. 7F depicts a virtual keyboard 700f which provides visual realignment cues 712, 714, and 716 in which a user's past finger positions are highlighted with fading after-images. The example embodiments depicted in FIGs. 7A - 7F are merely examples; it is appreciated that any number of visual effects can be provided in conjunction with or in place of audio cues and/or haptic cues described herein. Further, it is appreciated that any number of suitable visual cues apart from these specific examples provided above are possible including, but not limited to: virtual key animations; virtual key shape, size, color, design, font, position, or rotation changes; after image animations; and so on.

[0110] FIGS. 8 - 9 are simplified flow chart diagrams corresponding to example operations of methods of providing realignment cues with a notification system, such as described herein. [0111] FIG. 8 depicts operations of a method of providing haptic feedback to cue a user to readjust the user's finger positioning relative to a virtual input region of a graphical user interface. In particular, the method 800 includes operation 802 in which a touch location is provided by a user. At operation 804, one or more haptic output characteristics associated with the touch location can be obtained (e.g., determined, accessed from a memory, and so on). As noted with respect to other embodiments described herein, the haptic output characteristics can be based on a distance between the touch location and a central location of one or more virtual input regions of the graphical user interface. Once the haptic output characteristics are determined at operation 804, one or more haptic outputs can be provided at operation 806. These haptic outputs can be global haptic outputs, local haptic outputs, semi-local haptic outputs, or any other suitable haptic output.

[0112] FIG. 9 depicts operations of another method of providing haptic feedback to cue a user to readjust the user's finger positioning relative to a virtual input region of a graphical user interface. In particular, the method 900 includes operation 902 in which a distance between a touch location provided by a user and a central region of a virtual input region is determined. At operation 904, one or more output characteristics associated with the touch location can be obtained (e.g., determined, accessed from a memory, and so on). In this embodiment, the output characteristics correspond to a haptic output or haptic effect that provide a centering cue to the user to re-center the user's finger relative to the central region of the virtual input region. Once the haptic output characteristics are determined at operation 904, one or more haptic, acoustic, and/or visual outputs can be provided at operation 906.

[0113] The present disclosure recognizes that personal information data, including private inter-person communications, in the present technology, can be used to the benefit of users. For example, the use of haptic simulation on a surface of an electronic device can be used to provide for a more immersive computing experience.

[0114] The present disclosure further contemplates that the entities responsible for the collection, analysis, disclosure, transfer, storage, or other use of such personal information or communication data will comply with well-established privacy policies and/or privacy practices. In particular, such entities should implement and consistently use privacy policies and practices that are generally recognized as meeting or exceeding industry or

governmental requirements for maintaining personal information data private and secure, including the use of data encryption and security methods that meets or exceeds industry or government standards. For example, personal information from users should be collected for legitimate and reasonable uses of the entity and not shared or sold outside of those legitimate uses. Further, such collection should occur only after receiving the informed consent of the users. Additionally, such entities would take any needed steps for safeguarding and securing access to such personal information data and ensuring that others with access to the personal information data adhere to their privacy policies and procedures. Further, such entities can subject themselves to evaluation by third parties to certify their adherence to widely accepted privacy policies and practices.

[0115] Despite the foregoing, the present disclosure also contemplates embodiments in which users selectively block the use of, or access to, personal information data, including private inter-person communications. That is, the present disclosure contemplates that hardware and/or software elements can be provided to prevent or block access to such personal information data.

[0116] In addition, one may appreciate that although many embodiments are disclosed above, that the operations and steps presented with respect to methods and techniques described herein are meant as exemplary and accordingly are not exhaustive. One may further appreciate that alternate step order or, fewer or additional steps may be required or desired for particular embodiments.

[0117] Although the disclosure above is described in terms of various exemplary embodiments and implementations, it should be understood that the various features, aspects and functionality described in one or more of the individual embodiments are not limited in their applicability to the particular embodiment with which they are described, but instead can be applied, alone or in various combinations, to one or more of the some embodiments of the invention, whether or not such embodiments are described and whether or not such features are presented as being a part of a described embodiment. Thus, the breadth and scope of the present invention should not be limited by any of the above- described exemplary embodiments but is instead defined by the claims herein presented.