Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
PRESSURE SENSING IN THREE-DIMENSIONAL (3D) SPACE
Document Type and Number:
WIPO Patent Application WO/2023/211506
Kind Code:
A1
Abstract:
An apparatus is provided for use in a wearable sensor device for pressure sensing in three-dimensional (3D) space. The apparatus includes a memory storing instructions and at least one processor in communication with the memory. The at least one processor is configured, upon execution of the instructions, to perform the steps including obtaining a pressure measurement of user-induced pressure. The pressure measurement is obtained relative to a cursor position of a cursor displayed on a graphical user interface (GUI). The steps further include selecting a first functional layer icon of a first functional layer of a plurality of functional layers when the pressure measurement exceeds a first pressure threshold. The first functional layer is associated with the first functional layer icon displayed by the GUI. The steps further include performing a function corresponding to the first functional layer.

Inventors:
LIN ZONGFANG (US)
CHOW FREDERICK CHI TAK (US)
ZHANG YAN (US)
Application Number:
PCT/US2022/071923
Publication Date:
November 02, 2023
Filing Date:
April 26, 2022
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
FUTUREWEI TECHNOLOGIES INC (US)
International Classes:
G06F3/04817; G06F3/0346; G06F3/0488
Domestic Patent References:
WO2019229698A12019-12-05
Foreign References:
US11287886B12022-03-29
Attorney, Agent or Firm:
SCHEER, Bradley W. et al. (US)
Download PDF:
Claims:
CUAIMS

What is claimed is:

1. A computer-implemented method for pressure sensing in three- dimensional (3D) space, the method comprising: obtaining a pressure measurement of user-induced pressure, the pressure measurement being related to a cursor position of a cursor displayed on a graphical user interface (GUI); selecting a first functional layer icon of a first functional layer of a plurality of functional layers when the pressure measurement exceeds a first pressure threshold, the first functional layer being associated with the first functional layer icon displayed by the GUI; and performing a function corresponding to the first functional layer.

2. The computer-implemented method of claim 1, the user-induced pressure being generated between a second user finger and a wearable sensor device worn on a first user finger, the wearable sensor device comprising a pressure sensor.

3. The computer-implemented method of claim 1, the user-induced pressure being generated between a surface and a wearable sensor device worn on a first user finger, the wearable sensor device comprising a pressure sensor.

4. The computer-implemented method of any of claims 1-3, the obtaining the pressure measurement comprising receiving the pressure measurement.

5. The computer-implemented method of claim 1, the obtaining the pressure measurement comprising: receiving a pressure signal from a wearable sensor device worn on a first user finger, the wearable sensor device comprising a pressure sensor; and generating the pressure measurement from the pressure signal.

6. The computer-implemented method of any of claims 1-5, further comprising: selecting the first functional layer icon when the pressure measurement exceeds the first pressure threshold for a predefined selection time period.

7. The computer-implemented method of any of claims 1-5, further comprising: selecting the first functional layer icon based on the pressure measurement exceeding the first pressure threshold and the cursor position being within a predefined distance from the first functional layer icon.

8. The computer-implemented method of any of claims 1-5, further comprising: selecting the first functional layer icon based on the pressure measurement exceeding the first pressure threshold for a predefined time period and the cursor position being within a predefined distance from the first functional layer icon.

9. The computer-implemented method of claim 1, further comprising a preliminary step of: causing a display of the first functional layer icon on the GUI and within a predefined distance from the cursor position.

10. The computer-implemented method of claim 9, further comprising: detecting a motion of a wearable sensor device worn on a first user finger, the motion causing a movement of the cursor on the GUI.

11. The computer-implemented method of claim 10, wherein the pressure measurement remains greater than the first pressure threshold during the motion.

12. The computer-implemented method of claim 1, further comprising: causing a display of a second functional layer icon of a second functional layer of the plurality of functional layers in response to the selecting the first functional layer icon, the second functional layer being a sub-layer of the first functional layer; detecting a motion of a wearable sensor device to a second position, the motion causing a movement of the cursor on the GUI, with the pressure measurement remaining greater than the first pressure threshold during the motion, the second position being within a predefined distance from the second functional layer icon; obtaining a second pressure measurement, the second pressure measurement obtained at the second position; and selecting the second functional layer icon when the second pressure measurement exceeds a second pressure threshold.

13. The computer-implemented method of claim 12, wherein the wearable sensor device is worn on at least one finger of a user’s hand.

14. The computer-implemented method of claim 1, wherein the cursor comprises a first virtual object representation in a virtual environment, the method further comprising: detecting a virtual contact between the first virtual object representation and a second virtual object representation in the virtual environment.

15. The computer-implemented method of claim 14, further comprising: determining a virtual pressure measurement of virtual pressure between the first virtual object representation and the second virtual object representation in the virtual environment; and generating a feedback signal using a wearable sensor device, the feedback signal being proportional to the virtual pressure measurement.

16. The computer-implemented method of claim 15, wherein generating the feedback signal further comprises at least one of: generating pressure feedback at a wearable sensor device; generating a visual signal at the wearable sensor device; or generating an audio signal at the wearable sensor device.

17. The computer-implemented method of claim 1, the obtaining the pressure measurement further comprising: obtaining a pressure sequence of user-induced pressures, the pressure sequence being related to a second cursor position of the cursor displayed on the GUI; selecting a second functional layer icon associated with a second functional layer of the plurality of functional layers when the pressure sequence matches a pre-configured pressure sequence; and performing a function corresponding to the second functional layer.

18. An apparatus for use with a wearable sensor device for pressure sensing in three-dimensional (3D) space, the apparatus comprising: a memory storing instructions; and at least one processor in communication with the memory, the at least one processor configured, upon execution of the instructions, to perform steps comprising: obtaining a pressure measurement of user-induced pressure, the pressure measurement being related to a cursor position of a cursor displayed on a graphical user interface (GUI); selecting a first functional layer icon of a first functional layer of a plurality of functional layers when the pressure measurement exceeds a first pressure threshold, the first functional layer being associated with the first functional layer icon displayed by the GUI; and performing a function corresponding to the first functional layer.

19. The apparatus of claim 18, further comprising a pressure sensor, the wearable sensor device worn on a first user finger, and the user-induced pressure being generated between a second user finger and the wearable sensor device.

20. The apparatus of claim 18, further comprising a pressure sensor, the wearable sensor device worn on a first user finger, and the user-induced pressure being generated between a surface and the wearable sensor device.

21. The apparatus of any of claims 18-20, the obtaining the pressure measurement comprising receiving the pressure measurement.

22. The apparatus of claim 18, further comprising a pressure sensor, the wearable sensor device worn on a first user finger, and wherein the steps for obtaining the pressure measurement further comprise: receiving a pressure signal from the wearable sensor device; and generating the pressure measurement from the pressure signal.

23. The apparatus of any of claims 18-22, the steps further comprising: selecting the first functional layer icon when the pressure measurement exceeds the first pressure threshold for a predefined selection time period.

24. The apparatus of any of claims 18-20, the steps further comprising: selecting the first functional layer icon based on the pressure measurement exceeding the first pressure threshold and the cursor position being within a predefined distance from the first functional layer icon.

25. The apparatus of any of claims 18-24, the steps further comprising: selecting the first functional layer icon based on the pressure measurement exceeding the first pressure threshold for a predefined time period and the cursor position being within a predefined distance from the first functional layer icon.

26. The apparatus of claim 18, the steps further comprising a preliminary step of: causing a display of the first functional layer icon on the GUI and within a predefined distance from the cursor position.

27. The apparatus of claim 26, wherein the wearable sensor device is worn on a first user finger, and the steps further comprising: detecting a motion of the wearable sensor device, the motion causing a movement of the cursor on the GUI.

28. The apparatus of claim 27, wherein the pressure measurement remains greater than the first pressure threshold during the motion.

29. The apparatus of claim 18, the steps further comprising: causing a display of a second functional layer icon of a second functional layer of the plurality of functional layers in response to the selecting the first functional layer icon, the second functional layer being a sub-layer of the first functional layer; detecting a motion of the wearable sensor device to a second position, the motion causing a movement of the cursor on the GUI, with the pressure measurement remaining greater than the first pressure threshold during the motion, the second position being within a predefined distance from the second functional layer icon; obtaining a second pressure measurement, the second pressure measurement obtained at the second position; and selecting the second functional layer icon when the second pressure measurement exceeds a second pressure threshold.

30. The apparatus of claim 29, wherein the wearable sensor device is worn on at least one finger of a user’s hand.

31. The apparatus of claim 18, wherein the cursor comprises a first virtual object representation in a virtual environment, and the steps further comprising: detecting a virtual contact between the first virtual object representation and a second virtual object representation in the virtual environment.

32. The apparatus of claim 31, the steps further comprising: determining a virtual pressure measurement of virtual pressure between the first virtual object representation and the second virtual object representation in the virtual environment; and generating a feedback signal using the wearable sensor device, the feedback signal being proportional to the virtual pressure measurement.

33. The apparatus of claim 32, wherein generating the feedback signal further comprises at least one of: generating pressure feedback at a wearable sensor device; generating a visual signal at the wearable sensor device; or generating an audio signal at the wearable sensor device.

34. The apparatus of claim 18, the obtaining the pressure measurement further comprising: obtaining a pressure sequence of user-induced pressures, the pressure sequence being related to a second cursor position of the cursor displayed on the GUI; selecting a second functional layer icon associated with a second functional layer of the plurality of functional layers when the pressure sequence matches a pre-configured pressure sequence; and performing a function corresponding to the second functional layer.

35. A non-transitory computer-readable medium storing computer instructions for performing pressure sensing in three-dimensional (3D) space, that configure at least one processor, upon execution of the instructions, to perform steps comprising: obtaining a pressure measurement of user-induced pressure, the pressure measurement being related to a cursor position of a cursor displayed on a graphical user interface (GUI); selecting a first functional layer icon of a first functional layer of a plurality of functional layers when the pressure measurement exceeds a first pressure threshold, the first functional layer being associated with the first functional layer icon displayed by the GUI; and performing a function corresponding to the first functional layer.

36. The non-transitory computer-readable medium of claim 35, the user- induced pressure being generated between a second user finger and a wearable sensor device worn on a first user finger, the wearable sensor device comprising a pressure sensor.

37. The non-transitory computer-readable medium of claim 35, the user- induced pressure being generated between a surface and a wearable sensor device worn on a first user finger, the wearable sensor device comprising a pressure sensor.

38. The non-transitory computer-readable medium of any of claims 35-37, the obtaining the pressure measurement comprising receiving the pressure measurement.

39. The non-transitory computer-readable medium of claim 35, the obtaining the pressure measurement comprising: receiving a pressure signal from a wearable sensor device worn on a first user finger, the wearable sensor device comprising a pressure sensor; and generating the pressure measurement from the pressure signal.

40. The non-transitory computer-readable medium of any of claims 35-39, the steps further comprising: selecting the first functional layer icon when the pressure measurement exceeds the first pressure threshold for a predefined selection time period.

41. The non-transitory computer-readable medium of claim 35, the steps further comprising: selecting the first functional layer icon based on the pressure measurement exceeding the first pressure threshold and the cursor position being within a predefined distance from the first functional layer icon.

42. The non-transitory computer-readable medium of any of claims 35-41, the steps further comprising: selecting the first functional layer icon based on the pressure measurement exceeding the first pressure threshold for a predefined time period and the cursor position being within a predefined distance from the first functional layer icon.

43. The non-transitory computer-readable medium of claim 35, the steps further comprising a preliminary step of: causing a display of the first functional layer icon on the GUI and within a predefined distance from the cursor position.

44. The non-transitory computer-readable medium of claim 43, the steps further comprising: detecting a motion of a wearable sensor device worn on a first user finger, the motion causing a movement of the cursor on the GUI.

45. The non-transitory computer-readable medium of claim 44, wherein the pressure measurement remains greater than the first pressure threshold during the motion.

46. The non-transitory computer-readable medium of claim 35, the steps further comprising: causing a display of a second functional layer icon of a second functional layer of the plurality of functional layers in response to the selecting the first functional layer icon, the second functional layer being a sub-layer of the first functional layer; detecting a motion of a wearable sensor device to a second position, the motion causing a movement of the cursor on the GUI, with the pressure measurement remaining greater than the first pressure threshold during the motion, the second position being within a predefined distance from the second functional layer icon; obtaining a second pressure measurement, the second pressure measurement obtained at the second position; and selecting the second functional layer icon when the second pressure measurement exceeds a second pressure threshold.

47. The non-transitory computer-readable medium of claim 46, wherein the wearable sensor device is worn on at least one finger of a user’s hand.

48. The non-transitory computer-readable medium of claim 35, wherein the cursor comprises a first virtual object representation in a virtual environment, the steps further comprising: detecting a virtual contact between the first virtual object representation and a second virtual object representation in the virtual environment.

49. The non-transitory computer-readable medium of claim 48, the steps further comprising: determining a virtual pressure measurement of virtual pressure between the first virtual object representation and the second virtual object representation in the virtual environment; and generating a feedback signal using a wearable sensor device, the feedback signal being proportional to the virtual pressure measurement.

50. The non-transitory computer-readable medium of claim 49, wherein generating the feedback signal further comprises at least one of: generating pressure feedback at a wearable sensor device; generating a visual signal at the wearable sensor device; or generating an audio signal at the wearable sensor device.

51. The non-transitory computer-readable medium of claim 35, the obtaining the pressure measurement further comprising: obtaining a pressure sequence of user-induced pressures, the pressure sequence being related to a second cursor position of the cursor displayed on the GUI; selecting a second functional layer icon associated with a second functional layer of the plurality of functional layers when the pressure sequence matches a pre-configured pressure sequence; and performing a function corresponding to the second functional layer.

52. An apparatus for pressure sensing in three-dimensional (3D) space, the apparatus comprising: means for obtaining a pressure measurement of user-induced pressure, the pressure measurement being related to a cursor position of a cursor displayed on a graphical user interface (GUI); means for selecting a first functional layer icon of a first functional layer of a plurality of functional layers when the pressure measurement exceeds a first pressure threshold, the first functional layer being associated with the first functional layer icon displayed by the GUI; and means for performing a function corresponding to the first functional layer.

Description:
PRESSURE SENSING IN THREE-DIMENSIONAL (3D) SPACE

TECHNICAL FIELD

[0001] The present disclosure is generally related to a wearable device for in-air virtual interactions, and more specifically, an in-air virtual press wearable device.

BACKGROUND

[0002] Augmented reality (AR), virtual reality (VR), and mixed reality (MR) environments have been gaining popularity and wide use in many areas of daily life including gaming, teaching, shopping, content browsing, in-vehicle interactions, entertainment, manufacturing production, etc. However, movement accuracy associated with in-air virtual interactions in AR, VR, and MR environments can be challenging to achieve.

SUMMARY

[0003] Various examples are now described to introduce a selection of concepts in a simplified form, which are further described below in the detailed description. The Summary is not intended to identify key or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.

[0004] According to a first aspect of the present disclosure, there is provided a computer-implemented method for pressure sensing in three- dimensional (3D) space. The method includes obtaining a pressure measurement of user-induced pressure. The pressure measurement is obtained relative to a cursor position of a cursor displayed on a graphical user interface (GUI). A first functional layer icon of a first functional layer of a plurality of functional layers is selected when the pressure measurement exceeds a first pressure threshold. The first functional layer is associated with the first functional layer icon displayed by the GUI. A function corresponding to the first functional layer is performed.

[0005] In a first implementation form of the method according to the first aspect as such, the user-induced pressure is generated between a second user finger and a wearable sensor device worn on a first user finger. The wearable sensor device includes a pressure sensor.

[0006] In a second implementation form of the method according to the first aspect as such or any implementation form of the first aspect, the user- induced pressure is generated between a surface and a wearable sensor device worn on a first user finger. The wearable sensor device includes a pressure sensor.

[0007] In a third implementation form of the method according to the first aspect as such or any implementation form of the first aspect, the obtaining the pressure measurement includes receiving the pressure measurement.

[0008] In a fourth implementation form of the method according to the first aspect as such or any implementation form of the first aspect, the obtaining the pressure measurement includes receiving a pressure signal from a wearable sensor device worn on a first user finger. The wearable sensor device includes a pressure sensor. The pressure measurement is generated from the pressure signal.

[0009] In a fifth implementation form of the method according to the first aspect as such or any implementation form of the first aspect, the first functional layer icon is selected when the pressure measurement exceeds the first pressure threshold for a predefined selection period.

[0010] In a sixth implementation form of the method according to the first aspect as such or any implementation form of the first aspect, the first functional layer icon is selected based on the pressure measurement exceeding the first pressure threshold and the cursor position being within a predefined distance from the first functional layer icon.

[0011] In a seventh implementation form of the method according to the first aspect as such or any implementation form of the first aspect, the first functional layer icon is selected based on the pressure measurement exceeding the first pressure threshold for a predefined period, and the cursor position is within a predefined distance from the first functional layer icon.

[0012] In an eighth implementation form of the method according to the first aspect as such or any implementation form of the first aspect, a preliminary step includes causing a display of the first functional layer icon on the GUI and within a predefined distance from the cursor position.

[0013] In a ninth implementation form of the method according to the first aspect as such or any implementation form of the first aspect, a motion of a wearable sensor device worn on a first user finger is detected. The motion causes a movement of the cursor on the GUI.

[0014] In a tenth implementation form of the method according to the first aspect as such or any implementation form of the first aspect, the pressure measurement remains greater than the first pressure threshold during the motion. [0015] In an eleventh implementation form of the method according to the first aspect as such or any implementation form of the first aspect, a second functional layer icon of a second functional layer of the plurality of functional layers is displayed in response to the selecting the first functional layer icon.

The second functional layer is a sub-layer of the first functional layer. A motion of a wearable sensor device to a second position is detected. The motion causes a movement of the cursor on the GUI, with the pressure measurement remaining greater than the first pressure threshold during the motion. The second position is within a predefined distance from the second functional layer icon. A second pressure measurement is obtained. The second pressure measurement is obtained at the second position. The second functional layer icon is selected when the second pressure measurement exceeds a second pressure threshold.

[0016] In a twelfth implementation form of the method according to the first aspect as such or any implementation form of the first aspect, the wearable sensor device is worn on at least one finger of a user’s hand.

[0017] In a thirteenth implementation form of the method according to the first aspect as such or any implementation form of the first aspect, the cursor includes a first virtual object representation in a virtual environment. The method further includes detecting a virtual contact between the first virtual object representation and a second virtual object representation in the virtual environment.

[0018] In a fourteenth implementation form of the method according to the first aspect as such or any implementation form of the first aspect, virtual pressure measurement of virtual pressure between the first object representation and the second object representation in the virtual environment is determined. A feedback signal is generated using a wearable sensor device. The feedback signal is proportional to the virtual pressure measurement.

[0019] In a fifteenth implementation form of the method according to the first aspect as such or any implementation form of the first aspect, generating the feedback signal further includes at least one of generating pressure feedback at a wearable sensor device, generating a visual signal at the wearable sensor device, or generating an audio signal at the wearable sensor device.

[0020] In a sixteenth implementation form of the method according to the first aspect as such or any implementation form of the first aspect, the obtaining the pressure measurement further includes obtaining a pressure sequence of user-induced pressures, the pressure sequence being related to a second cursor position of the cursor displayed on the GUI. A second functional layer icon associated with a second functional layer of the plurality of functional layers is selected when the pressure sequence matches a pre-configured pressure sequence. A function corresponding to the second functional layer is performed.

[0021] According to a second aspect of the present disclosure, there is provided an apparatus for use with a wearable sensor device for pressure sensing in three-dimensional (3D) space. The apparatus includes a memory storing instructions and at least one processor in communication with the memory. The at least one processor is configured, upon execution of the instructions, to perform the following steps (or operations). The operations include obtaining a pressure measurement of user-induced pressure. The pressure measurement is obtained relative to a cursor position of a cursor displayed on a graphical user interface (GUI). A first functional layer icon of a first functional layer of a plurality of functional layers is selected when the pressure measurement exceeds a first pressure threshold. The first functional layer is associated with the first functional layer icon displayed by the GUI. A function corresponding to the first functional layer is performed.

[0022] In a first implementation form of the apparatus according to the second aspect as such, the user-induced pressure is generated between a second user finger and the wearable sensor device worn on a first user finger. The wearable sensor device includes a pressure sensor. [0023] In a second implementation form of the apparatus according to the second aspect as such or any implementation form of the second aspect, the user-induced pressure is generated between a surface and the wearable sensor device worn on a first user finger. The wearable sensor device includes a pressure sensor.

[0024] In a third implementation form of the apparatus according to the second aspect as such or any implementation form of the second aspect, the obtaining the pressure measurement includes receiving the pressure measurement.

[0025] In a fourth implementation form of the apparatus according to the second aspect as such or any implementation form of the second aspect, the obtaining the pressure measurement includes receiving a pressure signal from the wearable sensor device worn on a first user finger. The wearable sensor device includes a pressure sensor. The pressure measurement is generated from the pressure signal.

[0026] In a fifth implementation form of the apparatus according to the second aspect as such or any implementation form of the second aspect, the first functional layer icon is selected when the pressure measurement exceeds the first pressure threshold for a predefined selection period.

[0027] In a sixth implementation form of the apparatus according to the second aspect as such or any implementation form of the second aspect, the first functional layer icon is selected based on the pressure measurement exceeding the first pressure threshold and the cursor position being within a predefined distance from the first functional layer icon.

[0028] In a seventh implementation form of the apparatus according to the second aspect as such or any implementation form of the second aspect, the first functional layer icon is selected based on the pressure measurement exceeding the first pressure threshold for a predefined period, and the cursor position is within a predefined distance from the first functional layer icon.

[0029] In an eighth implementation form of the apparatus according to the second aspect as such or any implementation form of the second aspect, a preliminary step includes causing a display of the first functional layer icon on the GUI and within a predefined distance from the cursor position. [0030] In a ninth implementation form of the apparatus according to the second aspect as such or any implementation form of the second aspect, a motion of the wearable sensor device worn on a first user finger is detected. The motion causes a movement of the cursor on the GUI.

[0031] In a tenth implementation form of the apparatus according to the second aspect as such or any implementation form of the second aspect, the pressure measurement remains greater than the first pressure threshold during the motion.

[0032] In an eleventh implementation form of the apparatus according to the second aspect as such or any implementation form of the second aspect, a second functional layer icon of a second functional layer of the plurality of functional layers is displayed in response to the selecting the first functional layer icon. The second functional layer is a sub-layer of the first functional layer. A motion of the wearable sensor device to a second position is detected. The motion causes a movement of the cursor on the GUI, with the pressure measurement remaining greater than the first pressure threshold during the motion. The second position is within a predefined distance from the second functional layer icon. A second pressure measurement is obtained. The second pressure measurement is obtained at the second position. The second functional layer icon is selected when the second pressure measurement exceeds a second pressure threshold.

[0033] In a twelfth implementation form of the apparatus according to the second aspect as such or any implementation form of the second aspect, the wearable sensor device is worn on at least one finger of a user’s hand.

[0034] In a thirteenth implementation form of the apparatus according to the second aspect as such or any implementation form of the second aspect, the cursor includes a first virtual object representation in a virtual environment. The operations further include detecting a virtual contact between the first virtual object representation and a second virtual object representation in the virtual environment.

[0035] In a fourteenth implementation form of the apparatus according to the second aspect as such or any implementation form of the second aspect, virtual pressure measurement of virtual pressure between the first virtual object representation and the second virtual object representation in the virtual environment is determined. A feedback signal is generated using the wearable sensor device. The feedback signal is proportional to the virtual pressure measurement.

[0036] In a fifteenth implementation form of the apparatus according to the second aspect as such or any implementation form of the second aspect, generating the feedback signal further includes at least one of generating pressure feedback at a wearable sensor device, generating a visual signal at the wearable sensor device, or generating an audio signal at the wearable sensor device.

[0037] In a sixteenth implementation form of the apparatus according to the second aspect as such or any implementation form of the second aspect, the obtaining the pressure measurement further includes obtaining a pressure sequence of user-induced pressures, the pressure sequence being related to a second cursor position of the cursor displayed on the GUI. A second functional layer icon associated with a second functional layer of the plurality of functional layers is selected when the pressure sequence matches a pre-configured pressure sequence. A function corresponding to the second functional layer is performed.

[0038] According to a third aspect of the present disclosure, there is provided a non-transitory computer-readable medium storing computer instructions for performing pressure sensing in three-dimensional (3D) space. The instructions configure at least one processor to perform the following steps (or operations). Pressure measurement of user-induced pressure is obtained. The pressure measurement is obtained relative to a cursor position of a cursor displayed on a graphical user interface (GUI). A first functional layer icon of a first functional layer of a plurality of functional layers is selected when the pressure measurement exceeds a first pressure threshold. The first functional layer is associated with the first functional layer icon displayed by the GUI. A function corresponding to the first functional layer is performed.

[0039] In a first implementation form of the non-transitory computer- readable medium according to the third aspect as such, the user-induced pressure is generated between a second user finger and a wearable sensor device worn on a first user finger. The wearable sensor device includes a pressure sensor. [0040] In a second implementation form of the non-transitory computer- readable medium according to the third aspect as such or any implementation form of the third aspect, the user-induced pressure is generated between a surface and a wearable sensor device worn on a first user finger. The wearable sensor device includes a pressure sensor.

[0041] In a third implementation form of the non-transitory computer- readable medium according to the third aspect as such or any implementation form of the third aspect, the obtaining the pressure measurement includes receiving the pressure measurement.

[0042] In a fourth implementation form of the non-transitory computer- readable medium according to the third aspect as such or any implementation form of the third aspect, the obtaining the pressure measurement includes receiving a pressure signal from a wearable sensor device worn on a first user finger. The wearable sensor device includes a pressure sensor. The pressure measurement is generated from the pressure signal.

[0043] In a fifth implementation form of the non-transitory computer- readable medium according to the third aspect as such or any implementation form of the third aspect, the first functional layer icon is selected when the pressure measurement exceeds the first pressure threshold for a predefined selection period.

[0044] In a sixth implementation form of the non-transitory computer- readable medium according to the third aspect as such or any implementation form of the third aspect, the first functional layer icon is selected based on the pressure measurement exceeding the first pressure threshold and the cursor position being within a predefined distance from the first functional layer icon.

[0045] In a seventh implementation form of the non-transitory computer- readable medium according to the third aspect as such or any implementation form of the third aspect, the first functional layer icon is selected based on the pressure measurement exceeding the first pressure threshold for a predefined period, and the cursor position is within a predefined distance from the first functional layer icon.

[0046] In an eighth implementation form of the non-transitory computer- readable medium according to the third aspect as such or any implementation form of the third aspect, a preliminary step includes causing a display of the first functional layer icon on the GUI and within a predefined distance from the cursor position.

[0047] In a ninth implementation form of the non-transitory computer- readable medium according to the third aspect as such or any implementation form of the third aspect, a motion of a wearable sensor device worn on a first user finger is detected. The motion causes a movement of the cursor on the GUI. [0048] In a tenth implementation form of the non-transitory computer- readable medium according to the third aspect as such or any implementation form of the third aspect, the pressure measurement remains greater than the first pressure threshold during the motion.

[0049] In an eleventh implementation form of the non-transitory computer-readable medium according to the third aspect as such or any implementation form of the third aspect, a second functional layer icon of a second functional layer of the plurality of functional layers is displayed in response to the selecting the first functional layer icon. The second functional layer is a sub-layer of the first functional layer. A motion of a wearable sensor device to a second position is detected. The motion causes a movement of the cursor on the GUI, with the pressure measurement remaining greater than the first pressure threshold during the motion. The second position is within a predefined distance from the second functional layer icon. A second pressure measurement is obtained. The second pressure measurement is obtained at the second position. The second functional layer icon is selected when the second pressure measurement exceeds a second pressure threshold.

[0050] In a twelfth implementation form of the non-transitory computer- readable medium according to the third aspect as such or any implementation form of the third aspect, the wearable sensor device is worn on at least one finger of a user’s hand.

[0051] In a thirteenth implementation form of the non-transitory computer-readable medium according to the third aspect as such or any implementation form of the third aspect, the cursor includes a first virtual object representation in a virtual environment. The operations further include detecting a virtual contact between the first virtual object representation and a second virtual object representation in the virtual environment.

[0052] In a fourteenth implementation form of the non-transitory computer-readable medium according to the third aspect as such or any implementation form of the third aspect, virtual pressure measurement of virtual pressure between the first virtual object representation and the second virtual object representation in the virtual environment is determined. A feedback signal is generated using a wearable sensor device. The feedback signal is proportional to the virtual pressure measurement.

[0053] In a fifteenth implementation form of the non-transitory computer-readable medium according to the third aspect as such or any implementation form of the third aspect, generating the feedback signal further includes at least one of generating pressure feedback at a wearable sensor device, generating a visual signal at the wearable sensor device, or generating an audio signal at the wearable sensor device.

[0054] In a sixteenth implementation form of the non-transitory computer-readable medium according to the third aspect as such or any implementation form of the third aspect, the obtaining the pressure measurement further includes obtaining a pressure sequence of user-induced pressures, the pressure sequence being related to a second cursor position of the cursor displayed on the GUI. A second functional layer icon associated with a second functional layer of the plurality of functional layers is selected when the pressure sequence matches a pre-configured pressure sequence. A function corresponding to the second functional layer is performed.

[0055] According to a fourth aspect of the present disclosure, there is provided an apparatus for pressure sensing in three-dimensional (3D) space. The apparatus includes means for obtaining a pressure measurement of user-induced pressure. The pressure measurement is obtained relative to a cursor position of a cursor displayed on a graphical user interface (GUI). The apparatus further includes means for selecting a first functional layer icon of a first functional layer of a plurality of functional layers when the pressure measurement exceeds a first pressure threshold. The first functional layer is associated with the first functional layer icon displayed by the GUI. The apparatus further includes means for performing a function corresponding to the first functional layer.

[0056] Any one of the foregoing examples may be combined with any one or more of the other foregoing examples to create a new embodiment within the scope of the present disclosure.

BRIEF DESCRIPTION OF THE DRAWINGS

[0057] In the drawings, which are not necessarily drawn to scale, like numerals may describe similar components in different views. The drawings illustrate generally, by way of example, but not by way of limitation, various embodiments discussed in the present document.

[0058] FIG. 1 is a block diagram illustrating a computing device in communication with a wearable sensor device (WSD), according to example embodiments.

[0059] FIG. 2 is a diagram illustrating the WSD of FIG. 1 worn on a user’s hand, according to example embodiments.

[0060] FIG. 3 is a diagram illustrating the WSD of FIG. 1 worn on a user’s hand and used in connection with an in-air virtual press, according to example embodiments.

[0061] FIG. 4 is a diagram of a graphical user interface (GUI) illustrating a first functional layer icon associated with a first functional layer of a plurality of functional layers and a cursor corresponding to a WSD worn on a user’s hand, according to example embodiments.

[0062] FIG. 5 is a diagram of a GUI illustrating multiple second functional layer icons associated with a second functional layer of the plurality of functional layers and a cursor corresponding to the WSD of FIG. 4, according to example embodiments.

[0063] FIG. 6 is a diagram of the GUI of FIG. 5 illustrating the movement of the cursor corresponding to the movement of the WSD of FIG. 4 for selecting one of the multiple second functional layer icons, according to example embodiments.

[0064] FIG. 7 is a diagram of a GUI illustrating the movement of the cursor corresponding to the movement of the WSD of FIG. 4 for selecting one of multiple third functional layer icons associated with a third functional layer of the plurality of functional layers, according to example embodiments.

[0065] FIG. 8 is a diagram of the GUI of FIG. 7 illustrating the selection of one of the multiple third functional layer icons, according to example embodiments.

[0066] FIG. 9 is a flowchart of a method for pressure sensing in three- dimensional (3D) space, according to example embodiments.

[0067] FIG. 10 is a block diagram illustrating a representative software architecture, which may be used in conjunction with various device hardware described herein, according to example embodiments.

[0068] FIG. 11 is a block diagram illustrating circuitry for a device that implements algorithms and performs methods, according to example embodiments.

DETAILED DESCRIPTION

[0069] It should be understood at the outset that although an illustrative implementation of one or more embodiments is provided below, the disclosed systems and methods described with respect to FIGS. 1-11 may be implemented using any number of techniques, whether currently known or not yet in existence. The disclosure should in no way be limited to the illustrative implementations, drawings, and techniques discussed below, including the exemplary designs and implementations illustrated and described herein, but may be modified within the scope of the appended claims along with their full scope of equivalents.

[0070] In the following description, reference is made to the accompanying drawings that form a part hereof, and in which are shown, by way of illustration, specific embodiments which may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the inventive subject matter, and it is to be understood that other embodiments may be utilized, and that structural, logical, and electrical changes may be made without departing from the scope of the present disclosure. The following description of example embodiments is, therefore, not to be taken in a limiting sense, and the scope of the present disclosure is defined by the appended claims.

[0071] As discussed herein, the term “module” includes one or both of hardware or software that has been designed to perform a function or functions discussed herein.

[0072] As discussed herein, the term “virtual environment” indicates a computer-simulated environment, including a virtual reality environment, an augmented reality environment, or a mixed reality environment. As discussed herein, the term “virtual reality” (or VR) includes a computer-simulated environment configured as a fully artificial digital environment. As discussed herein, the term “augmented reality” (or AR) indicates a computer-simulated environment that overlays virtual objects in a real -world environment. As discussed herein, the term “mixed reality” (or MR) indicates a computer- simulated environment that anchors virtual objects to a real-world environment.

[0073] As used herein, the term “virtual press” indicates a pressing action that is applied to a virtual object (or objects) in a virtual environment, wherein the virtual press can manipulate or select the displayed virtual object(s). Alternatively, the term "virtual press" applies to a virtual pressure sensing associated with a virtual pressure between a first virtual object and a second virtual object in a virtual environment. As used herein, the term “functional layer” indicates a specific functionality that can be activated by selecting an icon corresponding to the functional layer (also referred to as a functional layer icon). In some aspects, a functional layer can be part of a “plurality of functional layers,” which can indicate a hierarchical arrangement of related functionalities.

[0074] Some drawbacks associated with virtual environments that conventionally lack a virtual press experience include the following. During a virtual environment experience, a user may have difficulty feeling virtual pressure from/to virtual objects. Pressure recipients (e.g., virtual objects being pressed in a virtual environment), cannot provide direct or accurate feedback about the virtual pressure associated with virtual object-to-object presses, because such presses happen virtually. Some virtual environment experiences may use sensors to detect the movements and depth of human hands, or game controllers, to roughly estimated the pressure. However, using such techniques is not optimal for many scenarios (e.g., such as graphical design and humancomputer interactions). Additionally, existing virtual environment architectures do not provide the user with a virtual press experience where the user receives pressure feedback associated with the virtual touch and can control how much pressure the user can apply to objects in the virtual environment in three- dimensional (3D) space.

[0075] Disclosed techniques use a wearable sensor device (WSD) which includes, for example, a sensor array, a feedback array, a communications module, and a battery. In some aspects, the WSD can be configured as a pad, a ring, an attachable tip, a wrap, or another type of device that can be worn by a user (e.g., worn on a user’s finger, as illustrated in FIGS. 1-8). The WSD can be in communication with a computing device (e.g., a mobile device paired with the WSD). The computing device can include a virtual press module (VPM) comprising software. The VPM can be used to receive a sensor signal (or sensor signals) from the WSD and configure and manage virtual press-related functionalities performed by the WSD.

[0076] In some aspects, the wearable device can also be configured to provide different levels of virtual presses (or press sequences), which can be used in connection with object selection, object movement, and multi-layer selection (e.g., selecting one or more functional layers of a plurality of available functional layers associated with an on-screen representation in a graphical user interface of the virtual environment). Additional description of the WSD and the computing device using a VPM is provided in connection with FIG. 1. FIG. 2 and FIG. 3 illustrate using the WSD while worn on a user’s finger. FIG. 4 - FIG. 9 illustrate various virtual press-related functionalities associated with the disclosed WSD. FIG. 10 illustrates a representative software architecture. FIG. 11 illustrates a representative device that can be used in connection with the disclosed virtual press-related functionalities.

[0077] FIG. 1 is a block diagram 100 illustrating a computing device 104 in communication with a device 102 via communication link 106, according to example embodiments. Referring to FIG. 1, device 102 includes a battery module 108, a communications module 110, memory 112, sensor array 114, and feedback array 122. In some embodiments and as described herein below, device 102 is a WSD and is referred to as WSD 102. However, in other embodiment, device 102 can be another type of device that is connected (e.g., paired) to computing device 104.

[0078] The battery module 108 is configured to power various components of WSD 102, including, communications module 110, the sensor array 114, and the feedback array 122. In some aspects, the battery module 108 includes rechargeable batteries, solar-powered batteries, or other types of batteries.

[0079] The communications module 110 is used to connect to (e.g., be paired with) the computing device 104 via the communication link 106. In some embodiments, communication link 106 can include a wired communication link, a wireless communication link, or a combination thereof. Additionally, the communication link 106 can use one or more wireless protocols, such as a Bluetooth protocol, for communicating data between WSD 102 and the computing device 104. In some aspects, WSD 102 can use the communications module 110 to receive configuration information (e.g., configurations for the sensor array 114 or the feedback array 122) from the computing device 104 via the communication link 106. WSD 102 can also use the communications module 110 to communicate sensor data from the sensor array 114 back to the computing device 104 for processing. The configuration information and the sensor data can be stored in memory 112 in device 102.

[0080] The sensor array 114 can include different types of sensors, such as a pressure sensor 116, a motion sensor 118, and other sensors 120. The pressure sensor 116 can be configured to detect contact pressure applied to the sensor while a user is wearing the WSD 102, and generate a pressure signal corresponding to the detected contact pressure. In some embodiments, the pressure sensor 116 can generate a pressure measurement from the pressure signal or communicate the pressure signal to the computing device 104 for processing (e.g., VPM 142 in computing device 104 can generate the pressure measurement based on the pressure signal received from the pressure sensor 116).

[0081] The motion sensor 118 can be configured to detect motion, including detecting motion in 3D space during a virtual environment experience. A resulting motion signal or measurement can be provided to a user via the computing device 104. In some aspects, sensor array 114 can also include other sensors 120.

[0082] The feedback array 122 can be configured to provide different types of feedback to the user in connection with providing or experiencing a virtual touch during a virtual environment experience. Example feedback that can be provided in connection with a virtual touch includes audio signal feedback generated by speaker 124, a visible light signal generated by a light source 126, sensory feedback such as heat feedback generated by heat source 128, or other types of feedback (e.g., vibration feedback, low-voltage signal feedback, etc.) generated by other feedback sources 130.

[0083] Computing device 104 includes a processor 132, memory 134, communications module 136, power supply module 138, a display 140, and a VPM 142. The communications module 136 can be configured to pair (or otherwise associate) WSD 102 with computing device 104 as well as facilitate communication of data between WSD 102 and computing device 104 in connection with pressure-sensing and/or virtual press-related functionalities.

[0084] In some embodiments, computing device 104 can be used in connection with a virtual environment experience (e.g., a virtual environment game, graphic design, human-computer interactions, or other types of virtual environment experiences) which may include in-air virtual press experiences. Example in-air virtual press experiences include applying pressure to objects in the virtual environment in 3D space, as well as controlling how much pressure is applied, to select a functional layer of a plurality of functional layers and perform a function corresponding to the selected functional layer. Other in-air virtual press experiences include enabling a user to experience virtual touch such as experiencing pressure feedback associated with pressure received from other virtual objects.

[0085] Additional description of virtual press-related functionalities which can be performed using WSD 102 is provided in connection with FIG. 2 - FIG. 11.

[0086] FIG. 2 is a diagram 200 illustrating the WSD 102 of FIG. 1 worn on a user’s hand, according to example embodiments. More specifically, diagram 200 illustrates WSD 102 is configured as a pad worn on a finger on the user’s hand 202.

[0087] FIG. 3 is a diagram 300 illustrating the WSD of FIG. 1 worn on a user’s hand and used in connection with an in-air virtual press, according to example embodiments. Referring to FIG. 3, WSD 102 is configured as a pad worn on finger 302 of the user’s hand 202. In this regard, pressure sensor 116 of the sensor array 114 in WSD 102 can generate a pressure signal indicative of pressure detected by the pressure sensor 116 (e.g., when finger 304 presses against WSD 102 worn on finger 302). In one embodiment, WSD 102 can determine a pressure measurement from the pressure signal sensed by the pressure sensor 116 and can communicate the pressure measurement to computing device 104 for further processing. In another embodiment, WSD 102 can communicate the pressure signal to computing device 104, and the processor 132 of computing device 104 (e.g., executing the VPM 142) can determine the pressure measurement from the received pressure signal. The use of the pressure measurement for functional layer selection is discussed herein below in connection with FIG. 4 - FIG. 8.

[0088] In some aspects, a first functional layer (displayed on the display 140 of the computing device 104) can be represented by a first functional layer icon and can be associated with a first functionality. When the first functional layer icon is selected, at least a second functional layer icon can be activated (e.g., displayed), where the second functional layer icon is associated with at least a second functionality. Additional (e.g., subsequent) functional layers (represented by corresponding functional layer icons) can be activated by selecting a functional layer icon of a currently active functional layer.

[0089] In an example embodiment, the plurality of functional layers can correspond to a plurality of functionality menus configured in a hierarchical arrangement. For example, a first functional layer can be associated with a first functionality menu. When a first functional layer icon corresponding to the first functionality menu is displayed on a screen and selected, a second functional layer icon corresponding to a second functionality menu (which is a sub-menu of the first functionality menu) is activated and displayed. Additional functionality menus can be activated and displayed similarly. [0090] FIG. 4 is diagram 400 of a graphical user interface (GUI) illustrating a first functional layer icon associated with a first functional layer of a plurality of functional layers and a cursor corresponding to the WSD 102 worn on a user’s hand, according to example embodiments. Referring to FIG. 4, GUI 404 can be displayed on display 140 of computing device 104 in connection with a virtual environment experience using WSD 102 on a user's finger 302 of user hand 202. GUI 404 illustrates a first functional layer icon 402, which can be associated with contacts-related functionalities. FIG. 4 further illustrates a cursor 408, which corresponds to a position 410 in the 3D space of WSD 102 as worn on user hand 202. When user hand 202 moves, motion sensor 118 in the WSD 102 detects movement and communicates movement data to the processor 132 and the VPM 142 of the computing device 104, which causes the corresponding movement of cursor 408 in GUI 404.

[0091] As illustrated in FIG. 4, user finger 304 is not currently touching finger 302 and the WSD 102 and, therefore, pressure sensor 116 is not detecting any pressure. In an example embodiment, the first functional layer icon 402 is displayed automatically in GUI 404, without any movement or pressure input needed from WSD 102.

[0092] In another embodiment, as an initial step, the first functional layer icon 402 is displayed after contact pressure is detected by the pressure sensor 116. The contact pressure causing the display of the first functional layer icon 402 can be associated with a minimum pressure threshold, or simply a detected contact between WSD 102 and another user finger or surface. In some aspects, the first functional layer icon 402 can be displayed within a preconfigured distance from a cursor position of cursor 408 in GUI 404.

[0093] In some embodiments, subsequent functional layers (e.g., a second functional layer) of the plurality of functional layers can be activated (e.g., displayed on GUI 404) by selecting the first functional layer icon 402 using a pressure measurement from WSD 102. For example, the first functional layer icon 402 of the first functional layer of the plurality of functional layers can be selected when the pressure measurement detected by the pressure sensor 116 of WSD 102 exceeds a first pressure threshold (e.g., the pressure measurement exceeds a first pressure threshold but is below a second pressure threshold). When the first functional layer icon 402 of the first functional layer of the plurality of functional layers is selected, one or more second functional layer icons associated with a second functional layer are displayed in GUI 404 (e.g., as illustrated in discussed in connection with FIG. 5).

[0094] In some embodiments, a selection of the first functional layer icon 402 is performed when a cursor position of cursor 408 is over at least a portion of the first functional layer icon 402. Alternatively, the selection of the first functional layer icon 402 is performed when the cursor 408 is within a preconfigured distance of the first functional layer icon 402. For example, a selection of the first functional layer icon 402 is performed when the cursor position of cursor 408 is within boundary 406 which defines a preconfigured maximum distance from the first functional layer icon 402.

[0095] FIG. 5 is diagram 500 of a GUI 501 illustrating multiple second functional layer icons associated with a second functional layer of the plurality of functional layers and a cursor 408 corresponding to the WSD 102, according to example embodiments. Referring to FIG. 5, the cursor position of cursor 408 corresponds to position 503 of WSD 102 (as worn on user hand 202) in 3D space. Multiple second functional layer icons 502, 504, 506, and 508 are displayed in GUI 501 when the cursor position of cursor 408 is within the boundary 406 of the first functional layer icon 402 and when the pressure detected by WSD 102 exceeds the first pressure threshold but is below a second pressure threshold. In this regard, performing a function corresponding to the first functional layer includes displaying the multiple second functional layer icons 502 - 508 when the pressure 505 is above the first pressure threshold and the first functional layer icon is selected. In an example embodiment, if the first functional layer does not include any additional functional layers, no additional functional layer icons are displayed, and performing the function corresponding to the first functional layer can include performing a function that is unrelated to additional functional layers (e.g., performing a data processing function or another functionality).

[0096] In some aspects, the pressure measurement detected by WSD 102 corresponds to pressure 505 applied between WSD 102 (as worn on finger 302) and finger 304 of user hand 202. [0097] FIG. 6 is a diagram 600 of the GUI 501 of FIG. 5 illustrating the movement of the cursor 408 corresponding to the movement of the WSD 102 for selecting one of the multiple second functional layer icons, according to example embodiments. Referring to FIG. 6, after the multiple second functional layer icons 502 - 508 are displayed in GUI 501, a user may wish to select the second functional layer icon 508 and perform a function corresponding to the second functional layer (e.g., viewing any additional functional layers associated with that icon). In this regard, the initial cursor position 602 of cursor 408 corresponds to the initial hand position 606 of WSD 102 in 3D space. As the user hand 202 moves to final hand position 608, cursor 408 moves to a corresponding final cursor position 604 which overlaps at least partially with the second functional layer icon 508.

[0098] In some embodiments, the multiple second functional layer icons 502 - 508 remain displayed in GUI 501 as long as pressure 505 (or pressure above the first pressure threshold and below the second pressure threshold) remains applied between WSD 102 and finger 304 of the user hand 202 during the movement from initial cursor position 602 to final cursor position 604.

[0099] After cursor 408 is at final cursor position 604, the user can perform a function corresponding to a second functional layer of second functional layer icon 508 by increasing the pressure 505 applied between WSD 102 and finger 304 to a pressure above a third pressure threshold (e.g., pressure 718 as discussed in connection with FIG. 7). For example, performing the function corresponding to the second functional layer can include displaying multiple third functional layer icons as illustrated in FIG. 7.

[00100] FIG. 7 is a diagram 700 of a GUI 701 illustrating the movement of the cursor 408 corresponding to the movement of the WSD 102 for selecting one of multiple third functional layer icons associated with a third functional layer of the plurality of functional layers, according to example embodiments. Referring to FIG. 7, after the second functional layer icon 508 is selected (e.g., by using pressure 505 as illustrated in FIG. 6), multiple third functional layer icons 702 - 708 are displayed in GUI 701. The user wearing WSD 102 on user hand 202 may wish to select third functional layer icon 706 to view any additional functional layers associated with that icon. In this regard, the initial cursor position 710 of cursor 408 corresponds to the initial hand position 714 of WSD 102 in 3D space. As the user hand 202 moves to final hand position 716, cursor 408 moves to a corresponding final cursor position 712, which overlaps at least partially with third functional layer icon 706.

[00101] In some embodiments, the multiple third functional layer icons 702 - 708 remain displayed in GUI 701 as long as pressure 718 (or pressure above the third pressure threshold and below a fourth pressure threshold) remains applied between WSD 102 and finger 304 of the user hand 202 during the movement from an initial cursor position 710 to a final cursor position 712.

[00102] FIG. 8 is a diagram 800 of the GUI 701 illustrating selection of one of the multiple third functional layer icons, according to example embodiments. Referring to FIG. 8, after cursor 408 is at cursor position 712 (corresponding to the user’s hand 202 with WSD 102 being at position 806), the user can perform a function (e.g., send an email, for example) corresponding to the third functional layer of the third functional layer icon 706 by increasing the pressure applied between WSD 102 and finger 304 to a pressure above the fourth pressure threshold (e.g., pressure 808 as illustrated in FIG. 8).

[0100] In some embodiments, a selection of the third functional layer icon 706 is performed only when the cursor position 712 of cursor 408 is within a preconfigured distance 802 of the third functional layer icon 706. For example, a selection of the third functional layer icon 706 is performed when the cursor position 712 of cursor 408 is within boundary 802, which defines a preconfigured maximum distance from the third functional layer icon 706.

[0101] In some embodiments, the selection of a functional layer icon can be based on (a) contact pressure sensed by the pressure sensor of WSD 102; (b) contact pressure and duration of contact pressure for a predetermined period; and (c) either of (a) or (b), as well as based on the position of the user's hand (relative to a defined point in 3D space, such as being within a boundary of the icon being selected), which relates to the position of a cursor or another pointer on the screen.

[0102] Even though the disclosed techniques for sensing pressure using WSD 102 discuss a continuous (uninterrupted) press with increasing pressure to select different functional layers, the disclosure is not limited in this regard. In some embodiments, a pressure sequence or pattern (e.g., a sequence of two or more taps between fingers 302 and 304 or between finger 302 with WSD 102 and another surface) can also be used for triggering virtual touch events such as selecting functional layer icons and performing functionalities associated with the corresponding functional layers. The pressure sequence or pattern can include a predetermined sequence of pressure values or ranges exerted by the user. For example, the predetermined sequence of pressure values or ranges can require the user to enter a sequence of low-high-low pressures to generate an input, including an input command or action. The pressure sequence or pattern can include a predetermined sequence of pressure values or ranges exerted by the user for predetermined time periods. For example, the predetermined sequence of pressure values or ranges can require the user to enter a sequence of a short-duration low pressure, a long-duration high pressure, followed by a short- duration low pressure, to generate the input.

[0103] Even though the disclosed techniques are described in connection with using a virtual press to interact with a 3D GUI, including icons with functional layers (e.g., as illustrated in FIGS. 2-8), such disclosed techniques are not limited in this regard. In some embodiments, the disclosed techniques can also be used in connection with other AR-, VR-, or MR-related applications and use cases. For example, the disclosed techniques can be used to configure a virtual press with regard to a switch or button (such as an On-Off switch) of a virtual representation of a machine to operate such a machine.

[0104] FIG. 9 is a flowchart of a method for pressure sensing in three- dimensional (3D) space, according to example embodiments. Method 900 includes operations 902, 904, and 906. By way of example and not limitation, method 900 may be performed by one or more processors within a computing device, such as device 104 in FIG. 1 or device 1100 of FIG. 11, for example. In some aspects, method 900 may be performed by a VPM used by (or executing on) device 104 (e.g., VPM 142) or device 1100 (e.g., VPM 1160).

[0105] At operation 902, pressure measurement of user-induced pressure is obtained. For example, pressure sensor 116 of WSD 102 obtains pressure measurement relative to a cursor position of cursor 408 displayed on GUI 404 (e.g., as discussed in connection with FIG. 4). [0106] At operation 904, a first functional layer icon of a first functional layer of a plurality of functional layers is selected when the pressure measurement exceeds a first pressure threshold. For example, the first functional layer is associated with the first functional layer icon 402 displayed by the GUI 404. The first functional layer icon 402 is selected when the pressure measurement obtained by the pressure sensor 116 of sensor array 114 exceeds a first pressure threshold. In other aspects, the first functional layer can be considered a functional layer illustrated in any of FIG. 4 - FIG. 8.

[0107] At operation 906, a function corresponding to the first functional layer is performed. The function can include the displaying of a next functional layer, for example, or can initiate an operational action(s). For example and in reference to FIG. 5, multiple second functional layer icons 502, 504, 506, and 508 are displayed in GUI 501 when the cursor position of cursor 408 is within the boundary 406 of the first functional layer icon 402 and when the pressure measurement detected by WSD 102 exceeds a second pressure threshold. In this regard, performing a function corresponding to the first functional layer in some embodiments includes displaying the multiple second functional layer icons 502 - 508 when the first functional layer icon is selected.

[0108] In some embodiments, the contact pressure is generated between a second user finger and a wearable sensor device worn on a first user finger. For example, the contact pressure can be generated using pressure sensor 116 while WSD 102 is worn on finger 302 of the user’s hand 202.

[0109] In some aspects, the contact pressure is generated between a surface and a wearable sensor device worn on a first user finger. For example, the contact pressure can be generated by pressing WSD 102 (worn on finger 302) to a surface (e.g., a hard surface).

[0110] In some aspects, obtaining the pressure measurement comprises receiving the pressure measurement. For example, the pressure measurement can be generated by the pressure sensor 116 of sensor array 114 of WSD 102, and the pressure measurement can be communicated to VPM 142 via communication link 106.

[oni] In some embodiments, obtaining the pressure measurement includes receiving a pressure signal from a wearable sensor device worn on a first user finger, the wearable sensor device comprising a pressure sensor, and generating the pressure measurement from the pressure signal. For example, pressure sensor 116 can generate a pressure signal which can be communicated by WSD 102 to VPM 142 at computing device 104. VPM 142 can generate the pressure measurement using the received pressure signal.

[0112] In some aspects, the first functional layer icon is selected when the pressure measurement exceeds the first pressure threshold for a predefined selection period. For example, the first functional layer icon 402 in FIG. 4 can be selected when the pressure measurement (e.g., measured by a pressure sensor 116 while the user’s hand with WSD 102 is at position 410 in 3D space) exceeds the first pressure threshold for the predefined selection time.

[0113] In some aspects, selecting the first functional layer icon is based on the pressure measurement exceeding the first pressure threshold and the cursor position being within a predefined distance from the first functional layer icon. For example, selecting the first functional layer icon 402 can be based on the pressure measurement performed by WSD 102 (or by VPM 142) exceeding the first pressure threshold and the cursor position of cursor 408 is within a predefined distance from the first functional layer icon 402 (e.g., within boundary 406).

[0114] In some embodiments, selecting the first functional layer icon is based on the pressure measurement exceeding the first pressure threshold for a predefined period and the cursor position being within a predefined distance from the first functional layer icon. For example, selecting the first functional layer icon 402 can be based on the pressure measurement performed by WSD 102 (or by VPM 142) exceeding the first pressure threshold for a predefined time and the cursor position of cursor 408 being within a predefined distance from the first functional layer icon 402 (e.g., within boundary 406).

[0115] In some embodiments, a preliminary step can include causing a display of the first functional layer icon on the GUI and within a predefined distance from the cursor position. For example, the preliminary step can include displaying the first functional layer icon 402 on GUI 404 and within a predefined distance from the cursor position of cursor 408. [0116] In some embodiments, a motion of a wearable sensor device worn on a first user finger is detected, the motion causing a movement of the cursor as displayed on the GUI. For example and as illustrated in FIG. 6, motion sensor 118 can detect the motion of the user’s hand 202 from position 606 to position 608 while WSD 102 is worn on the user’s finger 302. The detected motion can cause a corresponding movement of cursor 408 from position 602 to position 604.

[0117] In some aspects, the pressure measurement remains greater than the first pressure threshold during the motion. For example, the pressure measurement remains greater than the first pressure threshold as pressure 505 is maintained throughout the motion from position 606 to position 608.

[0118] In some embodiments, a display of a second functional layer icon of a second functional layer of the plurality of functional layers is performed in response to selecting the first functional layer icon, the second functional layer being a sub-layer of the first functional layer. For example, second functional layer icons 502 - 508 are associated with the second functional layer. The second functional layer icons 502 - 508 are displayed in response to exercising/maintaining pressure 505 while cursor 408 is at or within a predefined boundary of the first functional layer icon 402.

[0119] A motion of a wearable sensor device to a second position is detected, the motion causing a movement of the cursor on the GUI, with the pressure measurement remaining greater than the first pressure threshold during the motion, and the second position being within a predefined distance from the second functional layer icon. For example, a motion of WSD 102 from position 606 to position 608 is detected while pressure 505 is maintained throughout the motion. The motion causes a movement of cursor 408 from position 602 to position 604, with position 604 being within a predefined distance of the second functional layer icon 508.

[0120] A second pressure measurement is obtained, the second pressure measurement is obtained at the second position, and the second functional layer icon is selected when the second pressure measurement exceeds a second pressure threshold. For example, the second functional layer icon 508 is selected and a function associated with this icon is performed (e.g., displaying third functional layer icon 702 - 708 in GUI 701) when pressure 505 is increased to pressure 718.

[0121] In some embodiments, the cursor (e.g., cursor 408) comprises a first object representation in a virtual environment. A virtual contact can be detected between the first virtual object representation and a second virtual object representation in the virtual environment. For example, as the first virtual object representation of cursor 408 is moving in a virtual environment, the first virtual object representation may come in contact with a second virtual object representation within the virtual environment. Virtual pressure measurement of a virtual pressure between the first virtual object representation and the second virtual object representation in the virtual environment can be determined (e.g., using motion sensor 118 and pressure sensor 116 of WSD 102). A feedback signal can be generated using the feedback array 122 of WSD 102, where the feedback signal is proportional to the virtual pressure measurement.

[0122] In some aspects, generating the feedback signal further includes at least one of generating pressure feedback (e.g., by the other feedback sources 130 of feedback array 122, such as generating pressure on the user's finger 302), generating a visual signal (e.g., by a light source 126 visible to the user), or generating an audio signal to be heard by the user (e.g., by speaker 124 of feedback array 122) at WSD 102.

[0123] FIG. 10 is a block diagram illustrating a representative software architecture 1002, which may be used in conjunction with various device hardware described herein, according to example embodiments. FIG. 10 is merely a non-limiting example of software architecture 1002 and it will be appreciated that many other architectures may be implemented to facilitate the functionality described herein. The software architecture 1002 executes on hardware, such as computing device 104 in FIG. 1 which can be the same as device 1100 of FIG. 11 that includes, among other things, processor 1105, memory 1110, storage 1115 and/or 1120, and I/O interfaces 1125 and 1130.

[0124] A representative hardware layer 1004 is illustrated and can represent, for example, the device 1100 of FIG. 11. The representative hardware layer 1004 comprises one or more processing units 1006 having associated executable instructions 1008. Executable instructions 1008 represent the executable instructions of the software architecture 1002, including implementation of the methods, modules, and so forth of any of FIGS. 1-9. Executable instructions 1008 in some embodiments comprises the VPM 124 of FIG. 1 and can be configured to perform the functionalities discussed in connection with FIG. 1 - FIG. 9. Hardware layer 1004 also includes memory or storage modules 1010, which may store the executable instructions 1008. Hardware layer 1004 may also comprise other hardware 1012, which represents any other hardware of the hardware layer 1004, such as the other hardware illustrated as part of device 1100.

[0125] In the example architecture of FIG. 10, the software architecture 1002 may be conceptualized as a stack of layers where each layer provides particular functionality. For example, the software architecture 1002 may include layers such as an operating system 1014, libraries 1016, frameworks/middleware 1018, applications 1020, and presentation layer 1044. Operationally, the applications 1020 or other components within the layers may invoke application programming interface (API) calls 1024 through the software stack and receive a response, returned values, and so forth, illustrated as messages 1026 in response to the API calls 1024. The layers illustrated in FIG. 10 are representative in nature and not all software architectures 1002 have all layers. For example, some mobile or special purpose operating systems may not provide frameworks/middleware 1018, while others may provide such a layer. Other software architectures may include additional or different layers.

[0126] The operating system 1014 may manage hardware resources and provide common services. The operating system 1014 may include, for example, a kernel 1028, services 1030, and drivers 1032. The kernel 1028 may act as an abstraction layer between the hardware and the other software layers. For example, kernel 1028 may be responsible for memory management, processor management (e.g., scheduling), component management, networking, security settings, and so on. The services 1030 may provide other common services for the other software layers. Drivers 1032 may be responsible for controlling or interfacing with the underlying hardware. For instance, the drivers 1032 may include display drivers, camera drivers, Bluetooth® drivers, flash memory drivers, serial communication drivers (e.g., Universal Serial Bus (USB) drivers), Wi-Fi® drivers, audio drivers, power management drivers, and so forth, depending on the hardware configuration.

[0127] In some aspects, software architecture 1002 includes a VPM 1060 which can be similar to VPM 142 and can be configured to perform the functionalities discussed in connection with FIG. 1 - FIG. 9. In an example embodiment, VPM 1060 may be part of the operating system 1014 or the applications 1020 of software architecture 1002. In yet another embodiment, the VPM 1060 can be implemented in a distributed fashion where different components of the VPM 1060 are implemented in hardware (e.g., as part of the hardware layer 1004) or software (e.g., as part of the operating system 1014 or the applications 1020).

[0128] Libraries 1016 may provide a common infrastructure that may be utilized by the applications 1020 or other components or layers. Libraries 1016 typically provide functionality that allows other software modules to perform tasks more easily than to interface directly with the underlying operating system 1014 functionality (e.g., kernel 1028, services 1030, or drivers 1032). Libraries 1016 may include system libraries 1034 (e.g., C standard library) that may provide functions such as memory allocation functions, string manipulation functions, mathematic functions, and the like. In addition, libraries 1016 may include API libraries 1036 such as media libraries (e.g., libraries to support presentation and manipulation of various media formats such as MPEG4, H.264, MP3, AAC, AMR, JPG, PNG), graphics libraries (e.g., an OpenGL framework that may be used to render 2D and 3D in a graphic content on a display), database libraries (e.g., SQLite that may provide various relational database functions), web libraries (e.g., WebKit that may provide web browsing functionality), and the like. Libraries 1016 may also include a wide variety of other libraries 1038 to provide many other APIs to the applications 1020 and other software components/modules.

[0129] The frameworks/middleware 1018 (also sometimes referred to as middleware) may provide a higher-level common infrastructure that may be utilized by the applications 1020 or other software components/modules. For example, the frameworks/middleware 1018 may provide various graphical user interface (GUI) functions, high-level resource management, high-level location services, and so forth. The frameworks/middleware 1018 may provide a broad spectrum of other APIs that may be utilized by the applications 1020 or other software components/modules, some of which may be specific to a particular operating system 1014 or platform.

[0130] Applications 1020 include built-in applications 1040, and third- party applications 1042. Examples of representative built-in applications 1040 may include but are not limited to, a contacts application, a browser application, a book reader application, a location application, a media application, a messaging application, or a game application. Third-party applications 1042 may include any of the built-in applications 1040 as well as a broad assortment of other applications. In a specific example, the third-party application 1042 (e.g., an application developed using the Android™ or iOS™ software development kit (SDK) by an entity other than the vendor of the particular platform) may be mobile software running on a mobile operating system such as iOS™, Android™, Windows® Phone, or other mobile operating systems. In this example, the third-party application 1042 may invoke the API calls 1024 provided by the mobile operating system such as operating system 1014 to facilitate the functionality described herein.

[0131] The applications 1020 may utilize built-in operating system functions (e.g., kernel 1028, services 1030, and drivers 1032), libraries (e.g., system libraries 1034, API libraries 1036, and other libraries 1038), and frameworks/middleware 1018 to create user interfaces to interact with users of the system. Alternatively, or additionally, in some systems, interactions with a user may occur through a presentation layer, such as presentation layer 1044. In these systems, the application/module "logic" can be separated from the aspects of the application/module that interact with a user.

[0132] Some software architectures utilize virtual machines. In the example of FIG. 10, this is illustrated by virtual machine 1048. A virtual machine creates a software environment where applications/modules can execute as if they were executing on a hardware machine (such as the device 1100 of FIG. 11, for example). A virtual machine 1048 is hosted by a host operating system (e.g., operating system 1014) and typically, although not always, has a virtual machine monitor 1046, which manages the operation of the virtual machine 1048 as well as the interface with the host operating system (i.e., operating system 1014). A software architecture 1002 executes within the virtual machine 1048 such as an operating system (OS) 1050, libraries 1052, frameworks/middleware 1054, applications 1056, or presentation layer 1058. These layers of software architecture executing within the virtual machine 1048 can be the same as corresponding layers previously described or may be different.

[0133] FIG. 11 is a block diagram illustrating circuitry for a device that implements algorithms and performs methods, according to example embodiments. All components need not be used in various embodiments. For example, clients, servers, and cloud-based network devices may each use a different set of components, or in the case of servers, larger storage devices.

[0134] One example computing device in the form of a computer 1100 (also referred to as computing device 1100, computing node 1100, computer system 1100, or computer 1100) may include a processor 1105, memory 1110, removable storage 1115, non-removable storage 1120, input interface 1125, the output interface 1130, and communication interface 1135, all connected by a bus 1140. Although the example computing device is illustrated and described as the computer 1100, the computing device may be in different forms in different embodiments.

[0135] The memory 1110 may include volatile memory 1145 and nonvolatile memory 1150 and may store a program 1155. The computing device 1100 may include - or have access to a computing environment that includes - a variety of computer-readable media, such as the volatile memory 1145, the nonvolatile memory 1150, the removable storage 1115, and the non-removable storage 1120. Computer storage includes random-access memory (RAM), readonly memory (ROM), erasable programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM), flash memory or other memory technologies, compact disc read-only memory (CD ROM), digital versatile disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium capable of storing computer-readable instructions. [0136] Computer-readable instructions stored on a computer-readable medium (e.g., the program 1155 stored in memory 1110) are executable by the processor 1105 of the computing device 1100. A hard drive, CD-ROM, and RAM are some examples of articles including a non-transitory computer- readable medium such as a storage device. The terms “computer-readable medium” and “storage device” do not include carrier waves to the extent that carrier waves are deemed too transitory. “Computer-readable non-transitory media” includes all types of computer-readable media, including magnetic storage media, optical storage media, flash media, and solid-state storage media. It should be understood that software can be installed on and sold with a computer. Alternatively, the software can be obtained and loaded into the computer, including obtaining the software through a physical medium or distribution system, including, for example, from a server owned by the software creator or from a server not owned but used by the software creator. The software can be stored on a server for distribution over the Internet, for example. As used herein, the terms “computer-readable medium” and “machine-readable medium” are interchangeable.

[0137] Program 1155 may utilize VPM 1160. In some aspects, VPM 1160 comprises suitable circuitry, logic, interfaces, or code and can be configured to perform the functionalities of VPM 142 discussed herein in connection with any of FIGS. 1-9.

[0138] Any one or more of the modules described herein may be implemented using hardware (e.g., a processor of a machine, an applicationspecific integrated circuit (ASIC), field-programmable gate array (FPGA), or any suitable combination thereof). Moreover, any two or more of these modules may be combined into a single module, and the functions described herein for a single module may be subdivided among multiple modules. Furthermore, according to various example embodiments, modules described herein as being implemented within a single machine, database, or device may be distributed across multiple machines, databases, or devices.

[0139] Although a few embodiments have been described in detail above, other modifications are possible. For example, the logic flows depicted in the figures do not require the particular order shown, or sequential order, to achieve desirable results. Other steps may be provided, or steps may be eliminated, from the described flows, and other components may be added to, or removed from, the described systems. Other embodiments may be within the scope of the following claims.

[0140] It should be further understood that software including one or more computer-executable instructions that facilitate processing and operations as described above regarding any one or all of the steps (or operations) of the disclosure can be installed in and sold with one or more computing devices consistent with the disclosure. Alternatively, the software can be obtained and loaded into one or more computing devices, including obtaining software through a physical medium or distribution system, including, for example, from a server owned by the software creator or from a server not owned but used by the software creator. The software can be stored on a server for distribution over the Internet, for example.

[0141] Also, it will be understood by one skilled in the art that this disclosure is not limited in its application to the details of construction and the arrangement of components outlined in the description or illustrated in the drawings. The embodiments herein are capable of other embodiments and capable of being practiced or carried out in various ways. Also, it will be understood that the phraseology and terminology used herein are for description and should not be regarded as limiting. The use of “including,” “comprising,” or “having” and variations thereof herein is meant to encompass the items listed thereafter and equivalents thereof as well as additional items. Unless limited otherwise, the terms “connected,” “coupled,” and “mounted,” and variations thereof herein are used broadly and encompass direct and indirect connections, couplings, and mountings. In addition, the terms “connected” and “coupled,” and variations thereof, are not restricted to physical or mechanical connections or couplings. Further, terms such as up, down, bottom, and top are relative, and are employed to aid illustration, but are not limiting.

[0142] The components of the illustrative devices, systems, and methods employed by the illustrated embodiments can be implemented, at least in part, in digital electronic circuitry, analog electronic circuitry, computer hardware, firmware, software, or combinations of them. These components can be implemented, for example, as a computer program product such as a computer program, program code, or computer instructions tangibly embodied in an information carrier, or a machine -readable storage device, for execution by, or to control the operation of, data processing apparatus such as a programmable processor, a computer, or multiple computers.

[0143] A computer program can be written in any form of programming language, including compiled or interpreted languages, and it can be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other units suitable for use in a computing environment. A computer program can be deployed to be executed on one computer or multiple computers at one site or distributed across multiple sites and interconnected by a communication network. Also, functional programs, codes, and code segments for accomplishing the techniques described herein can be easily construed as within the scope of the claims by programmers skilled in the art to which the techniques described herein pertain. Method steps associated with the illustrative embodiments can be performed by one or more programmable processors executing a computer program, code, or instructions to perform functions (e.g., by operating on input data or generating an output). Method steps can also be performed, and apparatus for performing the methods can be implemented as, special purpose logic circuitry, e.g., an FPGA (field-programmable gate array) or an ASIC (application-specific integrated circuit), for example.

[0144] The various illustrative logical blocks, modules, and circuits described in connection with the embodiments disclosed herein may be implemented or performed with a general-purpose processor, a digital signal processor (DSP), an ASIC, an FPGA, or other programmable logic devices, discrete gate, or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A general-purpose processor may be a microprocessor, but in the alternative, the processor may be any processor, controller, microcontroller, or state machine. A processor may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration. [0145] Processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer. Generally, a processor will receive instructions and data from a read-only memory or a random-access memory or both. The required elements of a computer are a processor for executing instructions and one or more memory devices for storing instructions and data. Generally, a computer will also include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto-optical disks, or optical disks. Information carriers suitable for embodying computer program instructions and data include all forms of non-volatile memory, including by way of example, semiconductor memory devices, e.g., electrically programmable read-only memory or ROM (EPROM), electrically erasable programmable ROM (EEPROM), flash memory devices, or data storage disks (e.g., magnetic disks, internal hard disks, or removable disks, magneto-optical disks, or CD-ROM and DVD-ROM disks). The processor and the memory can be supplemented by or incorporated into special purpose logic circuitry.

[0146] Those of skill in the art understand that information and signals may be represented using any of a variety of different technologies and techniques. For example, data, instructions, commands, information, signals, bits, symbols, and chips that may be referenced throughout the above description may be represented by voltages, currents, electromagnetic waves, magnetic fields or particles, optical fields or particles, or any combination thereof.

[0147] As used herein, “machine -readable medium” (or “computer- readable medium”) comprises a device able to store instructions and data temporarily or permanently and may include, but is not limited to, randomaccess memory (RAM), read-only memory (ROM), buffer memory, flash memory, optical media, magnetic media, cache memory, other types of storage (e.g., Erasable Programmable Read-Only Memory (EEPROM)), or any suitable combination thereof. The term “machine-readable medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, or associated caches and servers) able to store processor instructions. The term “machine-readable medium” shall also be taken to include any medium or a combination of multiple media, that is capable of storing instructions for execution by one or more processors, such that the instructions, when executed by one or more processors, cause the one or more processors to perform any one or more of the methodologies described herein. Accordingly, a “machine- readable medium” refers to a single storage apparatus or device, as well as “cloud-based” storage systems or storage networks that include multiple storage apparatus or devices. The term “machine-readable medium” as used herein excludes signals per se.

[0148] In an example embodiment, the computing device 1100 includes an edge detection module for detecting edge information for a previous frame, the detecting based on rendered graphics data associated with the previous frame; a hint determination module for determining hint information from the previous frame, the determining based on the edge information, and the hint information indicating an amount of high-frequency information along a horizontal (X) axis and a vertical (Y) axis of an X-Y coordinate system in the rendered graphics data; a parameter value determination module for determining using the hint information, desired parameter values for a set of parameters associated with a plurality of sampler data structures; a selection module for selecting a sampler data structure from the plurality of sampler data structures based on the desired parameter values; and a rendering module for rendering a current frame using the selected sampler data structure, the current frame being sequential to the previous frame. In an example embodiment, the above-listed modules are implemented as part of a TBSS rendering module for processing 3D graphics data.

[0149] In addition, techniques, systems, subsystems, and methods described and illustrated in the various embodiments as discrete or separate may be combined or integrated with other systems, modules, techniques, or methods without departing from the scope of the present disclosure. Other items shown or discussed as coupled or directly coupled or communicating with each other may be indirectly coupled or communicating through some interface, device, or intermediate component whether electrically, mechanically, or otherwise. Other examples of changes, substitutions, and alterations are ascertainable by one skilled in the art and could be made without departing from the scope disclosed herein. [0150] Although the present disclosure has been described concerning specific features and embodiments thereof, it is evident that various modifications and combinations can be made thereto without departing from the scope of the disclosure. For example, other components may be added to, or removed from, the described systems. The specification and drawings are, accordingly, to be regarded simply as an illustration of the disclosure as defined by the appended claims, and are contemplated to cover any modifications, variations, combinations, or equivalents that fall within the scope of the present disclosure. Other aspects may be within the scope of the following claims. Finally, as used herein, the conjunction “or” refers to a non-exclusive “or,” unless specifically stated otherwise.