Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
DEVICES, METHODS, AND GRAPHICAL USER INTERFACES FOR INTERACTING WITH VIRTUAL OBJECTS USING HAND GESTURES
Document Type and Number:
WIPO Patent Application WO/2023/154219
Kind Code:
A4
Abstract:
The present disclosure generally relates to interacting with virtual objects using hand gestures. In some embodiments, methods and user interfaces for navigating content using hand gestures are described. In some embodiments, methods and user interfaces for using hand gestures to perform various operations are described. In some embodiments, methods and user interfaces for activating virtual objects are described. In some embodiments, methods and user interfaces for displaying information is described. In some embodiments, methods and user interfaces for manipulating the display of virtual objects is described.

Inventors:
DREYER MYLENE E (US)
LU MARISA R (US)
MISSIG JULIAN K (US)
Application Number:
PCT/US2023/012260
Publication Date:
November 16, 2023
Filing Date:
February 03, 2023
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
APPLE INC (US)
International Classes:
G06F3/01
Attorney, Agent or Firm:
EIDE, Christopher B. (US)
Download PDF:
Claims:
AMENDED CLAIMS received by the International Bureau on 28 September 2023 (28.09.2023)

1. A method, comprising: at a computer system that is in communication with a display generation component: while a hand-worn device is being worn by a user, displaying, via the display generation component, a respective user interface that includes a first portion of content and a second portion of the content that is different from the first portion of the content; while displaying the respective user interface that includes the first portion of the content and the second portion of the content, receiving an indication that the hand-worn device detected a hand input including a rotation of a hand; and in response to receiving the indication that the hand-worn device detected the hand input including the rotation of the hand: in accordance with a determination that the hand-worn device detected that the hand is clenched while the hand input was performed, navigating between the first portion of the content and the second portion of the content; and in accordance with a determination that the hand-worn device did not detect that the hand is clenched while the hand input was performed, forgoing navigating between the first portion of the content and the second portion of the content.

2. The method of claim 1, wherein the computer system is in communication with one or more sensors that have a detectability range, and wherein the hand is not within the detectability range of the one or more sensors when the hand-worn device detects the hand input.

3. The method of any one of claims 1-2, wherein navigating between the first portion of the content and the second portion of the content includes visually emphasizing the second portion of the content relative to the first portion of the content, the method further comprising: while visually emphasizing the second portion of the content, receiving an indication that the hand-worn device detected that the hand is unclenched; and in response to receiving the indication that the hand-worn device detected that the hand is unclenched, performing an operation that corresponds to the second portion of the content.

146

AMENDED SHEET (ARTICLE 19)

4. The method of any one of claims 1-3, wherein navigating between the first portion of the content and the second portion of the content is based on a duration the hand-worn device detects a portion of the hand input.

5. The method of any one of claims 1-4, wherein navigating between the first portion of the content and the second portion of the content is based on a degree of rotation of the hand input.

6. The method of any one of claims 1-5, wherein navigating between the first portion of the content and the second portion of the content is based on a direction of rotation of the hand input.

7. The method of any one of claims 1-6, further comprising: in response to receiving the indication that the hand-worn device detected the hand input including the rotation of the hand and in accordance with a determination that the hand-worn device did not detect that the hand is clenched while the hand input was performed, displaying, via the display generation component, a first hand input virtual object that is representative of a clenched hand.

8. The method of claim 7, wherein displaying the first hand input virtual object includes displaying an animation of the first hand input virtual object changing from displaying a virtual object that represents a clenched hand position to displaying a virtual object that represents an unclenched hand position.

9. The method of any one of claims 1-8, further comprising: before receiving the indication that the hand-worn device detected the hand input including the rotation of the hand, receiving an indication that the hand-worn device detected the hand is clenched; and in response to receiving the indication that the hand-worn device detected the hand is clenched, displaying, via the display generation component, a second hand input virtual object representative of a clenched hand that is rotated.

10. The method of claim 9, wherein displaying the second hand input virtual object includes displaying an animation of the second hand input virtual object as changing between a first amount of rotation and a second amount of rotation that is different from the first amount of rotation.

147

AMENDED SHEET (ARTICLE 19)

11. The method of any one of claims 1-10, further comprising: before receiving the indication that the hand-worn device detected the hand input including the rotation of the hand, displaying a selection indicator virtual object at a first location on the respective user interface; and in response to receiving the indication that the hand-worn device detected the hand input including the rotation of the hand and in accordance with a determination that the hand-worn device detected that the hand was clenched while the hand input was performed, moving display of the selection indicator virtual object from the first location to a second location on the respective user interface, wherein display of the selection indicator virtual object is moved based on a degree of rotation of the hand.

12. The method of any one of claims 1-11, wherein navigating between the first portion of the content and the second portion of the content includes scrolling between the first portion of the content and the second portion of the content

13. The method of any one of claims 1-12, wherein the first portion of the content and the second portion of the content are selectable.

14. The method of any one of claims 1-13, further comprising: while navigating between the first portion of the content and the second portion of the content, receiving an indication that the hand is unclenched; and in response to receiving the indication that the hand is unclenched, ceasing to navigate between the first portion of the content and the second portion of the content.

15. The method of any one of claims 1-14, further comprising: while navigating between the first portion of the content and the second portion of the content, displaying, via the display generation component, a third hand input virtual object representative of an unclenched hand.

16. The method of claim 15, wherein displaying the third hand input virtual object includes displaying an animation of the third hand input changing from a virtual object that represents a clenched hand position to a virtual object that represents an unclenched hand position.

148

AMENDED SHEET (ARTICLE 19)

17. The method of any one of claims 1-16, wherein the computer system is in communication with an external device, and wherein the external device causes display of a virtual object with a first visual appearance, the method further comprising: after receiving the indication that the hand-worn device detected the hand input including the rotation of the hand, receiving a second indication that the hand-worn device detected a fourth hand input including a second rotation of the hand while the hand is clenched, wherein the fourth hand input includes a first amount of rotation; in response to receiving the second indication that the hand-worn device detected the fourth hand input including a second rotation of the hand, transmitting instructions to the external device to cause the virtual object to be displayed with a second visual appearance different from the first visual appearance; while displaying the virtual object with a second visual appearance, receiving a third indication that the hand-worn device detected that the fourth hand input includes a second amount of rotation; and in response to receiving the third indication that the hand-worn device detects that the fourth hand input includes a second amount of rotation, transmitting instructions to the external device to cause the virtual object to be displayed with a third visual appearance different than the second visual appearance.

18. The method of any of claims 1-17, wherein the respective user interface is a media user interface.

19. The method of any one of claims 1-18, wherein the respective user interface is an augmented reality user interface.

20. The method of any one of claims 1-19, further comprising: after receiving the hand indication that the hand-worn device detected the hand input including the rotation of the hand, receiving an indication that the hand-worn device detected a respective hand input; and in response to receiving the indication that the hand-worn device detected the respective hand input, performing a second operation.

21. The method of claim 20, wherein displaying the respective user interface includes displaying a playback of a video, wherein the respective hand input includes a clench and roll

149

AMENDED SHEET (ARTICLE 19) input, and wherein performing the second operation includes pausing the playback of the video media item.

22. The method of claim 20, wherein the respective hand input includes a clench and roll input, and wherein performing the second operation includes scrolling between the first portion of the content and the second portion of the content.

23. A non-transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of a computer system that is in communication with a display generation component, the one or more programs including instructions for performing the method of any one of claims 1-22.

24. A computer system that is configured to communicate with a display generation component, the computer system comprising: one or more processors; and memory storing one or more programs configured to be executed by the one or more processors, the one or more programs including instructions for performing the method of any one of claims 1-22.

25. A computer system that is configured to communicate with a display generation component, comprising: means for performing the method of any one of claims 1-22.

26. A computer program product, comprising one or more programs configured to be executed by one or more processors of a computer system that is in communication with a display generation component, the one or more programs including instructions for performing the method of any one of claims 1-22.

27. A non-transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of a computer system that is in communication with a display generation component, the one or more programs including instructions for: while a hand-worn device is being worn by a user, displaying, via the display generation component, a respective user interface that includes a first portion of content and a second portion of the content that is different from the first portion of the content;

150

AMENDED SHEET (ARTICLE 19) while displaying the respective user interface that includes the first portion of the content and the second portion of the content, receiving an indication that the hand-worn device detected a hand input including a rotation of a hand; and in response to receiving the indication that the hand-worn device detected the hand input including the rotation of the hand: in accordance with a determination that the hand-worn device detected that the hand is clenched while the hand input was performed, navigating between the first portion of the content and the second portion of the content; and in accordance with a determination that the hand-worn device did not detect that the hand is clenched while the hand input was performed, forgoing navigating between the first portion of the content and the second portion of the content.

28. A computer system configured to communicate with a display generation component, comprising: one or more processors; and memory storing one or more programs configured to be executed by the one or more processors, the one or more programs including instructions for: while a hand-worn device is being worn by a user, displaying, via the display generation component, a respective user interface that includes a first portion of content and a second portion of the content that is different from the first portion of the content; while displaying the respective user interface that includes the first portion of the content and the second portion of the content, receiving an indication that the hand-worn device detected a hand input including a rotation of a hand; and in response to receiving the indication that the hand-worn device detected the hand input including the rotation of the hand: in accordance with a determination that the hand-worn device detected that the hand is clenched while the hand input was performed, navigating between the first portion of the content and the second portion of the content; and in accordance with a determination that the hand-worn device did not detect that the hand is clenched while the hand input was performed, forgoing navigating between the first portion of the content and the second portion of the content.

151

AMENDED SHEET (ARTICLE 19)

29. A computer system configured to communicate with a display generation component, comprising: means, while a hand-worn device is being worn by a user, for displaying, via the display generation component, a respective user interface that includes a first portion of content and a second portion of the content that is different from the first portion of the content; means, while displaying the respective user interface that includes the first portion of the content and the second portion of the content, for receiving an indication that the hand-worn device detected a hand input including a rotation of a hand; and means, responsive to receiving the indication that the hand-worn device detected the hand input including the rotation of the hand, for: in accordance with a determination that the hand-worn device detected that the hand is clenched while the hand input was performed, navigating between the first portion of the content and the second portion of the content; and in accordance with a determination that the hand-worn device did not detect that the hand is clenched while the hand input was performed, forgoing navigating between the first portion of the content and the second portion of the content.

30. A method, comprising: at a computer system that is in communication with a display generation component: displaying, via the display generation component, a respective user interface, wherein displaying the respective user interface includes concurrently displaying: a first control virtual object that, when activated with a first type of input, causes the computer system to perform a first operation; a second control virtual object that, when activated with the first type of input, causes the computer system to perform a second operation different from the first operation; and a first virtual object indicating that the first control virtual object can be activated in response to a second type of input being performed, wherein the second type of input is not directed to a location in the respective user interface; while displaying the first control virtual object, receiving an indication that a respective input has been performed; and in response to receiving the indication that the respective input has been performed:

152

AMENDED SHEET (ARTICLE 19) in accordance with a determination that the respective input is the first type of input directed to the location that corresponds to the first control virtual object, initiating a process for performing the first operation; in accordance with a determination that the respective input is the first type of input directed to a location that corresponds to the second control virtual object, initiating a process for performing the second operation; and in accordance with a determination that the respective input is the second type of input, initiating the process for performing the first operation.

31. The method of claim 30, further comprising: while displaying the respective user interface, displaying an animation that alternates between displaying the first virtual object and displaying a second virtual object, wherein the second virtual object indicates that the second control virtual object can be activated in response to a third type of input, and wherein the second type of input is different from the third type of input.

32. The method of claim 31, wherein alternating between the display of the first virtual object and the display of the second virtual object is performed on the basis of time.

33. The method of claim 31, wherein alternating between the display of the first virtual object and the display of the second virtual object is performed based on the detection of a first input.

34. The method of any one of claims 30-33, wherein displaying the first virtual object includes displaying a second animation that alternates between a first location that corresponds to the first control virtual object and a second location that corresponds to the second control virtual object.

35. The method of any one of claims 30-34, further comprising: while displaying the respective user interface, displaying, via the display generation component, a third virtual object that indicates that the second operation can be activated in response to a fourth type of input, being performed, wherein the fourth type of input is not directed to a location in the respective user interface.

36. The method of any one of claims 30-35, wherein the first control virtual object corresponds to a first default control option.

153

AMENDED SHEET (ARTICLE 19)

37. The method of any one of claims 30-36, wherein displaying the first virtual object includes visually emphasizing the first control virtual object.

38. The method of any one of claims 30-37, wherein the respective user interface corresponds to a first application, and wherein the first control virtual object is a second default virtual object and is displayed with a first color scheme, the method further comprising: after receiving the indication that the respective input has been performed, receiving a request to display a second respective user interface; and in response to receiving the request to display the second respective user interface, displaying, via the display generation component, the second respective user interface, wherein the second respective user interface corresponds to a second application that is different from the first application, and wherein displaying the second respective user interface includes displaying a third control virtual object, wherein the third control virtual object is the second default virtual object and is displayed with a second color scheme that is different than the first color scheme.

39. The method of any of claims 30-38, wherein displaying the first virtual object includes displaying a graphical representation of the second type of input.

40. The method of any one of claims 30-39, wherein displaying the first virtual object includes displaying an animation representative of the second type of input.

41. The method of any one of claims 30-40, wherein the first virtual object is displayed adjacent to the first control virtual object.

42. The method of any one of claims 30-41, wherein the first type of input is a tap.

43. The method of any one of claims 30-42, wherein displaying the respective user interface includes displaying a first complication with a first set of information, the method further comprising: after receiving the indication that the respective input has been performed, receiving an indication that a second respective input has been performed; and in response to receiving the indication that the second respective input has been performed, updating the display of the first complication to include a second set of information that is different from the first set of information.

154

AMENDED SHEET (ARTICLE 19)

44. The method of any one of claims 30-43, wherein: in accordance with a determination that the first control virtual object is in focus, displaying the first virtual object includes displaying the first virtual object adjacent to the first control virtual object; and in accordance with a determination that the second control virtual object is in focus, displaying the first virtual object includes displaying the first virtual object adjacent to the second control virtual object.

45. The method of any one of claims 30-44, wherein the respective user interface is displayed in response to receiving a request to connect to an external device wherein: in accordance with a determination that the respective input is the second type of input and in accordance with a determination that the respective input is being performed, initiating the process for performing the first operation includes connecting to the external device.

46. The method of any one of claims 30-45, wherein the first control virtual object is displayed at a third location on the respective user interface and the second control virtual object is displayed at a fourth location on the respective user interface, the method further comprising: while displaying the first control virtual object at the third location on the respective user interface and the second control virtual object at the fourth location on the respective user interface, receiving an indication that a clench gesture has been performed; in response to receiving the indication the clench gesture has been performed, displaying the first control virtual object at a fifth location on the respective user interface and displaying the second control virtual object at a sixth location on the respective user interface; and while displaying the first control virtual object at the fifth location on the respective user interface and displaying the second control virtual object at the sixth location on the respective user interface, receiving an indication that the computer system has been titled: in accordance with a determination that the computer system has been titled in a first direction, displaying the first control object at the third location on the respective user interface; and

155

AMENDED SHEET (ARTICLE 19) in accordance with a determination that the computer system has been tilted in a second direction, different from first direction, displaying the second control object at the second location on the respective user interface.

47. A non-transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of a computer system that is in communication with a display generation, the one or more programs including instructions for performing the method of any one of claims 30-46.

48. A computer system that is configured to communicate with a display generation component, the computer system comprising: one or more processors; and memory storing one or more programs configured to be executed by the one or more processors, the one or more programs including instructions for performing the method of any one of claims 30-46.

49. A computer system that is configured to communicate with a display generation component, comprising: means for performing the method of any one of claims 30-46.

50. A computer program product, comprising one or more programs configured to be executed by one or more processors of a computer system that is in communication with a display generation component, the one or more programs including instructions for performing the method of any one of claims 30-46.

51. A non-transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of a computer system that is in communication with a display generation component, the one or more programs including instructions for: displaying, via the display generation component, a respective user interface, wherein displaying the respective user interface includes concurrently displaying: a first control virtual object that, when activated with a first type of input, causes the computer system to perform a first operation; a second control virtual object that, when activated with the first type of input, causes the computer system to perform a second operation different from the first operation; and

156

AMENDED SHEET (ARTICLE 19) a first virtual object indicating that the first control virtual object can be activated in response to a second type of input being performed, wherein the second type of input is not directed to a location in the respective user interface; while displaying the first control virtual object, receiving an indication that a respective input has been performed; and in response to receiving the indication that the respective input has been performed: in accordance with a determination that the respective input is the first type of input directed to the location that corresponds to the first control virtual object, initiating a process for performing the first operation; in accordance with a determination that the respective input is the first type of input directed to a location that corresponds to the second control virtual object, initiating a process for performing the second operation; and in accordance with a determination that the respective input is the second type of input, initiating the process for performing the first operation.

52. A computer system configured to communicate with a display generation component, comprising: one or more processors; and memory storing one or more programs configured to be executed by the one or more processors, the one or more programs including instructions for: displaying, via the display generation component, a respective user interface, wherein displaying the respective user interface includes concurrently displaying: a first control virtual object that, when activated with a first type of input, causes the computer system to perform a first operation; a second control virtual object that, when activated with the first type of input, causes the computer system to perform a second operation different from the first operation; and a first virtual object indicating that the first control virtual object can be activated in response to a second type of input being performed, wherein the second type of input is not directed to a location in the respective user interface; while displaying the first control virtual object, receiving an indication that a respective input has been performed; and

157

AMENDED SHEET (ARTICLE 19) in response to receiving the indication that the respective input has been performed: in accordance with a determination that the respective input is the first type of input directed to the location that corresponds to the first control virtual object, initiating a process for performing the first operation; in accordance with a determination that the respective input is the first type of input directed to a location that corresponds to the second control virtual object, initiating a process for performing the second operation; and in accordance with a determination that the respective input is the second type of input, initiating the process for performing the first operation.

53. A computer system configured to communicate with a display generation component, comprising: means for displaying, via the display generation component, a respective user interface, wherein displaying the respective user interface includes concurrently displaying: a first control virtual object that, when activated with a first type of input, causes the computer system to perform a first operation; a second control virtual object that, when activated with the first type of input, causes the computer system to perform a second operation different from the first operation; and a first virtual object indicating that the first control virtual object can be activated in response to a second type of input being performed, wherein the second type of input is not directed to a location in the respective user interface; means, while displaying the first control virtual object, for receiving an indication that a respective input has been performed; and means, responsive to receiving the indication that the respective input has been performed, for: in accordance with a determination that the respective input is the first type of input directed to the location that corresponds to the first control virtual object, initiating a process for performing the first operation; in accordance with a determination that the respective input is the first type of input directed to a location that corresponds to the second control virtual object, initiating a process for performing the second operation; and

158

AMENDED SHEET (ARTICLE 19) in accordance with a determination that the respective input is the second type of input, initiating the process for performing the first operation.

54. A computer program product, comprising one or more programs configured to be executed by one or more processors of a computer system that is in communication with a display generation component, the one or more programs including instructions for: displaying, via the display generation component, a respective user interface, wherein displaying the respective user interface includes concurrently displaying: a first control virtual object that, when activated with a first type of input, causes the computer system to perform a first operation; a second control virtual object that, when activated with the first type of input, causes the computer system to perform a second operation different from the first operation; and a first virtual object indicating that the first control virtual object can be activated in response to a second type of input being performed, wherein the second type of input is not directed to a location in the respective user interface; while displaying the first control virtual object, receiving an indication that a respective input has been performed; and in response to receiving the indication that the respective input has been performed: in accordance with a determination that the respective input is the first type of input directed to the location that corresponds to the first control virtual object, initiating a process for performing the first operation; in accordance with a determination that the respective input is the first type of input directed to a location that corresponds to the second control virtual object, initiating a process for performing the second operation; and in accordance with a determination that the respective input is the second type of input, initiating the process for performing the first operation.

55. A method, comprising: at a computer system that is in communication with a display generation component: while displaying, via the display generation component, an extended reality environment that includes a virtual object that obscures at least a first portion of a physical environment that includes a wearable device, receiving an indication that a first hand input

159

AMENDED SHEET (ARTICLE 19) was performed by a hand on which the wearable device is being worn, wherein the first hand input includes movement of one or more digits of a hand relative to a portion the hand; and in response to receiving the indication that the hand input has been performed by the hand on which the wearable device is being worn, displaying, via the display generation component, information about the wearable device.

56. The method of claim 55, wherein displaying the information about the wearable device includes ceasing to display at least a portion of the virtual object.

57. The method of claim 56, wherein ceasing to display the portion of the virtual object causes the first portion of the of the physical environment to be visible.

58. The method of any one of claims 56-57, wherein, before ceasing to display the portion of the virtual object, a second portion of the physical environment, that is different than the first portion of the physical environment, is not visible, and wherein after ceasing to display the portion of the virtual object, the second portion of the physical environment is not visible.

59. The method of any one of claims 55-58, further comprising: while displaying the information about the wearable device, receiving an indication of a location of the hand on which the wearable device is being worn; and in response to receiving the indication of the location of the hand on which the wearable device is being worn; in accordance with a determination that the location of the hand on which the which the wearable device is being worn is at a first location, displaying the information about the wearable device includes displaying the information at a second location in the extended reality environment that corresponds to the first location; and in accordance with a determination that the location of the hand on which the which the wearable device is being worn is at a third location, displaying the information about the wearable device includes displaying the information at a fourth location in the extended reality environment that corresponds to the third location.

60. The method of claim 59, wherein: in accordance with a determination that the location of the hand on which the wearable device is being worn is at the first location, displaying the information includes displaying the information as changing from a first size to a second size over a first amount of time at the second location in the extended reality environment; and

160

AMENDED SHEET (ARTICLE 19) in accordance with a determination that the location of the hand on which the which the wearable device is being worn is at the third location, displaying the information includes displaying the information as changing from the first size to the second size over the first amount of time at the fourth location in the extended reality environment.

61. The method of any one of claims 59-60, wherein the information about the wearable device is a virtual representation of the wearable device.

62. The method of any one of claims 55-61, wherein the information includes one or more notifications that correspond to the wearable device.

63. The method of claim 62, wherein the information is displayed on a representation of a wrist of the hand on which the wearable device is being worn.

64. The method of any one of claims 62-63, further comprising: in response to receiving the indication that the hand input was performed by the hand on which the wearable device is worn, displaying a control user interface, wherein the information is displayed within the control user interface.

65. The method of any one of claims 62-64, wherein the extended reality environment includes a virtual representation of the hand on which the wearable device is being worn, and wherein displaying the information includes displaying the information on the virtual representation of the hand, the method further comprising: while displaying the information on the virtual representation of the hand, receiving an indication that a second hand input has been performed; and in response to receiving the indication that the second hand input has been performed: ceasing to display the information on the virtual representation of the hand on which the wearable device is being worn; and maintaining display of the virtual representation of the hand on which the wearable device is being worn.

66. The method of any one of claims 55-65, wherein the first hand input includes a double clench gesture.

161

AMENDED SHEET (ARTICLE 19)

67. The method of any one of claims 55-66, further comprising: while displaying the information about the wearable device, receiving an indication that a third hand input has been detected by the wearable device; and in response to receiving the indication that the third hand input was performed by the hand on which the wearable device is being worn, ceasing to display the information about the wearable device.

68. The method of claim 67, wherein the first hand input is a first type of gesture, and wherein the third hand input is the first type of gesture.

69. The method of claim 67, wherein the first hand input is a second type of gesture, and wherein the third hand input is a third type of gesture that is different from the second type of gesture.

70. A non-transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of a computer system that is in communication with a display generation component, the one or more programs including instructions for performing the method of any one of claims 55-69.

71. A computer system that is configured to communicate with a display generation component, the computer system comprising: one or more processors; and memory storing one or more programs configured to be executed by the one or more processors, the one or more programs including instructions for performing the method of any one of claims 55-69.

72. A computer system that is configured to communicate with a display generation component, comprising: means for performing the method of any one of claims 55-69.

73. A computer program product, comprising one or more programs configured to be executed by one or more processors of a computer system that is in communication with a display generation component, the one or more programs including instructions for performing the method of any one of claims 55-69.

162

AMENDED SHEET (ARTICLE 19)

74. A non-transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of a computer system that is in communication with a display generation component, the one or more programs including instructions for: while displaying, via the display generation component, an extended reality environment that includes a virtual object that obscures at least a first portion of a physical environment that includes a wearable device, receiving an indication that a first hand input was performed by a hand on which the wearable device is being worn, wherein the first hand input includes movement of one or more digits of a hand relative to a portion the hand; and in response to receiving the indication that the hand input has been performed by the hand on which the wearable device is being worn, displaying, via the display generation component, information about the wearable device.

75. A computer system configured to communicate with a display generation component, comprising: one or more processors; and memory storing one or more programs configured to be executed by the one or more processors, the one or more programs including instructions for: while displaying, via the display generation component, an extended reality environment that includes a virtual object that obscures at least a first portion of a physical environment that includes a wearable device, receiving an indication that a first hand input was performed by a hand on which the wearable device is being worn, wherein the first hand input includes movement of one or more digits of a hand relative to a portion the hand; and in response to receiving the indication that the hand input has been performed by the hand on which the wearable device is being worn, displaying, via the display generation component, information about the wearable device.

76. A computer system configured to communicate with a display generation component, comprising: means, while displaying, via the display generation component, an extended reality environment that includes a virtual object that obscures at least a first portion of a physical environment that includes a wearable device, for receiving an indication that a first hand input was performed by a hand on which the wearable device is being worn, wherein the first hand input includes movement of one or more digits of a hand relative to a portion the hand; and

163

AMENDED SHEET (ARTICLE 19) means, responsive to receiving the indication that the hand input has been performed by the hand on which the wearable device is being worn, for displaying, via the display generation component, information about the wearable device.

77. A computer program product, comprising one or more programs configured to be executed by one or more processors of a computer system that is in communication with a display generation component, the one or more programs including instructions for: while displaying, via the display generation component, an extended reality environment that includes a virtual object that obscures at least a first portion of a physical environment that includes a wearable device, receiving an indication that a first hand input was performed by a hand on which the wearable device is being worn, wherein the first hand input includes movement of one or more digits of a hand relative to a portion the hand; and in response to receiving the indication that the hand input has been performed by the hand on which the wearable device is being worn, displaying, via the display generation component, information about the wearable device.

78. A method, comprising: at a computer system that is in communication with a display generation component and a wearable device: while displaying, via the display generation component, an augmented reality environment user interface, receiving an indication that a first hand input was performed by a first hand of the user; and in response to receiving the indication that the first hand input was performed by the first hand of the user: in accordance with a determination that the first hand input was performed while a second hand input was being performed by a second hand of the user, wherein the second hand of the user is different from the first hand, performing a first operation; and in accordance with a determination that the first hand input was performed while the second hand input was not being performed by the second hand of the user, forgoing performing the first operation.

79. The method of claim 78, further comprising, in response to receiving the indication that the first hand input was performed by the first hand of the user and in accordance with a determination that the first hand input

164

AMENDED SHEET (ARTICLE 19) was performed while the second hand input was not being performed by the second hand of the user, performing a second operation.

80. The method of claim 79, wherein the second operation is a selection operation.

81. The method of any one of claims 78-80, wherein performing the first operation includes displaying, via the display generation component, a context menu, and wherein the first hand input includes an air tap gesture or a pinch gesture.

82. The method of any one of claims 78-81, wherein the first hand input includes moving the first hand of the user in a first direction in the physical environment, and wherein performing the first operation includes dragging a virtual object that is displayed in the augmented reality environment user interface from a first location to a second location.

83. The method of any one of claims 78-82, wherein displaying the augmented reality environment user interface includes displaying a first virtual object and a second virtual object, and wherein: in accordance with a determination that a user’s attention is directed towards the first virtual object, performing the first operation includes performing the first operation on the first virtual object; and in accordance with a determination that the user’s attention is directed towards the second virtual object, performing the first operation includes performing the first operation on the second virtual object.

84. The method of any one of claims 78-83, wherein the first hand input includes a clench gesture, and wherein performing the first operation includes selecting a third virtual object that is displayed within the augmented realty environment user interface, the method further comprising: while the third virtual object is selected and while the first hand of the user is clenched, receiving an indication that the first hand of the user has performed a third hand input; and in response to receiving the indication that the first hand of the user has performed the third hand input, deselecting the third virtual object.

85. The method of any one of claims 78-84, wherein displaying the augmented reality environment user interface includes displaying a hand input virtual object, wherein:

165

AMENDED SHEET (ARTICLE 19) while displaying the augmented reality environment and in accordance with a determination that the second hand of the user has performed a clench gesture, displaying the hand input virtual object includes displaying the hand input virtual object with a first visual appearance; and while displaying the augmented reality environment and in accordance with a determination that the second hand input was not performed by the second hand of the user, displaying the hand input virtual object includes displaying the hand input virtual object with a second visual appearance, that is different from than first visual appearance.

86. The method of any one of claims 78-85, further comprising: after receiving the indication that the first hand input was performed by the first hand of the user, receiving an indication that a fourth hand input was performed by the second hand of the user; and in response to receiving the indication that the fourth hand input was performed by the second hand of the user and in accordance with a determination that the fourth hand input was performed while the first hand was not performing a hand input, performing a third operation.

87. The method of claim 86, wherein performing the third operation includes displaying a multitasking user interface.

88. The method of claim 86, wherein performing the third operation includes displaying a media player user interface.

89. The method of claim 86, wherein performing the third operation includes displaying a plurality of selectable tool option virtual objects, wherein the plurality of selectable tool option virtual objects can be selected using the first hand of the user.

90. The method of claim 89, further comprising: while displaying the plurality of selectable tool option virtual object, receiving an indication that a fifth hand input was performed by the first hand of the user; and in response to receiving the indication that the fifth hand input was performed by the first hand of the user, selecting one or more tool options of the plurality of tool option virtual objects.

166

AMENDED SHEET (ARTICLE 19)

91. The method of claim 89, further comprising: while displaying the plurality of selectable tool option virtual objects and while the fourth hand input is performed by the second hand of the user, receiving an indication that a sixth hand input was performed by the second hand of the user; and in response to receiving the indication that the sixth hand input was performed by the second hand of the user, ceasing display of the plurality of selectable tool option virtual objects.

92. The method of any one of claims 78-91, wherein the first hand of the user is tracked with one or more cameras, and wherein the second hand of the user is tracked with one or more sensors that are integrated into the wearable device.

93. A non-transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of a computer system that is in communication with a display generation component and a wearable device, the one or more programs including instructions for performing the method of any one of claims 78-92.

94. A computer system that is configured to communicate with a display generation component and a wearable device, the computer system comprising: one or more processors; and memory storing one or more programs configured to be executed by the one or more processors, the one or more programs including instructions for performing the method of any one of claims 78-92.

95. A computer system that is configured to communicate with a display generation component and a wearable device, comprising: means for performing the method of any one of claims 78-92.

96. A computer program product, comprising one or more programs configured to be executed by one or more processors of a computer system that is in communication with a display generation component and a wearable device, the one or more programs including instructions for performing the method of any one of claims 78-92.

97. A non-transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of a computer system that is in

167

AMENDED SHEET (ARTICLE 19) communication with a display generation component and a wearable device, the one or more programs including instructions for: while displaying, via the display generation component, an augmented reality environment user interface, receiving an indication that a first hand input was performed by a first hand of the user; and in response to receiving the indication that the first hand input was performed by the first hand of the user: in accordance with a determination that the first hand input was performed while a second hand input was being performed by a second hand of the user, wherein the second hand of the user is different from the first hand, performing a first operation; and in accordance with a determination that the first hand input was performed while the second hand input was not being performed by the second hand of the user, forgoing performing the first operation.

98. A computer system configured to communicate with a display generation component and a wearable device, comprising: one or more processors; and memory storing one or more programs configured to be executed by the one or more processors, the one or more programs including instructions for: while displaying, via the display generation component, an augmented reality environment user interface, receiving an indication that a first hand input was performed by a first hand of the user; and in response to receiving the indication that the first hand input was performed by the first hand of the user: in accordance with a determination that the first hand input was performed while a second hand input was being performed by a second hand of the user, wherein the second hand of the user is different from the first hand, performing a first operation; and in accordance with a determination that the first hand input was performed while the second hand input was not being performed by the second hand of the user, forgoing performing the first operation.

99. A computer system configured to communicate with a display generation component and a wearable device, comprising:

168

AMENDED SHEET (ARTICLE 19) means, while displaying, via the display generation component, an augmented reality environment user interface, for receiving an indication that a first hand input was performed by a first hand of the user; and means, responsive to receiving the indication that the first hand input was performed by the first hand of the user, for: in accordance with a determination that the first hand input was performed while a second hand input was being performed by a second hand of the user, wherein the second hand of the user is different from the first hand, performing a first operation; and in accordance with a determination that the first hand input was performed while the second hand input was not being performed by the second hand of the user, forgoing performing the first operation.

100. A computer program product, comprising one or more programs configured to be executed by one or more processors of a computer system that is in communication with a display generation component and a wearable device, the one or more programs including instructions for: while displaying, via the display generation component, an augmented reality environment user interface, receiving an indication that a first hand input was performed by a first hand of the user; and. in response to receiving the indication that the first hand input was performed by the first hand of the user: in accordance with a determination that the first hand input was performed while a second hand input was being performed by a second hand of the user, wherein the second hand of the user is different from the first hand, performing a first operation; and in accordance with a determination that the first hand input was performed while the second hand input was not being performed by the second hand of the user, forgoing performing the first operation.

169

AMENDED SHEET (ARTICLE 19)