Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
WEARABLE DEVICE DON/DOFF DETERMINATION
Document Type and Number:
WIPO Patent Application WO/2024/059680
Kind Code:
A1
Abstract:
A device may determine, using first sensor data, that a head mounted wearable device is in a non-resting state. A device may increase an activation state of a second sensor to an increased activation level in response to determining that the head mounted wearable device is in the non-resting state. A device may receive second sensor data from the second sensor at the increased activation level. A device may determine, using the second sensor data, that the head mounted wearable device is in a head-mounted state. A device may in response to determining that the head mounted wearable device is in the head-mounted state, increasing an operational mode of the head mounted wearable device to an increased operational mode, wherein determining that the head mounted wearable device is in a first of the non-resting state or the head-mounted state includes executing a model.

More Like This:
Inventors:
SHIN DONGEEK (US)
COLACO ANDREA (US)
KOWDLE ADARSH PRAKASH MURTHY (US)
ZYSKOWSKI JAMIE ALEXANDER (US)
HU JINGYING (US)
Application Number:
PCT/US2023/074140
Publication Date:
March 21, 2024
Filing Date:
September 14, 2023
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
GOOGLE LLC (US)
International Classes:
G06F1/16; G06F1/32
Domestic Patent References:
WO2016020768A12016-02-11
Foreign References:
US20180254019A12018-09-06
US20170280394A12017-09-28
US20040104864A12004-06-03
US9294739B12016-03-22
US20170344123A12017-11-30
US20160025971A12016-01-28
US20140375545A12014-12-25
Attorney, Agent or Firm:
SOMAUROO, Tawnya Ferbiak et al. (US)
Download PDF:
Claims:
WHAT IS CLAIMED IS:

1. A head mounted wearable device, comprising: a head mounted wearable device body; a first sensor positioned in the head mounted wearable device body; a second sensor positioned in the head mounted wearable device body; and a processor and a memory configured with instructions to: receive first sensor data from the first sensor; determine, using the first sensor data, that the head mounted wearable device is in a non-resting state; increase an activation state of the second sensor to an increased activation level in response to determining that the head mounted wearable device is in the non-resting state; receive second sensor data from the second sensor at the increased activation level; determine, using the second sensor data, that the head mounted wearable device is in a head-mounted state; and in response to determining that the head mounted wearable device is in the head-mounted state, increase an operational mode of the head mounted wearable device to an increased operational mode, wherein determining that the head mounted wearable device is in a first of the non-resting state or the head-mounted state includes executing a model.

2. The head mounted wearable device of claim 1, wherein the model is a first model and determining that the head mounted wearable device is in a second of the non-resting state or the head-mounted state includes executing a second model.

3. The head mounted wearable device of claims 1 or 2, wherein the model comprises a multi-channel support vector machine.

4. The head mounted wearable device of any of claims 1 to 3, wherein increasing the operational mode of the head mounted wearable device to an increased operational mode further comprises powering on an additional component of the head mounted wearable device.

5. The head mounted wearable device of any of claims 1 to 4, wherein determining that the head mounted wearable device is in the first of the non-resting state or the head-mounted state further comprises: generating an activity rating of the head mounted wearable device based on a sensor data; and determining that the activity rating is over a threshold.

6. The head mounted wearable device of any of claims 1 to 5, wherein determining that the head mounted wearable device is in the non-resting state further comprises using the second sensor at a decreased activation level as input to the model.

7. The head mounted wearable device of any of claims 1 to 6. wherein: the first sensor comprises a capacitive sensor; and the second sensor data comprises at least one of an inertial measurement unit, an eye tracking camera, or an ultrasonic sensor.

8. The head mounted wearable device of any of claims 1 to 7, wherein the second sensor consumes a higher level of power than the first sensor.

9. The head mounted wearable device of any of claims 1 to 8, wherein the processor is a first processor and the head mounted wearable device further comprises: a second processor, wherein the first processor is configured to determine that the head mounted wearable device is in the non-resting state and the second processor is configured to determine that the head mounted wearable device is in the head-mounted state.

10. A computer readable medium having stored therein instructions that, when executed by a processor, cause the processor to perform functions comprising: receiving first sensor data from a first sensor; determining, using the first sensor data, that a head mounted wearable device is in a non-resting state; increasing an activation state of a second sensor to an increased activation level in response to determining that the head mounted wearable device is in the nonresting state; receiving second sensor data from the second sensor at the increased activation level; determining, using the second sensor data, that the head mounted wearable device is in a head-mounted state; and in response to determining that the head mounted wearable device is in the head-mounted state, increasing an operational mode of the head mounted wearable device to an increased operational mode, wherein determining that the head mounted wearable device is in a first of the non-resting state or the head-mounted state includes executing a model.

11. The computer readable medium of claim 10, wherein the model is a first model and determining that the head mounted wearable device is in a second of the non-resting state or the head-mounted state includes executing a second model.

12. The computer readable medium of claims 10 or 11, wherein the model comprises a multi-channel support vector machine.

13. The computer readable medium of any of claims 10 to 12, wherein increasing the operational mode of the head mounted wearable device to an increased operational mode further comprises powering on an additional component of the head mounted wearable device.

14. The computer readable medium of any of claims 10 to 13, wherein determining that the head mounted wearable device is in the first of the non-resting state or the head-mounted state further comprises: generating an activity rating of the head mounted wearable device based on a sensor data; and determining that the activity rating is over a threshold.

15. The computer readable medium of any of claims 10 to 14, wherein determining that the head mounted wearable device is in the non-resting state further comprises using the second sensor at a decreased activation level as input to the model.

16. The computer readable medium of any of claims 10 to 15, wherein the processor is further configured with instructions to: the first sensor comprises a capacitive sensor; and the second sensor data comprises at least one of an inertial measurement unit, an eye tracking camera, or an ultrasonic sensor.

17. A method comprising: receiving first sensor data from a first sensor; determining, using the first sensor data, that a head mounted wearable device is in a non-resting state; increasing an activation state of a second sensor to an increased activation level in response to determining that the head mounted wearable device is in the nonresting state; receiving second sensor data from the second sensor at the increased activation level; determining, using the second sensor data, that the head mounted wearable device is in a head-mounted state; and in response to determining that the head mounted wearable device is in the head-mounted state, increasing an operational mode of the head mounted wearable device to an increased operational mode, wherein determining that the head mounted wearable device is in a first of the non-resting state or the head-mounted state includes executing a model.

18. The method of claim 17, wherein the model is a first model and determining that the head mounted wearable device is in a second of the non-resting state or the head-mounted state includes executing a second model.

19. The method of claims 17 or 18, wherein the model comprises a multi-channel support vector machine.

20. The method of any of claims 17 to 19, wherein increasing the operational mode of the head mounted wearable device to an increased operational mode further comprises powering on an additional component of the head mounted wearable device.

21. The method of any of claims 17 to 20, wherein determining that the head mounted wearable device is in the first of the non-resting state or the head-mounted state further comprises: generating an activity rating of the head mounted wearable device based on a sensor data; and determining that the activity rating is over a threshold.

22. The method of any of claims 17 to 21, wherein determining that the head mounted wearable device is in the non-resting state further comprises using the second sensor at a decreased activation level as input to the model.

23. The method of any of claims 17 to 22, wherein: the first sensor comprises a capacitive sensor; and the second sensor data comprises at least one of an inertial measurement unit, an eye tracking camera, or an ultrasonic sensor.

Description:
WEARABLE DEVICE DON/DOFF

DETERMINATION

CROSS-REFERENCE TO RELATED APPLICATION

[0001] This application also claims priority to U.S. Provisional Patent Application No. 63/375,814, filed on September 15, 2022, entitled “HEAD MOUNTED WEARABLE DEVICE DON/DOFF DETERMINATION”, the disclosure of which is incorporated by reference herein in its entirety.

TECHNICAL FIELD

[0002] This description relates to a sy stem and techniques for a body-mounted wearable device DON (device ON) determination.

BACKGROUND

[0003] For any body-wearable device, it is desirable to know when the wearable device is being worn versus when the wearable device is not being worn. Determining the device ON (DON) I device OFF (DOFF) state may help determine when to power or activate parts of the device, which can affect power consumption, battery life, and user experience.

SUMMARY

[0004] Systems and techniques are described for transitioning a wearable device from a DOFF state (for example resting on a table) to a DON state (for example, head mounted) in response to data received from a first sensor and a second sensor. The first sensor initially detects that the wearable device is in a non-resting state and increases the activation level of the second sensor. Then, the second sensor is used to determine that the wearable device is in a body -mounted state. In examples, a first model may determine that a wearable device is in a non-resting state based on first sensor data and/or a second model may determine that a wearable device is in a head-mounted state based on second sensor data. The first sensor may be a lower power sensor than the second sensor. In examples, the wearable device can be smart glasses, earbuds, over-ear head phones, goggles (e.g., virtual reality (VR) goggles or augmented reality (AR) goggles), a smart watch, or any other body-wearable electronic.

[0005] In some aspects, the techniques described herein relate to a method including: receiving first sensor data from a first sensor; determining, using the first sensor data, that a head mounted wearable device is in a non-resting state; increasing an activation state of a second sensor to an increased activation level in response to determining that the head mounted wearable device is in the non-resting state; receiving second sensor data from the second sensor at the increased activation level; determining, using the second sensor data, that the head mounted wearable device is in a head-mounted state; and in response to determining that the head mounted wearable device is in the head-mounted state, increasing an operational mode of the head mounted wearable device to an increased operational mode, wherein determining that the head mounted wearable device is in a first of the non-resting state or the head-mounted state includes executing a model.

[0006] In some aspects, the techniques described herein relate to ahead mounted wearable device, including: ahead mounted wearable device body; a first sensor positioned in the head mounted wearable device body; a second sensor positioned in the head mounted wearable device body; and a processor and a memory configured with instructions to: receive first sensor data from a first sensor; determine, using the first sensor data, that a head mounted wearable device is in a non-resting state; increase an activation state of a second sensor to an increased activation level in response to determining that the head mounted wearable device is in the non-resting state; receive second sensor data from the second sensor at the increased activation level; determine, using the second sensor data, that the head mounted wearable device is in a head-mounted state; and in response to determining that the head mounted wearable device is in the head-mounted state, increase an operational mode of the head mounted wearable device to an increased operational mode, wherein determining that the head mounted wearable device is in a first of the non-resting state or the head-mounted state includes executing a model.

[0007] The details of one or more implementations are set forth in the accompanying drawings and the description below. Other features will be apparent from the description and drawings, and from the claims. BRIEF DESCRIPTION OF THE DRAWINGS

[0008] FIG. 1 A depicts a head mounted wearable device worn by a user, according to examples.

[0009] FIG. IB depicts a front view, and FIG. 1C depicts a rear view of the head mounted wearable device shown in FIG. 1A, according to examples.

[0010] FIG. ID is a block diagram of a head mounted wearable device, according to examples.

[0011] FIG. 2A depicts a flow diagram, according to examples.

[0012] FIG. 2B depicts a state diagram for DON/DOFF determination, according to examples.

[0013] FIG. 3 depicts a flowchart of a method 300, according to examples.

DETAILED DESCRIPTION

[0014] This disclosure relates to systems and techniques for determining when a body -mounted wearable device is being worn on the body of a user. The state of when the body -mounted wearable device is being worn by the user may be referred to as a DON/DOFF state. For example, when the wearable device is in a DON state, the wearable device is being worn by a user. For example, in a DOFF state, the wearable device is not being worn by a user. While a head mounted wearable device is discussed as an example in the disclosure, the methods described herein may be applied to any body-wearable device to determine a DON/DOFF state of the wearable device.

[0015] Technical problems may arise based on prior techniques used to determine DON/DOFF state because those existing techniques do not make accurate DON/DOFF state determinations. Other technical problems, such as a degradation in DON/DOFF determination accuracy, arise based on those existing techniques in combination with adverse and/or variable environmental and user factors. An inaccurate DON/DOFF state determination can affect multiple other wearable device characteristics and aspects including power consumption, battery life, and user experience. In some implementations, technical problems arise based on other existing techniques due to the amount of power consumed to accurately determine the DON/DOFF state of the body mounted wearable device.

[0016] The technical solutions described herein provide a more accurate determination of the DON/DOFF state, while consuming a lower amount of power. The technical solutions implement a DON/DOFF logic for DON/DOFF state determination without consuming a large amount of wearable device power or a large amount of compute requirements of the body-mounted wearable device as compared to existing solutions. The technical solutions include using various combinations of sensors to confirm the switch from one of the multiple status states to another of the multiple status states that are part of the DON/DOFF state determination. More specifically, the technical solutions include an intermediate state transition that provides more accurate and sensitive state transitions as compared to existing solutions. The use of an intermediate state eliminates false positives present with existing solutions. For example, a first sensor on the wearable device may be used to determine that the wearable device has transitioned from a resting state to a nonresting state. A second sensor may be used to confirm that a wearable device has transitioned to the non-resting state or to further transition the wearable device from the non-resting state to a body -mounted state. The first sensor and the second sensor may be sensors already present on many head mounted wearable devices.

[0017] Further, as part of the technical solutions, one or more models such as, for example, one or more machine learning models, may be used by the head mounted device in cooperation with the first sensor and the second sensor to transition the head mounted wearable device from one state to another state. In some examples, one or more multi-modal machine learning models may be used. For example, multiple different ty pes of sensors may be used to transition states and the multiple sensors may be input into a single, multi-modal machine learning model to process and output whether or not the head mounted device has transitioned from the resting state to the non-resting state or the head-mounted state. In this manner, improved accuracy of the models may be realized that cause the state changes.

[0018] In some examples, the head mounted wearable device includes smart glasses. In some examples, the head mounted wearable device include earbuds. In some examples, the head mounted wearable device includes over-ear headphones. In some examples, the head mounted wearable device includes goggles (e.g., virtual reality (VR) goggles or augmented reality (AR) goggles). While this document describes and illustrates smart glasses style head mounted wearable device, it is understood that wearable device may include other body worn devices such as, for example, earbuds, over-ear headphone, goggles, a smart headband, a smart earing, a smart watch, or any other device that is worn and active on a user’s body.

[0019] FIG. 1 A illustrates a user wearing an example head mounted wearable device 100 in the form of smart glasses, or augmented reality glasses, including display capability, eye/gaze tracking capability, and computing/processing capability. FIG. IB is a front view, and FIG. 1C is a rear view, of the example head mounted wearable device 100 shown in FIG. 1A. The example head mounted wearable device 100 includes a frame 110. The frame 110 includes a front frame portion 120, and a pair of arm portions 130 rotatably coupled to the front frame portion 120 by respective hinge portions 132. The front frame portion 120 includes rim portions 123 surrounding respective optical portions in the form of lenses 127, with a bridge portion 129 connecting the rim portions 123. The arm portions 130 are coupled, for example, pivotably or rotatably coupled, to the front frame portion 120 at peripheral portions of the respective rim portions 123. In some examples, the lenses 127 are corrective/prescription lenses. In some examples, the lenses 127 are an optical material including glass and/or plastic portions that do not necessarily incorporate corrective/prescription parameters.

[0020] In some examples, the head mounted wearable device 100 includes a device display 104 that can output visual content, for example, at an output coupler 105, so that the visual content is visible to the user. In the example shown in FIGs. IB and 1C, the device display 104 is provided in one of the two arm portions 130, simply for purposes of discussion and illustration. Device displays 104 may be provided in each of the two arm portions 130 for binocular output of content. In some examples, the device display 104 may be a see-through near eye display. In some examples, the device display 104 may be configured to project light from a display source onto a portion of teleprompter glass functioning as a beamsplitter seated at an angle (e.g., 30-45 degrees). The beamsplitter may allow for reflection and transmission values that allow the light from the display source to be partially reflected while the remaining light is transmitted through. Such an optic design may allow a user to see both physical items in the world, for example, through the lenses 127, next to content (for example, digital images, user interface elements, virtual content, and the like) output by the device display 104. In some implementations, waveguide optics may be used to depict content on the device display 104.

[0021] In some examples, the head mounted wearable device 100 includes one or more of an audio output device 106 (such as, for example, one or more speakers), an illumination device 108, a sensing system 111, a control system 112, at least one processor 114, and an outward facing image sensor 116, or camera 116. In some examples, the sensing system 111 may include various sensing devices and the control system 112 may include various control system devices including, for example, at least one processor 114 operably coupled to the components of the control system 112. In some examples, the control system 112 may include a communication module providing for communication and exchange of information between the head mounted wearable device 100 and other external devices.

[0022] The sensing system 111 may include one or more capacitive touch sensors. The capacitive touch sensors may be located on one or both of the temple arm portions 130. In some examples, two capacitive touch sensors may be disposed in one of the temple arm portions 130. In some examples, one or more capacitive touch sensors may be located on other portions of the frame 110.

[0023] The sensing system 111 also may include a motion sensor, which may be implemented as an accelerometer, a gyroscope, and/or magnetometer, some of which may be combined to form an inertial measurement unit (IMU). In some examples, the motion sensor may be implemented as a three-axis motion sensor such as, for example, a three-axis accelerometer or a three-axis gy roscope, where the motion signals captured by the motion sensor describe three translation movements (i.e., x-direction, y-direction, and z-direction) along axes of a world coordinate system. In some examples, the motion sensor may be implemented as a six-axis motion sensor such as, for example, an IMU that has six (6) degrees of freedom (6- DOF), which can describe three translation movements (i.e., x-direction, y-direction, and z-direction) along axes of a world coordinate system and three rotation movements (i.e., pitch, yaw, roll) about the axes of the world coordinate system.

[0024] The sensing system 111 may include an optical proximity sensor. In some examples, the optical proximity sensor may be located on one or both of the temple arm portions 130. In some examples, the optical proximity sensor may be located on other portions of the frame 110.

[0025] In some examples, the head mounted wearable device 100 includes a gaze tracking device 115 to detect and track eye gaze direction and movement. Data captured by the gaze tracking device 115 may be processed to detect and track gaze direction and movement as a user input. In the example shown in FIGs. IB and 1C, the gaze tracking device 115 is provided in one of the two arm portions 130, simply for purposes of discussion and illustration. In the example arrangement shown in FIGs. IB and 1C, the gaze tracking device 115 is provided in the same arm portion 130 as the device display 104, so that user eye gaze can be tracked not only with respect to objects in the physical environment, but also with respect to the content output for display by the device display 104. In some examples, gaze tracking devices 115 may be provided in each of the two arm portions 130 to provide for gaze tracking of each of the two eyes of the user. In some examples, device displays 104 may be provided in each of the two arm portions 130 to provide for binocular display of visual content.

[0026] The control system 112 may include one or more models such as, for example, one or more machine learning models that operate in cooperation with the processor 114. In some examples, the models may be trained to classify one or more sensor inputs to determine an activity state of a head mounted wearable device. In some examples, the models may include a multi-channel support vector machine used to classify one or more sensor inputs to determine whether the head mounted wearable device is being worn on the head of the user.

[0027] FIG. ID depicts a block diagram of head mounted wearable device 100, in accordance with an example. Head mounted wearable device 100 may include any combination of elements described in FIGs. 1A-1D.

[0028] Head mounted wearable device 100 includes a processor 140, a memory 142, a communication interface 144. In examples, head mounted wearable device 100 may further include any combination of: a capacitive sensor 146, a proximity sensor 148, an inertial measuring unit (IMU) 150, an eye tracking device 152, an ultrasonic sensor 154, first sensor data receiving module 156, second sensor data receiving module 157, a resting state determination module 158, an activation status change module 1 0, a head-mounted state determination module 162, and an operational mode change module 1 4.

[0029] In examples, processor 140 may include multiple processors, and memory 142 may include multiple memories. Processor 140 may be in communication with any cameras, sensors, and other modules and electronics of head mounted wearable device 100. Processor 140 is configured by instructions (e.g., software, application, modules, etc.) to display content or execute any modules included on head mounted wearable device 100. The instructions may include non- transitory computer readable instructions stored in, and recalled from, memory 142. In examples, the instructions may be communicated to processor 140 from a computing device or from a network (not pictured) via a communication interface 144.

[0030] Processor 140 of head mounted wearable device 100 is in communication with device display 104. Processor 140 may be configured by instructions to transmit text, graphics, video, images, etc. to device display 104.

[0031] Communication interface 144 of head mounted wearable device 100 may be operable to facilitate communication between head mounted wearable device 100 and other computing devices, such as desktop computers, laptop computers, tablet computers, smart phones, wearable computers, servers, or any other type of computing device. In examples, communication interface 144 may utilize Bluetooth, Wi-Fi, Zigbee, or any other wireless or wired communication methods.

[0032] Head mounted wearable device 100 may include capacitive sensor 146. Capacitive sensor 146 may determine when a surface with dielectric properties, such as a finger, comes in close proximity via a change in capacitance from two conductive surfaces. In examples, capacitive sensor 146 may be positioned on one or both of temple arm portions 130, or on other portions of the frame 110.

[0033] Head mounted wearable device 100 may include a proximity sensor 148. Proximity sensor 148 may determine the presence or absence of a surface, for example the temple of a head, by generating an electromagnetic field (for example infrared) and detecting electromagnetic radiation reflected off that surface. In examples, proximity sensor 148 may be positioned on one or both of temple arm portions 130, or on other portions of the headset frame 110.

[0034] Head mounted wearable device 100 may include an IMU 150. IMU 150 may comprise a motion sensor operating in one to three dimensions which may include any combination of: accelerometer, gyroscope, and/or magnetometer. In examples, first IMU 150 may be implemented as a three-axis motion sensor such as, for example, a three-axis accelerometer or a three-axis gyroscope, where the motion signals captured by the motion sensor describe three translation movements (i.e., x- direction, y-direction, and z-direction) along axes of a world coordinate system. In examples, first IMU 150 may be implemented as a six-axis motion sensor which can describe three translation movements (i. e. , x-direction, y-direction, and z-direction) along axes of a world coordinate system and three rotation movements (i.e., pitch, yaw, roll) about the axes of the world coordinate system. In examples, head mounted wearable device 100 may include more than one IMU 150.

[0035] Head mounted wearable device 100 may include a gaze tracking device 115. Gaze tracking device 115 may detect and track eye gaze direction and movement by imaging a user’s eye. Gaze tracking device 115 may include an electromagnetic radiation source to illuminate the eye and a detector to image it. [0036] Head mounted wearable device 100 may include ultrasonic sensor 154. In examples, ultrasonic sensor 154 may be a microphone that can sense ultrasonic output from audio output device 106. Ultrasonic sensor 154 and audio output device 106 may be used, for example, to generate a time-of-flight measurement to an object, such as a human head.

[0037] Head mounted wearable device 100 may include first sensor data receiving module 156. First sensor data receiving module 156 may receive first sensor data from a first sensor. In examples, first sensor data receiving module 156 may receive data from any combination of capacitive sensor 146, proximity sensor 148, IMU 150, gaze tracking device 115, and ultrasonic sensor 154. As discussed above, one example of the first sensor includes a capacitive touch sensor that outputs capacitive readings when near contact to the human temple (head) or skm. Another example of the first sensor includes an optical proximity sensor that uses reflective optical energy (or time of flight) to measure proximity. Typically, capacitive touch sensors and proximity sensors use low power consumption, which may help save power on head mounted wearable device 100 when the device is in DOFF mode but monitoring for DON mode.

[0038] Head mounted wearable device 100 may further include second sensor data receiving module 157. Second sensor data receiving module 157 may receive second sensor data from a second sensor. In examples, second sensor data receiving module 157 may receive data from any combination of capacitive sensor 146, proximity sensor 148, IMU 150, gaze tracking device 115, and ultrasonic sensor 154. Typically, IMU’s gaze tracking devices, and ultrasonic sensors may use more power than proximity and capacitive touch sensors. [0039] Head mounted wearable device 100 may include resting state determination module 158. In examples, resting state determination module 158 may use the first sensor data to determine that a head mounted wearable device is in a nonresting state. In examples, the first sensor may be capacitive sensor 146 or proximity sensor 148. In examples, the first sensor may comprise other sensors.

[0040] In examples, the first sensor data may be used to determine that the head mounted wearable device 100 is in an activity state by generating an activity rating of the head mounted wearable device. The activity rating is an indication that the first sensor data corresponds to one or more states of use of head mounted wearable device 100. In examples, the activity rating may be a probability that a sensor is in one or more states. In examples, the activity rating may include a most likely activity state and/or a confidence score.

[0041] For example, FIG. 2A depicts a flow diagram 200, in accordance with examples. Flow diagram 200 depicts how data from a first sensor and a second sensor may be used to determine one or more activity states of head mounted wearable device 100 and make a DON/DOFF determination. Flow diagram 200 begins when first sensor data 202 is received by resting state determination module 158. In the example of flow diagram 200, resting state determination module 158 uses first sensor data 202 to determine whether wearable device 100 is in resting state 208 or non-resting state 210.

[0042] FIG. 2B depicts activity state diagram 250, in accordance with an example. Activity state diagram 250 may be used for DON/DOFF determination. The transition between a DOFF state 252 and a DON state 254 may include determining that the head mounted device is in one or more intermediate activity states based on one or more sensors. In the example, activity state diagram 250 includes a resting state 208, a non-resting state 210, and a head-mounted state 218. In further examples, however, activity state diagram 250 could include additional and/or different intermediate states.

[0043] In resting state 208, the head mounted wearable device 100 is not being worn on the body of a user and may be positioned on a substantially nonmoving surface, such as a table. In examples, resting state 208 may mean that head mounted wearable device 100 is not being touched or held by a user either. In resting state 208, the head mounted wearable device 100 may be powered off, powered on, or charging. In resting state 208, head mounted wearable device 100 is in DOFF state 252.

[0044] In non-resting state 210, the head mounted wearable device 100 is not in resting state 208. Non-resting state 210 may include the circumstances where head mounted wearable device 100 is being moved by a user, held by a user, or worn in an operational position. For example, the wearable device may be carried in a backpack, in a moving car, or in the hand of a user who has picked up the device from a resting position but has not placed it on their body yet. Non-resting state 210 can in be DOFF state 252 or DON state 254.

[0045] During head-mounted state 218, the head mounted wearable device 100 is being worn on the body of a user as it is designed to be operated. In headmounted state 218, head mounted wearable device 100 is in DON state 254.

[0046] In examples, resting state determination module 158 may use a model to determine an activity state based on first sensor data 202. The model may comprise a machine learning model, e.g., using supervised or semi-supervised training. In examples, the model may comprise a multi-channel support vector machine.

[0047] In examples where resting state determination module 158 is a machine learning model, a training set of data may be generated including negative data indicating that head mounted wearable device 100 is not in a resting state and positive data indicating that head mounted wearable device 100 is in a non-resting state. The training set of data may be generated as follows: sensor data associated with positive data is collected from devices being carried or worn by users while performing a variety of daily tasks. Sensor data associated with negative data is collected from devices that are substantially at rest on a stable surface.

[0048] State determination module 158 may generate an activity rating using first sensor data as input and determine that head mounted wearable device 100 is in an activity state based on the activity rating. In the example the activity states may include resting state 208 and non-resting state 210.

[0049] In examples, resting state determination module 158 may determine if head mounted wearable device 100 is in an activity' state based on determining if the activity rating is over a threshold. In examples, the threshold may be optimized to balance the need for head mounted wearable device 100 to respond quickly to a user that is preparing to use it and the need to conserve battery power by relying on a lower powered sensing mode when head mounted wearable device 100 is in DOFF state 252.

[0050] In examples, resting state determination module 158 generating the activity rating of the head mounted wearable device may use second sensor data at a decreased activation level as input to determine an activity state. For example, as may be seen in flow diagram 200, resting state determination module 158 may receive reduced activation status second sensor data 204. In examples where resting state determination module 158 uses a machine model, the second sensor data may be used as input to the model. In examples, the second sensor may comprise IMU 150, gaze tracking device 115, or ultrasonic sensor 154, for example. The second sensor may have increased and decreased activation levels that use different amounts of power and/or generate different amounts of data based on the activation level.

[0051] Head mounted wearable device 100 may include activation status change module 160. Activation status change module 160 may increase or decrease an activation state of a second sensor in response to detecting an activity' state. For example, returning to diagram 200, it may be seen that activation status change module 160 is executed following a determination at resting state determination module 158 that non-resting state 210 is the activity state. In examples, activation status change module 160 may increase an activation level of the second sensor from off to on, or from low power to high power and/or low data rate to high data rate. This may allow for head mounted wearable device 100 to remain in a lower power state, operating only a first sensor or a combination of first sensor and second sensor in a reduced activation status when head mounted wearable device 100 is in resting state 208. Therefore, head mounted wearable device 100 may avoid wasting power in DOFF state 252 while remaining alert enough to detect when a user may be transitioning head mounted wearable device 100 to DON state 254.

[0052] For example, IMU 150 may use a 100ms accelerometer reading (no gyro may be needed) to check 3 Dof plus subtle head movement during the increased activation level. At a decreased activation level, IMU 150 may continue to take a recurring snapshot periodically (e.g., every Is).

[0053] Head mounted wearable device 100 may include a head-mounted state determination module 162. Head-mounted state determination module 162 may determine, using second sensor data as input, that the head mounted wearable device is in a head-mounted state. For example, as seen in FIG. 2A, head-mounted state determination module 162 may follow activation status change module 160 and determine if the activity state is non-resting state 210 or head-mounted state 218. As may be seen, head-mounted state determination module 162 may receive increased activation status second sensor data 216 as input.

[0054] In examples, head-mounted state determination module 162 may execute a model, such as a machine learning model. A training set of data may be generated including negative data indicating that head mounted wearable device 100 not in a head-mounted state and positive data indicating that head mounted wearable device 100 is in a head-mounted state. The training set of data may be generated as follows: sensor data associated with positive data is collected from devices being worn by users while performing a variety of daily tasks. Sensor data associated with negative data is collected from devices in a variety of other situations.

[0055] In examples, head-mounted state determination module 162 may generate an activity rating for head mounted wearable device 100 based on second sensor data. In examples, the activity rating may include any of the features described above. In examples, it may be determined that the head mounted wearable device is in a non-resting state or a head-mounted state based on determining that the activity rating is over a threshold.

[0056] In examples, first sensor may comprise a capacitive sensor and the second sensor may comprise at least one of an inertial measurement unit, an eye tracking camera, or an ultrasonic sensor.

[0057] In examples, the second sensor may consume a higher level of power than the first sensor.

[0058] Head mounted wearable device 100 may further include operational mode change module 164. In response to determining that head mounted wearable device 100 is in the head-mounted state, operational mode change module 164 may increase an operational mode of head mounted wearable device 100 to an increased operational mode.

[0059] An operational mode of head mounted wearable device 100 may comprise one or more modes of software and/or hardware operation. For example, a first operational mode may be a deep standby mode applied during a resting state of head mounted wearable device 100. During deep standby mode, components of head mounted wearable device 100 may be minimally powered to conserve power. In examples, the first sensor may be powered, but not the second sensor or device display 104, for example.

[0060] A further operational mode may include an off head standby mode. Off head standby mode may be applied in the non-resting state to provide power to everything powered in deep standby mode, with the addition of the second sensor.

[0061] A further operational mode may include an active mode. Active mode may be applied in a head-mounted state of head mounted wearable device 100. In active mode, power may be provided to both the first sensor, the second sensor, and additional components needed for normal operation of head mounted wearable device 100, such as gaze tracking device 115 and IMU 150.

[0062] In examples, there may be other operational modes as well.

[0063] In examples, operational mode change module 164 may increase an operational mode of head mounted wearable device 100 to an increased operational mode may include selecting an operational mode with more components being powered, or an operational mode in which a greater volume of processer cycles are executed. For example, increasing an operational mode of head mounted wearable device 100 to an increased operational mode may include increasing from an off head standby mode to active mode. Therefore, in examples, increasing the operational mode of head mounted wearable device 100 to an increased operational mode may comprise powering on an additional component of the head mounted wearable device. [0064] In examples, processor 140 may include two processors: a first processor and a second processor. The first processor may be lower power/slower, and the second processor may be higher power/faster. In examples, the first processor may be configured to determine that the head mounted wearable device is in the nonresting state, and the second processor may be configured to determine that the head mounted wearable device is in the head-mounted state. By dividing up the processing between the first and second processors, it may be possible to better optimize battery performance, accuracy, and speed of response to a user preparing to use head mounted wearable device 100.

[0065] FIG. 3 depicts a flowchart of method 300, according to examples. Method 300 may be used to determine that head mounted wearable device 100 is in a head-mounted state, according to embodiments. Method 300 may include any combination of steps 302-312. Method 300 begins with step 302. In step 302, first sensor data from a first sensor may be received. For example, head mounted wearable device 100 may execute first sensor data receiving module 156, as described above. [0066] Method 300 may continue with step 304. In step 304, it may be determined, using the first sensor data, that a head mounted wearable device is in a non-resting state. For example, head mounted wearable device 100 may execute resting state determination module 158, as described above.

[0067] Method 300 may continue with step 306. In step 306, an activation state of a second sensor may be increased to an increased activation level in response to determining that the head mounted wearable device is in the non-resting state. For example, head mounted wearable device 100 may execute activation status change module 160, as described above. The wording "Increasing an activation state of the second sensor to an increased activation level” can be used herein to refer to activating the sensor. Therefore, increasing the activation state can simply mean that the sensor is switched on for sensing, from low power to high power, and/or from low data rate to high data rate when the head mounted wearable device is determined to be in the non-resting state.

[0068] Method 300 may continue with step 308. In step 308, second sensor data may be received from the second sensor at the increased activation level. For example, head mounted wearable device 100 may execute second sensor data receiving module 157, as described above.

[0069] Method 300 may continue with step 310. In step 310, it may be determined, using the second sensor data, that the head mounted wearable device is in a head-mounted state. For example, head mounted wearable device 100 may execute head-mounted state determination module 162, as described above.

[0070] Method 300 may continue with step 310. In step 312, an operational state of the head mounted wearable device may be increased to an increased operational state. For example, head mounted wearable device 100 may execute operational mode change module 164, as described above.

[0071] In some aspects, the techniques described herein relate to a method, wherein the model is a first model and determining that the head mounted wearable device is in a second of the non-resting state or the head-mounted state includes executing a second model. [0072] In some aspects, the techniques described herein relate to a method, wherein the model includes a multi-channel support vector machine.

[0073] In some aspects, the techniques described herein relate to a method, wherein increasing the operational mode of the head mounted wearable device to an increased operational mode further includes powering on an additional component of the head mounted wearable device.

[0074] In some aspects, the techniques described herein relate to a method, wherein determining that the head mounted wearable device is in the first of the nonresting state or the head-mounted state further includes: generating an activity rating of the head mounted wearable device based on a sensor data; and determining that the activity rating is over a threshold.

[0075] In some aspects, the techniques described herein relate to a method, wherein determining that the head mounted wearable device is in the non-resting state further includes using the second sensor at a decreased activation level as input to the model.

[0076] In some aspects, the techniques described herein relate to a method, wherein: the first sensor includes a capacitive sensor; and the second sensor data includes at least one of an inertial measurement unit, an eye tracking camera, or an ultrasonic sensor.

[0077] In some aspects, the techniques described herein relate to a method, wherein the second sensor consumes a higher level of power than the first sensor.

[0078] In some aspects, the techniques described herein relate to ahead mounted wearable device, wherein the model is a first model and determining that the head mounted wearable device is in a second of the non-resting state or the headmounted state includes executing a second model.

[0079] In some aspects, the techniques described herein relate to ahead mounted wearable device, wherein the model includes a multi-channel support vector machine.

[0080] In some aspects, the techniques described herein relate to ahead mounted wearable device, wherein increasing the operational mode of the head mounted wearable device to an increased operational mode further includes powering on an additional component of the head mounted wearable device.

[0081] In some aspects, the techniques described herein relate to ahead mounted wearable device, wherein determining that the head mounted wearable device is in the first of the non-resting state or the head-mounted state further includes: generating an activity rating of the head mounted wearable device based on a sensor data; and determining that the activity rating is over a threshold.

[0082] In some aspects, the techniques described herein relate to ahead mounted wearable device, wherein determining that the head mounted wearable device is in the non-resting state further includes using the second sensor at a decreased activation level as input to the model.

[0083] In some aspects, the techniques described herein relate to ahead mounted wearable device, wherein: the first sensor includes a capacitive sensor; and the second sensor data includes at least one of an inertial measurement unit, an eye tracking camera, or an ultrasonic sensor.

[0084] In some aspects, the techniques described herein relate to ahead mounted wearable device, wherein the second sensor consumes a higher level of power than the first sensor.

[0085] In some aspects, the techniques described herein relate to ahead mounted wearable device, wherein the processor is a first processor and the head mounted wearable device further includes: a second processor, wherein the first processor is configured to determine that the head mounted wearable device is in the non-resting state and the second processor is configured to determine that the head mounted wearable device is in the head-mounted state.

[0086] Various implementations of the systems and techniques described here can be realized in digital electronic circuitry, integrated circuitry, specially designed ASICs (application specific integrated circuits), computer hardware, firmware, software, and/or combinations thereof. These various implementations can include implementation in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, coupled to receive data and instructions from, and to transmit data and instructions to, a storage system, at least one input device, and at least one output device. Various implementations of the systems and techniques described here can be realized as and/or generally be referred to herein as a circuit, a module, a block, or a system that can combine software and hardware aspects. For example, a module may include the functions/acts/computer program instructions executing on a processor or some other programmable data processing apparatus.

[0087] Some of the above example implementations are described as processes or methods depicted as flowcharts. Although the flowcharts describe the operations as sequential processes, many of the operations may be performed in parallel, concurrently, or simultaneously. In addition, the order of operations may be re-arranged. The processes may be terminated when their operations are completed, but may also have additional steps not included in the figure. The processes may correspond to methods, functions, procedures, subroutines, subprograms, etc.

[0088] Methods discussed above, some of which are illustrated by the flow charts, may be implemented by hardware, software, firmware, middleware, microcode, hardware description languages, or any combination thereof. When implemented in software, firmware, middleware or microcode, the program code or code segments to perform the necessary tasks may be stored in a machine or computer readable medium such as a storage medium. A processor(s) may perform the necessary tasks.

[0089] Specific structural and functional details disclosed herein are merely representative for purposes of describing example implementations. Example implementations, however, have many alternate forms and should not be construed as limited to only the implementations set forth herein.

[0090] It will be understood that, although the terms first, second, etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first element could be termed a second element, and, similarly, a second element could be termed a first element, without departing from the scope of example implementations. As used herein, the term and/or includes any and all combinations of one or more of the associated listed items.

[0091 ] The terminology used herein is for the purpose of describing particular implementations only and is not intended to be limiting of example implementations. As used herein, the singular forms a, an, and the are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms comprises, comprising, includes and/or including, when used herein, specify the presence of stated features, integers, steps, operations, elements and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components and/or groups thereof.

[0092] It should also be noted that in some alternative implementations, the functions/ acts noted may occur out of the order noted in the figures. For example, two figures show n in succession may in fact be executed concurrently or may sometimes be executed in the reverse order, depending upon the functionality /acts involved.

[0093] Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which example implementations belong. It will be further understood that terms, e.g., those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.

[0094] Portions of the above example implementations and corresponding detailed description are presented in terms of software, or algorithms and symbolic representations of operation on data bits within a computer memory. These descriptions and representations are the ones by which those of ordinary skill in the art effectively convey the substance of their work to others of ordinary skill in the art. An algorithm, as the term is used here, and as it is used generally, is conceived to be a self-consistent sequence of steps leading to a desired result. The steps are those requiring physical manipulations of physical quantities. Usually, though not necessarily, these quantities take the form of optical, electrical, or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated. It has proven convenient at times, principally for reasons of common usage, to refer to these signals as bits, values, elements, symbols, characters, terms, numbers, or the like.

[0095] In the above illustrative implementations, reference to acts and symbolic representations of operations (e.g., in the form of flowcharts) that may be implemented as program modules or functional processes include routines, programs, objects, components, data structures, etc., that perform particular tasks or implement particular abstract data types and may be described and/or implemented using existing hardware at existing structural elements. Such existing hardware may include one or more Central Processing Units (CPUs), Graphics Processing Units (GPUs), digital signal processors (DSPs), application-specific-integrated-circuits, field programmable gate arrays (FPGAs) computers or the like.

[0096] It should be borne in mind, however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities. Unless specifically stated otherwise, or as is apparent from the discussion, terms such as processing or computing or calculating or determining of displaying or the like, refer to the action and processes of a computer system, or similar electronic computing device, that manipulates and transforms data represented as physical, electronic quantities within the computer system’s registers and memories into other data similarly represented as physical quantities within the computer system memories or registers or other such information storage, transmission or display devices.

[0097] Note also that the software implemented aspects of the example implementations are typically encoded on some form of non-transitory program storage medium or implemented over some type of transmission medium. The program storage medium may be magnetic (e.g., a floppy disk or a hard drive) or optical (e.g., a compact disk read only memory, or CD ROM), and may be read only or random access. Similarly, the transmission medium may be twisted wire pairs, coaxial cable, optical fiber, or some other suitable transmission medium known to the art. The example implementations not limited by these aspects of any given implementation.

[0098] Lastly, it should also be noted that whilst the accompanying claims set out particular combinations of features described herein, the scope of the present disclosure is not limited to the particular combinations hereafter claimed, but instead extends to encompass any combination of features or implementations herein disclosed irrespective of whether or not that particular combination has been specifically enumerated in the accompanying claims at this time.