Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
ACTIVITY MONITORING SYSTEMS AND METHODS FOR ACCIDENT DETECTION AND RESPONSE
Document Type and Number:
WIPO Patent Application WO/2017/100025
Kind Code:
A1
Abstract:
Various embodiments are presented for one or more wearable electronic devices which may be mounted on or integrated into, e.g., helmets, clothing, gear, vehicles, body portions, etc. These embodiments may be used in a variety of contexts in which user injury may occur, e.g., in outdoor sports, construction environments, military exercises, etc. In some embodiments, the device is comprised of motion monitoring sensors connected to a microcontroller (some embodiments may include pressure sensors, water sensors, temperature sensors, chemical sensors, sonic sensors, electromagnetic and radiation sensors, etc.). The microcontroller may perform all or a portion of a real-time analysis of motion and/or other sensor data provided by the monitoring sensors.

Inventors:
CIARAMELLETTI CARLO (US)
CAVALLI MARCO (US)
Application Number:
PCT/US2016/063821
Publication Date:
June 15, 2017
Filing Date:
November 28, 2016
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
SAPHIBEAT TECH INC (US)
International Classes:
G08B21/02; G08B21/04; G08B23/00
Foreign References:
US20150269825A12015-09-24
US20060270949A12006-11-30
US20110172562A12011-07-14
US20130053990A12013-02-28
US20150186612A12015-07-02
US6453166B12002-09-17
US20040132501A12004-07-08
US20160171864A12016-06-16
Other References:
See also references of EP 3387630A4
Attorney, Agent or Firm:
SLOAT, Ashley (US)
Download PDF:
Claims:
CLAIMS

WHAT IS CLAIMED IS:

1. An emergency response system comprising:

a plurality of sensors;

a processor;

a communication module; and

a memory comprising instructions configured to cause the processor to: identify an activity of a user based on data from the plurality of sensors; generate a time-dependent, multi-dimensional biomechanical vector that describes one or more of a dynamic, a motion, a position, and an orientation of the user at any given time interval based on the identified activity and the data from the plurality of sensors; compare the vector against allowed values within a pattern corresponding to the identified activity; and transmit a message to a remote system if the comparison indicates an anomaly.

2. The system of Claim 1, wherein the memory comprising instructions is further configured to cause the processor to transmit the message only after verifying the anomaly, wherein the verification includes determining, from the data from the plurality of sensors, if the activity has resumed by comparing the vector after the anomaly to the pattern.

3. The system of Claim 1, wherein the memory comprising instructions is further configured to cause the processor to update the pattern for the identified activity based on the vector when an anomaly is not detected.

4. The system of Claim 1, wherein the identifying the activity of the user comprises determining a geographical position of the user and comparing the position to a map to determine terrain and associated activity for that terrain.

5. The system of Claim 1, wherein the identifying the activity of the user comprises comparing motion sensor values over time with patterns of activities for activities to identify a match.

6. The system of Claim 1, wherein the comparing the vector comprises verifying if instantaneous values of the vector belong to an allowed set of values.

7. The system of Claim 1 , wherein the comparing the vector comprises, based on recent vector history and the identified activity, calculating expected vector values and verifying if measured values fall within the expected vector values.

8. The system of Claim 1, wherein the comparing the vector comprises comparing consistency of the vector with an associated pattern of the identified activity.

9. The system of Claim 1, wherein the memory comprising instructions is further configured to cause the processor to measure a biometric reading of the user and wherein the transmitting further requires a biometric anomaly for transmission.

10. The system of Claim 1, wherein the memory comprising instructions is further configured to cause the processor to transmit the message via one or more communication channels depending on channel availability.

11. The system of Claim 1, further comprising: two power sources; and a power management module coupled to the two power sources, the power management module configured to cause the system to operate primarily on one power source and maintain the second power source in reserve for transmission of the message indicating detection of an anomaly.

12. A computer-readable medium having stored thereon instructions to cause a computer to execute a method, the method comprising: identifying an activity of a user based on data from a plurality of sensors; generating a time-dependent, multi-dimensional biomechanical vector that describes one or more of a dynamic, a motion, a position, and an orientation of the user at any given time interval based on the identified activity and the sensor data; comparing the vector against allowed values within a pattern corresponding to the identified activity; and transmitting a message to a remote system if the comparison indicates an anomaly.

13. An emergency response method, comprising: identifying an activity of a user based on data from a plurality of sensors; generating a time-dependent, multi-dimensional biomechanics! vector that describes one or more of a dynamic, a motion, a position, and an orientation of the user at any given time interval based on the identified activity and the sensor data; comparing the vector against allowed values within a pattern corresponding to the identified activity; and transmitting a message to a remote system if the comparison indicates an anomaly.

14. The method of Claim 13, further comprising transmitting the message only after verifying the anomaly, wherein the verifying includes determining, from sensor data, if the activity has resumed by comparing the vector after the anomaly to the corresponding pattern.

15. The method of Claim 13, further comprising updating the pattern for the identified activity based on the vector when an anomaly is not detected.

16. The method of Claim 13, wherein the identifying the activity of the user comprises determining a geographical position of the user and comparing the position to a map to determine terrain and associated activity for that terrain.

17. The method of Claim 13, wherein the identifying the activity of the user comprises comparing motion sensor values over time with patterns of activities for activities to identify a match.

18. The method of Claim 13, wherein the comparing the vector comprises verifying if instantaneous values of the vector belong to an allowed set of values.

19. The method of Claim 13, wherein the comparing the vector comprises, based on recent vector history and the identified activity, calculating expected vector values and verifying if measured values fall within the expected vector values.

20. The method of Claim 13, wherein the comparing the vector comprises comparing a consistency of the vector with an associated pattern of the identified activity.

Description:
ACTIVITY MONITORING SYSTEMS AND METHODS FOR ACCIDENT

DETECTION AND RESPONSE

CROSS-REFERENCE TO RELATED APPLICATIONS

[0001] This application claims priority to U.S. Nonprovisional Patent Application

Serial No. 14/961,820, entitled "Activity Monitoring Systems and Methods for Accident Detection and Response," filed December 7, 20 IS, which claims priority to U.S. Provisional Patent Application Serial No.62/088,466, entitled "Activity Monitoring Systems and Methods for Accident Detection and Response," filed December 5, 2014; the disclosure of both is incorporated by reference in their entireties.

TECHNICAL FIELD

[0002] Various of the present embodiments relate to systems and methods for detecting adverse situations and taking appropriate action.

BACKGROUND

[0003] Sport, construction, military, and law enforcement activities and other potentially hazardous situations often bear a certain level of risk. Helmets are typically used to mitigate the risk of head injury. However, in case of an accident, it is nonetheless extremely important to be able to communicate and ask for help, particularly when the user is alone, in a remote location, and/or potentially unconscious. Indeed, a clear assessment of the situation may be required for efficient assistance and proper prioritization. As another example, athletes may be in a team, but not able to communicate with teammates, e.g., because the teammate is out of sight as a result of being left behind or gone ahead of the group. In case of group activities, circumstances may arise that cause one or more members to become disconnected from the rest of the group and in need of assistance or simply unable to reconnect.

SUMMARY

[0004] One aspect of the present disclosure is directed to an emergency response system. In some embodiments, the systems includes: a plurality of sensors; a processor; a communication module; and a memory including instructions configured to cause the processor to: identify an activity of a user based on data from the plurality of sensors;

generate a time-dependent, multi-dimensional biomechanical vector that describes one or more of a dynamic, a motion, a position, and an orientation of the user at any given time interval based on the identified activity and the data from the plurality of sensors; compare the vector against allowed values within a pattern corresponding to the identified activity; and transmit a message to a remote system if the comparison indicates an anomaly.

[0005] In some embodiments, the memory includes instructions configured to cause the processor to transmit the message only after verifying the anomaly. In some

embodiments, the verification includes determining, from the data from the plurality of sensors, if the activity has resumed by comparing the vector after the anomaly to the pattern.

[0006] In some embodiments, the memory includes instructions configured to cause the processor to update the pattern for the identified activity based on the vector when an anomaly is not detected.

[0007] In some embodiments, identifying the activity of the user includes determining a geographical position of the user and comparing the position to a map to determine terrain and associated activity for that terrain.

[0008] In some embodiments, identifying the activity of the user includes comparing motion sensor values over time with patterns of activities for activities to identify a match.

[0009] In some embodiments, comparing the vector includes verifying if

instantaneous values of the vector belong to an allowed set of values.

[0010] In some embodiments, comparing the vector includes, based on recent vector history and the identified activity, calculating expected vector values and verifying if measured values fall within the expected vector values.

[0011] In some embodiments, comparing the vector includes comparing consistency of the vector with an associated pattern of the identified activity.

[0012] In some embodiments, the memory includes instructions configured to cause the processor to measure a biometric reading of the user.

[0013] In some embodiments, transmitting further requires a biometric anomaly for transmission.

[0014] In some embodiments, the memory includes instructions configured to cause the processor to transmit the message via one or more communication channels depending on channel availability.

[0015] In some embodiments, the system further includes two power sources; and a power management module coupled to the two power sources, the power management module configured to cause the system to operate primarily on one power source and maintain the second power source in reserve for transmission of the message indicating detection of an anomaly.

[0016] Another aspect of the present disclosure is directed to a computer-readable medium having stored thereon instructions to cause a computer to execute a method. In some embodiments, the method includes: identifying an activity of a user based on data from a plurality of sensors; generating a time-dependent, multi-dimensional biomechanical vector that describes one or more of a dynamic, a motion, a position, and an orientation of the user at any given time interval based on the identified activity and the sensor data; comparing the vector against allowed values within a pattern corresponding to the identified activity; and transmitting a message to a remote system if the comparison indicates an anomaly.

[0017] Another aspect of the present disclosure is directed to an emergency response method. In some embodiments, the method includes: identifying an activity of a user based on data from a plurality of sensors; generating a time-dependent, multi-dimensional biomechanical vector that describes one or more of a dynamic, a motion, a position, and an orientation of the user at any given time interval based on the identified activity and the sensor data; comparing the vector against allowed values within a pattern corresponding to the identified activity; and transmitting a message to a remote system if the comparison indicates an anomaly.

[0018] In some embodiments, the method further includes transmitting the message only after verifying the anomaly. In some embodiments, verifying includes determining, from sensor data, if the activity has resumed by comparing the vector after the anomaly to the corresponding pattern.

[0019] In some embodiments, the method further includes updating the pattern for the identified activity based on the vector when an anomaly is not detected.

[0020] In some embodiments, identifying the activity of the user includes determining a geographical position of the user and comparing the position to a map to determine terrain and associated activity for that terrain.

[0021] In some embodiments, identifying the activity of the user includes comparing motion sensor values over time with patterns of activities for activities to identify a match.

[0022] In some embodiments, comparing the vector includes verifying if

instantaneous values of the vector belong to an allowed set of values. [0023] In some embodiments, comparing the vector includes, based on recent vector history and the identified activity, calculating expected vector values and verifying if measured values fall within the expected vector values.

[0024] In some embodiments, comparing the vector includes comparing a consistency of the vector with an associated pattern of the identified activity.

BRIEF DESCRIPTION OF THE DRAWINGS

[0025] The foregoing is a summary, and thus, necessarily limited in detail. The above-mentioned aspects, as well as other aspects, features, and advantages of the present technology are described below in connection with various embodiments, with reference made to the accompanying drawings.

[0026] FIG. 1 is a perspective view of a helmet with an attachment as may be implemented in some embodiments;

[0027] FIG.2 illustrates one embodiment of an attachment affixed to a helmet and separate as may be implemented in some embodiments;

[0028] FIG.3 is a functional block diagram of a system as may be implemented in some embodiments;

[0029] FIG.4 is a functional block diagram of a power management system as may be implemented in some embodiments;

[0030] FIG.5 is a block diagram illustrating operation of the accidental event detection system at a high level as may be implemented in some embodiments;

[0031] FIG.6 is a block diagram illustrating operation of the detection system using alternative coverage at a high level as may be implemented in some embodiments;

[0032] FIG.7 is a block diagram illustrating various components in a detection and response system as may be implemented in some embodiments;

[0033] FIG.8 is a Venn diagram illustrating various target functionalities met by certain embodiments;

[0034] FIG.9 is a flow diagram illustrating a process for detecting and responding to an event as may be implemented in some embodiments;

[0035] FIG. 10 is a block diagram of a computer system as may be used to implement features of some of the embodiments;

[0036] FIG. 11 is a block diagram of a system as may be implemented in some embodiments; [0037] FIG. 12 is a block diagram of a power management module of the system of

FIG. 11;

[0038] FIG. 13 is a flow diagram illustrating a process for recognizing an activity;

[0039] FIG. 14 is a flow diagram illustrating a process for identifying an accident;

[0040] FIG. 15 shows an example biomechanical model for use in the process of

FIG. 14;

[0041] FIG. 16 is a bi-dimensional representation of system learning;

[0042] FIG. 17 is a flow diagram illustrating a process for accident verification;

[0043] FIG. 18 is a flow diagram illustrating a process of communication strategy; and

[0044] FIG. 19 is a perspective view of a system as may be implemented in some embodiments.

[0045] The illustrated embodiments are merely examples and are not intended to limit the disclosure. The schematics are drawn to illustrate features and concepts and are not necessarily drawn to scale.

[0046] While the flow and sequence diagrams presented herein show an organization designed to make them more comprehensible by a human reader, those skilled in the art will appreciate that actual data structures used to store this information may differ from what is shown, in that they, for example, may be organized in a different manner, may contain more or less information than shown; may be compressed and/or encrypted; etc.

[0047] The headings provided herein are for convenience only and do not necessarily affect the scope or meaning of the embodiments. Further, the drawings have not necessarily been drawn to scale. For example, the dimensions of some of the elements in the figures may be expanded or reduced to help improve the understanding of the embodiments. Similarly, some components and/or operations may be separated into different blocks or combined into a single block for the purposes of discussion of some of the embodiments. Moreover, while the various embodiments are amenable to various modifications and alternative forms, specific embodiments have been shown by way of example in the drawings and are described in detail below. The intention, however, is not to limit the particular embodiments described. On the contrary, the embodiments are intended to cover all modifications, equivalents, and alternatives falling within the scope of the disclosed embodiments. DETAILED DESCRIPTION

[0048] The foregoing is a summary, and thus, necessarily limited in detail. The above mentioned aspects, as well as other aspects, features, and advantages of the present technology will now he described in connection with various embodiments. The inclusion of the following embodiments is not intended to limit the disclosure to these embodiments, but rather to enable any person skilled in the art to make and use the contemplated invention(s). Other embodiments may be utilized and modifications may be made without departing from the spirit or scope of the subject matter presented herein. Aspects of the disclosure, as described and illustrated herein, can be arranged, combined, modified, and designed in a variety of different formulations, all of which are explicitly contemplated and form part of this disclosure.

[0049] Various examples of the disclosed techniques will now be described in further detail. The following description provides specific details for a thorough understanding and enabling description of these examples. One skilled in the relevant art will understand, however, that the techniques discussed herein may be practiced without many of these details. Likewise, one skilled in the relevant art will also understand that the techniques can include many other obvious features not described in detail herein. Additionally, some well- known structures or functions may not be shown or described in detail below, so as to avoid unnecessarily obscuring the relevant description.

[0050] The terminology used below is to be interpreted in its broadest reasonable manner, even though it is being used in conjunction with a detailed description of certain specific examples of the embodiments. Indeed, certain terms may even be emphasized below; however, any terminology intended to be interpreted in any restricted manner will be overtly and specifically defined as such in this section.

OVERVIEW

[0051] Various embodiments are presented for one or more wearable electronic activity monitoring devices which may be mounted on or integrated into, e.g., helmets, clothing, gear, vehicles, body portions, etc. These embodiments may be used in a variety of safety contexts in which user injury may occur, e.g., in outdoor sports, construction environments, military exercises, etc. In some embodiments, the activity monitoring device is comprised of motion and position sensors connected to a microcontroller. Some

embodiments may include pressure sensors, water sensors, temperature sensors, chemical sensors, sonic sensors, electromagnetic and radiation sensors, etc. The microcontroller may perform all or a portion of a real-time analysis of motion and/or other sensor data provided by the monitoring sensors. In some embodiments a remote system or a mobile phone may perform at least a portion of the analysis and response. In some embodiments the

microcontroller may operate in conjunction with a remote device (e.g., a personal phone, personal digital assistant, smart watch, or smart wrist/chest bands). In some embodiments, the microcontroller may relay the sensor data to the remote device, where the operations are performed. Based upon this analysis, the system, using machine teaming methods, a predictive model analysis, and a decision tree may detect an accident (physical trauma from an external source [e.g., hit by a rock, falling from a cliff, etc.]), collapse due to a medical condition (e.g., exhaustion, heart attack, etc.), undesirable conditions for a given context (e.g., submersion while running, freefall while playing golf, etc.) during or following its occurrence. In particular, the system utilizes one or more models that represent the typical motion of individuals when performing certain activities (e.g., skiing, biking, running). The system continuously monitors the motion of a user using a complex motion vector to represent the user's state. When the monitored motion vector goes outside of the modeled "norm" for the activity in which the user is participating, the system triggers a secondary assessment to measure other characteristics that might reflect an accident or other medical emergency. A microcontroller that implements such a multi-stage analysis is discussed in further detail in conjunction with FIG. 11 and FIG. 12 below. As will be described in greater detail herein, one benefit of such a system is that it provides a more accurate assessment of potential emergency situations as compared to, for example, a system that relies only on the measurement of impact of a user's helmet.

[0052] In some embodiments, the system may communicate the location of the user to relevant external individuals (e.g., friends or an emergency service). In some

embodiments, to guarantee reliable functionality in emergency situations, the device may implement a priority-driven power management technique based on two batteries and energy harvesting devices.

[0053] To maximize chances of a successful message delivery, in every condition, in some embodiments the device implements a multi-channel communication management algorithm based on cellphone network, satellite transmission, and sub- GHz radio transmission.

[0054] The device may also be capable of collecting biometric and environmental data from other external devices or other connected sensors, combining this external data with internally collected data, and generating a detailed picture of the user's condition to improve efficiency of rescue teams. Typical data collected and transmitted may include, but not be limited to, heart rate, sweating condition, physiological stress, blood pressure, body temperature, ambient temperature, and likely body position. For example, with this information the rescue team may anticipate the type of the accident, prioritize the intervention, and prepare appropriate corresponding supplies. During the "golden hour" (i.e., the initial period following an accident when first responded may be most effective in saving a victim's life or mitigating long-term health effects) such information can be especially useful and may save one or more lives.

EXAMPLE SYSTEM IMPLEMENTATION OVERVIEW

[0055] Various embodiments are directed to improving safety in potentially hazardous environments (e.g., outdoor sports), for example using a wearable/portable electronic device mounted on or built-in into a helmet.

[0056] In some embodiments, the sensor device may be placed on or integrated into helmets used in sporting activities, including, but not limited to, cycling, mountain biking, skiing, snowboarding, motorcycling, and horseback riding.

[0057] FIG. 1 is a perspective view of one embodiment 100 of a helmet with an attachment as may be implemented in some embodiments. FIG. 1 shows a generic helmet 105 and the activity monitor device 110 mounted on it. For example, the device 110 may be mounted on a helmet 105 using bi-adhesive tape, cable plastic strips, compatible custom or industry standard attachments, a purposely pre-designed attachment on the helmet 105, or can be embedded in the helmet 105 itself, such that the device 110 and/or helmet 105 include a specific housing for the described hardware. The device 110 may also be similarly mounted on hats, head bands, backpack shoulder straps, be functionally stored in backpacks, or other gear. The device case is shown in the shape of a fin, as an example, but there is no limitation or restriction on the case shape and location on the helmet. For example, the device case may be substantially cylindrical, cuboidal, triangular, hexagonal, or any other shape that provides stability, aerodynamic qualities, telecommunication/data transmission qualities and/or other useful or aesthetic qualities. In one non-limiting example, the fin may provide aerodynamic qualities facilitating its attachment to the helmet surface without interfering with the user's maneuverability. [0058] FIG.2 illustrates one embodiment 200 of an attachment 502 affixed to a helmet 504 and separate as may be implemented in some embodiments, though the sensor system may be integrated into the helmet or other clothing in some embodiments.

EXAMPLE SENSOR SYSTEM

[0059] FIG.3 is a functional block diagram of a system 300 as may be implemented in some embodiments. Dynamic data generated by 9-axis motion sensors (3-axis accelerometer 33, 3-axis magnetometer 34, 3-axis gyroscope 35) and by an atmospheric pressure sensor 36 are supplied continuously to a microcontroller 37.

[0060] The microcontroller 37 may store the data coming from the sensors as well as processed data onto a solid-state non-volatile memory 311. The microcontroller 37 runs software that performs the following activities: a) Continuous monitoring of the motion data from the sensors; b) data analysis and comparison to the specific activity biomechanical model, looking for anomalies; c) In case of an identified anomaly, initiate a decision tree process to identify a potential accidental event; and d) In the case of an accident being identified, decides the optimal strategy to communicate the user location. The activities will be discussed in further detail below in conjunction with FIGS. 13-17. Communication methods may include, but not be limited to, Bluetooth Low Energy 38 to a smartphone or a cellular network communication module 310; satellite transmitter 39; or sub-GHz transmitter 312. Communication strategies will be discussed in further detail below in conjunction with FIG. 18. Power management by prioritizing use of battery power to privilege safety and prolong battery life in emergency conditions will be discussed in further detail below in conjunction with FIG. 12.

[0061] The accident detection algorithm may use motion data coming from sensors 33, 34, 35 and from the pressure sensor 36 to identify when the same is occurring. Some embodiments will detect any vital sign that indicates an impairment that may inhibit a conscious request for help. For example, such impairments may result from crashes, collapses, hearth attacks, excessive radiation, submersion, unexpected pressure drops, etc. In some embodiments, the accident detection algorithm may use data coming from integrated sensors 33, 34, 35, 36 or data from other devices communicatively coupled to the system, for example smartwatches, wristbands, chest heartrate monitors, smart rings or other devices.

[0062] Because typical dynamics vary greatly depending on the activity and on the skill level of the user, the machine learning algorithm in some embodiments is capable of adjusting its parameters based on activity level and user skill level to provide accurate accident recognition while minimizing errors. Non-limiting examples of factors relied upon may include: demonstrated speed; tracked figures; successful measured performances like leaning and jumping; and curve trajectories. Such factors are used to change the parameters used in the numerical dynamic model to enhance the identification of the activity anomalies. The algorithm may also use machine-learning techniques to gradually adapt over time to the dynamics typical of each user individually as the user's skills improve or degrade with practice or inactivity, respectively.

[0063] The solid-state non-volatile memory 311 may record the motion data for a configurable length of time. In case of an accident or other event, the memorized data can be made available to qualified rescue personnel to provide detailed information on how the events preceding and following the accident evolved.

[0064] Communication can be a key component of some embodiments. The device presented in some embodiments may be equipped with sub-GHz transmitter 312 that has an approximately four miles to eight miles communication range. In case of an accident, the device in some embodiments can automatically detect the emergency and send a distress signal, e.g., using the sub-GHz transmitter. Other similar devices in the area covered by the distress signal can automatically recognize the help request and alert a second user of an emergency nearby. Through the use of a second computing device communicatively coupled to the system, for example a smartphone, a second user who has received the distress message can locate the source of the call and provide help faster than any other entity who may be far away and/or may have difficult access to the area.

[0065] In some embodiments, the device may work in conjunction with a smartphone application. The application may have an open interface (e.g., an open software interface), be flexible, and may be able to access other external devices that the first user may be carrying and that may collect biometric data (e.g., heart rate, temperature, oxygen levels, motion, etc.). The application may combine data from the device described elsewhere herein as well as data from any other supported device, to generate a detailed portrait of the user condition in case of an accident in some embodiments. This information may be made available to qualified rescue personnel to improve rescue efficiency and immediate medical care.

EXAMPLE POWER MANAGEMENT SYSTEM

[0066] FIG.4 is a functional block diagram of a power management system 415 as may be implemented in some embodiments. In some embodiments, the device can source power from two batteries 413, 414. The microcontroller 37 may enable one battery at a time based on a priority system. Battery 413 may be used for the normal motion monitoring operations and potentially to control other devices, such as cameras and other functionalities not critical to the safety features of the device. Battery 414 is used as an emergency use battery and comes into play only when battery 413 is depleted and in case of an emergency by deactivating any functionality not strictly needed for the management of the emergency. An algorithm running on a microcontroller 37 may monitor the status of each battery and may decide when to switch to battery 414 and turn off redundant activities. To guarantee functionality in case of emergency the microcontroller 37 may also enable and control additional power sources in some embodiments, e.g.: a kinetic energy harvester 416 and a photovoltaic system 417.

[0067] In some embodiments, the device is to be placed on or integrated into helmets used in various sports or other activities. A power management solution for

emergency/rescue wearable devices may be provided in some embodiments.

[0068] Wearable devices that aim at improving safety for users may need to guarantee that they will have enough power to operate when they are needed and for the entire duration of the emergency. Thus, power management can be a critical factor affecting reliability of such devices.

[0069] Some embodiments implement a system based upon a multiple battery mechanism and an algorithm that assigns battery resources according to specific priorities. This may ensure that at least one battery will be available in case of an emergency. Some embodiments also adopt energy harvesting techniques (e.g., kinetic and photovoltaic) to exploit motion and light to prolong battery life.

[0070] Some embodiments include a minimum of two batteries. The first battery may be used for the normal motion monitoring operations, potentially to control other devices, such as cameras and communication modules, and/or as a backup battery. The first battery may be used during an emergency in some embodiments when a second battery is depleted. The second battery may be used only in case of an emergency. An algorithm running on a microcontroller may continuously monitor the energy level of each battery and, based on the amount of remaining charge, decide to disable certain non-essential functionalities to maximize battery life. Certain sensors, for example, may be temporarily disabled if not essential in a specific activity, sampling frequency may be reduced and communication minimized. [0071] In order to extend battery life energy some embodiments employ supplemental harvesting techniques, e.g.: solar and kinetic. The algorithm may optimally manage energy harvesting devices to maximize energy storage efficiency.

[0072] Accordingly, some embodiments implement an algorithm for multi-battery management. Some embodiments employ a redundant battery architecture. Some

embodiments employ multi-source energy feeding (e.g., wireless power, solar, kinetics).

[0073] Some embodiments implement in/out energy management gauging and reporting. Some embodiments implement self-sustainable emergency communication device with unlimited operation. In some such embodiments, for example, the implemented combination of energy harvesting and advanced load management may allow the device to send unlimited emergency distress messages virtually with no discontinuation of the service within the lifetime and the operating specifications of the components.

EXAMPLE SYSTEM OPERATION

[0074] FIG.5 is a block diagram illustrating operation of the detection system 500 at a high level as may be implemented in some embodiments. As depicted, sensors 502 in a head-mounted device 502 or third-party device may collect data 506a, 506b, 506c (e.g., gyroscope, compass, acceleration, etc. data) about one or more activities, for example an accident 512, of a user. In some embodiments, in the event of an accident or an activity, the data is transmitted 511, for example via, e.g., a Bluetooth connection to a mobile phone 508, to contact a second user 513, for example a first responder. The mobile phone 508 may supplement the data with GPS coordinates and other context information, for example.

[0075] FIG.6 is a block diagram illustrating operation of the detection system 600 using alternative coverage at a high level as may be implemented in some embodiments. As depicted, sensors 602 on a head-mounted device 604 may collect data 606a, 606b, 606c (e.g., gyroscope, compass, acceleration, etc. data) about one or more activities, for example an accident 612, of a user. In some embodiments, in the event of an accident or an activity, the data is transmitted, for example via, e.g., satellite 608, to contact a second user 613, for example a first responder. In some embodiments, the head-mounted system may include a GPS (or other position) receiver for location coordinates calculation.

[0076] FIG.9 is a flow diagram illustrating a process 900 for detecting and responding to an event as may be implemented in some embodiments. At block 905, a user may select the type or character of the activity in which they may engage (e.g., swimming, hiking, biking, etc.). The user may make the selection via, for example a graphical user interface (GUI) on a mobile phone, a switch on the mounted device, a Bluetooth connection etc. At block 905, the system may identify the relevant models, algorithms, and sensors for a given selection. For example, a gyroscope may be used when biking, while a temperature sensor and a submersion detection sensor may be used for water rafting.

[0077] At block 915, the system may supplement the time series of sensor data with new data. One will recognize that different sensors may update asynchronously. The system may analyze the data at block 920 to determine if an event has occurred. A frequency analysis, a Principal Component Analysis, a threshold model-based analysis, etc. may be used, alone or in combination to identify an event based on the context. For example, a skier's jumps can be easily misread as an indicator that something anomalous happened to the user, but by using different analysis methods we can separate jumping from loss of control flying during the activity. If an event has been identified, at block 925, the system may prepare a characterization for a second user, for example a first responder. For example, for physical trauma, the system may determine which body parts are most likely injured. For a drowning incident, the system may record GPS coordinates, a depth, a timestamp since the last submersion, etc. Accident identification will be discussed in further detail below in conjunction with FIGS. 14-16.

[0078] If the user has not indicated that the event was a false alarm at block 925 (e.g., the system may emit sound indicia at block 925), the system may begin a notification process. In some embodiments, the user may indicate a false alarm using a GUI on a mobile phone.

[0079] At block 935, the system may determine if it can access the mobile phone's network and if so, will communicate the event details to a remote system at block 940 (or an update to a previous submission) to precipitate a first responder response. Conversely, if satellite access is available at block 945, the system may attempt to transmit the information via satellite. If peers are available at block 955, they may likewise be notified of the event at block 960. For example, if the victim is an incapacitated biker, the system may contact the mobile phones and/or devices of other bikers in the area. Sometimes the system can use other devices as a network extender to bounce the request for help through the cellular phone or its satellite communication. This would be particularly useful in case the device is not capable of determining a position or geographic location to communicate directly with the network of choice. [0080] At block 965, the system may take actions anticipating responder arrival (e.g., manage battery power, update the event characterization with new sensor data, etc.). A black box update may be prepared so that first responders will have access to the most recent data upon arrival if the device is unable to transmit the data wirelessly. The process may continue until the user or first responder cancels the requests.

EXAMPLE FEATURE

[0081] Sport specific activity monitor, accident detection, and safety algorithm using dynamic data collected by motion sensor(s) integrated in or mounted on helmet.

[0082] In some embodiments, the device may be attached to or integrated into helmets used in sport activities. A sport specific accident detection and safety algorithm using dynamic data collected by one or more motion sensors integrated in or mounted on helmet may be used.

[0083] Sport activities often bear a certain level of risk and accidents can happen to anyone. It is of extreme importance to be able to identify when an accident has happened, particularly if the user is practicing the sport alone or out of sight and if the user is unconscious as a result of a fall.

[0084] Some embodiments contemplate a device that includes motion sensors to monitor the user activity and motion dynamics and a microcontroller that runs an algorithm capable of analyzing the data provided by the sensors and detect an accident when that happens.

[0085] In some implementations, the device is placed on or integrated into helmets used in sport activities, including, but not limited to, cycling, mountain biking, skiing, snowboarding, motorcycling, horseback riding.

[0086] The device may include a microcontroller, non-volatile storage memory and the following sensors: 3-axis accelerometer, 3-axis gyroscope, 3-axis magnetometer, and pressure sensor. The microcontroller may run firmware that continuously monitors the values provided by the sensors. The firmware may also include an algorithm that interprets the data provided by the sensors and is capable of recognizing when an accident is occurring.

[0087] The dynamics may greatly vary depending on the type of activity.

Accordingly, the user may specify what type of activity he/she is engaging in and the algorithm will adjust its parameters according to one or more pre-defined models. Over time, the algorithm may tune its internal parameters based on the typical motion patterns of each individual user in order to optimize accident detection capabilities and minimize the chance of false alarms.

[0088] Thus, some embodiments apply specific biomechanical models applied to the activity dynamics. A machine learning algorithm may be applied for personalized tuning of parameters. In some embodiments, predictive models for event anticipation and consistency check may be applied. In some embodiments, data analytics for safety and emergency help requests may be applied. Selectable profiles for specific activities and dynamics may be provided in some embodiments. Real time biometric monitoring for rescue intervention, prioritization and efficiency may also be provided in some embodiments.

EXAMPLE OPERATIONAL SYSTEM COMPONENTS

[0089] FIG.7 is a block diagram illustrating various components in accident detection and response system 700 as may be implemented in some embodiments. For example, there may be seamless integration between one or more sensors 702 coupled to a wearable garment (e.g., helmet); one or more communication systems (e.g., low energy Bluetooth, satellite, sub-GHz transmitter); and embedded software for analyzing data from the one or more sensors, detecting one or more activities of a user, and/or transmitting the data to a second user. Further for example, one or more computing devices 704 of the system may display or present a GUI with various functionality. For example, may include one or more status icons; a navigation icon; a concierge icon; a checking in icon; one or more locking switches; an emergency contacts icon; a history icon for viewing an activity history; one or more social networking icons for sharing activity information with friends, family, or acquaintances; etc. Still further for example, the system may include one or more additional computing devices 706 configured to present or display additional information which, in some embodiments, may be stored in a personal profile. The personal profile may include emergency data storage, activity data analytics (such as but not limited to, speed,

acceleration, direction, route, lean angle, jumping angle, time, altitude, force in action, 3D model avatar), adventure planning capabilities, general information about the user (e.g., likes, dislikes, age, educational background, residence location, relationship status, etc.).

EXAMPLE TARGET FUNCTIONALITIES

[0090] FIG.8 is a Venn diagram 800 illustrating various target functionalities met by certain embodiments. The embodiments 808 described here lie at the interface between sport consumer electronics 802, safety demand 804, and wearable Internet of Things (IoT) technology 806.

EXAMPLE FEATURE 1

[0091] Multiple sensors architecture and consistency check algorithm of biometric data for unconscious 2-way communication and efficient rescue/emergency intervention.

[0092] In some embodiments, the device is to be placed on or integrated into helmets used in sport activities. In some embodiments, a multi-sensor architecture and consistency check algorithm of biometric data for unconscious 2-way communication and efficient rescue/emergency intervention are implemented.

[0093] Rescue teams may have a preference for 2-ways communication with accident victims. The rescue team may need to assess the situation and the conditions of the user, prioritize their intervention and make sure to be equipped for best efficiency.

[0094] Sensor feedback and monitoring of the user's conditions during the period immediately following an accident can be vital for the successful outcome of the rescue operation. Some embodiments seek to consolidate and provide this information in the time following an accident and preceding the arrival of help. The rescue personnel can act on the information before leaving, en route, on site, and even following retrieval.

[0095] Some people engaging in sporting activities often carry multiple devices that help them monitor their performance and track their biometric data. These devices can potentially offer a wealth of information about the user health condition, but are typically used in isolation. The respective data on each device is not made available to an external or consolidated monitoring system.

[0096] Some embodiments anticipate a user carrying multiple wearable devices capable of collecting different types of data. The data from the tracking devices may be assembled into a sophisticated model that creates a holistic picture of the user condition. The model may be open and becomes more accurate and richer the more information is made available through third party devices. Such information from third party devices may be transmitted to and/or processed by one or more computing devices, as described elsewhere herein.

[0097] Accordingly, some embodiments contemplate using multiple sensor data collection and analysis. Some embodiments implement consistency check and reporting and provide a unique intelligent analysis and highlight system. Some embodiments implement encrypted communication and may provide mobile or web access to information through a unique key. Some embodiments provide "black box recording," for example pre- and post- accident event comparison.

EXAMPLE FEATURE 2

[0098] Multi-channel communication strategy for rescue alert.

[0099] In some embodiments, a multi-channel communication strategy for rescue alert may be used. For example, a rescue team may request communication with the user following an accident. This may be difficult in remote locations and difficult geographic environments. However, timely distress communication can be critical to maximizing the chances of survival and minimizing damage to the victim.

[00100] Some embodiments combine multiple strategies to optimize the distress communication and to maximize the rescue success rate. On a first level, when cellular coverage is available, a smartphone accompanying the victim may be used to deliver the message and/or data to first responders (communicating, e.g., via Bluetooth). If a smartphone is not available, the system may revert to satellite communication. In case particular topographical conditions prevent satellite accessibility, some embodiments use sub-GHz antennas to establish communication with similar devices that are located within a 50 mile radius of the user. Finally, if these techniques are not available, some embodiments implement a check-in mechanism in which a server will send out an emergency alert automatically if a check-in signal from the user is not received within a predefined amount of time.

[00101] Thus, some embodiments provide a communication strategy optimized to maximize success rate based on local conditions. Some embodiments provide a power efficient strategy to maximize battery life. Some embodiments bounce a signal off of neighboring devices to bridge to the closest receiver.

EXAMPLE FEATURE 3

[00102] Sub-GHz communication techniques for distress message delivery in case of an accident.

[00103] Timely communication in case of an accident can be critical to maximize survival rate and minimize damage. Often people around the victim of an accident can provide invaluable help until professional rescue teams arrive (e.g., during the "golden hour"). These individuals represent a virtual community immediately available for first assistance. For example, other skiers can be the very first responders to an injured skier, other bikers for a knocked out biker on the curb side, companions and team mates of a climbing adventurer, etc.

[00104] Some embodiments equip devices with sub-GHz capabilities that have an approximately SO mile range of communication. In case of an accident, multiple users wearing the devices can automatically detect the emergency and send a distress signal, using a sub-GHz transmitter, between one another. Other devices in the area covered by the distress signal can automatically recognize the help request and alert the user of an emergency nearby. Through the use of a smartphone (if available), the user who has received the distress message can locate the source of the emergency message and provide help faster than any other entity that may be farther away and may have difficult access to the area.

[00105] Accordingly, some embodiments provide a peer-to-peer direct communication protocol. Some embodiments provide an in-the-range detection algorithm. Some

embodiments provide multiple-variable nodes architecture and network management (e.g., as part of an ad hoc network).

EXAMPLE FEATURE 4

[00106] Recording and analysis of motion data for fast and accurate assessment of trauma and optimal rescue intervention.

[00107] Often, in case of a traumatic event, rescue teams can only guess what happened and what kind of dynamics and forces the user was subject to. Hence treatment is based on best practices and a conservative approach.

[00108] By monitoring and recording the 9-axis motion of the user, some

embodiments can quickly provide the rescuers with a detailed picture of how the accidental event evolved, how the user moved, what forces were applied to the user's head or body and what happened in the minutes following the trauma. This information can provide rescuers with an amount of objective information that can prove vital in the quick detection of critical traumas.

[00109] In some embodiments, a model can be installed on a portable device in the hands of the rescue team that can access the data from the wearable device on, for example, the helmet. The rescuer's device may graphically reproduce the traumatic event for a quick assessment of the damage (e.g., indicating which portion of the victim's anatomy was affected). [00110] Some embodiments implement a dynamic numerical model for the specific discipline practiced (e.g., skiing, swimming, etc.), as described in further detail elsewhere herein. System models may be able to precisely quantify the entity and quality of traumas. Entity is defined by the direction of the forces applied during the accident or other information available to the system. For example, the heart rate monitor can clearly indicate that the primary reason of the accident is a heart attack, absence of such alteration may imply a mechanical cause of trauma. Quality is defined by the magnitude of the forces registered during the event or other information available. For example, the record of 60g force may identify a very serious impact, a low body temperature may identify a dramatic hypothermia developing scenario, or a high or irregular heart rate may quantify the level of physical distress of the user.

[00111] Some embodiments implement a black-box approach for sport accidents and related injuries.

COMPUTER SYSTEM

[00112] FIG. 10 is a block diagram of a computer system 1000 as may be used to implement features, e.g., navigation, object recognition, preprogrammed behavior, of some of the embodiments. The computing system 1000 may include one or more central processing units ("processors") 1005, memory 1010, input/output devices 1025 (e.g., keyboard and pointing devices, display devices), storage devices 1020 (e.g., disk drives), and network adapters 1030 (e.g., network interfaces) that are connected to an interconnect 1015. The interconnect 1015 is illustrated as an abstraction that represents any one or more separate physical buses, point to point connections, or both connected by appropriate bridges, adapters, or controllers. The interconnect 815, therefore, may include, for example, a system bus, a Peripheral Component Interconnect (PCI) bus or PCI-Express bus, a HyperTransport or industry standard architecture (ISA) bus, a small computer system interface (SCSI) bus, a universal serial bus (USB), IIC (I2C) bus, or an Institute of Electrical and Electronics Engineers (IEEE) standard 1394 bus, also called "Firewire".

[00113] The memory 1010 and storage devices 1020 arc computer-readable storage media that may store instructions that implement at least portions of the various

embodiments. In addition, the data structures and message structures may be stored or transmitted via a data transmission medium, e.g., a signal on a communications link. Various communications links may be used, e.g., the Internet, a local area network, a wide area network, or a point-to-point dial-up connection. Thus, computer readable media can include computer-readable storage media (e.g., "non transitory" media) and computer-readable transmission media.

[00114] The instructions stored in memory 1010 can be implemented as software and/or firmware to program the processors) 1005 to carry out actions described above. In some embodiments, such software or firmware may be initially provided to the processing system 1000 by downloading it from a remote system through the computing system 1000 (e.g., via network adapter 1030).

[00115] The various embodiments introduced herein can be implemented by, for example, programmable circuitry (e.g., one or more microprocessors) programmed with software and/or firmware, or entirely in special-purpose hardwired (non-programmable) circuitry, or in a combination of such forms. Special-purpose hardwired circuitry may be in the form of, for example, one or more ASICs, PLDs, FPGAs, etc.

ALTERNATIVE EMBODIMENTS

[00116] FIG. 11 is a block diagram of a system as may be implemented in some embodiments. Motion sensors, such as, but not limited to, a 3-axis accelerometer 1, a 3-axis gyroscope 2 and a 3-axis magnetometer 3 transmit dynamic data to a processing unit or microcontroller 8. Additional sensors may also be present in the system, including environmental sensors (e.g., a barometer 4, humidity sensor, or temperature sensor 5) and biometric sensors (e.g., a body temperature sensor 5, heart rate sensor 6, or other sensor(s) 7, etc.). These additional sensors, when present, also transmit the measured data to the processing unit or microcontroller 8 for further analysis and action.

[00117] The processing unit 8 runs firmware that performs the following tasks:

[00118] Reads, at a predetermined and/or configurable sample rate, sensor data. The data may be received or read from the sensors at a specified sample rate, and the sensor values can either be analog or digital values. If necessary, sensor data may be filtered to remove the effect of noise in the sensor signals;

[00119] Stores the filtered sensor data into a non-volatile memory 9 such as, but not limited to, a flash memory or a micro SD card

[00120] Runs an accident detection algorithm (FIG. 14) that, based on the filtered sensor data, recognizes when a potentially dangerous accident occurs to the user; [00121] Decides the optimal communication strategy (FIG. 18) to adopt when an accident has been identified, to inform a selected list of contacts or rescue personnel of the user location and physical condition as can be inferred by sensor measurements;

[00122] Adopts the most appropriate power management strategy (FIG. 12) to ensure that enough power is available to sustain the system activity in case of an emergency;

[00123] Transmits data to an external device, such as smartphone or tablet, including raw sensor data and processed data utilizing a Bluetooth low energy (BLE) radio module 10 or any other suitable communication module, such as a sub-GHz radio module 11 and/or satellite transmitter 12 or other device; and

[00124] Manages additional input/output components, such as a camera 13, a microphone 14, and LEDs 16.

[00125] Motion, environmental, and biometric sensors 1-7 are connected to the processing unit 8, as depicted in FIG. 11. Data coming from sensors is typically affected by noise due to intrinsic sensor drift, calibration errors, and interference effects. As a result, the processing unit may adopt filtering procedures to remove undesirable noise from the sensor readings. For example, the system may apply Adaptive, Multirate, Infinite Impulse Response (IIR), and Finite Impulse Response (FIR) filters to the received sensor data. In the event that the sensors' output is analog, the processing unit performs an analog-to-digital conversion operation. To perform such conversion, the processing unit 8 may include or access an analog-to-digital converter (ADC).

[00126] The data stored in the non-volatile memory 9 provides a log of selected metrics concerning the activity of the user. The identity of and frequency of collection of metrics tracked by the system is configurable, and can include linear acceleration and angular velocity of the user at different time intervals. The non-volatile memory 9 may be managed as a circular buffer in which, when the total capacity is reached, the system overwrites the oldest data with new incoming data. The memory is sized, however, in a fashion to ensure that a sufficient time span of data for accident assessment is stored at all times. For example, the system may always store 30 minutes of data representing the user's most recent motion. It will be appreciated that the system may store data representing a greater or lesser amount of time, depending on memory size and frequency and amount of data being stored.

[00127] Data stored in the non-volatile memory can be transmitted to an external device (e.g., a smartphone or tablet) via the BLE module 10 or via any other available wireless or wired channel. The monitored data is used to create a model of the user activity, including information such as lean angles, trajectory, velocity, etc. of the user. The stored data can also be used, in the case of an emergency, by rescue teams, providing them with a deeper understanding of the dynamics leading to the accident and guiding emergency responders towards the adoption of the most appropriate intervention measures.

[00128] The processing unit 8 and the firmware running on it identify a potentially dangerous accident involving the user when it occurs. Accident detection is achieved by continuously monitoring data from the sensors and by using the data to create a

biomechanical model of the user engaged in his/her activity. The biomechanical model represents the motion of the user while engaged in the activity. The system uses the biomechanical model to determine when the behavior of the user diverges from the expected behaviors typical of the associated activity. That is, the accident detection algorithm executed by the processing unit 8 is tailored to different activities (e.g., skiing, snowboarding, road cycling, mountain biking, etc.) because each activity is characterized by peculiar dynamics and behaviors that do not necessarily apply to others. The identification of the activity being performed can either be provided explicitly by the user or automatically detected by the system itself, through its activity recognition capabilities. Once the specific activity has been identified (either manually or automatically), the algorithm compares parameters from the biomechanical model with expected parameter values for the activity. The divergence of parameter values from the biomechanical model with the expected parameter values for the activity can suggest a potential accident. When the real time model parameters indicate that an accident may be occurring, a subsequent analysis stage is activated to closely monitor all sensor data to determine if an accident has in fact happened.

[00129] By monitoring user motion and implementing a multi-stage analysis, the system can detect not only an impact indicated by a sudden stop but other emergencies not including an impact. For example, the system would also detect if a user had a heart attack while motorcycling and slowly coasted to a stop and then passed out at the side of the road.

[00130] If the subsequent stage analysis indicates the likelihood of an accident, the system alerts the user through a visual and audible signal as well as via a message alert on an external device (e.g., a smartphone). The user is given a pre-determined amount of time to confirm that no injury has been sustained and cancel the help request. If no action is taken by the user within the pre-determined time interval, the system determines the optimal strategy to communicate a help request to a list of selected users or emergency responders. The communication strategy is based on the prioritization of the available resources. For example, if the system contains a sub-GHz radio module 11, the radio module is triggered first because it could potentially reach other users that are closest to the victim and hence in a position to provide immediate assistance. If the radio module is not present or if the communication fails using the module, the system attempts to connect to an available smartphone to determine whether there is service coverage. If service coverage exists, the smartphone is used by the system to send the help request to pre-selected contacts and/or emergency responders. If no service coverage is present or no smartphone is available (e.g., the smartphones battery is drained, the smartphone is damaged), the system attempts to use a satellite transmitter module 12 to send the help request. If the satellite transmitter 12 is available, then the system sends the help request through satellite communication. If neither smartphone nor satellite transmitter is available, the system will continue to repeat a polling cycle of the

communication channels at variable or fixed time intervals, to assess whether any of the channels become available. Once a channel becomes available, the help request is sent.

[00131] The help request sent by the system may contain a variety of information that is helpful to first responders. Importantly, the help request typically includes the GPS coordinates of the user to aid first responders in locating the user. Additionally, if biometric or environmental sensor data is available, certain sensor data can also be sent as the data can be relevant to help emergency responders form a clearer picture of the accident scene and the victim's physical conditions. If a camera and/or a microphone are available, data coming from these components can be saved in the storage module and conveyed in the help request to provide additional detail that complements the overall description of the accident and the victim to rescue personnel. The help request may be a text message, an email message, a proprietary-format message, or other communication that conveys the necessary information to those who can assist the user. Alternatively or in addition to transmitting the help request, the communication module and/or I/O module can emit signals (e.g., light via LEDs 16; radio signals via BLE module 10 and/or radio module 11) to act as a beacon for emergency responders.

[00132] In order to ensure proper operation of the system in case of emergency, the system implements a rigorous power management policy. As shown in FIG. 12, the system uses a dual battery architecture in which one of the batteries 19 is the main power source and is used for all the regular operations (e.g., activity monitor, data transmission,

communication, powering of a camera if available). A second battery 20 is a backup power source and is reserved for emergency situations. The system regularly monitors the charge level of the main battery using monitoring circuits 17, 18 and, as the charge decreases, implements power saving strategies to maximize battery life. Some non-vital functionalities may be disabled or reduced when the main battery charge level decreases beyond a pre-set threshold. If an accident has been recognized, the system may disable all non-critical functions to guarantee that enough power is available to sustain monitoring and

communication capabilities until assistance is provided to the user. In such embodiments, motion sensing features may remain operational, although at a reduced frequency of sensing, to detect activity of the victim. Communication, according to the strategy described in additional details in FIG. 18, is activated at regular time intervals to maximize the likelihood of the help request reaching the destination. To extend battery life, other power sources may be included, such as photovoltaic cells 22 and kinetic energy harvesters 23. Energy derived from light or motion may be used by a harvester 25 to recharge batteries 19 and 20. In addition, a micro USB port is provided to allow a user to directly charge batteries 19 and 20 when an external power source is available.

[00133] During normal operation the system does not communicate with external devices unless an accident is detected. At any time, however, the system can be reached and controlled by authorized external devices, such as smartphones, smartwatches or tablets, using a Bluetooth or other dear-field communication (NFC) connection. The external devices can instruct the system to transmit raw data from the sensors as well as a number of other processed information, such as quaternions or Euler's angles, that can be used by the external smart devices for advanced modeling and activity analysis. For example, the data may be accessed for subsequent performance analysis of the activity or to generate 3D visualizations of the activity that can be shared with friends and family. Through a similar mechanism the content of the non-volatile memory 9 can be transferred to other devices. For example, rescue teams can access the system and use information stored in the memory 9 as a black-box recording, providing detailed information of the dynamics before, during and after the accident.

[00134] The system, and more specifically, the processing unit 8, executes processes described in FIG. 13 - FIG. 18 to identify an accident and to verify if, as a consequence of the accident, the user/wearer of the system is unable to ask for help. An automatic request for help is triggered by the system if an accident is detected and the user does not disable or otherwise prevent transmission of the help request within a certain period of time. One advantage of the disclosed accident detection process is that for an accident to be recognized it doesn't necessarily require a physical impact of the helmet or an extreme angular velocity measured outside of specific thresholds. The disclosed system is thereby able to identify accidents like, for example, falls as a result of a heart attack, heat stroke, or other accident not implying a head hit.

[00135] The accident detection algorithm implemented by the system encompasses three specific stages: (1) activity identification, (2) activity monitoring including

anomalous/potential accident event identification, and (3) assessment of user condition and accident confirmation. Each stage will be described in additional detail herein.

[00136] FIG. 13 is a flow diagram illustrating a process implemented by the system for recognizing an activity, the first stage of the accident detection algorithm. Before the beginning of an activity, at decision block 49 the user has the option to specify the type of activity he/she is going to engage in (e.g., motorcycling, road biking, skiing, etc.). A user can specify the activity through a software application running on an authorized device that communicates with the system, such as a smartphone or tablet. For example, the user may be presented with a drop down menu or radio buttons associated with potential activities, from which an activity is selected. Alternatively, the activity monitor device 105 may have physical buttons or offer other input functionality to allow the user to select the activity. The activity selection is particularly recommended for new users, until the system has the opportunity to monitor the user and derive an understanding of the typical activity profile of the user.

[00137] If the user chooses not to explicitly input the activity type at decision block 49, then the system attempts to automatically identify the activity type by analyzing a set of parameters in blocks 50-53. The parameters analyzed by the system may include, but are not limited to, location (e.g., altitude, latitude, longitude), physical constraints (e.g., on the road, off road), terrain (e.g., ski resort, park, beach, desert), speed (e.g., ski typical values, biking typical values, motorcycling typical values), acceleration patterns (e.g., gravity acceleration on a slope, motor accelerations, human propelled accelerations), trajectory patterns (e.g., curves radius, lean angle, point of sensing height), relative spatial position and movements, and biomechanical patterns (compared with historical value registered by the user). The analysis of one or more of these parameters enables the system to determine the nature of the activity that the user is performing.

[00138] The adoption of the correct activity biomechanical model is important for accurate accident identification. The activity biomechanical model is a simplified representation of the human body while performing the corresponding activity. An example of an activity biomechanicai model used by the system is an Inverted Pendulum model, complemented with additional information regarding, for example, the position of the sensor (e.g., on the back of a skier or on the helmet of a biker) and user biometrics (height, weight, age). Once the activity has been recognized by the system, some key parameters and assumptions drive the creation of the biomechanicai model. Some examples include the degrees of freedom of the Inverted Pendulum, the distribution of masses, typical or expected position, center of mass position, point of contacts with the ground, friction/viscosity of the ground, etc.

[00139] At a block SO, the system reads motion sensor data (e.g., accelerometer, gyroscope, magnetometer) and calculates certain derived values (e.g., user orientation, position, linear velocity) from the motion sensor data. At a block 51, the system obtains GPS coordinates of the user either from an on-board GPS module or from a GPS module included in a connected smart device (smartphone, smartwatch, tablet, etc.). Barometer values are also read by the system. The combination of GPS and barometer readings is used by the system to obtain the accurate geographic location of the user. By locating the user on a digital topographic map it is possible to obtain information about the type of terrain surrounding the user. The surrounding terrain gives clues to the system about likely activities in which the user is engaged.

[00140] At a block 52, the system analyzes the values of the motion sensors for a predefined time window to extract a pattern. The pattern is compared by the system with a library of pre-determined patterns that uniquely identify each activity (e.g., skiing, motorcycling, road biking, etc.). If a pattern match is found consistent with the other analyzed data (e.g., the user location), then the activity is identified. Otherwise the system continues monitoring the user until an activity match is found. In some embodiments, the activity analysis routine runs in the background until the probability of activity identification is at least 90%, 95%, or 98%. The activity recognition process enables the system to select the right activity biomechanicai model for purposes of the accident detection stage.

[00141] Although not depicted in FIG. 13, the season can also be used by the system in order to assess the activity of the user. For example, the system may maintain a calendar date or may receive a calendar date from a linked mobile phone, smartwatch, or other authorized device. The season can be used in conjunction with the GPS data to select certain activities over others (e.g., hiking in the mountains in the summer months vs. skiing in the mountains in the winter months).

[00142] Once the system determines the activity of the user, in the second stage of analysis the system continuously implements an accident detection process. FIG. 14 is a flow diagram illustrating a process implemented by the system for identifying an accident Inputs to the accident detection algorithm are: the activity type as provided by the user or assessed by the system; sensor readings as provided by on-board sensors; and, optionally, biometrics data (e.g., user height, weight, age, skill-level), may be provided by the user. The user may provide the biometrics data through a graphical user interface on an authorized device (e.g., a smartphone) coupled to the system.

[00143] The system reads the sensor values and calculates certain derived values from the sensor date. The combination of both read and derived values is used by the system at block 54 to build a time-dependent, multi-dimensional biometric vector that describes the motion, the position, and orientation of the user at any given time interval. A representative biomechanical model used by the system is shown by a time- dependent matrix in FIG. 15.

[00144] In the matrix shown in FIG. 15, A is the acceleration vector from the accelerometer, π the angular velocity given by gyroscope, M the magnetic orientation vector from the magnetometer, Lt is the geographic latitude and Lg is the geographic longitude from the GPS, H is the altitude from the barometer, (Χ,Υ,Ζ) is the relative position, V is the velocity vector, F is the force vector, a is the angular acceleration, and L is the angular momentum vector. Some of these values are directly measured by the sensors of the system, while others are derived by the system. Some values that the system may also take into consideration are the height, weight, center of mass, degrees of freedom, and proficiency of the athlete.

[00145] Representative formulas to calculate values for the matrix include the following:

[00146] Position X(t) = the position location may be periodically compared to GPS geolocalization in order to improve overall accuracy

[00147] Position ; the position location may be periodically

compared to GPS geolocalization in order to improve overall accuracy

[00148]

[00149]

[00150]

[00151]

[00152] The matrix provides instantaneous information on the user's activity and is used by the system at block 55 to build vector patterns - defined as a sequence of vectors over a period of time - that characterize the activity under analysis. The obtained sequence of vectors is stored by the system in non-volatile memory 9. Because the sensors values as well as the derived values for each time interval are stored in the non-volatile memory 9, it is possible for the system to retrieve the stored values to calculate the trajectory and the evolution over a given time interval of the user activity. Such information supports the accident detection analysis performed by the system.

[00153] The system maintains a library of values and vector patterns that characterize different activity. A pattern is a sequence of vectors over time. The library of vector patterns reflect typical or "normal" motion associated with various activities (e.g., skiing, biking, running). The library of vector patterns may also be associated with a level of mastery or activity approach (e.g., expert, intermediate, or beginner skier, or aggressive, average, or relaxed skier). For example, the more the athlete is proficient with acrobatic skiing, the more patterns associated with jumps and acrobatic figures will populate the library. The library is automatically updated over time either from external input, but also from the patterns automatically generated by the system as it monitors the activity. By modifying the stored patterns that characterize each sport activity, the system is allowed to "learn" from the user's habits and become better adapted to the user's activities.

[00154] At a block 56, the system compares the monitored motion of the user with stored vector patterns that characterize "normal" parameters associated with the

corresponding activity. In order to make such a comparison, the system performs one or more of the following three validity tests:

[00155] (1) The system verifies that the instantaneous values of the current user activity matrix belong to an allowed set of values represented by a corresponding stored "normal" vector pattern associated with the user activity.

[00156] (2) The system utilizes the recent vector history (vector values at times t- 1 , t- 2,... t- n) of the user and the specific activity under analysis to calculate the expected vector values for times t+1, t+2...t+n. The system then verifies that the actual measured vectors for times t+1 , t+2...t+n fall within the predicted value ranges. [00157] (3) The system compares the activity pattern of the user as represented by multiple stored matrices of the user's activity, with a corresponding activity pattern stored in the library. The system determines whether the pattern of activity of the user falls within allowable boundaries of the stored activity pattern, or falls outside of the boundaries of the stored activity pattern.

[00158] If one or more of the validity tests fail repeatedly for a certain period of time (from 10 to 60 seconds), the system treats such failure as a potential accident To confirm that an accident has indeed occurred, however, the system progresses to a third stage of accident analysis.

[00159] In addition to comparing vectors and activity patterns of the user with stored vectors and patterns, it will be appreciated that the patterns (both newly generated and preexisting) are associated with properties that represent the probability that the pattern will occur and the probability that the pattern represents a potential accident. The more the system is used, the more patterns are generated based on the user's actual behavior. Over time this makes the system more tailored to the user skills, conditions, and general behavior associated with the activity under analysis. Low recurring patterns lose significance over time because they prove not to be representative of the user's typical behavior and may, at some point, be removed from the system. By removing or promoting patterns, the system adapts to the athlete proficiency, conditions, and shape. For example, at the beginning of the season a user may not be in good shape and in great form, so certain patterns may not be utilized for purposes of the comparison, but as training progresses more advanced patterns may be used by the system.

[00160] A bi-dimensional representation of how the system evolves and learns over time is illustrated in FIG. 16. Biomechanical vectors are represented by points and a solid line separates values that are "allowed" (i.e., safe, and not indicative of a potential accident) from values that are "not allowed" (e.g., may indicate a hazard condition). In FIG. 16, allowed vectors are depicted with solid points and not allowed vectors are depicted with hollow points. As the system learns more from the user's performance, the border between the regions changes, learning and adapting to the user behavior. Since the matrix is multidimensional, the same concept must be translated into a multidimensional space.

[00161] FIG. 17 is a flow diagram illustrating a process that is implemented by the system to verify that an accident has occurred. As previously discussed, the system not only identifies the occurrence of an accident, but also assess whether the user is incapacitated (as a result of an accident or other reasons) and as such unable to autonomously call for help.

[00162] If the accident detection stage (FIG. 14) indicates that a potential accident has occurred, the system only knows that an anomalous event that does not match with allowed patterns has been identified. The anomalous event may or may not reflect an actual accident. A user-defined time window is therefore used to monitor activity following the detection of a potential accident, and to establish with sufficient confidence that a severe accident has occurred.

[00163] In order to confirm whether an accident requiring assistance has indeed occurred, the system implements the decision tree analysis depicted in FIG. 17 as part of a third stage of accident analysis. The sensor data provides sufficient information for the system to answer questions that reflect the state of the user, such as: Has the original activity resumed? Is the user moving or not moving? Is the movement linear or random or shaking? Is the helmet worn or not? Is the system still attached to the helmet or not? What is the user position? Orientation? Was any sharp change in the altitude measurement detected? Any abrupt change in velocity or acceleration? Was any peak force applied? Did we measure any up-side down evolution? Any flying dynamics?

[00164] Moreover, if biometric sensors (e.g., heart rate monitor, body temperatures, etc.) are available to the system, then readings from the biometric sensors may be included as part of the decision tree analysis. Anomalies in biometric readings can provide useful data on the event and on the user condition. For example, if the system has biometric data, the system can assess questions like: What's the heart rate of the user? What's the body temperature of the user?

[00165] More specifically, the system implements the following decision tree analysis. At a block 57, the system calculates the past trajectory of the user based on stored sensor data. Because all sensor values as well as the derived values for each time interval are stored in the non-volatile memory, the system can retrieve the stored values and calculate the trajectory of the user and the evolution over a given time interval of the user dynamics.

[00166] At blocks 58-59, various parameters associated with the current state of the user are assessed. At a block 58, the system continues to monitor current values from the sensors. Sensor values are continuously read and derived values are calculated to assess current activity of the user. At a block 59, the user position (e.g., standing or laying) and orientation are calculated based on sensor readings. Both the activity and position of the user may be calculated by the system at certain time intervals.

[00167] At block 60, the system assesses the evolution of certain parameter values over time windows that extend in the past (from recorded values) and continue in the present to determine whether sudden changes in values (e.g., position, acceleration, orientation) suggest that a potentially traumatic event may have occurred.

[00168] At block 61, if biometric sensors (e.g., heart rate monitor, body temperatures, etc.) are available, then readings from the biometric sensors may be gathered by the system and included as part of the decision tree analysis. Anomalies in biometric readings can provide useful data on the event and on the patient condition.

[00169] At block 62, the motion sensor readings are continuously monitored by the system to detect whether user activity resumes (after a fall, for instance) and the quality of the activity (is it a regular motion, e.g., walking or erratic movements?). At block 63, if normal motion is resumed by the user, then the event is considered non-critical (or at least the user is conscious and capable of calling for help independently) and the process returns to block 57 to continue to monitor user activity.

[00170] At block 64, if motion has not resumed or the quality of the motion is such as to suggest that the user may be severely injured, then the system continues to monitor until a pre-defined monitoring time window has elapsed. The monitored time window may range, for example, from 30-300 seconds or more. The monitored time window may vary depending on the activity of the user and the likelihood of a serious accident occurring as a result of that activity.

[00171] At a block 65, if user motion has not resumed and the pre-defined "safe" time window has elapsed, then a final analysis of the recorded parameters is assessed to confirm that an accident condition likely exists. If, for example, a sudden change in acceleration is determined, with a corresponding equally sudden change in user position and absence of movement afterwards then these - after the monitoring phase at the previous steps has completed - are considered a clear indication of a severe accident.

[00172] If the position and parameters are compatible with an accident at block 65, then processing continues to block 66. At block 66, the system activates an alert count down, with an audio and visual alert, to give the user a final chance to stop a help request from being sent. When the countdown reaches zero, the help request is sent following the communication strategy described in FIG. 18. The user is provided the ability to stop the help request within a certain, period of time (for example, by default 30 seconds). In some embodiments, the period of time is user-customizable, manufacturer specified, or predefined.

[00173] If at block 66, the system instead determines that no accident occurred, the registered pattern associated with the decision tree analysis is added to the patterns library with its own recurrence value equal to 1 to indicate a low probability event. In this manner, if the user is particularly keen in acrobatic jumps and figures, motion patterns associated with those actions will be included in the library and not be considered events to be identified as potential accidents. Also if the user stops enjoying acrobatic jumps and figures, the system associates these activities with time lower probability until the pattern of such events will fade away.

[00174] FIG. 18 is a flow diagram illustrating a communication strategy implemented by the system. After the countdown timer has elapsed at block 67, if the alert is not cancelled at block 71 and the system returned to normal operation at block 72, then a help request is transmitted via sub-GHz radio (blocks 68 and 72) if available. Further, the request is also transmitted via smartphone if the smartphone has coverage (blocks 69 and 74). If not, then the request is transmitted via satellite radio (blocks 70 and 75). The process loops repeatedly to send help requests as described above.

[00175] FIG. 19 is a perspective view of a system as may be implemented in some embodiments. The system of FIG. 19 is integrated with LEDs 16, camera 13, microphone 14, environmental sensors 21 (e.g., temperature, pressure), photovoltaic panels 22 to generate electricity, biosensors 27 (e.g., heart rate, etc.), and/or wireless power coil 29 for wireless charging. Within the system, there may also be a kinetic harvester. The system is a single piece with the sensors and modules etc. integrated therein. Although depicted as a standalone device, the system may be integrated into a helmet or other piece of gear to prevent separation of the sensors/modules etc. in the event of an accident and to help ensure that the system continues to function. For example, the system can be integrated into clothing (vest, jacket), boots, wristband, gloves, etc.

[00176] The systems and methods of the preferred embodiment and variations thereof can be embodied and/or implemented at least in part as a machine configured to receive a computer-readable medium storing computer-readable instructions. The instructions are preferably executed by computer-executable components preferably integrated with the system and one or more portions of a processor in a computing device. The computer- readable medium can be stored on any suitable computer-readable media such as RAMs, ROMs, flash memory, EEPROMs, optical devices (e.g., CD or DVD), hard drives, floppy drives, or any suitable device. The computer-executable component is preferably a general or application-specific processor, but any suitable dedicated hardware or hardware/firmware combination can alternatively or additionally execute the instructions.

[00177] As used in the description and claims, the singular form "a", "an" and "the" include both singular and plural references unless the context clearly dictates otherwise. For example, the term "sensor" may include, and is contemplated to include, a plurality of sensors. At times, the claims and disclosure may include terms such as "a plurality," "one or more," or "at least one;" however, the absence of such terms is not intended to mean, and should not be interpreted to mean, that a plurality is not conceived.

[00178] The term "about" or "approximately," when used before a numerical designation or range (e.g., to define a length or pressure), indicates approximations which may vary by ( + ) or ( - ) 5%, 1% or 0.1%. All numerical ranges provided herein are inclusive of the stated start and end numbers. The term "substantially" indicates mostly (i.e., greater than 50%) or essentially all of a device, substance, or composition.

[00179] As used herein, the term "comprising" or "comprises" is intended to mean that the devices, systems, and methods include the recited elements, and may additionally include any other elements. "Consisting essentially of * shall mean that the devices, systems, and methods include the recited elements and exclude other elements of essential significance to the combination for the stated purpose. Thus, a system or method consisting essentially of the elements as defined herein would not exclude other materials, features, or steps that do not materially affect the basic and novel characteristics) of the claimed disclosure.

"Consisting of shall mean that the devices, systems, and methods include the recited elements and exclude anything more than a trivial or inconsequential element or step.

Embodiments defined by each of these transitional terms are within the scope of this disclosure.

[00180] The examples and illustrations included herein show, by way of illustration and not of limitation, specific embodiments in which the subject matter may be practiced. Other embodiments may be utilized and derived therefrom, such that structural and logical substitutions and changes may be made without departing from the scope of this disclosure. Such embodiments of the inventive subject matter may be referred to herein individually or collectively by the term "invention" merely for convenience and without intending to voluntarily limit the scope of this application to any single invention or inventive concept, if more than one is in fact disclosed Thus, although specific embodiments have been illustrated and described herein, any arrangement calculated to achieve the same purpose may be substituted for the specific embodiments shown. This disclosure is intended to cover any and all adaptations or variations of various embodiments. Combinations of the above embodiments, and other embodiments not specifically described herein, will be apparent to those of skill in the art upon reviewing the above description.