Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
MEMS SENSING SYSTEM
Document Type and Number:
WIPO Patent Application WO/2021/216631
Kind Code:
A2
Abstract:
A wearable sensor system capable of receiving mechanical waves comprises at least one MEMS sensor secured to a user. The MEMS sensor is adapted to receive mechanical wave signals generated by a body part of the user when the user effects an event. The wearable sensor system further comprises a processor operatively connected to the at least one MEMS sensor. The processor is adapted to determine the spectral distribution of the received mechanical wave signal and to extract from the spectral distribution at least one descriptor related to the movement or pose of the body part.

Inventors:
CONNOLLY MATTHEW (US)
WILKINSON DAVID CLARK (US)
HOLMAN DAVID (CA)
Application Number:
PCT/US2021/028285
Publication Date:
October 28, 2021
Filing Date:
April 21, 2021
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
TACTUAL LABS CO (US)
International Classes:
G01H11/06
Attorney, Agent or Firm:
VENTO, Jose et al. (US)
Download PDF:
Claims:
CLAIMS

1. A wearable sensing system, comprising: at least one MEMS sensor secured to a user, the MEMS sensor adapted to receive mechanical wave signals generated by a body part of the user when the user effects an event; and, a processor operatively connected to the at least one MEMS sensor, wherein the processor is adapted to determine the spectral distribution of the received mechanical wave signal and to extract from the spectral distribution at least one descriptor related to the movement or pose of the body part.

2. The wearable sensing system of claim 1 , wherein the event is at least one of a pinch, a tap, a snap, a wiggle, and a fist.

3. The wearable sensing system of claim 1 , wherein the at least one one MEMS sensor is at least one of a microphone, an accelerometer, a gyroscope, and a pressure sensor.

4. The wearable sensing system of claim 1 , wherein the body part of the user is at least one of a wrist, an arm, a leg, a torso, and a head.

5. The wearable sensing system of claim 1 , wherein the at least one descriptor is associated with at least one of the length of the event, the type of event, a force associated with the event, a body part associated with the event, and an object associated with the event.

6. The wearable sensing system of claim 1 , wherein the at least one descriptor is used for an interaction with a processor enabled application.

7. The wearable sensing system of claim 6, wherein the interaction is word entry into the application.

8. The wearable sensing system of claim 6, wherein the interaction is a mobile device implementing the application.

9. The wearable sensing system of claim 6, wherein the interaction is a vehicle implementing the application.

10. A wearable sensing system, comprising: a substrate adjacent to a first body part of a user; at least one MEMS sensor mechanically connected to the substrate and adapted to receive mechanical wave signals transmitted through the substrate; and, a processor operably connected to the at least one MEMS sensor, wherein the processor is adapted to determine a measurement of each of the received mechanical wave signals by the MEMS sensor and to determine at least one descriptor associated with an event effected by the first body part.

11 . The wearable sensing system of claim 10, further comprising a mechanical wave transmitter adjacent to a second body part of the user, wherein the transmitter is adapted to transmit at least one mechanical wave signal.

12. The wearable sensing system of claim 11 , wherein the mechanical wave transmitter is adapted to transmit mechanical wave signal into a dermal layer of the user.

13. The wearable sensing system of claim 11 , wherein the first and second body parts are the same body part.

14. The wearable sensing system of claim 11 , wherein at least one of the first and second body parts are at least one of a wrist, an arm, a leg, a torso, and a head.

15. The wearable sensing system of claim 10, wherein the event is at least one of a pinch, a tap, a snap, a wiggle, and a fist.

16. The wearable sensing system of claim 10, wherein the at least one one MEMS sensor is at least one of a microphone, an accelerometer, a gyroscope, and a pressure sensor.

17. The wearable sensing system of claim 10, wherein the at least one descriptor is associated with at least one of the length of the event, the type of event, a force associated with the event, a body part associated with the event, and an object associated with the event.

18. The wearable sensing system of claim 10, wherein the at least one descriptor is used for an interaction with a processor enabled application.

19. The wearable sensing system of claim 18, wherein the interaction is input into a mobile device implementing the application.

20. The wearable sensing system of claim 18, wherein the interaction is input into a vehicle implementing the application.

Description:
MEMS SENSING SYSTEM

[0001] This application is a continuation in part of U.S. Patent Application Serial No. 16/909,608 filed June 23, 2020; which claims the benefit of U.S. Provisional Application No. 62/866,206, filed June 25, 2019. This application is a continuation in part of U.S. Patent Application Serial No. 16/910,982 filed June 24, 2020; which claims the benefit of U.S. Provisional Application No. 62/866,324, filed June 25, 2019. This application is a continuation in part of U.S. Patent Application Serial No. 17/062,608 filed October 4, 2020; which claims the benefit of U.S. Provisional Application No. 62/910,528, filed October 4, 2019. This application is a continuation in part of U.S. Patent Application Serial No. 16/913,966 filed June 26, 2020; which claims the benefit of U.S. Provisional Application No. 62/866,970, filed June 26, 2019. This application is a continuation in part of U.S. Patent Application Serial No. 16/914,258 filed June 26, 2020; which claims the benefit of U.S. Provisional Application No. 62/867,006, filed June 26, 2019. This application claims the benefit of U.S. Provisional Application No. 63/013,507 filed April 21 , 2020. The contents of all of the aforementioned applications are incorporated herein by reference. This application includes material which is subject to copyright protection. The copyright owner has no objection to the facsimile reproduction by anyone of the patent disclosure, as it appears in the Patent and Trademark Office files or records, but otherwise reserves all copyright rights whatsoever.

FIELD

[0002] The disclosed apparatus and method relate to the field of sensors, in particular the disclosed apparatus and method relate to gesture and human interaction sensors operating through the mechanical wave sensing of motion and position.

BRIEF DESCRIPTION OF THE DRAWINGS [0003] The foregoing and other objects, features, and advantages of the disclosure will be apparent from the following more particular description of embodiments as illustrated in the accompanying drawings in which reference characters refer to the same parts throughout the various views. The drawings are not necessarily to scale, emphasis instead being placed upon illustrating principles of the disclosed embodiments. [0004] FIG. 1 shows a diagram of a Micro-ElectroMechanical Systems (MEMS) sensor.

[0005] FIG. 2 shows an embodiment of a sensing system incorporating MEMS sensors. [0006] FIG. 3 shows an embodiment of a sensing system having an array of MEMS sensors.

[0007] FIG. 4 shows an embodiment of a sensing system having an array of MEMS microphones and mechanical wave transmitters.

[0008] FIG. 5 shows an embodiment of a sensing system having an array of MEMS sensors, transmitting antennas and receiving antennas.

[0009] FIG. 6 shows a flow chart that illustrates an example method of analysing a mechanical wave received by a MEMS sensor.

[0010] FIG. 7 shows a graph illustrating concepts of an example method of analysing a mechanical wave received by a MEMS sensor.

[0011] FIG. 8 shows a flow chart that illustrates an example method of analysing a mechanical wave received by a MEMS sensor.

[0012] FIG. 9 shows a flow chart that illustrates an example method of analysing a mechanical wave received by a MEMS sensor.

[0013] FIG. 10 is a diagram illustrating a signal processing pipeline and the features obtained at different stages.

[0014] FIG. 11 is a chart illustrating the relationship between the time domain and the RMSE for the received signal corresponding to multiple events.

[0015] FIG. 12 is a chart illustrating the relationship between the time domain and the RMSE for the received signal corresponding to a single event.

DETAILED DESCRIPTION

[0016] The presently disclosed systems and methods involve principles related to and for designing, manufacturing and using sensors implementing mechanical wave signals. The mechanical wave signals are used with devices that may be able to transmit and receive the signals by themselves or function in conjunction with other devices that implement and transmit other types of signals or use other types of sensing modalities. By mechanical signals it is meant signals that are generated via the transmission of waves through a medium, such as gasses and solids. As used herein, the term “solids” encompasses the human body and all associated tissues. In an embodiment, mechanical wave signals may additionally be referred to as acoustic waves.

[0017] Throughout this disclosure, the term “event” may be used to describe periods of time in which muscle activity and/or position of the body is detected and determined. In accordance with an embodiment, events may be detected, processed, and/or supplied to downstream computational processes with very low latency, e.g., on the order of ten milliseconds or less, or on the order of less than one millisecond.

[0018] As used herein, and especially within the claims, ordinal terms such as first and second are not intended, in and of themselves, to imply sequence, time or uniqueness, but rather, are used to distinguish one claimed construct from another. In some uses where the context dictates, these terms may imply that the first and second are unique. For example, where an event occurs at a first time, and another event occurs at a second time, there is no intended implication that the first time occurs before the second time, after the second time or simultaneously with the second time. However, where the further limitation that the second time is after the first time is presented in the claim, the context would require reading the first time and the second time to be unique times. Similarly, where the context so dictates or permits, ordinal terms are intended to be broadly construed so that the two identified claim constructs can be of the same characteristic or of different characteristics. Thus, for example, a first and a second frequency, absent further limitation, could be the same frequency, e.g., the first frequency being 10 Mhz and the second frequency being 10 Mhz; or could be different frequencies, e.g., the first frequency being 10 Mhz and the second frequency being 11 Mhz. Context may dictate otherwise, for example, where a first and a second frequency are further limited to being frequency-orthogonal to each other, in which case, they could not be the same frequency.

[0019] The system described herein uses mechanical waves in order to determine motion and position. In an embodiment, the mechanical waves are acoustic waves. In an embodiment, the system described herein senses the motion and position of a person’s body part. In an embodiment, the system described herein senses the motion and position of a person’s hand. In an embodiment, the system described herein senses the motion and position of a person’s fingers. In an embodiment, the system described herein senses the motion and position of a person’s legs. In an embodiment, the system described herein senses the motion and position of a person’s arms. In an embodiment, the system described herein senses the motion and position of a person’s head. In an embodiment, the system described herein senses the position of a person with respect to another person. In an embodiment, the system described herein senses the motion of a person with respect to a device or an object. [0020] Embodiment of the disclosure implement sensing systems using MEMS (microelectro-mechanical systems) sensors (i.e. , MEMS microphones, MEMS accelerometer, MEMS gyroscopes). MEMS microphones are transducers that convert acoustic pressure waves to electrical signals. Similarly, MEMS accelerometers measure acceleration and MEMS gyroscopes measure orientation with respect to a reference point. The MEMS microphones can be arranged in an array or placed in various locations depending on the desired implementation. Additionally, one MEMS microphone can be used instead of multiple MEMS microphones.

[0021] An example of a MEMS microphone 10 is shown in Fig. 1. The MEMS microphone has a plate 12 that is fixed in place. The plate 12 has holes 14 through which acoustic pressure waves can enter. An electrode 16 permits the MEMS microphone 10 to be operably connected to a system. Located proximate to the fixed plate 12 is a conductive plate 18 that is movable. A chamber 20 is located beneath the conductive plate 18. Compressed air exits through a ventilation hole 22. This MEMS microphone 10 is adapted to measure the vibrations that occur in the environment. However, it should be understood that the MEMS microphone shown in Fig. 1 is by way of example and that other types of MEMS microphones may be used instead. Indeed, it should be understood that any device that is adapted to receive mechanical waves and turn the received mechanical waves into electrical signals can be used so as to take the results of the measured waves and turn it into meaningful information.

[0022] A MEMS sensor (i.e., microphone, accelerometer, gyroscope) is able to take mechanical wave that are transmitted through various mediums and convert them into electrical signals. The electrical signals are able to be processed and used to determine information regarding the significance of the mechanical waves that have been measured. For instance, a MEMS microphone is used in order to determine the presence of acoustic waves. In an embodiment, the MEMS microphone is able to measure the acoustic waves and determine the specific frequency of an acoustic wave so measured. In an embodiment, a plurality of MEMS microphones can use the measured acoustic waves and be able to determine position and motion using the measured and processed acoustic waves. Similarly, MEMS accelerometers and gyroscopes can be used to determine the presence of mechanical waves. They can also determine the specific frequency of the mechanical wave so measured. In an embodiment, a plurality of MEMS sensors can use the measured mechanical waves and are able to determine position and motion using the measured and processed mechanical waves. In an embodiment, a plurality of MEMS accelerometers can use the measured mechanical waves and are able to determine position and motion using the measured and processed mechanical waves. [0023] Turning to FIG. 2, sensing system 200 is shown. The sensing system 200 is an mechanical wave measuring system. In the sensing system 200 a MEMS sensor 202 is adapted to be placed on an individual's body. The MEMS sensor 202 is operably attached to or connected to a substrate 201 . In an embodiment, the MEMS sensor 202 is formed as part of the substrate 201. In an embodiment, at least one of the MEMS sensor 202 and the substrate 201 form part of a wearable mechanical enclosure.

[0024] The substrate 201 may form part of a wearable that is worn by a user. The MEMS sensor 202 is able to measure mechanical waves transmitted through the air. Additionally, the MEMS sensor 202 is able to measure mechanical waves transmitted via the user’s body. For example, the MEMS sensor 202 is able to detect the presence of mechanical waves transmitted via the dermal layer of a user. In an embodiment, the MEMS sensor measures mechanical waves transmitted through the dermal layer of a user. In an embodiment, the MEMS sensor measures mechanical waves transmitted through the user’s body. In an embodiment, the MEMS sensor measures mechanical waves transmitted through the air. In an embodiment, the MEMS sensor measures mechanical waves transmitted through the dermal layer and the interior of a user’s body. In an embodiment, the MEMS sensor measures mechanical waves transmitted through the dermal layer, the interior of a user’s body and the air. In an embodiment, the MEMS sensor measures mechanical waves transmitted through a wearable mechanical enclosure. In an embodiment, the MEMS sensor measures mechanical waves transmitted via a wearable mechanical enclosure. In an embodiment, the mechanical waves are acoustic waves.

[0025] In an embodiment, the mechanical waves generated by movement of a user’s hand are able to be measured by the MEMS sensor. In an embodiment, mechanical waves generated by contact of a user’s fingers with another of the user’s fingers are measured by the MEMS sensor. In an embodiment, mechanical waves generated by contact of the fingers with parts of the hand are measured by the MEMS sensor. In an embodiment, mechanical waves generated by contact of the fingers with objects are measured by the MEMS sensor. In an embodiment, mechanical waves generated by contact of parts of the hand with other parts of the hand are measured by the MEMS sensor. In an embodiment, mechanical waves generated by contact of objects with parts of the hand are measured by the MEMS sensor. In an embodiment, mechanical waves generated by contact of one hand with another hand are measured by the MEMS sensor. In an embodiment, mechanical waves generated by contact of parts of a body with other body parts or other objects are measured by the MEMS sensor.

[0026] The MEMS sensor 202 is operably connected to a processor 203. The processor 203 may be adapted to process and be connected to a plurality of different sensing modalities that are able to measure and determine different facets of motion and position. In an embodiment, the MEMS sensor functions with more than one sensing modality, such as, for example, using transmitting antennas and receiving antennas that transmit and receive a plurality of frequency orthogonal signals and use the received signals to further provide position and movement of a user’s hand.

[0027] Turning to FIG. 3, sensing system 300 is shown. The sensing system 300 has a MEMS sensor 302(a). In an embodiment, the MEMS sensor 302(a) is one MEMS sensor out of a plurality of MEMS sensors 302. In FIG. 3, three MEMS sensors 302(a)- 302(c) are shown. Each one of the MEMS sensors 302(a)-302(c) can be adapted to measure a mechanical wave coming from the activity of a user’s hand or body part. The mechanical waves coming from the various activities of a user can be measured with respect to each of the MEMS sensors 302(a)-302(c). The measurements made by each of the MEMS sensors 302(a)-302(c) can be combined and correlated to provide a more comprehensive picture of the movement and activity of a user’s body part. A processor 303 processes the measurements and uses the measurements in order to provide information related to motion and position of a user’s hand.

[0028] The measurement of the mechanical waves can be used to triangulate position as well as to ascertain various qualities of movement extrapolated from the mechanical waves. Because the mechanical waves of contact between various body parts can be measured, properties of the mechanical waves can be used in order to determine the strength of the activity. For example, the mechanical waves of a clap will have different measured properties than the measured properties of the mechanical waves of a snap.

[0029] Additionally, machine learning can be applied to the data so as to be able to discriminate different activities based on the measured properties of the mechanical waves received. By applying machine learning to the various positions and activities performed by a user the ability of the system to determine position and activity is able to become more refined.

[0030] FIG. 3 illustrates a sensing system 300 with three MEMS sensors, however additional numbers of MEMS sensors as well as different arrays of many MEMS sensors can be used. In an embodiment, an array of four MEMS sensors are positioned in a quadrilateral formation. In an embodiment, an array of four MEMS sensors are positioned along the circumference of a circle. In an embodiment, an array of five MEMS sensors are positioned in a pentagonal formation. In an embodiment, an array of five MEMS sensors are positioned along the circumference of a circle. In an embodiment, an array of six MEMS sensors are positioned in a hexagonal formation. In an embodiment, an array of six MEMS sensors are positioned along the circumference of a circle. It should be understood that larger numbers of MEMS sensors can be used and arranged in various configurations and are not limited to the embodiments disclosed herein. Furthermore, in some embodiments the MEMS sensors can be used in predetermined arrangements that may not form a particular pattern but may instead be determined based on the device or the wearable upon which it is being implemented. For example, if implemented in a glove, a MEMS sensor may be placed within the finger area of each finger portion of a glove.

[0031] In an embodiment, the MEMS sensors are positioned along a circumference of a circle formed or placed in or on a wearable that is used by an individual. In an embodiment, the MEMS sensors are placed on or in a wearable placed in the wrist area. In an embodiment, the MEMS sensors are placed on or in a wearable worn in the ankle area. In an embodiment, the MEMS sensors are placed on or in a wearable worn around the neck. In an embodiment, the MEMS sensors are placed on or in a wearable worn around the chest. In an embodiment, the MEMS sensors are placed on or in a wearable worn around the waist. In an embodiment, the MEMS sensors are placed on or in a wearable worn around an arm. In an embodiment, the MEMS sensors are placed on or in a wearable worn around the head.

[0032] Turning to FIG. 4, an embodiment of sensing system 400 is shown. The sensing system 400 has a plurality of MEMS sensors 402(a)-402(d) that are placed on a substrate 401 that is located on a user’s body. In an embodiment, the MEMS sensors 402(a)-402(d) are MEMS microphones. In an embodiment, the MEMS sensors 402(a)- 402(d) are MEMS accelerometers. In an embodiment, the MEMS sensors 402(a)-402(d) are MEMS gyroscopes. In an embodiment, the MEMS sensors 402(a)-402(d) are MEMS pressure sensors. In an embodiment, the MEMS sensors 402(a)-402(d) are MEMS inertial measurement sensors. In an embodiment, the MEMS sensors 402(a)-402(d) are MEMS ultrasound transducers.

[0033] Additionally, one or more mechanical wave transmitters 404(a)-404(b) can be placed on the substrate 401. The mechanical wave transmitters 404(a)-404(b) are able to generate mechanical waves that can have identifiable frequencies. In an embodiment, a mechanical wave can propagate through at least one medium. As used herein, the term medium is used to indicate any matter capable of mediating the propagation of mechanical waves.

[0034] In an embodiment, a mechanical wave transmitter is an acoustic transmitter that is capable of generating acoustic waves that can have identifiable frequencies. As a non-limiting example an acoustic transmitter is a sound speaker. In an embodiment, a mechanical wave transmitter is a haptic generator capable of generating mechanical waves. In an embodiment, the haptic generator generates mechanical waves that can have identifiable frequencies. In an embodiment, a haptic generator is an electric motor with an eccentric mass for creating mechanical waves. In an embodiment, a haptic generator is a device capable of applying at least one of forces, vibrations, and motions to at least one of a material and matter.

[0035] In an embodiment, a mechanical wave transmitter is a body part of a human. In an embodiment, a mechanical wave transmitter is a body part of an animal. As may be understood by those skilled in the art, while FIG. 4 illustrates a non-limiting embodiment showing mechanical wave transmitters 404(a)-404(b) that can be placed on the substrate 401 , in an embodiment where the mechanical wave transmitter is a body part, the mechanical wave transmitter is not located on substrate 401 . In an embodiment, a body part of a human or animal is a mechanical wave transmitter when there is movement of the body part or any other part of the body. As a non-limiting example, the human arm including the hand is a mechanical wave transmitter when the human effectuates motion of the arm and hand to, for instance, tap upon a surface or pinch between two fingers. As may be noted by those skilled in the art, the movements and motions of any given part of the human body is limitless and therefore any movement or motion can transmit mechanical waves to be detected. As may also be noted by those skilled in the art, the human body resonates at different frequencies for individual parts of the body. In an embodiment, the mechanical wave transmitters are in different body parts than the MEMS sensors. In an embodiment, the mechanical wave transmitters are in the same body part as the MEMS sensors.

[0036] In an embodiment, the mechanical waves are transmitted in a frequency range outside the scope of hearing (generally considered 20 Hz - 20 kHz). In an embodiment, the mechanical waves are transmitted in a frequency range below 20 Hz. In an embodiment, the mechanical waves are transmitted in a frequency range above 20 kHz. In an embodiment, the mechanical waves are transmitted in a frequency range below 20 Hz and above 20 kHz. In an embodiment, the mechanical waves are transmitted in a frequency range that encompasses part of the hearing range. In an embodiment, the mechanical waves are transmitted in a frequency range between 1 Hz-100 kHZ. In an embodiment, the mechanical waves are transmitted through air. In an embodiment, the mechanical waves are transmitted through the skin. In an embodiment, the mechanical waves are transmitted through the interior of the body. In an embodiment, the mechanical wave transmitters transmit mechanical waves through water. As may be noted by those skilled in the art, the mechanical wave transmitters can transmit mechanical waves through at least two mediums simultaneously.

[0037] The medium through which the mechanical waves are transmitted will affect the characteristics of the mechanical waves that are transmitted. When the mechanical waves are received by the MEMS sensor and processed, the processor can be adapted to distinguish the particular mechanical wave transmitted by taking into account the medium and accompanying mechanical interference. In an embodiment, the transmitted mechanical waves can be used to determine movement and position of body parts based on the processed and received mechanical waves.

[0038] As noted above, it should be understood that while the application of acoustic waves (mechanical waves) is discussed via the application of MEMS microphones and acoustic emitters, other devices capable of either emitting or receiving acoustic (mechanical) waves can be employed instead of or in addition to the use of the MEMS devices. In an embodiment, the sensing system uses MEMS accelerometers. In an embodiment, the sensing system uses MEMS accelerometers and other MEMS devices (e.g., a MEMS gyroscope). In an embodiment, the sensing system uses MEMS and non-MEMS piezoelectric devices. In an embodiment, the sensing system uses piezoelectric devices and other MEMS devices. In an embodiment, the sensing system uses piezoelectric devices and accelerometers. In an embodiment, the sensing system uses accelerometers, other MEMS devices and piezoelectric devices. In an embodiment, the sensing system uses accelerometers and gyroscopes.

[0039] Sensing systems that employ one or more different types of mechanical wave emitting or receiving devices can be further integrated with other types of sensing modalities, such as capacitive sensing using orthogonal frequency division multiplexing, discussed below. In an embodiment, the mechanical wave transmitter can transmit at least two orthogonal frequencies. In an embodiment, the MEMS sensors can at least one of detect and receive orthogonal frequencies.

[0040] FIG. 5 shows an embodiment of a sensing system 500 that implements MEMS sensors 502(a)-502(c) and mechanical wave transmitters 504(a)-504(b). In addition to the mechanical wave components of the sensing system 500, the sensing system 500 has an additional modality of sensing position and movement of a hand. In particular, the sensing modality implements a plurality of transmitting antennas 506 and a plurality of receiving antennas 508. The plurality of transmitting antennas 506 are adapted to transmit a plurality of unique frequency orthogonal signals that are generated from a signal generator (not shown). When at least one of the plurality of unique frequency orthogonal signals is received, information regarding the position and movement of body parts that interact with the transmitted signals are determined from the measured signals. The received signals may be processed through the use of a Fast Fourier Transform. Further discussion regarding the implementation of the transmitting antennas (or conductors) and receiving antennas (or conductors) can be found in U.S. Patent Application No. 15/926,478, U.S. Patent Application No. 15/904,953, U.S. Patent Application No. 16/383,090 and U.S. Patent Application No. 16/383,996, the contents of all of the aforementioned applications incorporated herein by reference.

[0041] In the embodiment shown in FIG. 5, the mechanical wave components of the system are able to provide information regarding movement and position of a body part that may not be readily ascertained from other sensing modalities. In particular, the mechanical wave components, such as the MEMS sensors 502(a)-502(c), are able to readily obtain information regarding contact of body parts, such as fingers touching, that may not be easily distinguished from transmitting antennas or receiving antennas.

[0042] As noted above, in an embodiment, the mechanical wave transmitters can each transmit a signal that is frequency orthogonal to each other signal that is transmitted. Certain principles of a fast multi-touch (FMT) sensor have been disclosed in the patent applications disclosed above. With respect to the mechanical wave transmitters, certain principles can be applied to the mechanical wave signals that are transmitted. Orthogonal signals may be transmitted and information may be received by the MEMS sensors. In an embodiment, receivers “sample” the signals received during a sampling period (t). In an embodiment, signals are then analyzed by a signal processor to identify events (including, position and movement of body parts). In an embodiment, one or more transmitters can transmit a signal and the movement of the respective body parts impacts the signals that are received and processed. In an embodiment where the orthogonal signals are frequency orthogonal, spacing between the orthogonal frequencies, Dί, may be at least the reciprocal of the measurement period t, the measurement period t being equal to the period during which the column conductors are sampled. Thus, in an embodiment, the received at a column conductor may be measured for one millisecond (x) using frequency spacing (Dί) of one kilohertz (i.e. , Dί = 1/x). [0043] In an embodiment, a signal processor of a mixed signal integrated circuit (or a downstream component or software) is adapted to determine at least one value representing each frequency orthogonal signal transmitted. In an embodiment, the signal processor of the mixed signal integrated circuit performs a Fourier transform on the signals received. In an embodiment, the mixed signal integrated circuit is adapted to digitize received signals. In an embodiment, the mixed signal integrated circuit is adapted to digitize the signal and perform a discrete Fourier transform (DFT) on the digitized information. In an embodiment, the mixed signal integrated circuit (or a downstream component or software) is adapted to digitize the signals present on at least one of a MEMS sensor, a receive conductor, and an antenna and perform a Fast Fourier transform (FFT) on the digitized information -- an FFT being one type of discrete Fourier transform.

[0044] It will be apparent to a person of skill in the art in view of this disclosure that a DFT, in essence, treats the sequence of digital samples (e.g., window) taken during a sampling period (e.g., integration period) as though it repeats. As a consequence, signals that are not center frequencies (i.e., not integer multiples of the reciprocal of the integration period (which reciprocal defines the minimum frequency spacing)), may have relatively nominal, but unintended consequence of contributing small values into other DFT bins. Thus, it will also be apparent to a person of skill in the art in view of this disclosure that the term orthogonal as used herein is not “violated” by such small contributions. In other words, as the term frequency orthogonal is used herein, two signals are considered frequency orthogonal if substantially all of the contribution of one signal to the DFT bins is made to different DFT bins than substantially all of the contribution of the other signal.

[0045] An example of such a sampled signal is as follows. In an embodiment, received signals are sampled at 4.096 Mhz. In an embodiment, received signals are sampled at more than 4 MFIz. To achieve kFIz sampling, for example, 4096 samples may be taken at 4.096 MFIz. In such an embodiment, the integration period is 1 millisecond, which per the constraint that the frequency spacing should be greater than or equal to the reciprocal of the integration period provides a minimum frequency spacing of 1 KFIz. (It will be apparent to one of skill in the art in view of this disclosure that taking 4096 samples at e.g., 4 MFIz would yield an integration period slightly longer than a millisecond, and not achieving kFIz sampling, and a minimum frequency spacing of 976.5625 Flz.) In an embodiment, the frequency spacing is equal to the reciprocal of the integration period. In such an embodiment, the maximum frequency of a frequency- orthogonal signal range should be less than 2 MFIz. In such an embodiment, the practical maximum frequency of a frequency-orthogonal signal range should be less than about 40% of the sampling rate, or about 1.6 MFIz. In an embodiment, a DFT (which could be an FFT) is used to transform the digitized received signals into bins of information, each reflecting the frequency of a frequency-orthogonal signal transmitted which may have been transmitted by the transmitting antenna. In an embodiment 2048 bins correspond to frequencies from 1 KFIz to about 2 MFIz. It will be apparent to a person of skill in the art in view of this disclosure that these examples are simply that, exemplary. Depending on the needs of a system, and subject to the constraints described above, the sample rate may be increased or decreased, the integration period may be adjusted, the frequency range may be adjusted, etc.

[0046] In an embodiment, a DFT (which can be an FFT) output comprises a bin for each frequency-orthogonal signal that is transmitted. In an embodiment, each DFT (which can be an FFT) bin comprises an in-phase (I) and quadrature (Q) component. In an embodiment, the sum of the squares of the I and Q components is used as a measure corresponding to signal strength for that bin. In an embodiment, the square root of the sum of the squares of the I and Q components is used as measure corresponding to signal strength for that bin. It will be apparent to a person of skill in the art in view of this disclosure that a measure corresponding to the signal strength for a bin could be used as a measure related to muscle activity. In other words, the measure corresponding to signal strength in a given bin would change as a result of some activity originated by muscles of the body.

[0047] Turning now to FIG. 6, an example method 600 of analysing a mechanical wave received by a MEMS sensor is described. In an embodiment, a mechanical wave is transmitted by a mechanical wave transmitter as described herein. In an embodiment, the mechanical wave transmitter is the user’s body. In an embodiment, the mechanical waves are transmitted in response to an event effected by a body part. In an embodiment, the event is at least one of a pinch, a tap, a snap, a wiggle, and a fist.

[0048] In step 602, the mechanical waves are received by a MEMS sensor. In an embodiment, the mechanical waves are digitized by sampling the mechanical waves using, at least, a MEMS sensor. In step 604, the spectral distribution of the received mechanical wave is determined. As it may be apparent to those skilled in the art, the spectral distribution can be determined by performing a discrete Fourier transform (DFT) - which can be an FFT - on the digitized information. In an embodiment, in step 604, the power spectral density data set (i.e. , multiple FFT executions across small time slices) is determined. As will be discussed in further detail below, the power spectral density data is further processed to identify and classify the events (i.e., gestures, motions, pose). [0049] In step 606, frequency bins outside of a predetermined frequency range are excluded. In an embodiment, the frequency range bounds are associated with the resonance frequency of individual body parts. In an embodiment, frequency range bounds are determined by previously associating at least one frequency range to at least one event. In an embodiment, associating at least one frequency range to at least one event is achieved by determining the frequency bins of the mechanical waves transmitted by body parts performing a known event. In an embodiment, frequency range bounds are arbitrary frequencies.

[0050] In step 608, a number of frequency bins within the frequency range bounds are excluded. In an embodiment, at least one frequency bin within the frequency range bounds is excluded. In an embodiment, no frequency bins within the frequency range bounds are excluded. In an embodiment, the number of excluded frequency bins is determined by the total number of frequency bins within the frequency range bounds. In an embodiment, the number of excluded frequency bins is a percentage of the total number of frequency bins. In an embodiment, the number of excluded frequency bins is a percentage of the total number of frequency bins within the frequency range bounds. In an embodiment, the number of excluded frequency bins is an arbitrary number of frequency bins. In an embodiment, the number of excluded frequency bins is associated with the event. In an embodiment, the number of excluded frequency bins is determined by previously associating at least one frequency bin with at least one event.

[0051] In an embodiment, the number of excluded frequency bins is determined by a threshold. In an embodiment, the threshold is associated to the energy level of at least one frequency bin. In an embodiment, the threshold is associated with the energy level of noise present in the mechanical wave. In an embodiment, the threshold is arbitrarily determined. In an embodiment, the threshold is determined by previously associating at least one frequency bin to at least one event. In an embodiment, determining that at least one frequency bin is above the threshold can indicate that an event has happened. In an embodiment, determining that at least one frequency bin is above the threshold can indicate the type of event that has happened. In an embodiment, determining that at least one frequency bin is above the threshold can indicate the type of event that has happened based on the at least one frequency bin above the threshold. [0052] In step 610, a gaussian distribution template of frequency bins is determined. In an embodiment, the gaussian distribution template is determined based on the non-excluded frequency bins within the frequency range bounds. In an embodiment, the gaussian distribution template is determined based on the non- excluded frequency bins within the frequency range bounds above a predetermined threshold. In an embodiment, the gaussian distribution template is determined based on all the frequency bins within the frequency range bounds. In an embodiment, the gaussian distribution template is determined based on all frequency bins. In an embodiment, the gaussian distribution template is determined based on an arbitrary number of frequency bins. In an embodiment, the gaussian distribution template is determined based on arbitrary frequency bins. In an embodiment, the gaussian distribution template is determined based on the frequency bin with the highest energy level. In an embodiment, the gaussian distribution template is determined based on frequency bins previously associated with a known event. In an embodiment, the gaussian distribution template is determined based on at least one characteristic of at least one frequency bin.

[0053] In step 612, the non-excluded frequency bins within the frequency range bounds are compared to the gaussian distribution template to determine at least one characteristic of the event. In an embodiment, the method 600 determines whether at least some of the non-exclude frequency bins fall within the gaussian distribution template to determine at least one characteristic of the event. In an embodiment, the method 600 determines whether all of the non-exclude frequency bins fall within the gaussian distribution template to determine at least one characteristic of the event. [0054] In an embodiment, the at least one characteristic is whether the event has happened. In an embodiment, the at least one characteristic is the type of event. As mentioned above, in an embodiment, the type of event is at least one of a pinch, a tap, a snap, a wiggle, and a fist. In an embodiment, the at least one characteristic is how long the event was. In an embodiment, the at least one characteristic is a force associated with the event. In an embodiment, the at least one characteristic is which body parts were involved in the event. In an embodiment, the at least one characteristic is which fingers were involved in the event. In an embodiment, the at least one characteristic is whether an object was involved in the event. In an embodiment, the at least one characteristic is what type of object was involved in the event.

[0055] FIG. 7 shows a graph illustrating concepts of an example method of analysing a mechanical wave received by a MEMS sensor. Frequency bins 702, 703, and 704 located along the x-axis represent the different frequencies present in a received mechanical wave. In an embodiment, the y-axis represents the energy level corresponding to the frequency bins. The frequency range bounds are delineated by low frequency bound 706 and high frequency bound 708. Frequency bins 703 are excluded for being outside the frequency range bounds. Frequency bin 704 is excluded for being below the threshold 709. The gaussian distribution template is 710 is determined based on at least one frequency bin 702, 703, and 704. In an embodiment, The gaussian distribution template is 710 is determined based on at least one non-excluded frequency bin 702.

[0056] Turning now to FIG. 8, an example method 800 of analysing a mechanical wave received by a MEMS sensor is described. In step 802, the mechanical waves are received by at least one MEMS sensor. [0057] In step 804, at least one spectral distribution of the received mechanical wave is determined over at least one parameter. In an embodiment, the at least one parameter is time. In an embodiment, when the at least one parameter is time, the spectral distribution of the received mechanical wave is determined in sequential time instances. In an embodiment, the sequential time instances can be continuous. In an embodiment, the sequential time instances are not continuous.

[0058] In an embodiment, the at least one parameter is associated with at least one of the type and the quantity of the MEMS sensor that received the mechanical wave. As an example, the spectral distribution is determined by analyzing the digitized information provided by at least two MEMS sensors about the same mechanical wave at the same point in time, overlapping points in time, or different points in time. In an embodiment, the at least two MEMS sensors can be of different types (e.g. at least one MEMS microphone, at least one MEMS accelerometer, and at least one MEMS gyroscope). In an embodiment, the at least two MEMS sensors can be of the same type (e.g. multiple MEMS microphones, multiple MEMS accelerometers, or multiple MEMS gyroscopes).

[0059] As set forth elsewhere in this disclosure, in an embodiment, at least one frequency bin of at least one spectral distribution is excluded. In an embodiment, no frequency bins are excluded. As set forth elsewhere in this disclosure, in an embodiment, at least one frequency range is determined for at least one spectral distribution.

[0060] In step 806, a multivariate gaussian distribution template of frequency bins over at least one parameter is determined. In an embodiment, the multivariate gaussian distribution template is determined based on at least one of the non-excluded frequency bins. In an embodiment, the multivariate gaussian distribution template is determined based on all the frequency bins within the frequency range bounds. In an embodiment, the multivariate gaussian distribution template is determined based on all frequency bins. In an embodiment, the multivariate gaussian distribution template is determined based on an arbitrary number of frequency bins. In an embodiment, the multivariate gaussian distribution template is determined based on arbitrary frequency bins. In an embodiment, the multivariate gaussian distribution template is determined based on the frequency bin with the highest energy level. In an embodiment, the multivariate gaussian distribution template is determined based on frequency bins previously associated with a known event. In an embodiment, the multivariate gaussian distribution template is determined based on at least one characteristic of at least one frequency bin.

[0061] In step 808, the at least one spectral distribution over the at least one parameter is compared to the multivariate gaussian distribution template to determine at least one characteristic of the event. In an embodiment, the method 800 determines whether at least some of the frequency bins of the at least one spectral distribution over the at least one parameter fall within the multivariate gaussian distribution template to determine at least one characteristic of the event.

[0062] In an embodiment, the at least one characteristic is whether the event has happened. In an embodiment, the at least one characteristic is the type of event. As mentioned above, in an embodiment, the type of event is at least one of a pinch, a tap, a snap, a wiggle, and a fist. In an embodiment, the at least one characteristic is which body parts were involved in the event. In an embodiment, the at least one characteristic is which fingers were involved in the event. In an embodiment, the at least one characteristic is whether an object was involved in the event. In an embodiment, the at least one characteristic is what type of object was involved in the event.

[0063] Turning now to FIG. 9, an example method 900 of analysing a mechanical wave received by a MEMS sensor (e.g., accelerometer or gyroscope) is described. In step 902, the system acquires, using at least one MEMS sensor, N samples of a mechanical wave. In step 904, the difference between a sample and the next sample is determined. As a non-limiting example, where N=10, the difference between a sample (e.g. NO) and the next sample (e.g. N1 ) is determined. Then, in step 906, the spectral distribution of the difference between the samples is determined.

[0064] In step 908, at least one frequency bin is excluded. In an embodiment, the number of frequency bins excluded is calculated as a portion of all the frequency bins. In an embodiment, no frequency bins are excluded. In an embodiment, an arbitrary number of frequency bins are excluded.

[0065] In step 910, at least one frequency bin below a predetermined threshold energy level is excluded. In an embodiment, the threshold is an arbitrary energy level. In an embodiment, the threshold is calculated as a portion of the energy level of at least one frequency bin. In an embodiment, the threshold is associated to the energy level of at least one frequency bin. In an embodiment, the threshold is associated with the energy level of noise present in the mechanical wave. In an embodiment, the threshold is arbitrarily determined. In an embodiment, the threshold is determined by previously associating at least one frequency bin to at least one event. In an embodiment, determining that at least one frequency bin is above the threshold can indicate that an event has happened. In an embodiment, determining that at least one frequency bin is above the threshold can indicate the type of event that has happened. In an embodiment, determining that at least one frequency bin is above the threshold can indicate the type of event that has happened based on the at least one frequency bin above the threshold. [0066] In step 912, the system determines at least one characteristic of the event based on whether at least one frequency bin is above the threshold. In an embodiment, the at least one characteristic is whether the event has happened. In an embodiment, the at least one characteristic is the type of event. As mentioned above, in an embodiment, the type of event is at least one of a pinch, a tap, a snap, a wiggle, and a fist. In an embodiment, the at least one characteristic is which body parts were involved in the event. In an embodiment, the at least one characteristic is which fingers were involved in the event. In an embodiment, the at least one characteristic is whether an object was involved in the event. In an embodiment, the at least one characteristic is what type of object was involved in the event.

[0067] An aspect of the present disclosure is a method of analysing at least one of a mechanical wave and an electronic signal. In an embodiment, the method comprises using wavelet processing to detect events from at least one of a mechanical wave and an electronic signal transmitted by at least one body part when the at least one body part interacts with at least one other body part; in an embodiment, between two fingers. [0068] In an embodiment, the method comprises a wavelet transform, or continuous wavelet transform (CWT). A CWT is similar to the FFT, in that it can produce a two dimensional representation of frequency content as a function of time. In an embodiment, a CWT can be tuned to the type of signal being analyzed - similar to a matched filter. In an embodiment, a CWT can be tuned by changing the type of wavelet used. As will be noted by those skilled in the art, wavelets are not a fixed shape and a large number of wavelet types have been developed, therefore, the specific wavelet is non-limiting and multitude of wavelet types can be used.

[0069] An aspect of the present disclosure is a method of analysing at least one of a mechanical wave and an electronic signal to determine a fingerprinting and event classification. In an embodiment, the user fingerprint denotes a signature ascribed to a specific movement by a specific user; which can be used to differentiate among users of the same system. In an embodiment, the user fingerprint is found in at least one of a mechanical wave and an electronic signal.

[0070] As noted above the systems and methods described herein involve principles related to and for designing, manufacturing and using sensors implementing mechanical wave signals. Mechanical wave signals (e.g., acoustic or vibratory signals) have a myriad of features or descriptors which can be used in order to classify or identify the event associated with or the source generating the signal being analyzed. These features include but are not limited to volume, pitch, Zero Crossing Rate (ZCR), energy, centroids (explained in further detail below), and bandwidth. These features can be broadly classified as global temporal features, instantaneous temporal features, energy features, spectral shape features, harmonic features, and perceptual features, among others. FIG. 10 illustrates the relationship between the different stages of signal processing and the features analyzed at each step.

[0071] In an embodiment, a method of extracting features of a received mechanical wave signal includes a pre-computing stage. During this stage multiple signal representations can be determined to aid in the extraction of descriptors later during later stages of processing.

[0072] In an embodiment, a signal representation comprises estimating the energy envelope of the signal. The energy envelope of a signal can be used for the calculation of global descriptors (i.e. , Log-time attacks, temporal centroids). In an embodiment, the energy envelope can be computed using a low pass filtering of the analytical signal amplitude. In an embodiment, efficient calculation of the energy envelope relies on the instantaneous RMS values of at least a portion of the signal (e.g., the local signal, windowed signal, or signal frame).

[0073] In an embodiment, a signal representation comprises the Short Time Fourier Transform (STFT) of at least a portion of the signal. In an embodiment, Short Time Fourier Transform is triggered by a slope change or slope attack in the energy curve of at least a portion of the signal.

[0074] In an embodiment, a signal representation comprises a sinusoidal harmonic model of the signal. In an embodiment, the sinusoidal harmonic model relies on estimating the peaks of the STFT of the local signal (i.e., the windowed signal) during each frame. In an embodiment, a windowed signal is 512 samples wide and incremented at 128 samples. In an embodiment, STFT peaks close to the fundamental frequency can be identified in order to estimate sinusoidal harmonic frequency and amplitude.

[0075] As shown in FIG. 10, global descriptors or global temporal features are obtained at different stages of the signal processing pipeline. In an embodiment, a global temporal feature or descriptor can be extracted from a signal frame by analyzing the energy envelope (i.e. , envelope characterization). In an embodiment, this analysis focuses on a slope attack. In an embodiment, the attack part is described using two parameters: the duration of the attack and the average slope of the energy of the attack (i.e., the increase factor). In an embodiment, the duration of the attack can be estimated by applying a set of thresholds that define the start and end of the attack. In an embodiment, when a low threshold is achieved the start of the attack is determined. In an embodiment, if a high threshold is achieved the end of the attack is determined. In an embodiment, the thresholds are predetermined energy levels. In an embodiment, the thresholds are dynamically calculated with respect to all of the energy levels of the signal. In an embodiment, the thresholds are determined using finite impulse response (FIR). In an embodiment, the thresholds are supplemented with finite impulse response (FIR). [0076] In an embodiment, the energy envelope is analyzed by using a Log-time attack method. Using this method, the attack start and end is determined by using the logarithm of the time duration between the time the signal starts to the time it reaches the stable portion.

[0077] In an embodiment, the energy envelope is analyzed by using a temporal centroid method. Using this method, time can be averaged over the energy envelope. In an embodiment, this method helps distinguish signals generated as a result of a percussive event (e.g., a tap) and signals generated from sustained events (e.g., a pinch or a press) because signals from percussive events tend to have a low temporal centroid. [0078] In an embodiment, the energy envelope is analyzed by using a temporal increase method. Using this method, the increase time can be defined as the average temporal slope of the energy during the time attack. In an embodiment, the average temporal slope can be derived by computing the local slopes of the energy corresponding to each window being analyzed and then computing the weighted average of the slopes. [0079] Returning to FIG. 10, in an embodiment, global descriptors can be obtained from instantaneous temporal features. In an embodiment, instantaneous temporal features are determined using an autocorrelation method. This method calculates Root Mean Square Error (RMSE) of the signal in the time domain using the equation below.

[0080] Essentially, the signal spectral distribution is being represented in the time domain. FIG. 11 illustrates the relationship between the time domain and the RMSE for a received signal. In an embodiment, events can be distinguished by comparing the signal to the RMSE. In an embodiment, a tap event is distinguished from swinging hand movements. In an embodiment, the tap event has a low error between the time domain and the RMSE in the time domain. In an embodiment, swinging hand movements have higher errors as the resonancence is found to not match. FIG. 12 illustrates an expanded view of a signal in the time domain and the RMSE of the signal in the time. As will be noted, in an embodiment, a delay of 50 milliseconds occurs between the onset of the movement and proper classification.

[0081] In an embodiment, instantaneous temporal features are determined using a Zero Crossing Rate (ZCR) method. This method measures the number of times the signal crosses the origin (i.e. , the 0 axis). In an embodiment, periodic sounds have a low ZCR. In an embodiment, periodic sounds have a low ZCR because the impulse, relative to other signals, is high energy and carries a majority of the signal. In an embodiment, noisy sounds have a high ZCR.

[0082] Still referring to FIG. 10, in an embodiment, global descriptors can also be obtained from energy features. In an embodiment, energy features include the total energy (i.e., estimating the signal power at a given time or estimating signal power directly from the signal frame around a given time). In an embodiment, energy features include the harmonic part energy (i.e., estimating the power of the harmonic part of the signal at a given time or estimating the power from the estimated harmonic amplitude at a given time). In an embodiment, energy features include the noise part energy (i.e., estimating the power of the noise part of the signal at a given time or estimating the power by subtracting the harmonic part from the signal).

[0083] In an embodiment, global descriptors are obtained from spectral features. In an embodiment, spectral features include but are not limited to the spectral shape description, the temporal variation of spectrum, and the global spectral shape description. In an embodiment, the spectral shape description is based on the spectral centroid (i.e. , the barycenter of the spectrum). In an embodiment, the spectral shape description is based on the spectral centroid (i.e., the barycenter of the spectrum) measures the spectral skewness (i.e., measures the asymmetry of a distribution around its mean value). The spectral skewness describes the degree of asymmetry of the distribution (i.e., SK=0 - Symmetric Distribution, SK<0 - More Energy to the Right (High Frequency), SK>0 - More Energy to the Left (Low Frequency)). In an embodiment, the spectral shape description is based on the spectral spread (i.e., the spread of the spectrum around its mean value). In an embodiment, the temporal variation of spectrum represents the amount of variation of the spectrum along time. In an embodiment, the global spectral shape description is based on the Mel Frequency Cepstral Coefficients (MFCC). As will be noted, the MFCC represents the shape of the spectrum with very few coefficients. In this method, the Fourier Transform of the logarithm of the spectrum is found. Then, the Mel-Cepstrum, Cepstrum are computed on the mel-bands.

[0084] In an embodiment, global descriptors are obtained from harmonic features. In an embodiment, harmonic features include the fundamental frequency (i.e., frequency so that its integer multiples best explain the contents of the signal spectrum), noisiness (i.e., ratio between energy of the noise (non harmonic part) and the total energy), inharmonicity (i.e., the divergence of the signal spectral components from a purely harmonic signal), and harmonic spectral deviation (i.e., the deviation of the amplitude harmonic peaks from a global spectral envelope). In an embodiment, a noisiness level close to 1 indicates a purely noise signal, and a noisiness level close to 0 indicates a purely harmonic signal. As will be noted, harmonic spectral deviation essentially creates a spline that tracks down, touching each harmonic of a signal, and then measures deviance from it. In an embodiment, the harmonic spectral deviation extracts the spectral variance. The spectral variance is a measure of the standard deviation of a signal's magnitude spectrum. In an embodiment, observed tap signals appear to have little harmonic spectral deviation further implying that they resemble a plosive stop.

[0085] In an embodiment, global descriptors are obtained from perceptual features. In an embodiment, perceptual features include specific loudness (i.e., loudness associated with each bark band), total loudness (i.e., sum of individual loudnesses), and sharpness. Sharpness is the perceptual equivalent to the cepstral centroid but computed using the specific loudness of the bark bands. [0086] In an embodiment, once global descriptors have been obtained information regarding the event may be gleaned (i.e. , type, force, duration). Several methods can be leveraged to classify the events within these categories. For instance, convolutional neural networks and machine learning can be applied to recognize patterns in the data and classify the events based on these patterns. Similarly, the data can be compared against known descriptors for known characteristics of the different events.

[0087] An aspect of the present disclosure is a wearable sensing system comprising at least one MEMS sensor secured to a user, the MEMS sensor adapted to receive mechanical wave signals generated by a body part of the user when the user effects an event; and, a processor operatively connected to the at least one MEMS sensor. The processor is adapted to determine the spectral distribution of the received mechanical wave signal and to extract from the spectral distribution at least one descriptor related to the movement or pose of the body part.

[0088] Another aspect of the present disclosure is a wearable sensing system comprising a substrate adjacent to a first body part of a user. The wearable sensing system further comprises at least one MEMS sensor mechanically connected to the substrate and adapted to receive mechanical wave signals transmitted through the substrate; and, a processor operably connected to the at least one MEMS sensor. The processor is adapted to determine a measurement of each of the received mechanical wave signals by the MEMS sensor and to determine at least one descriptor associated with an event effected by the first body part.

[0089] Yet another aspect of the present disclosure is a mechanical wave sensing system. In an embodiment, the mechanical wave sensing system is an acoustic sensing system. The mechanical wave sensing system comprises a substrate adapted to be located on a user’s body; a plurality of MEMS microphones adapted to receive mechanical waves, wherein at least one of the plurality of MEMS microphones is operably attached to the substrate; a processor operably connected to the plurality of MEMS microphones and the plurality of receiving antennas, wherein the processor is adapted to process measurements of the mechanical wave waves received by the plurality of MEMS microphone and to determine information regarding movement of a body part using the measurements.

[0090] Still yet another aspect of the present disclosure is a system. The system comprises a substrate; a plurality of MEMS sensors adapted to receive mechanical waves, wherein at least one of the plurality of MEMS sensor is operably attached to the substrate; a plurality of transmitting antennas, wherein at least one of the plurality of transmitting antennas is operably connected to a signal generator wherein the signal generator is adapted to generate a plurality of unique frequency orthogonal signals and each of the plurality of unique frequency orthogonal signals frequency orthogonal to each other; a plurality of receiving antennas, wherein the plurality of receiving antennas are adapted to receive the plurality of unique frequency orthogonal signals; and a processor operably connected to the plurality of MEMS sensor and the plurality of receiving antennas, wherein the processor is adapted to process measurements of the mechanical waves received by the plurality of MEMS microphone and to process measurements of received unique frequency orthogonal signals, wherein processed measurements of the mechanical waves and received unique frequency orthogonal signals are used to determine information regarding movement of a body part.

[0091] While the invention has been particularly shown and described with reference to a preferred embodiment thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention.