Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
USER COMMAND DETERMINATION BASED ON A VIBRATION PATTERN
Document Type and Number:
WIPO Patent Application WO/2018/044443
Kind Code:
A1
Abstract:
Embodiments of the present disclosure provide techniques and configurations for an apparatus to determine a command to the apparatus, based on vibration patterns. In one instance, the apparatus may include a body with at least one surface to receive one or more user inputs; at least one sensor disposed to be in contact with the body to detect vibration manifested by the surface in response to the user input, and generate a signal indicative of vibration detected; and a controller coupled with the sensor, to process the vibration-indicative signal, to identify a vibration pattern, and determine a command based at least in part on the vibration pattern, based at least in part on a result of the process of the signal. The command may be provided to interact, operate, or control the apparatus. Other embodiments may be described and/or claimed.

Inventors:
LOPEZ MEYER PAULO (MX)
CORDOURIER MARURI HECTOR ALFONSO (MX)
ZAMORA ESQUIVEL JULIO CESAR (MX)
IBARRA VON BORSTEL ALEJANDRO (MX)
CAMACHO PEREZ JOSE RODRIGO (MX)
ROMERO ARAGON JORGE CARLOS (MX)
Application Number:
PCT/US2017/044243
Publication Date:
March 08, 2018
Filing Date:
July 27, 2017
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
INTEL CORP (US)
International Classes:
G06F3/01; G06F1/16; G06N3/08; G08B6/00; H01L41/08
Domestic Patent References:
WO2016069052A12016-05-06
Foreign References:
US20150070290A12015-03-12
US20130044042A12013-02-21
US20150035759A12015-02-05
US20080284620A12008-11-20
US20170139480A12017-05-18
Attorney, Agent or Firm:
RASKIN, Vladimir et al. (US)
Download PDF:
Claims:
Claims

What is claimed is:

1. An apparatus for determination of user commands based on vibration patterns, comprising:

a body of the apparatus, wherein the body includes at least one surface to receive one or more user inputs to convey a command to the apparatus;

at least one sensor disposed to be in contact with the body of the apparatus to detect vibration manifested by the surface in response to the one or more user inputs, and generate a signal indicative of the detected vibration; and

a controller coupled with the at least one sensor, to process the vibration-indicative signal, to identify a pattern of vibration, and determine the command based at least in part on the vibration pattem, wherein the command is to interact, operate or control the apparatus.

2. The apparatus of claim 1, wherein the controller is further to generate the command to interact, operate or control the apparatus, based at least in part on the vibration pattern.

3. The apparatus of claim 1, wherein the one or more user inputs include at least one of: one or more taps, one or more drags, or a combination of the one or more taps and the one or more drags.

4. The apparatus of claim 1 , wherein the at least one surface includes one of: a smooth surface, a rough surface, a surface with a surface pattern, or a combination thereof, wherein the surface pattem includes one or more surface elements disposed in a pattern on the body of the apparatus.

5. The apparatus of claim 1 , wherein the at least one sensor comprises a piezoelectric transducer responsive to vibration.

6. The apparatus of claim 1, wherein the controller to identify a vibration pattern includes to:

determine that a portion of the signal provided by the at least one sensor is indicative of the user input;

extract features indicative of the vibration pattern from the portion of the signal; and identify the vibration pattern, based at least in part on the features.

7. The apparatus of claim 6, wherein the features comprise mel-frequency cepstral coefficients (MFCC), wherein to extract features includes to calculate multiple MFCC for the portion of the signal.

8. The apparatus of claim 6, wherein to determine the command includes to input the features into a neural network that is trained to recognize the one or more user inputs to the at least one surface, based on the inputted features, wherein the recognized one or more user inputs to the at least one surface indicates the command to interact, operate, or control the apparatus.

9. The apparatus of claim 1, further comprising an amplifier coupled with the at least one sensor, to amplify and condition the signal indicative of vibration, and to provide the amplified and conditioned signal to the controller for processing.

10. The apparatus of any of claims 1 to 9, wherein the apparatus includes one of: a portable or a wearable device.

1 1. The apparatus of claim 10, wherein the apparatus comprises an eyewear, wherein the at least one surface is disposed on one of: a bridge of a frame of the eyewear, a frontal portion of the eyewear frame, a right side of the eyewear frame, or a left side of the eyewear frame.

12. A method for fabrication of an apparatus for determination of user commands based on vibration patterns, comprising:

providing at least one surface on a body of an apparatus, wherein the surface is to receive one or more user inputs to convey a command to the apparatus;

disposing at least one sensor to be in contact with the body of an apparatus, to detect vibration manifested by the surface in response to the one or more user inputs, and generate a signal indicative of vibration;

disposing a controller on or inside the body of the apparatus; and

electrically coupling the at least one sensor with the controller, to process the signal indicative of vibration, to identify a pattern of vibration, and determine the command that corresponds to the vibration pattern, wherein the command is to interact, operate or control the apparatus.

13. The method of claim 12, wherein disposing at least one sensor to be in contact with the body of an apparatus includes placing the at least one sensor on or inside the body of the apparatus.

14. The method of claim 12, wherein providing at least one surface on a body of an apparatus includes forming the at least one surface to include one of: a smooth surface, a rough surface, a surface with a surface pattern, or a combination thereof, wherein the surface partem includes one or more surface elements disposed in a partem on the body of the apparatus.

15. The method of claim 12, further comprising: providing the at least one sensor, wherein the sensor comprises a piezoelectric transducer responsive to vibration.

16. The method of any of claims 12 to 15, wherein electrically coupling the at least one sensor with a controller includes:

electrically coupling the at least one sensor with an amplifier, and electrically coupling the amplifier with the controller, to amplify and condition the signal indicative of vibration provided by the at least one sensor, and to provide the amplified and conditioned signal to the controller for processing.

17. One or more non-transitory controller-readable media having instructions for determination of user commands based on vibration patterns stored thereon that cause a controller of an apparatus, in response to execution by the controller, to:

receive, from at least one sensor disposed to be in contact with a body of the apparatus, a signal indicative of vibration, wherein the signal is generated in response to one or more user inputs to at least one surface of the body of the apparatus; and

process the vibration-indicative signal, wherein to process includes to identify a pattern of vibration, and determine the command based at least in part on the vibration pattern, wherein the command is to interact, operate or control the apparatus.

18. The non-transitory controller-readable media of claim 17, wherein the instructions that cause the controller to process the vibration-indicative signal further cause the controller to: determine that a portion of the signal provided by the at least one sensor is indicative of the user input; extract features indicative of the vibration pattern from the portion of the signal; and identify the vibration pattern, based at least in part on the features.

19. The non-transitory controller-readable media of claim 18, wherein the instructions that cause the controller to identify the command further cause the controller to input the features into a neural network that is trained to recognize the one or more user inputs to the at least one surface, based on the inputted features, wherein the recognized one or more user inputs to the at least one surface indicates the command to interact, operate, or control the apparatus. 20. The non-transitory controller-readable media of any of claims 17 to 19, wherein the instructions further cause the controller to generate the command that corresponds to the vibration partem.

21. An apparatus for determination of user commands based on vibration patterns, comprising:

means for receiving, from at least one sensor disposed to be in contact with a body of the apparatus, a signal indicative of vibration, wherein the signal is generated in response to one or more user inputs to at least one surface of the body of the apparatus; and

means for processing the vibration-indicative signal, including means for identifying a pattern of vibration, and determining the command based at least in part on the vibration pattern, wherein the command is to interact, operate or control the apparatus.

22. The apparatus of claim 21 , wherein the means for processing the vibration- indicative signal include:

means for determining that a portion of the signal provided by the at least one sensor is indicative of the user input;

means for extracting features indicative of the vibration partem from the portion of the signal; and

means for identifying the vibration partem, based at least in part on the features.

23. The apparatus of claim 22, wherein the means for identifying the command include means for inputting the features into a neural network that is trained to recognize the one or more user inputs to the at least one surface, based on the inputted features, wherein the recognized one or more user inputs to the at least one surface indicates the command to interact, operate, or control the apparatus.

24. The apparatus of any of claims 21 to 23, wherein the apparatus further comprises means for generating the command that corresponds to the vibration pattern.

Description:
USER COMMAND DETERMINATION BASED ON A VIBRATION PATTERN

Cross-Reference to Related Application

This application claims priority to U. S. Application No. 15/252,175 entitled "USER COMMAND DETERMINATION BASED ON A VIBRATION PATTERN," filed August 30, 2016.

Field

Embodiments of the present disclosure generally relate to the field of computing, and more particularly, to wearable devices having sensor devices configured to facilitate

identification of commands to interact, operate or control the wearable devices, based on vibration patterns resulting from the user input.

Background

Portable or wearable devices continue to increase in popularity, and feature increasingly sophisticated functionality. These devices need to have an input interface for the user in order to control the device, either by adding hardware in the form of electronic buttons, or by an extemal communication device, e.g., remote control, smart phone application, or the like. However, the use of electronic or touch screen buttons may be frustrating for the user, because it may be difficult to identify and/or use a specific button or other interface component, given the limited size of a keyboard or a screen of the device.

Brief Description of the Drawings

Embodiments will be readily understood by the following detailed description in conjunction with the accompanying drawings. To facilitate this description, like reference numerals designate like structural elements. Embodiments are illustrated by way of example and not by way of limitation in the figures of the accompanying drawings.

FIG. 1 is a diagram illustrating an example apparatus equipped with the technology for determination of user commands based on vibration patterns, in accordance with some embodiments.

FIG. 2 is an example illustration of some aspects of the apparatus of FIG. 1 , in accordance with some embodiments.

FIGS. 3-6 illustrate example graphs showing results of detecting vibration patterns, in accordance with some embodiments. FIG. 7 illustrates an example implementation of the apparatus of FIG. 1, in accordance with some embodiments.

FIG. 8 is a block diagram illustrating some aspects of sensor signal processing by the apparatus of FIG. 1, in accordance with some embodiments.

FIG. 9 is an example process flow diagram for determining user commands to an apparatus, based on vibration patterns, in accordance with some embodiments.

FIG. 10 is an example process flow diagram for manufacturing an apparatus for determination of user commands based on vibration patterns, in accordance with some embodiments.

Detailed Description

Embodiments of the present disclosure include techniques and configurations for an apparatus and method for determination of a command to the apparatus, based on vibration patterns. In some embodiments, the apparatus may include a body of the apparatus, with at least one surface to receive one or more user inputs to convey a command to the apparatus. The apparatus may further include at least one sensor disposed to be in contact with the body of the apparatus to detect vibration manifested by the surface in response to the user input, and generate a signal indicative of the detected vibration. The apparatus may further include a controller coupled with the sensor, to process the vibration-indicative signal, to identify a pattern of vibration, and determine the command based at least in part on the vibration pattern. The command may be provided to interact, operate, or control the apparatus.

In the following detailed description, reference is made to the accompanying drawings that form a part hereof, wherein like numerals designate like parts throughout, and in which are shown by way of illustration embodiments in which the subject matter of the present disclosure may be practiced. It is to be understood that other embodiments may be utilized and structural or logical changes may be made without departing from the scope of the present disclosure.

Therefore, the following detailed description is not to be taken in a limiting sense, and the scope of embodiments is defined by the appended claims and their equivalents.

For the purposes of the present disclosure, the phrase "A and/or B" means (A), (B), (A) or (B), or (A and B). For the purposes of the present disclosure, the phrase "A, B, and/or C" means (A), (B), (C), (A and B), (A and C), (B and C), or (A, B, and C).

The description may use perspective-based descriptions such as top/bottom, in/out, over/under, and the like. Such descriptions are merely used to facilitate the discussion and are not intended to restrict the application of embodiments described herein to any particular orientation.

The description may use the phrases "in an embodiment" or "in embodiments," which may each refer to one or more of the same or different embodiments. Furthermore, the terms "comprising," "including," "having," and the like, as used with respect to embodiments of the present disclosure, are synonymous.

The term "coupled with," along with its derivatives, may be used herein. "Coupled" may mean one or more of the following. "Coupled" may mean that two or more elements are in direct physical, electrical, or optical contact. However, "coupled" may also mean that two or more elements indirectly contact each other, but yet still cooperate or interact with each other, and may mean that one or more other elements are coupled or connected between the elements that are said to be coupled with each other. The term "directly coupled" may mean that two or more elements are in direct contact.

FIG. 1 is a diagram illustrating an example apparatus equipped with the technology for determination of user commands, based on vibration patterns, in accordance with some embodiments. The apparatus 100 may comprise a wearable device, a portable device, or, more generally, any type of a computing device that may include means for user tactile interaction with the device. While examples of specific implementations (e.g., in headwear) and/or technologies (e.g., piezoelectric sensors, wireless communications, etc.) may be employed herein, these examples are presented merely to provide a readily comprehensible perspective from which the more generalized devices, methods, etc. described herein may be understood.

As noted above, the apparatus 100 may comprise a wearable device. In embodiments, the apparatus 100 may comprise eyeglasses, as described in greater detail with reference to FIG 7.

The apparatus 100 may include a body of the apparatus 102. The body 102 of the apparatus 100 may include at least one surface 104 configured to receive one or more user inputs to convey a command to the apparatus 100. The user may provide e.g., a series of tactile inputs to the surface 104. The surface 104 may manifest a vibration pattem in response to a particular series of user inputs. The vibration with a particular pattern may be propagated through the body 102 of the apparatus 100. The apparatus 100 may be configured to sense and identify the pattern of vibration. The command corresponding to the identified vibration pattern, when determined, may be generated to interact, operate or control the apparatus 100.

The surface 104 may include a smooth surface, a rough surface, a surface with a surface pattern, or a combination thereof. The surface pattem may include one or more surface elements (e.g., bumps, lines, geometric figures, or other shapes) that may be disposed in a pattern on the body 102, as described in greater detail in reference to FIG. 2.

At least one sensor 106 may be disposed on or inside the apparatus 100, such as to be in contact with the body 102, in order to detect vibration generated by the surface 104 in response to the user input, and generate a signal indicative of the detected vibration. For example, the sensor 106 may be mounted on the body 102 via mechanical attachment (e.g., screw, nail or other fastener), adhesive attachment (e.g., a glue, epoxy, etc.) or may be incorporated within the structure of the body 102. In embodiments, the sensor 106 may comprise vibration sensing circuitry. The sensing circuitry may comprise, for example, piezoelectric components such as a diaphragm or other piezoelectric transducer, to convert into signals vibration (e.g., mechanical pressure waves) occurring in the body 102 in response to a user input to the surface 104.

The apparatus 100 may further include a controller device 110, which may also be disposed on the apparatus 100, e.g., in the body 102. The controller device 110 may be electrically and/or communicatively coupled with the sensor 106, to receive and process the signal provided by the sensor 106, as described below. In embodiments, the controller device 110 may be coupled with the sensor 106 via an amplifier 108. The amplifier 108 may be configured to amplify and condition the sensor signal indicative of vibration, and to provide the amplified and conditioned signal to the controller device 110 for processing.

The controller device 110 may be configured to process the sensor signal indicative of vibration, to identify a partem of vibration and determine a command that corresponds to the pattern of vibration. Based on the determined command, the controller device 110 may generate the command to interact, operate or control the apparatus 100. The commands may include various functions to control the apparatus 100, e.g., powering on or off, initializing different apparatus functions, providing alphanumeric inputs to a password prompt or application prompt, and the like.

To process the sensor signal, the controller device 110 may include components configured to record and process the readings of the signal. The controller device 110 may provide these components through, for example, a plurality of machine-readable instructions (e.g., signal processing block 130) stored in a memory 122 and executable on a processor 120. The controller device 110 may record the sensor signal and store (e.g., buffer) the recorded readings, for example, in the memory 122, for further analysis and processing, e.g., in real time or near-real time.

The processor 120 may include, for example, one or more processors situated in separate components, or alternatively one or more processing cores embodied in a component (e.g., in a System-on-a-Chip (SoC) configuration), and any processor-related support circuitry (e.g., bridging interfaces, etc.). Example processors may include, but are not limited to, various microprocessors including those in the Pentium®, Xeon®, Itanium®, Celeron®, Atom®, Quark®, Core® product families, or the like.

Examples of support circuitry may include host side or input/output (I/O) side chipsets

(also known as northbridge and southbridge chipsets/components) to provide an interface through which the processor 120 may interact with other system components that may be operating at different speeds, on different buses, etc. in the apparatus 100. Some or all of the functionality commonly associated with the support circuitry may also be included in the same physical package as the processor.

The memory 122 may comprise random access memory (RAM) or read-only memory (ROM) in a fixed or removable format. RAM may include volatile memory configured to hold information during the operation of apparatus 100 such as, for example, static RAM (SRAM) or Dynamic RAM (DRAM). ROM may include non-volatile (NV) memory circuitry configured based on basic input/output system (BIOS), Unified Extensible Firmware Interface (UEFI), etc. to provide instructions when apparatus 100 is activated, programmable memories such as electronic programmable ROMs (erasable programmable read-only memory), Flash, etc. Other fixed/removable memory may include, but is not limited to, electronic memories such as solid state flash memory, removable memory cards or sticks, etc.

The controller device 110 may further include other components necessary for functioning of the apparatus 100. For example, the controller device 110 may include a communication block 124, which may include one or more radios capable of transmitting and receiving signals (e.g., to an external device, not shown) using various suitable wireless communications techniques. Such techniques may involve communications across one or more wireless networks. Some example wireless networks include (but are not limited to) wireless local area networks (WLANs), wireless personal area networks (WPANs), wireless metropolitan area network (WMANs), cellular networks, and satellite networks. In communicating across such networks, the communication block 124 may operate in accordance with one or more applicable standards in any version. To this end, the communication block 124 may include, for instance, hardware, circuits, software, or any combination thereof that allows communication with external computer systems.

In some specific non-limiting examples, the communication block 124 may comport with the Institute of Electrical and Electronics Engineers (IEEE) 802.11 standard (e.g., Wi-Fi), a Bluetooth®, ZigBee®, near-field communication, or any other suitable wireless communication standard. In addition, the communication block 124 may comport with cellular standards such as 3G (e.g., Evolution-Data Optimized (EV-DO), Wideband Code Division Multiple Access (W- CDMA)) and/or 4G wireless standards (e.g., High Speed Packet Access (HSPA), Worldwide Interoperability for Microwave Access (WIMAX), Long-Term Evolution (LTE)).

The controller device 110 may include other components 126 that may be necessary for functioning of the apparatus 100. Other components 126 may include, for example, a power circuitry block configured to provide power supply to the components of the apparatus 100. The power circuitry block may include internal power sources (e.g., battery, fuel cell, etc.) and/or external power sources and related circuitry configured to supply apparatus 100 with the power needed to operate. Other components 126 may further include various output mechanisms (e.g., speakers, displays, lighted/flashing indicators, electromechanical components for vibration, motion, etc.). The hardware in other components 126 may be incorporated within the controller device 110 and/or may be external to the controller device 110.

FIG. 2 is an example illustration of some aspects of the apparatus of FIG. 1, in accordance with some embodiments. For ease of understanding, like components of FIGS. 1 and 2 are indicated by like numerals. More specifically, FIG. 2 illustrates exemplary surfaces which when combined with a sensor, may be used to capture user input across the surfaces as vibrations. The captured vibrations may in turn be translated into commands or gestures.

In a setup 200 of FIG. 2, two sensors 106 (e.g., piezoelectric transducers) may be connected in series and coupled to an amplifier, such as a MOTU amplifier to condition the sensor signal. Each sensor may be composed of a metallic disk and a thin layer of piezoelectric material (Murata® 7BB-20-6L0). As shown, the sensors 106 may be affixed to the body 102 of the apparatus 100.

Vibration to be sensed by the sensors 106 may occur in response to the user input to (interaction with) one or more surfaces (e.g., surface 104 of FIG. 1) configured to receive user input and placed on the body 102. The example surfaces shown on 200 are described for ease of understanding and are not limited to this disclosure. Different types of surfaces may be employed to receive user input and generate vibration in response to the input. The example surfaces of FIG. 2 may include a smooth surface 202, a rough surface 204, a surface with a pattern (e.g., bumpy surface) 206, or a combination thereof. User input may include different user interaction with different types of surfaces. For example, user input may include a one- finger tap over the smooth surface 202. In another example, user input may be a drag of one or more fingers along the rough surface 204. The surface with a pattern 206 may include different surface elements disposed in a particular pattern on the surface of the apparatus body. For example, the surface 206 may include an array of bumps or recesses 210 that may be disposed along the surface 206 with equal or different (e.g., progressively increasing) distance between each other, as shown in FIG. 2. Accordingly, the user input may include a drag of a finger along the surface 206. When the distance between the bumps or recesses differs (e.g., progressively increases), the pattem of a vibration signal generated in response to a finger drag along the surface in one direction (e.g., 212) may differ from the pattern of a vibration signal generated in response to a finger drag in the opposite direction (214).

Accordingly, a combination of a user input to a particular surface may be characterized as a gesture. The gestures may include (and are not limited to), for example, one or more taps over the smooth surface, drag over a rough surface, drag over a surface with a pattem in one direction, drag over the surface with the pattern in the opposite direction, or any combination thereof. Each gesture, identified based on determined vibration pattern, may correspond to a particular command to control the apparatus 100 of FIG. 1. The command may indicate an operation to be performed by the apparatus, e.g., powering on or off the apparatus, turning on and off different apparatus functions. The command may also serve to interact with the apparatus, such as to enter information (e.g., passcodes), and the like.

More generally, different surface textures or a set of regular or irregular patterns disposed along the surface of a body of an apparatus (e.g., a wearable device) may introduce additional differentiating features to the vibration patterns generated in response to different user interactions with the surface (e.g., finger taping or dragging along the surface). Different vibration patterns may be used to differentiate among different gestures and may be further associated with different commands to the apparatus. The sensors 106 may be able to generate a signal indicative of a vibration pattern, which may be processed to identify a user gesture (e.g., user input on a particular surface of the apparatus), and generate a corresponding command to control the apparatus. In other words, a vibration pattem, when identified, may indicate a command to control the apparatus.

To evaluate the vibration pattern differences, data from different gestures, e.g., finger-tap over the smooth surface, drag on smooth area, drag on rough surface, and drag over bumpy surface may be collected. Each gesture may be repeated multiple times, e.g., ten times, to have a total of forty user inputs.

FIGS. 3-6 illustrate example graphs showing results of detecting vibration patterns, in accordance with some embodiments. More specifically, graphs 302, 402, 502, and 602 illustrate example representations of the electronic (e.g., piezoelectric) signal captured during testing on the surfaces shown in FIG. 2, in accordance with some embodiments. The signals detected by the sensor are shown as function of time, for the following gestures: tap on a smooth surface (302), drag on a smooth surface (402), drag on a rough surface (502), and drag on a surface with a partem (602). The signal may be recorded at a sample frequency of 41 kHz, and down sampled to 16 kHz for post-processing analysis. The timing of the signal (shown on X-axis) may be divided into 250 ms non-overlapping windows ti (for i = 1, 2,..., n number of windows) to obtain a spectrogram Si using the short-time Fourier transform.

Graphs 304, 404, 504, and 604 illustrate the respective spectrograms of the sensor signals 302, 402, 502, and 602. Graphs 306, 406, 506, and 606 illustrate the partem that correspond to the signals illustrated in graphs 302, 402, 502, and 602 respectively. The features of the signal may be used to identify or classify a particular vibration pattern. For example, a signal feature may include the mel-frequency cepstral coefficients (MFCC) of the vibration signal generated by the sensor.

The apparatus with determination of user commands based on vibration patterns described herein may be easily adopted in the day-to-day life of a user, for example, in a wearable device, such as eyewear (eyeglasses, sunglasses, safety glasses, or the like) that may be routinely worn by people.

FIG. 7 illustrates an example implementation of the apparatus of FIG. 1, in accordance with some embodiments. In this example, the apparatus 100 may be implemented in eyewear, such as safety eyewear glasses 700, different views of which are shown in FIG. 7.

In embodiments, one or more piezoelectric sensors 106 connected in series (similar to the ones shown in FIG. 2) may be placed on the nasal support (nose pads) 710 of a pair of eyeglasses 700.

The surfaces to receive user input may be provided in different parts of the eyeglasses

700. For example, a smooth surface 702 (similar to 202) may be provided on the left side of the glasses 700, as shown in view 740. A rough surface 704 (similar to 204) may be placed in the frontal part of the glasses 700, as shown in view 742. A surface with a pattern 706 (similar to 206) may be disposed on the right side of the glasses 700, as shown in view 740. As indicated by an arrow 720 in view 746, a user may perform a user input in a form of a drag over the surface 706. In addition to the gestures described above, a tap over a bridge 712 above the nose pads 710 of the eyeglasses 700 may be used, as shown in view 744.

In embodiments, a controller device (e.g., 110 of FIG. 1, not shown in FIG. 7) may also be mounted on the eyeglasses 700, e.g., on the sides or the front part. The controller device 110 may be communicatively coupled with the sensors 106 to enable signal transmission from the sensors 106 to the controller device 110. Understandably, the controller device 110 may be mounted in other suitable areas of a respective wearable device, depending on the wearable device configuration.

In general, the apparatus 100 may comprise any wearable device, such as a headset, a helmet, diadems, caps, hats, arm bands, knee bands, or other types of headwear, body wear, wrist wear, or arm wear. It may also be applied to stationary devices like desktops, laptops, smart TVs, smart homeware and kitchenware, cars, etc. The sensors may be disposed in different areas of the wearable device such as to provide a contact with the body of the device, in order to sense vibration produced by different types of user input on one or more surface disposed on the device.

FIG. 8 is a block diagram illustrating some aspects of sensor signal processing by the apparatus of FIG. 1, in accordance with some embodiments. For ease of understanding, like components of FIGS. 1, 7, and 8 are indicated by like numerals.

More specifically, the block diagram 800 illustrates the components and operation of the signal processing block 130 briefly described in reference to FIG. 1. As described in reference to FIG. 1, the apparatus 100 may include one or more sensors 106 disposed on the body of the apparatus 100. The sensor signal generated in response to vibration produced by the body of the apparatus 100 (e.g., eyeglasses 700) may be provided, via the amplifier 108, to the controller device 110.

In embodiments, the signal processing block 130 may be implemented as a recognition engine, which may include an activity (user input) detection block (AD) 802, a feature extraction block 804, and a user input classifier 806.

The signal from the piezoelectric sensors 106 may be amplified by the amplifier 108. Subsequently, the AD 802 may be used to segment the signal into signal portions with possible control-related user input. The AD 802 may be amplitude-based. In other words, block 802 may detect user input based on the amplitude of vibration signal (e.g., 302, 402, 502, or 602, also indicated by numeral 810 in FIG. 8). For example, if for a given period of time the average signal amplitude is above a threshold, it may be determined that a user input of some type has occurred.

For example, if for a given epoch (reference time) of sampled data points (e.g., 1024 data points) of the signal (64 ms), the average amplitude is above a threshold Th = 2x10-4, the AD 802 may begin collecting data until the average of the incoming epochs drops below Th. Accordingly, the segment, e.g., a portion of the signal (frame) within the identified time period, within which the user input occurred, may be identified.

The segmented parts 812 of the signal 810 may serve as the inputs to the feature extraction block 804. For the captured candidate segment, the feature extraction block 804 may extract features indicative of the vibration pattern. As discussed above, the features may comprise mel-frequency cepstral coefficients (MFCC), spectral representations of which are shown in graphs 306, 406, 506, and 606. Multiple MFCC may be calculated for the signal segment. For example, 12 MFCC may be calculated over 20 ms window frames, together with its derivatives, e.g., deltas and double deltas, resulting in a feature vector for each frame of MFCC dimensionality. While MFCC may be chosen as features in some embodiments because they may represent more efficient features for classification of user input, any other signal pattern indicating features may also be used.

The MFCC features may be used as inputs for the user input classifier 806. The MFCC may represent the variable features to be used by a classifier methodology to discriminate among different vibration patterns. Accordingly, the user input classifier 806 may comprise a machine learning technique, such as feed-forward neural network 816 that may use the MFCC features 814 as inputs and may be trained to recognize gestures based on the inputted vibration pattern features. The gestures may correspond to user commands to control, operate, or interact with the apparatus 100.

In one example, the MFCC may be normalized to be of 36 window frames in time. The resulting topology of the neural network 816 implemented for prototyping may include 36 x 36 input neurons (total 1296), 100 hidden neurons, and four output neurons (one for each recognized user input type).

A training set may be collected for the user input described in reference to FIG. 2, e.g., with 10 repetitions each. The training results may show a high precision performance of the neural network 816, up to 97.5%. Validation of the resulting neural network model may be performed using a real time routine, where each one of the four gestures may be repeated 20 times; the overall average precision performance across all repetitions and all gestures may reach 90%.

In embodiments, the controller device 110 with the user input determination described above may be implemented over a field-programmable gate array (FPGA) emulation platform designed for wearable device applications. In this platform, the neural network 816 may be implemented as a shifted neural network, which may consume substantially less power (e.g., 1 mW) compared to other neural network implementations. The shifted neural network may be trained in this emulation, with the configured topology of 36 x 36 inputs (MFCC features), 16 hidden neurons, and two output neurons (two outputs may represent two gestures). Precision performance for user input recognition during training and real-time may be observed to be comparable and consistent to the ones obtained as described above, and performance results to classify MFCC for keyword recognition may prove the technique's feasibility to be used in real life real-time applications.

FIG. 9 is an example process flow diagram for determining user commands to an apparatus, based on vibration patterns, in accordance with some embodiments. The process 900 may comport with some of the apparatus embodiments described in reference to FIGS. 1-8. The process 900 may be performed by the controller device 110, and more specifically, by the signal processing block 130 of the apparatus 100 of FIGS. 1 and 8. Accordingly, the process 900 is described with continuous reference to FIGS. 1 and 8. In alternate embodiments, the process 900 may be practiced with more or fewer operations, or a different order of the operations.

The process 900 may begin at block 902 and include capturing sensor signals from sensors 106, amplified and conditioned by the amplifier 108. The signals may be generated by the sensors 106 in response to user input and corresponding vibration propagated through the body 102 of the apparatus 100.

At block 904, the process 900 may include buffering of incoming sampled data points corresponding to an epoch of the received signal and computing average amplitude^ of the given signal's epoch (e.g., time windows that contain 1024 sampled data points) from the buffer. For example, for each 1024 sampled data points the compute amplitude average formula may comprise: Aj=∑ai/1024, wherein i represents incremental sample data points, e.g., z ' =l,... , 1024 sampled data points, and j represents incremental epochs, e.g.,y ' =l,... , n epochs (e.g., epoch 1 containing 1024 sample data points, epoch 2 containing subsequent 1024 sample data points, and so on, wherein a is the amplitude of the signal at sampling time i.

At decision block 906, the process 900 may include comparing average amplitude of signal Aj for the buffered data points with a predetermined threshold Th, to detect whether user input occurred during the time period corresponding to the buffered data points. If the average is below the threshold, the process 900 may return to block 904.

If the average amplitude is above the threshold, it may be inferred that the portion of the signal corresponding to the average signal amplitude above the threshold may be indicative of the user input. In other words, the user input may have occurred during the time period in which the average amplitude of the captured signal remained above the threshold. Accordingly, the process 900 may move to block 908, where a segment (portion of signal) corresponding to the above-threshold average amplitude of sensor signal may be captured and stored.

At decision block 910, the process may include comparing average amplitude of signal Aj+1 for the next portion of the buffered data points (e.g., another 1024 points) with the predetermined threshold Th. If the average signal amplitude is above the threshold, the process may move to block 908, where the next portion (segment) of the signal corresponding to the detected user input may be captured. If the average signal amplitude is below the threshold, the process may move to block 912. The actions described in reference to blocks 904-910 may be performed by the AD 802 of the signal processing block 130 of FIG. 8.

At block 912, the features indicative of the vibration partem corresponding to the portion of signal captured at block 908 may be extracted from the captured signal portion. For example, MFCC 814 may be computed, based on the captured signal segment. The actions described in reference to block 912 may be performed by the feature extraction block 804 of the signal processing block 130 of FIG. 8.

At block 914, a command corresponding to vibration pattern may be identified, based at least in part on the features indicative of the vibration pattern. For example, the neural network 816 may perform gesture classification based on inputted MFCC features, as described in reference to FIG. 8, and therefore a command to control, operate, or interact with the apparatus.

At block 916, the results of the gesture classification may be reported (and used to generate the command to interact, operate or control an apparatus), and thereafter, the process 900 may move to block 904. The actions described in reference to blocks 914-916 may be performed by the classifier 806 (e.g., neural network 816) of the signal processing block 130 of FIG. 8.

FIG. 10 is an example process flow diagram for manufacturing an apparatus equipped with the technology for determination of user commands based on vibration patterns, in accordance with some embodiments. The process 1000 may describe manufacturing of the apparatus that comports to the description provided in reference to FIGS. 1, 7, and 8.

Accordingly, the process is described with continuous reference to FIG. 1.

The process 1000 may begin at block 1002 and include providing at least one surface for user input (e.g., 104) on a body (102) of an apparatus (100), to receive a user input to control the apparatus. Providing the surface for user input may include forming the surface to include one of: a smooth surface, a rough surface, a surface with a surface pattern, or a combination thereof.

The surface partem may include one or more surface elements disposed in a pattern on the body of the apparatus. At block 1004, the process 1000 may include disposing at least one sensor (e.g., 106) to be in contact with the body of an apparatus, to detect vibration generated by the surface of the body of the apparatus. Disposing the sensor may include placing the sensor on or inside the body of the apparatus. In embodiments, the sensor may comprise a piezoelectric transducer responsive to vibration.

At block 1006, the process 1000 may include disposing a controller (e.g., 1 10) on or inside the body of the apparatus.

At block 1008, the process 1000 may include electrically coupling the sensor with the controller, to process the signal indicative of vibration, to identify a pattern of vibration, and determine a command that corresponds to the vibration pattern, based at least in part on a result of the processing of the signal indicative of vibration.

Electrically coupling the sensor with the controller may include electrically coupling the sensor with an amplifier, and electrically coupling the amplifier with the controller. The amplifier may be configured to amplify and condition the signal indicative of vibration provided by the sensor, and to provide the amplified and conditioned signal to the controller for processing.

The following paragraphs describe examples of various embodiments.

Example 1 may be an apparatus for determination of user commands based on vibration patterns, comprising: a body of the apparatus, wherein the body includes at least one surface to receive one or more user inputs to convey a command to the apparatus; at least one sensor disposed to be in contact with the body of the apparatus to detect vibration manifested by the surface in response to the one or more user inputs, and generate a signal indicative of the detected vibration; and a controller coupled with the at least one sensor, to process the vibration- indicative signal, to identify a pattem of vibration, and determine the command based at least in part on the vibration pattern, wherein the command is to interact, operate or control the apparatus.

Example 2 may include the subject matter of Example 1, wherein the controller may further generate the command to interact, operate or control the apparatus, based at least in part on the vibration pattern.

Example 3 may include the subject matter of Example 1, wherein the one or more user inputs may include at least one of: one or more taps, one or more drags, or a combination of the one or more taps and the one or more drags.

Example 4 may include the subject matter of Example 1, wherein the at least one surface may include one of: a smooth surface, a rough surface, a surface with a surface pattern, or a combination thereof, wherein the surface pattem may include one or more surface elements disposed in a pattem on the body of the apparatus.

Example 5 may include the subject matter of Example 1, wherein the at least one sensor may comprise a piezoelectric transducer responsive to vibration.

Example 6 may include the subject matter of Example 1, wherein the controller to identify a vibration pattern includes to: determine that a portion of the signal provided by the at least one sensor is indicative of the user input; extract features indicative of the vibration pattem from the portion of the signal; and identify the vibration pattern, based at least in part on the features.

Example 7 may include the subject matter of Example 6, wherein the features comprise mel-frequency cepstral coefficients (MFCC), wherein to extract features includes to calculate multiple MFCC for the portion of the signal.

Example 8 may include the subject matter of Example 6, wherein to determine the command may include to input the features into a neural network that is trained to recognize the one or more user inputs to the at least one surface, based on the inputted features, wherein the recognized one or more user inputs to the at least one surface indicates the command to interact, operate, or control the apparatus.

Example 9 may include the subject matter of Example 1, further comprising an amplifier coupled with the at least one sensor, to amplify and condition the signal indicative of vibration, and to provide the amplified and conditioned signal to the controller for processing.

Example 10 may include the subject matter of any Examples 1 to 9, wherein the apparatus includes one of: a portable or a wearable device.

Example 11 may include the subject matter of Example 10, wherein the apparatus comprises an eyewear, wherein the at least one surface is disposed on one of: a bridge of a frame of the eyewear, a frontal portion of the eyewear frame, a right side of the eyewear frame, or a left side of the eyewear frame.

Example 12 may be a method for fabrication of an apparatus for determination of user commands based on vibration patterns, comprising: providing at least one surface on a body of an apparatus, wherein the surface is to receive one or more user inputs to convey a command to the apparatus; disposing at least one sensor to be in contact with the body of an apparatus, to detect vibration manifested by the surface in response to the one or more user inputs, and generate a signal indicative of vibration; disposing a controller on or inside the body of the apparatus; and electrically coupling the at least one sensor with the controller, to process the signal indicative of vibration, to identify a pattern of vibration, and determine the command that corresponds to the vibration pattern, wherein the command is to interact, operate or control the apparatus.

Example 13 may include the subject matter of Example 12, wherein disposing at least one sensor to be in contact with the body of an apparatus may include placing the at least one sensor on or inside the body of the apparatus.

Example 14 may include the subject matter of Example 12, wherein providing at least one surface on a body of an apparatus may include forming the at least one surface to include one of: a smooth surface, a rough surface, a surface with a surface pattern, or a combination thereof, wherein the surface partem may include one or more surface elements disposed in a pattern on the body of the apparatus.

Example 15 may include the subject matter of Example 12, further comprising: providing the at least one sensor, wherein the sensor comprises a piezoelectric transducer responsive to vibration.

Example 16 may include the subject matter of any Examples 12 to 15, wherein electrically coupling the at least one sensor with a controller may include: electrically coupling the at least one sensor with an amplifier, and electrically coupling the amplifier with the controller, to amplify and condition the signal indicative of vibration provided by the at least one sensor, and to provide the amplified and conditioned signal to the controller for processing.

Example 17 may be one or more non-transitory controller-readable media having instructions for determination of user commands based on vibration patterns stored thereon that cause a controller of an apparatus, in response to execution by the controller, to: receive, from at least one sensor disposed to be in contact with a body of the apparatus, a signal indicative of vibration, wherein the signal is generated in response to one or more user inputs to at least one surface of the body of the apparatus; and process the vibration-indicative signal, wherein to process includes to identify a pattern of vibration, and determine the command based at least in part on the vibration pattern, wherein the command is to interact, operate or control the apparatus.

Example 18 may include the subject matter of Example 17, wherein the instructions that cause the controller to process the vibration-indicative signal may further cause the controller to: determine that a portion of the signal provided by the at least one sensor is indicative of the user input; extract features indicative of the vibration pattern from the portion of the signal; and identify the vibration pattern, based at least in part on the features.

Example 19 may include the subject matter of Example 18, wherein the instructions that cause the controller to identify the command may further cause the controller to input the features into a neural network that is trained to recognize the one or more user inputs to the at least one surface, based on the inputted features, wherein the recognized one or more user inputs to the at least one surface indicates the command to interact, operate, or control the apparatus.

Example 20 may include the subject matter of any of Examples 17 to 19, wherein the instructions further cause the controller to generate the command that corresponds to the vibration partem.

Example 21 may be an apparatus for determination of user commands based on vibration patterns, comprising: means for receiving, from at least one sensor disposed to be in contact with a body of the apparatus, a signal indicative of vibration, wherein the signal is generated in response to one or more user inputs to at least one surface of the body of the apparatus; and means for processing the vibration-indicative signal, including means for identifying a pattern of vibration, and determining the command based at least in part on the vibration partem, wherein the command is to interact, operate or control the apparatus.

Example 22 may include the subject matter of Example 21, wherein the means for processing the vibration-indicative signal may include: means for determining that a portion of the signal provided by the at least one sensor is indicative of the user input; means for extracting features indicative of the vibration pattern from the portion of the signal; and means for identifying the vibration pattern, based at least in part on the features.

Example 23 may include the subject matter of Example 22, wherein the means for identifying the command may include means for inputting the features into a neural network that is trained to recognize the one or more user inputs to the at least one surface, based on the inputted features, wherein the recognized one or more user inputs to the at least one surface indicates the command to interact, operate, or control the apparatus.

Example 24 may include the subject matter of any of Examples 21 to 23, wherein the apparatus further comprises means for generating the command that corresponds to the vibration pattern.

Various operations are described as multiple discrete operations in turn, in a manner that is most helpful in understanding the claimed subject matter. However, the order of description should not be construed as to imply that these operations are necessarily order dependent.

Embodiments of the present disclosure may be implemented into a system using any suitable hardware and/or software to configure as desired.

Although certain embodiments have been illustrated and described herein for purposes of description, a wide variety of alternate and/or equivalent embodiments or implementations calculated to achieve the same purposes may be substituted for the embodiments shown and described without departing from the scope of the present disclosure. This application is intended to cover any adaptations or variations of the embodiments discussed herein. Therefore, it is manifestly intended that embodiments described herein be limited only by the claims and the equivalents thereof.