Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
METHOD AND APPARATUS FOR RECOGNIZING A GESTURE
Document Type and Number:
WIPO Patent Application WO/2021/219574
Kind Code:
A1
Abstract:
A gesture may be accomplished by a user near a two-port orthogonally polarized antenna. For example, a source signal sweeping across a range of frequencies may feed the first port. The second port may be connected to a receiver, configured to obtain (e.g., measure) variations of a relative characteristic (e.g., any of a relative phase and a relative magnitude) of the captured signal in function of time and frequency. When a gesture is accomplished by a body part in the near-field zone of the antenna, the part of the near-field power scattered by the body part may be converted to the opposite polarization and captured by the second port. A signature representative of time variations of the characteristic of the received signal at the second port, relative to the source signal over the swept frequencies, during the gesture duration, may be matched to a reference signature for recognizing the gesture.

Inventors:
LOUZIR ALI (FR)
HASKOU ABDULLAH (FR)
PESIN ANTHONY (FR)
Application Number:
PCT/EP2021/060875
Publication Date:
November 04, 2021
Filing Date:
April 26, 2021
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
INTERDIGITAL CE PATENT HOLDINGS (FR)
International Classes:
H04B5/00; G01S13/00; G01S13/88; G06F3/01
Foreign References:
US20190346928A12019-11-14
US20180120420A12018-05-03
US20180196501A12018-07-12
Attorney, Agent or Firm:
INTERDIGITAL (FR)
Download PDF:
Claims:
CLAIMS

1. A method for recognizing a gesture, the method comprising:

- feeding a first port of a radiating element with a first signal sweeping across a range of frequencies;

- receiving a second signal from a second port of the radiating element, in presence of the gesture in a near field zone of the radiating element;

- obtaining a signature of the gesture, representative of variations of a characteristic of the second signal relative to the first signal in function of time and frequency; and

- recognizing the gesture based on a matching of the signature to a reference signature.

2. The method according to claim 1 , wherein the first port of the radiating element is orthogonally polarized with the second port.

3. The method according to any of claims 1 to 2, wherein the characteristic is a phase or a magnitude.

4. The method according to any of claims 1 to 3, wherein the signature comprises a set of values of the characteristic of the second signal relative to the first signal varying according to time and frequency.

5. The method according to any of claims 1 to 4, further comprising obtaining an initial second signal relative to the first signal from the second port of the radiating element in absence of any gesture in the near field zone of the radiating element, the first port of the radiating element being fed by the first signal.

6. The method according to claim 5, further comprising obtaining initial variations of the characteristic of the initial second signal relative to the first signal in function of time and frequency.

7. The method according to claim 6, wherein the signature is based on a difference between the initial variations of the characteristic and the variations of the characteristic.

8. The method according to any of claims 1 to 7, wherein the matching is based on a correlation or a mean square error between the signature and the reference signature.

9. The method according to any of claims 1 to 8, wherein the reference signature is obtained by a preliminary training with a determined gesture.

10. The method according to any of claims 1 to 8, wherein the reference signature is preconfigured.

11 . The method according to any of claims 1 to 10, wherein the reference signature belongs to a set of reference signatures corresponding to respective gestures, the gesture being recognized among the respective gestures by matching the signature to the set of reference signatures.

12. The method according to any of claims 1 to 11 , wherein the recognized gesture triggers an action in a user interface.

13. The method according to any of claims 1 to 11 , wherein the gesture is any of a hand gesture and a finger gesture.

14. An apparatus for recognizing a gesture, the apparatus comprising:

- a radiating element comprising a first port configured to be fed by a first signal sweeping across a range of frequencies and a second port;

- a receiver configured to obtain a second signal from the second port, in presence of a gesture in a near field zone of the radiating element; and

- a processor configured to obtain a signature of the gesture representative of variations of a characteristic of the second signal relative to the first signal in function of time and frequency and to recognize the gesture based on a matching of the signature to a reference signature.

15. A computer program product for recognizing a gesture, the computer program product comprising program code instructions executable by a processor for:

- obtaining a signature of the gesture representative of variations of a characteristic of a second signal relative to a first signal in function of time and frequency, wherein the second signal is obtained in presence of the gesture in a near field zone of a radiating element, the second signal being obtained from a second port of the radiating element, the first port being fed by the first signal sweeping across a range of frequencies; and

- recognizing the gesture based on a matching of the signature to a reference signature.

Description:
METHOD AND APPARATUS FOR RECOGNIZING A GESTURE

1. TECHNICAL FIELD

The present disclosure relates to the domain of gesture sensing and recognition.

2. BACKGROUND ART

Gesture-based interfaces may allow users to intuitively control devices, with, for example, motions of parts of the body. Applications using gesture recognition may be based on computer vision image processing techniques and may rely on cameras for capturing images of a gesture to be recognized. Some applications may benefit from gesture recognition, without having access to a camera and/or the processing resources for processing captured images. The present disclosure has been designed with the foregoing in mind.

3. SUMMARY

According to embodiments, a gesture may be accomplished by a user near an antenna. For example, the antenna may be a two-port orthogonally polarized antenna. For example, a source signal sweeping across a range of frequencies may feed a first port. A second port may be connected to a receiver, configured to obtain (e.g., measure) variations of a relative characteristic (e.g., any of a relative phase and a relative magnitude) of the captured signal in function of time and frequency. When a gesture is accomplished by a body part in the near-field zone of the antenna, the part of the near-field power scattered by the body part may be converted to the opposite polarization and captured by the second port. A signature representative of time variations of the characteristic of the received signal at the second port, relative to the source signal over the swept frequency band, during the gesture duration, may be matched to a reference signature for recognizing the gesture.

4. BRIEF DESCRIPTION OF THE DRAWINGS - Figure 1 is a diagram illustrating an example of a system for recognizing a gesture based on a radiating element;

- Figure 2A is a diagram illustrating an example of heatmaps, obtained in presence of a gesture near the radiating element;

- Figure 2B is a diagram illustrating an example of initial heatmaps obtained in absence of any gesture near the radiating element;

- Figure 2C is a diagram illustrating another example of heatmaps obtained in presence of a gesture near the radiating element;

- Figure 3 is a diagram illustrating an example of a module for recognizing a gesture based on a radiating element;

- Figure 4A is a diagram illustrating an example of a radiating element configured to recognize a gesture;

- Figure 4B is a diagram illustrating an example of a processing device for recognizing a gesture;

- Figure 4C is a diagram representing an exemplary architecture of a processing device of any of the figures 3 and 4B;

- Figure 5 is a diagram representing a set of exemplary gestures;

- Figure 6 is a diagram illustrating an example of phase heatmaps, obtained in presence of a repetition of a same gesture near the radiating element;

- Figure 7 is a diagram illustrating an example of magnitude heatmaps, obtained in presence of a repetition of a same gesture near the radiating element;

- Figure 8 is a diagram illustrating an example of phase heatmaps, obtained in presence of the exemplary gestures of Figure 5 near the radiating element;

- Figure 9 is a diagram illustrating an example of magnitude heatmaps, obtained in presence of the exemplary gestures of Figure 5 near the radiating element;

- Figure 10 is a diagram illustrating an example of two sets of reference signatures of the exemplary gestures of Figure 5;

- Figure 11 A is table illustrating a confusion matrix obtained when recognizing a gesture based on phase heatmaps according to an embodiment; - Figure 11 B is table illustrating an example of a confusion matrix obtained when recognizing a gesture based on magnitude heatmaps according to an embodiment;

- Figure 12 is a diagram illustrating an example of a method for recognizing a gesture.

It should be understood that the drawing(s) are for purposes of illustrating the concepts of the disclosure and are not necessarily the only possible configuration for illustrating the disclosure.

5. DESCRIPTION OF EMBODIMENTS

It should be understood that the elements shown in the figures may be implemented in various forms of hardware, software or combinations thereof. Preferably, these elements are implemented in a combination of hardware and software on one or more appropriately programmed general-purpose devices, which may include a processor, memory and input/output interfaces. Herein, the term " interconnected " is defined to mean directly connected to or indirectly connected with through one or more intermediate components. Such intermediate components may include both hardware and software-based components. The term “interconnected” is not limited to a wired interconnection and also includes wireless interconnection.

All examples and conditional language recited herein are intended for educational purposes to aid the reader in understanding the principles of the disclosure and the concepts contributed by the inventor to furthering the art and are to be construed as being without limitation to such specifically recited examples and conditions.

Moreover, it will be appreciated by those skilled in the art that the block diagrams presented herein represent conceptual views of illustrative circuitry embodying the principles of the disclosure. Similarly, it will be appreciated that any flow charts, flow diagrams, state transition diagrams, pseudocode, and the like represent various processes which may be substantially represented in computer readable media and so executed by a computer or processor, whether or not such computer or processor is explicitly shown. The functions of the various elements shown in the figures may be provided through the use of dedicated hardware as well as hardware capable of executing software in association with appropriate software. When provided by a processor, the functions may be provided by a single dedicated processor, by a single shared processor, or by a plurality of individual processors, some of which may be shared. Moreover, explicit use of the term “processor” or “controller” should not be construed to refer exclusively to hardware capable of executing software, and may implicitly include, without limitation, digital signal processor (DSP) hardware, read only memory (ROM) for storing software, random access memory (RAM), and non-volatile storage.

Other hardware, conventional and/or custom, may also be included. Similarly, any switches shown in the figures are conceptual only. Their function may be carried out through the operation of program logic, through dedicated logic, through the interaction of program control and dedicated logic, or even manually, the particular technique being selectable by the implementer as more specifically understood from the context.

In the claims hereof, any element expressed as a means for performing a specified function is intended to encompass any way of performing that function including, for example, a) a combination of circuit elements that performs that function or b) software in any form, including, therefore, firmware, microcode or the like, combined with appropriate circuitry for executing that software to perform the function. The disclosure as defined by such claims resides in the fact that the functionalities provided by the various recited means are combined and brought together in the manner which the claims call for. It is thus regarded that any means that can provide those functionalities are equivalent to those shown herein.

It is to be appreciated that the use of any of the following 7”, “and/or”, and “at least one of”, for example, in the cases of “A/B”, “A and/or B” and “at least one of A and B”, is intended to encompass the selection of the first listed option (A) only, or the selection of the second listed option (B) only, or the selection of both options (A and B). As a further example, in the cases of “A, B, and/or C” and “at least one of A, B, and C”, such phrasing is intended to encompass the selection of the first listed option (A) only, or the selection of the second listed option (B) only, or the selection of the third listed option (C) only, or the selection of the first and the second listed options (A and B) only, or the selection of the first and third listed options (A and C) only, or the selection of the second and third listed options (B and C) only, or the selection of all three options (A and B and C). This may be extended, as is clear to one of ordinary skill in this and related arts, for as many items as are listed.

Embodiments described herein may be related to a method and any of a module and an apparatus for radio frequency (RF) gesture sensing and recognition. The module may be any piece of hardware, for example, separable from, connectable to (e.g., a main printed circuit board of) an apparatus. The module may be, for example, cost effective and of a small (e.g., reduced) size. The module may be of a size allowing it to be embedded in any consumer electronics (CE) device.

Embodiments described herein may allow, for example, hands-free control of any of mobile and wearable devices. Embodiments, described herein may allow to enhance (e.g., improve intuitiveness of) any user interface of any CE device with gesture sensing and recognition. For example, a user interface may allow to trigger any kind of action based on a recognized gesture. More generally, embodiments described herein may be applicable to any of user posture, activity recognition and game control.

Embodiments described herein are related to gesture recognition. A gesture may be seen as a movement of, for example, a part of a body of a human. Embodiments are described herein using examples of hand and finger gestures accomplished in proximity of a radiating element. More generally embodiments described herein may be applicable to any kind of gestures (e.g., accomplished by any of a head, an arm, a wrist, an elbow, and any object hold by a user) in the near field zone of a RF antenna.

Figure 1 is a diagram illustrating an example of a system for recognizing a gesture 13 based on (e.g., in proximity of) a radiating element 10. According to embodiments, a radiating element (e.g., an antenna) may comprise a first port 11 and a second port 12, for example, orthogonally polarized with the first port 11. The radiating element 10 may be a single radiating element with two ports 11 , 12 corresponding to two orthogonal polarizations. In another example the first port 11 may not be orthogonally polarized with the second port 12. Any kind of antenna comprising at least two ports 11 , 12, that may be any of orthogonally polarized and non-orthogonally polarized may be applicable to embodiments described herein. According to embodiments, the radiating element 10 may be fed through the first port 11 by a signal 111 , for example, sweeping across a range of frequencies (e.g., around a central frequency). The first signal 111 may be generated, for example, by a frequency swept signal source 14. According to embodiments, the second port 12 may be connected to a receiver 16 which may be configured to obtain (e.g., measure) the time variations of the relative phase and magnitude of the captured signal 112, as function of the frequency. For example, a second signal 112 may be received from the second port 12, in presence of a gesture 13 (e.g., accomplished) in the near field zone of the radiating element 10. When a gesture (by any of a finger, a hand, and any other part of the body) is performed in the antenna’s near-field zone, a part of the near-field power may be scattered by the body part and may be converted e.g., to the orthogonal polarization. The scattered power may be captured by the second port 12 of the radiating element 10. Receiving the second (e.g., scattered) signal 112 from the second port 12, e.g., orthogonally polarized with the first port may allow to improve isolation from the first signal 111 of the scattered part, which may be representative of the gesture. According to embodiments, a processor 18 may be configured to obtain, time variations of a characteristic (e.g., any of the magnitude and the phase) of the received second signal 112 relative to the first signal 111 , over the swept frequency band, during, for example, the gesture duration. The obtained variations of the characteristic of the received second signal 112 relative to the first signal 111 in function of time and frequency may be representative of that gesture and may be referred to herein as a signature of the gesture. According to embodiments, the gesture may be recognized 19, for example, based on a matching of the signature with a reference signature of the gesture.

According to embodiments, variations of a characteristic (e.g., any of a phase and a magnitude) of the second signal relative to the first signal may be represented as a set of values of the characteristic (e.g., any of a phase and a magnitude) of the second signal relative to the first signal varying according to time and frequency. According to embodiments, a set of values of the phase (respectively magnitude) of the second signal relative to the first signal varying according to both time and frequency may be referred to herein as a phase (respectively magnitude) heatmap. For example, a (e.g., phase, magnitude) heatmap, may be represented as a two- dimension matrix of values, where a value corresponds to the value of the signal characteristic (e.g., phase, magnitude) at a given time and frequency.

Figure 2A is a diagram illustrating an example of a phase heatmap 21A and a magnitude heatmap 22A, obtained in presence of a gesture near the radiating element. The illustrated heatmaps 21A, 22A correspond to an exemplary finger snapping gesture, accomplished in proximity of the radiating element. The phase heatmap 21 A may comprise a set of phase values of the second signal relative to the first signal, arranged horizontally according to a varying frequency 200 of the signal and arranged vertically according to a varying time 210. Similarly, the magnitude heatmap 21 A may comprise a set of magnitude values of the second signal relative to the first signal, arranged horizontally according to a varying frequency 200 of the signal and arranged vertically according to a varying time 210.

Figure 2B is a diagram illustrating an example of an initial phase heatmap 21 B and an initial magnitude heatmap 22B, obtained in absence of any gesture near the radiating element. In the absence of any object (e.g., and gesture) in the antenna’s near-field zone, the received signal’s magnitude may be very low and may correspond to the cross-polarization level of the transmitting antenna. The initial heatmaps 21 B, 22B, may comprise a set of values of respectively the phase and the magnitude of the second signal relative to the first signal, arranged in the same way as in Figure 2A. The initial heatmaps 21 B, 22B illustrate, there is no time variation of the phase and magnitude in absence of any gesture in proximity of the radiating element.

Figure 2C is a diagram illustrating another example of heatmaps 21 C, 22C obtained in presence of a gesture near the radiating element. According to embodiments, a (e.g., filtered) heatmap 21 C, 22C, may be obtained by filtering the heatmap 21 A, 22A based on the corresponding initial heatmap 21 B, 22B. For example, a (e.g., filtered) phase map 21 C may be obtained by subtracting the initial phase map 21 B to the phase map 21 A obtained in presence of the gesture. Similarly, a (e.g., filtered) magnitude map 22C may be obtained by subtracting the initial magnitude map 22B to the magnitude map 22A, obtained in presence of the gesture. Filtering the heatmaps may allow to focus the heatmap on the dynamic parts (e.g., most valuable information) of the signal.

Figure 3 is a diagram illustrating an example of a module for recognizing a gesture based on a radiating element. According to embodiments, a frequency swept signal source 34 may be connected to a first port 31 of a radiating element 30. According to embodiments, the signal source 34 may be configured to generate a first signal sweeping across a range of frequencies for feeding the first port 31 . For example, the signal source 34 may include a voltage-controlled oscillator (VCO), controlled by a voltage, having, for example, a sawtooth shape. For example, and as illustrated by Figure 3, repeatedly varying the control voltage from a Vmin value to a Vmax value may allow to generate a first signal of a frequency repeatedly varying from a first value fmin to a second value fmax. According to embodiments, a frequency scanning period 300 may be determined, for example, at least a scale factor smaller than a time scale variation of the gesture. The time scale variation may represent a typical variation of the duration of different occurrences of a same gesture. An exemplary value may be 50ms. According to embodiments, the frequency range (fmin-fmax) and the scanning period may be determined in order to have a plurality of scanning periods (e.g., more than ten) during the gesture to be recognized.

According to embodiment, a receiver 36 may be connected to respectively the first port 31 and the second port 32 of the antenna 30, that may be, for example, a two-port orthogonally polarized antenna. The receiver may include, for example an l/Q demodulator. According to embodiments, the receiver 36 may be configured to receive a second signal from the second port, in presence of a gesture accomplished in the near field zone of the radiating element. According to embodiments, the receiver may be configured to obtain any of a phase and a magnitude of the second signal relative to the first signal in function of time and frequency. Considering a complex representation of the signals (e.g., having both a real and an imaginary part), the second signal relative to the first signal may correspond to the second signal divided by the first signal. For example, the magnitude of the second signal relative to the first signal may represent the magnitude of the second signal divided by the magnitude of the first signal. In another example, the phase of the second signal relative to the first signal may represent the phase of the second signal minus the phase of the first signal. According to embodiments, the receiver may be a homodyne receiver (e.g., represented in Figure 3).

According to embodiments, the radiating element 30 may be fed at a first port 31 with a signal source swiping from e.g., 2.1GHz to 2.5GHz (e.g., corresponding to the radiating element frequency band), at a rate of e.g., one sweep every five millisecond. According to embodiments, any of a phase and a magnitude of the received signal at the second port 32 (e.g., relative to the first signal fed at the first port 31) may be obtained (e.g., measured) by the l/Q demodulator 36 and recorded in function of frequency and time, over the recording duration (e.g., as a heatmap).

Embodiments described herein may use a frequency band (fmin-fmax) of 2.1- 2.5 GHz, a scanning period of five milliseconds, with a record duration of five seconds. The record duration may correspond to the duration over which a heatmap may be obtained (e.g., measured). The record duration may, for example, correspond to the gesture duration plus some margin. Such a configuration may allow to obtain 1000 samples (e.g., of any of phase and magnitude) per frequency point. Any other frequency band, scanning period, and heatmap duration allowing to obtain a set of samples of a signal characteristic representative of a gesture may be applicable to embodiments described herein.

In a variant, the receiver may be a heterodyne receiver (not represented). The received signals may be converted to a lower intermediate frequency (IF), while preserving the relative phase and magnitude of the second signal with regards to the first signal. The receiver may include a local oscillator (LO), which may be configured to operate at a different frequency from the VCO, synchronized to the VCO. The receiver may be configured to digitally process the IF signals based on, for example, an l/Q demodulator. Figure 4A is a diagram illustrating an example of a radiating element 40 configured to recognize a gesture. A square patch antenna 40 is illustrated in Figure 4A. According to embodiments, the square patch antenna 40 may be of total dimensions of 42 * 42 * 1.2 mm 3 . According to embodiments, the square patch antenna may be printed on a flame resistant 4 (FR4) substrate and fed at two 50 Ohm matching points 41 , 42 placed along axis at 90° angles corresponding to orthogonal polarizations. According to embodiments, the square patch antenna 40 may be designed to operate at a central frequency of 2.275 GHz. Any smaller size two-port patch antenna, for example, using higher substrate permittivity may be applicable to the embodiments described herein. Any other (e.g., non-patch) two-port orthogonally polarized radiating element may be applicable to the embodiments described herein. In a variant the radiating element may be any two side-by-side single port orthogonal polarization antenna. In yet another variant, the radiating element may be any antenna comprising at least two (e.g., non-orthogonally polarized) ports.

Figure 4B is a diagram illustrating an example of a processing device 4B for recognizing a gesture. According to embodiments, the processing device 4B may comprise a radiating element 40 (e.g., as illustrated in Figure 4A). According to embodiments, the radiating element 40 may comprise a first port configured to be fed by a first signal sweeping across a range of frequencies and a second port, that may be, for example, orthogonally polarized with the first port. The first signal may be generated by a signal source 44 that may be internal or external to the processing device 4B. According to embodiments, the processing device 4B may comprise a receiver 43 configured to obtain a second signal from the second port, in presence of a gesture in a near field zone of the radiating element 40. According to embodiments, the receiver may be coupled to a processing module 45, configured to obtain a signature of the gesture representative of variations of a characteristic (e.g., any of a phase and a magnitude) of the second signal relative to the first signal in function of time and frequency. According to embodiments, the processing module 45 may be further configured to recognize the gesture based on a matching of the signature to a reference signature.

According to embodiments, the processing module 45 may be coupled to an optional interface module 47. According to embodiments, the interface module 47 may be a network interface module. According to embodiments, the network interface may be any of wired and wireless network interface, any of a local and wide area network interface. According to embodiments, the interface module 47 may be a user interface, running (e.g., and displayed) locally on the processing device 4B. According to embodiments the user interface may be running on another device, communicating with the processing device 4B via the network interface 47. The user interface may allow the processing device 4B to interact with a user, for example, by associating a specific action with a specific gesture and by triggering the specific action based on the recognized gesture. Without limitations, specific actions may include any of unlocking a door, unlocking a device, lighting a lamp, ... In another example, the user interface may propose (e.g., any of display an image displaying, play an audio including, ...) a set of options to a user, wherein an option may be associated with a specific gesture. Upon recognition of the gesture as one of the specific gestures, the user interface may be configured to validate (e.g., select) the option corresponding to the recognized gesture. According to embodiments, the processing device 4B may be a (e.g., basic) sensing device that may be any of separable from and couplable to another (e.g., more complex) processing device. Coupling / separation between both devices may be accomplished via any of a bus interface, a network interface, with e.g., connectors.

Figure 4C represents an exemplary architecture of any of the processing device 4B described herein. The processing device 4B may comprise one or more processor(s) 410, which may be, for example, any of a CPU, a GPU a DSP (English acronym of Digital Signal Processor), along with internal memory 420 (e.g. any of RAM, ROM, EPROM). The processing device 4B may comprise any number of Input/Output interface(s) 430 adapted to send output information and/or to allow a user to enter commands and/or data (e.g. any of a keyboard, a mouse, a touchpad, a webcam, a display), and/or to send / receive data over a network interface; and a power source 440 which may be external to the processing device 4B.

According to embodiments, the processing device 4B may further comprise a computer program stored in the memory 420. The computer program may comprise instructions which, when executed by the processing device 4B, in particular by the processor(s) 410, make the processing device 4B carrying out the processing method described with reference to figure 12. According to a variant, the computer program may be stored externally to the processing device 4B on a non-transitory digital data support, e.g. on an external storage medium such as any of a SD Card, HDD, CD-ROM, DVD, a read-only and/or DVD drive, a DVD Read/Write drive, all known in the art. The processing device 4B may comprise an interface to read the computer program. Further, the processing device 4B may access any number of Universal Serial Bus (USB)-type storage devices (e.g., “memory sticks.”) through corresponding USB ports (not shown).

According to embodiments, the processing device 4B may be any of a server, a desktop computer, a laptop computer, a networking device, a TV set, a tablet, a smartphone, a phablet, a set-top box, an internet gateway, a game console, a head mounted device, an internet of things (loT) device, a wearable device, a doorbell system, a locker, ...

Figure 5 is a diagram representing a set of exemplary gestures, that may be accomplished by a user in the near field zone of the radiating element, for being recognized.

According to embodiments, a first gesture 501 may correspond to a right-hand finger snapping gesture. A finger snapping may be accomplished by, for example, building tension between the thumb and another (middle, index, or ring) finger and then moving the other finger forcefully downward so it may hit the palm of the same hand e.g., at a high speed.

According to embodiments, a second gesture 502 may correspond to a pull right gesture, where, for example, a virtual (e.g., clock) button may be pulled towards the right side of the radiating element.

According to embodiments, a third gesture 503 may correspond to a turn right gesture, where, for example, a virtual (e.g., clock) button may be turned between the right-hand thumb moving upwards and the index moving downwards.

According to embodiments, a fourth gesture 504 may correspond to a turn right twice gesture, where, for example, a virtual (e.g., clock) button may be turned twice between the right-hand thumb moving upwards and the index moving downwards.

According to embodiments, a fifth gesture 505 may correspond to a button clockwise gesture, where, for example, a straight index may turn clockwise perpendicularly to the radiating element.

According to embodiments, a sixth gesture 506 may correspond to a button counter clockwise gesture, where, for example, a straight index may turn counter clockwise perpendicularly to the radiating element.

According to embodiments, a seventh gesture 507 may correspond to a swipe left gesture, where, for example, a straight finger may swipe (e.g., parallel to a radiating element main plan) towards the left of the radiating element.

According to embodiments, an eighth gesture 508 may correspond to a swipe right gesture, where, for example, a straight finger may swipe (e.g., parallel to a radiating element main plan) towards the right of the radiating element.

According to embodiments, a ninth gesture 509 may correspond to a winding up gesture, where, for example, a virtual button, virtually emerging on the radiating element, may be grabbed between right hand thumb and forefinger, the thumb moving upwards and the forefinger moving downwards.

According to embodiment, a tenth gesture 510 may correspond to a winding down gesture, where, for example, a virtual button, virtually emerging on the radiating element, may be grabbed between right hand thumb and forefinger, the thumb moving downwards and the forefinger moving upwards.

According to embodiment, an eleventh gesture 511 may correspond to a zoom in gesture, where, for example, two fingers (e.g., thumb and forefinger) get closer to each other on a virtual touch pad.

According to embodiment, a twelfth gesture 512 may correspond to a zoom out gesture, where, for example, two fingers (e.g., thumb and forefinger) move away from each other on a virtual touch pad. Gestures as listed hereabove are not limiting embodiments described herein. Any gesture that may be accomplished in a near field zone of the radiating element, by any part of a body may be applicable to the embodiments described herein.

Figure 6 is a diagram illustrating an example of phase heatmaps, obtained in presence of a repetition of a same gesture near the radiating element. According to embodiments, accomplishing five occurrences of the same pull right gesture 502 near the radiating element, may allow to obtain a set of five distinct phase heatmaps 61 A, corresponding to the same gesture 502. According to embodiments, five filtered phase heatmaps 61 C may be obtained based on the five distinct phase heatmaps 61 A and on an initial phase heatmap 21 B (e.g., obtained in absence of any gesture near the radiating element). For example, a (e.g., each) filtered heatmap may be obtained by subtracting the initial phase heatmap 21 B, to the corresponding phase heatmap of the set of five distinct phase heatmaps 61 A. Any kind of filtering of a phase heatmap based on an initial heatmap may be applicable to the embodiments described herein.

Figure 7 is a diagram illustrating an example of magnitude heatmaps, obtained in presence of a repetition of a same gesture near the radiating element. According to embodiments, accomplishing five occurrences of the same pull right gesture 502 near the radiating element, may allow to obtain a set of five distinct magnitude heatmaps 72A, corresponding to the same gesture 502. According to embodiments, five filtered magnitude heatmaps 72C may be obtained based on the five distinct magnitude heatmaps 72A and on an initial magnitude heatmap 22B (e.g., obtained in absence of any gesture near the radiating element). For example, a (e.g., each) filtered heatmap may be obtained by subtracting the initial magnitude heatmap 22B, to the corresponding magnitude heatmap of the set of five distinct magnitude heatmaps 72A. Any kind of filtering of a magnitude heatmap based on an initial heatmap may be applicable to the embodiments described herein.

Figure 8 is a diagram illustrating an example of phase heatmaps 81 A, 81 C, obtained in presence of the exemplary gestures of Figure 5 near the radiating element. Figure 8 illustrates a first set of phase heatmaps 81 A, where each phase heatmap corresponds to a different gesture as illustrated in Figure 5. Figure 8 illustrates a second set of filtered phase heatmaps 81 C (e.g., initial phase heatmap subtraction), where each filtered phase heatmap corresponds to a different gesture as illustrated in Figure 5.

Figure 9 is a diagram illustrating an example of magnitude heatmaps 92A, 92C, obtained in presence of the exemplary gestures of Figure 5 near the radiating element. Figure 9 illustrates a first set of magnitude heatmaps 92A, where each magnitude heatmap corresponds to a different gesture as illustrated in Figure 5. Figure 9 illustrates a second set of filtered magnitude heatmaps 92C (e.g., initial magnitude heatmap subtraction), where each filtered magnitude heatmap corresponds to a different gesture as illustrated in Figure 5.

It may be noted from Figures 6 and Figure 7, that apart from a minor time shift, heatmaps samples corresponding to a same gesture are similar. It may also be noted from Figures 8 and Figure 9 that heatmaps corresponding to different gestures may appear to be sufficiently different to be discriminable.

According to embodiments, a signature of a gesture may be any of a filtered, unfiltered, phase and magnitude heatmap corresponding to a gesture. In a first example, the signature of a gesture may be a phase heatmap obtained in presence of the gesture in the near field zone of the radiating element. In a second example, the signature of a gesture may be a magnitude heatmap obtained in presence of the gesture in the near field zone of the radiating element. In a third example, the signature of a gesture may be a filtered phase heatmap obtained in presence of the gesture in the near field zone of the radiating element. In a third example, the signature of a gesture may be a filtered magnitude heatmap obtained in presence of the gesture in the near field zone of the radiating element. In a first variant, a filtered (e.g., phase, magnitude) heatmap may be obtained based on a difference between an initial (e.g., phase, magnitude) heatmap and the (e.g., phase, magnitude) heatmap. In a second variant, a filtered (e.g., phase, magnitude) heatmap may be obtained by e.g., any of averaging, clipping a set of (e.g., phase, magnitude) heatmap samples obtained from occurrences of a same gesture near the radiating element. In yet another example, a signature of a gesture may be obtained based on a combination of phase and magnitude heatmaps corresponding to the same gesture.

According to embodiments, a signature of a gesture (e.g., accomplished in the near field zone of the radiating element) may be compared to a reference signature of the gesture, and the gesture may be recognized based on the matching between the signature and the reference signature.

According to embodiments the reference signature of a gesture may be preconfigured. For example, a processing device configured to recognize a gesture may be pre-configured with the reference signature of that gesture. The reference signature may be preconfigured e.g., when the processing device may be manufactured. In another example the reference signature of the gesture may be downloaded, for example from a web server.

According to embodiments, a reference signature of a (e.g., pre-defined) gesture may be obtained based on a preliminary training with the corresponding (e.g., pre-defined) gesture. For example, the same (e.g., pre-defined) gesture may be repeated in the near field zone of the radiating element, and a set of signatures of the gesture may be obtained. For example, a (e.g., each) signature of the set of signatures may be obtained in presence of an occurrence of the gesture repetition in the near field zone of the radiating element. According to embodiments, the gesture may be repeated by a same user and/or different users. According to embodiments the set of signatures may comprise signatures according to any variant and/or examples described above. According to embodiments, the reference signature may be obtained by processing the set of signatures. For example, the set of signatures (e.g., heatmaps) may be averaged. In another example, the set of signatures (e.g., heatmaps) may be clipped at a given value (e.g., before averaging). Any data processing method for obtaining a reference signature based on a set of signatures representative of a same gesture may be compatible with embodiments described herein.

According to embodiments, the reference signature may belong to a set of reference signatures of a set of respective (e.g., pre-defined) gestures. The set of reference gestures may be any of pre-configured and preliminary obtained via training.

Figure 10 is a diagram illustrating an example of two sets of reference signatures of the exemplary gestures of Figure 5. Figure 10 illustrates a first set 1001C of reference signatures based on filtered phase heatmaps, where each reference signature corresponds to a different gesture as illustrated in Figure 5. Figure 10 illustrates a second set 1002C of reference signatures based on filtered magnitude heatmaps, where each reference signature corresponds to a different gesture as illustrated in Figure 5.

According to embodiments, a signature representative of a gesture (e.g., accomplished in the near field zone of the radiating element) may be matched to a set of (e.g., reference) signatures. For example, the signature may be correlated with (e.g., each) reference signature, and the gesture may be recognized based on the correlations between the signature and the reference signatures. For example, the gesture corresponding to highest correlated reference signature (e.g., for any of the phase and magnitude heatmap) may be recognized. According to embodiments, the gesture may be recognized based on a matching of the obtained signature to a (e.g. set of) reference signature(s). In a first variant, the matching may be based on a correlation between the signature and the (e.g. set of) reference signature(s). For example, a 2D correlation of an M-by-N matrix, X, and a P-by-Q matrix, H, may be a matrix, C, of size M+P-1 by N+Q-1 , which elements may be given by:

As it may be seen from the equation above, the correlation may be tested for all the possible shifts (k,l), in both dimensions (i.e. frequency and time), between the two matrices (X,H). The maximum correlation’s value may be independent from the shift between the two matrices, in either dimension, only its location may change. For example, referring to Figure 6, and calculating the correlation between each filtered heatmap of the set of filtered heatmaps 61 C with the left-hand side filtered heatmap provides the following set of results:

Correlation (1:1) X=0, Y=0, Z=1 Correlation (1 :2) X=0, Y=0.3, Z=0.78 Correlation (1 :3) X=0, Y=0.4, Z=0.77 Correlation (1 :4) X=0, Y=1.15, Z=0.62 Correlation (1 :5) X=0, Y=0.5, Z=0.75

From the above results, it may be noted that a maximum auto-correlation value of 1 (equivalent to 100%) may be achieved for zero-shift in both dimensions. Since, no frequency shift may be observed between the different elements, the maximum correlation may always be achieved forX=0. Since, the second element is shifted in time by +0.3 seconds compared to the first one, the maximum correlation value of 0.78 (78%) may be achieved for Y=-0.3seconds. Since, the fourth element is shifted in time by -1.15 seconds compared to the first one, the maximum correlation 0.62 (62%) may be achieved for Y=1.15seconds. Finally, since the fourth heatmap in the set of heatmaps 61 C is not complete (the event began before starting registering the data), this element may present the lowest maximum correlation value of 0.62 (62%).

According to embodiment, the signature may be matched to the (e.g., set of) reference signature(s) based on a correlation between the signature and the (e.g., set of) reference signature(s). For example, the signature and a reference signature may match on condition that the correlation is above a value. In another example the signature and a reference signature may match on condition that the correlation with the reference signature is higher than other correlations with other reference signatures of the set of reference signatures.

In a second variant, the matching may be based on a mean square error between the signature and the (e.g. set of) reference signature(s). For example, a mean square error between two 2D (M-by-N) matrices may be obtained, for example by vectorizing each matrix into a one dimension vector of M * N coefficients and by calculating the Euclidean distance between both vectors. Similarly, the signature and a reference signature may match on condition that the mean square error is below a value. In another example the signature and a reference signature may match on condition that the mean square error obtained with the reference signature is lower than other mean square errors obtained with other reference signatures of the set of reference signatures.

Any technique for evaluating similarity between signatures may be applicable to embodiments described herein.

Figure 11A and Figure 11 B are two tables illustrating two confusion matrices obtained when recognizing a gesture based on respectively phase and magnitude heatmaps according to an embodiment. Both confusion matrixes have been obtained by repeating attempts to recognize a gesture among the set of pre-defined gestures described in Figure 5. Both matrixes represent rate of success each gesture has been successfully recognized, using a two-dimension-correlation-based matching.

Figure 12 is a diagram illustrating an example of a method for recognizing a gesture. According to embodiments, in a step 1210, a first port of a (e.g., at least two- port) radiating element may be fed with a first signal sweeping across a range of frequencies. For example, the first port may be orthogonally polarized with a second port. According to embodiments, in a step 1220, a second signal may be received from the second port of the radiating element, in presence of (e.g., an instance of) the gesture in a near field zone of the radiating element. The second signal may comprise a part of the near field power scattered by the part of the body having accomplished the (e.g., instance of the) gesture. According to embodiments, in a step 1230, a signature of the (e.g., instance of the) gesture may be obtained. The signature may be representative of variations of a signal characteristic of the second signal relative to the first signal in function of time and frequency.

According to embodiments, the signal characteristic may be any of a phase and a magnitude.

According to embodiments, the signature may comprise a set of values of the signal characteristic of the second signal relative to the first signal varying according to time and frequency.

According to embodiments, an initial second signal may be obtained relative to the first signal from the second port of the radiating element in absence of any gesture in the near field zone of the radiating element, the first port of the radiating element being fed by the first signal.

According to embodiments, initial variations of the signal characteristic of the initial second signal may be obtained relative to the first signal in function of time and frequency.

According to embodiments, initial variations of the signature may be based on a difference between the initial variations of the signal characteristic and the variations of the signal characteristic.

According to embodiments, in a step 1240, the gesture may be recognized based on a matching of the signature to a reference signature of the gesture.

According to embodiments, the matching may be based on a correlation between the signature and the reference signature.

According to embodiments, the matching may be based on a mean square error between the signature and the reference signature.

According to embodiments, the reference signature may be obtained by a preliminary training with a corresponding (e.g., pre-determined, initial instance of the) gesture.

According to embodiments, the reference signature may be pre-configured.

According to embodiments, the reference signature may belong to a set of reference signatures corresponding to respective gestures, the gesture being recognized among the respective gestures by matching the signature to the set of reference signatures.

According to embodiments, the recognized gesture may trigger an action in a user interface.

According to embodiments, the gesture may be any of a hand gesture and a finger gesture.

Embodiments described herein may allow to build low power, low cost, privacy preserving gesture recognition systems. Indeed, unlike computer vision-based systems, which may recognise gestures by processing images of the user accomplishing the gesture, RF based gesture recognition systems, as described herein may not rely on any image of the user for recognizing the gesture. While not explicitly described, the present embodiments may be employed in any combination or sub-combination. For example, the present principles are not limited to the described variants, and any arrangement of variants and embodiments may be used. Moreover, embodiments described herein are not limited to the (e.g., finger and hand) gestures and frequency bands described herein and any other types of gestures (e.g., involving other body parts) and/or frequency bands (and/or antenna shapes) may be compatible with the embodiments described herein.

Besides, any characteristic, variant or embodiment described for a method is compatible with an apparatus device comprising means for processing the disclosed method, with a device comprising a processor configured to process the disclosed method, with a computer program product comprising program code instructions and with a non-transitory computer-readable storage medium storing program instructions.

Although features and elements are described above in particular combinations, one of ordinary skill in the art will appreciate that each feature or element can be used alone or in any combination with the other features and elements. In addition, the methods described herein may be implemented in a computer program, software, or firmware incorporated in a computer readable medium for execution by a computer or processor. Examples of non-transitory computer-readable storage media include, but are not limited to, a read only memory (ROM), random access memory (RAM), a register, cache memory, semiconductor memory devices, magnetic media such as internal hard disks and removable disks, magneto-optical media, and optical media such as CD-ROM disks, and digital versatile disks (DVDs).

Moreover, in the embodiments described above, processing platforms, computing systems, controllers, and other devices containing processors are noted. These devices may contain at least one Central Processing Unit ("CPU") and memory. In accordance with the practices of persons skilled in the art of computer programming, reference to acts and symbolic representations of operations or instructions may be performed by the various CPUs and memories. Such acts and operations or instructions may be referred to as being "executed," "computer executed" or "CPU executed." One of ordinary skill in the art will appreciate that the acts and symbolically represented operations or instructions include the manipulation of electrical signals by the CPU. An electrical system represents data bits that can cause a resulting transformation or reduction of the electrical signals and the maintenance of data bits at memory locations in a memory system to thereby reconfigure or otherwise alter the CPU's operation, as well as other processing of signals. The memory locations where data bits are maintained are physical locations that have particular electrical, magnetic, optical, or organic properties corresponding to or representative of the data bits. It should be understood that the representative embodiments are not limited to the above-mentioned platforms or CPUs and that other platforms and CPUs may support the provided methods.

The data bits may also be maintained on a computer readable medium including magnetic disks, optical disks, and any other volatile (e.g., Random Access Memory ("RAM")) or non-volatile (e.g., Read-Only Memory ("ROM")) mass storage system readable by the CPU. The computer readable medium may include cooperating or interconnected computer readable medium, which exist exclusively on the processing system or are distributed among multiple interconnected processing systems that may be local or remote to the processing system. It is understood that the representative embodiments are not limited to the above- mentioned memories and that other platforms and memories may support the described methods.

In an illustrative embodiment, any of the operations, processes, etc. described herein may be implemented as computer-readable instructions stored on a computer- readable medium. The computer-readable instructions may be executed by a processor of a mobile unit, a network element, and/or any other computing device.

There is little distinction left between hardware and software implementations of aspects of systems. The use of hardware or software is generally (e.g., but not always, in that in certain contexts the choice between hardware and software may become significant) a design choice representing cost vs. efficiency tradeoffs. There may be various vehicles by which processes and/or systems and/or other technologies described herein may be effected (e.g., hardware, software, and/or firmware), and the preferred vehicle may vary with the context in which the processes and/or systems and/or other technologies are deployed. For example, if an implementer determines that speed and accuracy are paramount, the implementer may opt for a mainly hardware and/or firmware vehicle. If flexibility is paramount, the implementer may opt for a mainly software implementation. Alternatively, the implementer may opt for some combination of hardware, software, and/or firmware.

The foregoing detailed description has set forth various embodiments of the devices and/or processes via the use of block diagrams, flowcharts, and/or examples. Insofar as such block diagrams, flowcharts, and/or examples contain one or more functions and/or operations, it will be understood by those within the art that each function and/or operation within such block diagrams, flowcharts, or examples may be implemented, individually and/or collectively, by a wide range of hardware, software, firmware, or virtually any combination thereof. Suitable processors include, by way of example, a general purpose processor, a special purpose processor, a conventional processor, a digital signal processor (DSP), a plurality of microprocessors, one or more microprocessors in association with a DSP core, a controller, a microcontroller, Application Specific Integrated Circuits (ASICs), Application Specific Standard Products (ASSPs); Field Programmable Gate Arrays (FPGAs) circuits, any other type of integrated circuit (1C), and/or a state machine.

Although features and elements are provided above in particular combinations, one of ordinary skill in the art will appreciate that each feature or element can be used alone or in any combination with the other features and elements. The present disclosure is not to be limited in terms of the particular embodiments described in this application, which are intended as illustrations of various aspects. Many modifications and variations may be made without departing from its spirit and scope, as will be apparent to those skilled in the art. No element, act, or instruction used in the description of the present application should be construed as critical or essential to the invention unless explicitly provided as such. Functionally equivalent methods and apparatuses within the scope of the disclosure, in addition to those enumerated herein, will be apparent to those skilled in the art from the foregoing descriptions. Such modifications and variations are intended to fall within the scope of the appended claims. The present disclosure is to be limited only by the terms of the appended claims, along with the full scope of equivalents to which such claims are entitled. It is to be understood that this disclosure is not limited to particular methods or systems. In certain representative embodiments, several portions of the subject matter described herein may be implemented via Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs), digital signal processors (DSPs), and/or other integrated formats. However, those skilled in the art will recognize that some aspects of the embodiments disclosed herein, in whole or in part, may be equivalently implemented in integrated circuits, as one or more computer programs running on one or more computers (e.g., as one or more programs running on one or more computer systems), as one or more programs running on one or more processors (e.g., as one or more programs running on one or more microprocessors), as firmware, or as virtually any combination thereof, and that designing the circuitry and/or writing the code for the software and or firmware would be well within the skill of one of skill in the art in light of this disclosure. In addition, those skilled in the art will appreciate that the mechanisms of the subject matter described herein may be distributed as a program product in a variety of forms, and that an illustrative embodiment of the subject matter described herein applies regardless of the particular type of signal bearing medium used to actually carry out the distribution. Examples of a signal bearing medium include, but are not limited to, the following: a recordable type medium such as a floppy disk, a hard disk drive, a CD, a DVD, a digital tape, a computer memory, etc., and a transmission type medium such as a digital and/or an analog communication medium (e.g., a fiber optic cable, a waveguide, a wired communications link, a wireless communication link, etc.).

The herein described subject matter sometimes illustrates different components contained within, or connected with, different other components. It is to be understood that such depicted architectures are merely examples, and that in fact many other architectures may be implemented which achieve the same functionality. In a conceptual sense, any arrangement of components to achieve the same functionality is effectively "associated" such that the desired functionality may be achieved. Hence, any two components herein combined to achieve a particular functionality may be seen as "associated with" each other such that the desired functionality is achieved, irrespective of architectures or intermediate components. Likewise, any two components so associated may also be viewed as being "operably connected", or "operably coupled", to each other to achieve the desired functionality, and any two components capable of being so associated may also be viewed as being "operably couplable" to each other to achieve the desired functionality. Specific examples of operably couplable include but are not limited to physically mateable and/or physically interacting components and/or wirelessly interactable and/or wirelessly interacting components and/or logically interacting and/or logically interactable components.

With respect to the use of substantially any plural and/or singular terms herein, those having skill in the art can translate from the plural to the singular and/or from the singular to the plural as is appropriate to the context and/or application. The various singular/plural permutations may be expressly set forth herein for sake of clarity.

It will be understood by those within the art that, in general, terms used herein, and especially in the appended claims (e.g., bodies of the appended claims) are generally intended as "open" terms (e.g., the term "including" should be interpreted as "including but not limited to," the term "having" should be interpreted as "having at least," the term "includes" should be interpreted as "includes but is not limited to," etc.). It will be further understood by those within the art that if a specific number of an introduced claim recitation is intended, such an intent will be explicitly recited in the claim, and in the absence of such recitation no such intent is present. For example, where only one item is intended, the term "single" or similar language may be used. As an aid to understanding, the following appended claims and/or the descriptions herein may contain usage of the introductory phrases "at least one" and "one or more" to introduce claim recitations. However, the use of such phrases should not be construed to imply that the introduction of a claim recitation by the indefinite articles "a" or "an" limits any particular claim containing such introduced claim recitation to embodiments containing only one such recitation, even when the same claim includes the introductory phrases "one or more" or "at least one" and indefinite articles such as "a" or "an" (e.g., "a" and/or "an" should be interpreted to mean "at least one" or "one or more"). The same holds true for the use of definite articles used to introduce claim recitations. In addition, even if a specific number of an introduced claim recitation is explicitly recited, those skilled in the art will recognize that such recitation should be interpreted to mean at least the recited number (e.g., the bare recitation of "two recitations," without other modifiers, means at least two recitations, or two or more recitations). Furthermore, in those instances where a convention analogous to "at least one of A, B, and C, etc." is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., "a system having at least one of A, B, and C" would include but not be limited to systems that have A alone, B alone, C alone, A and B together, A and C together, B and C together, and/or A, B, and C together, etc.). In those instances where a convention analogous to "at least one of A, B, or C, etc." is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., "a system having at least one of A, B, or C" would include but not be limited to systems that have A alone, B alone, C alone, A and B together, A and C together, B and C together, and/or A, B, and C together, etc.). It will be further understood by those within the art that virtually any disjunctive word and/or phrase presenting two or more alternative terms, whether in the description, claims, or drawings, should be understood to contemplate the possibilities of including one of the terms, either of the terms, or both terms. For example, the phrase "A or B" will be understood to include the possibilities of "A" or "B" or "A and B." Further, the terms "any of" followed by a listing of a plurality of items and/or a plurality of categories of items, as used herein, are intended to include "any of," "any combination of," "any multiple of," and/or "any combination of multiples of" the items and/or the categories of items, individually or in conjunction with other items and/or other categories of items. Moreover, as used herein, the term "set" or “group” is intended to include any number of items, including zero. Additionally, as used herein, the term "number" is intended to include any number, including zero.

In addition, where features or aspects of the disclosure are described in terms of Markush groups, those skilled in the art will recognize that the disclosure is also thereby described in terms of any individual member or subgroup of members of the Markush group.

Moreover, the claims should not be read as limited to the provided order or elements unless stated to that effect. In addition, use of the terms "means for" in any claim is intended to invoke 35 U.S.C. §112, U 6 or means-plus-function claim format, and any claim without the terms "means for" is not so intended.