Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
ELECTRONIC PROCESSING DEVICE FOR BIOMETRIC SIGNALS THAT ALLOWS THE MANIPULATION OF ACTUATORS
Document Type and Number:
WIPO Patent Application WO/2022/229687
Kind Code:
A1
Abstract:
The present invention seeks a selection of wide range of input biometric signals and output systems; process and classify these biometric signals universally by integrating different specialized and optimized electronic processing components with different processing methods, allowing direct analog and digital electrical connections that do not require communication with an external unit for processing; implements,to achieve the above, a plurality of sensor units; a modular processing unit; and a plurality of connection units that allow connection with a plurality of external electronic devices through wired or wireless connections.

Inventors:
PONCE CRUZ PEDRO (MX)
MOLINA GUTIÉRREZ ARTURO (MX)
MATA JUÁREZ OMAR (MX)
Application Number:
PCT/IB2021/055933
Publication Date:
November 03, 2022
Filing Date:
July 01, 2021
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
INST TECNOLOGICO ESTUDIOS SUPERIORES MONTERREY (MX)
International Classes:
G06F3/041; G06F3/01; G10L15/08
Foreign References:
US20070024579A12007-02-01
US5360971A1994-11-01
US6433679B12002-08-13
US20160189491A12016-06-30
US20050248901A12005-11-10
US20100100646A12010-04-22
US20070233425A12007-10-04
US20090043580A12009-02-12
Attorney, Agent or Firm:
ESPEJO-HINOJOSA, Vicente Octavio et al. (MX)
Download PDF:
Claims:
Claims

NEW INVENTION

1 . An electronic processing device characterizedby: a power source to provide power to the processing device; a plurality of sensor units configured to send at least one input signal to a modular processing unit; a selection user interface that is in communication with the modular processing unit and is configured to allow the exchange, as needed by a user, of each input signal; a modular processing unit configured to process each input signal received by the plurality of sensor units and to send at least one control signal to a plurality of connection units; and a plurality of connection units configured to receive each control signal from the modular processing unit; , to connect to a plurality of external electronic devices and to transmit each control signal to the plurality of external electronic devices.

2. The device according to claim 1, further characterized because each sensor unit isof the plurality of sensor units is selected from the group comprising: a head motion sensor unit, a voice command sensor unit, an eye movement sensor unit, and/or an ultrasonic sensor unit.

3. The device in accordance with claim 1, further characterized by the input signals received by the modular processing unit are selected from: EMG signals (electromyography), EEG (electroencephalography) signals, inertial signals, signals received from an acceleration sensor, and/or signals received by an optical sensor.

4. The device according to claim 1 , further characterized because each connection unit of the plurality of connection units is selected from the group comprising: wired connectivity ports (an analog audio output, an ethernet connector, a set of general input and output pins, universal serial buses or USB ports), a Bluetooth unit, and/or a Wi-fi unit.

5. The device in accordance with claim 1 , further characterized because each electronic device of the plurality of electronic devices is selected from the group comprising: televisions, lighting systems, computer systems, smartphones, or any other electronic hardware with an analog or digital input connection.

6. The device in accordance with claim 1 , further characterized by the powersource for theelectronic processing device being selected from a battery bank, capacitors or power supply network of any house, building, school, hospital, or any place where this device is used.

7. The device according to claim 1 , further characterized by the modular processing unit comprising: a first microcontroller of at least 8 bits of word length configured to receive input signals, process input signals, and to write and read data from a second microcontroller as well as from each digital-to-analog conversion circuit (DAC); asecond microcontroller of at least 32 bits of word length configured to process the written data of the first microcontroller and/or each DAC conversion circuit, and to write and read data from the first microcontroller and/or each DAC conversion circuit;at least one DAC conversion circuitconfigured to convert the written data of the first and second microcontrollers , and to write and read data from the first and second microcontroller and/or the plurality of connectionunits; a first port of the first microcontroller with universal asynchronous serial communication protocol configured to perform a write and read data of at least 38400 bits per second; a second port of the second microcontroller with universal asynchronous serial communication protocol configured to perform a write and read data of at least 38400 bits per second, the first port and the second port are configured for information exchange between the first microcontroller and the second microcontroller; a third port of the first microcontroller and a fourth port of the second microcontroller with universal asynchronous serial communication protocol configured to perform a data reception and transmission of at least 38400 bits per second , intended for the exchange of information with other elements; a processor configured to perform multiprocessing tasks that is configured to process read data from the elements of the modular processing unit;a RAM that isconfigured to store data from different processes performed by the modular processing unit;a heat exchangefan generated by the elements of the modular processing unit;and a memory card configured to store data, software and/or an operating system of the modular processing unit.

8. The device in accordance with claim 7, further characterized by the modular processing unit being configured with a plurality of processing methods.

9. The device according to claim 8, further characterized because each method of processing the plurality of processing methods is selected from: a method of using artificial intelligence algorithms such as fuzzy logic, neural networks, and signal filtering; an intelligent control processingmethod; and/or a machine learning processing method.

10. The device in accordance with claim 9, further characterized because the modular processing units configured to connect to external electronic devices through the plurality of connection units by means of the UART communication protocol, the connections are wired or wireless, depending on the characteristics and technical specifications of the external electronic devices.

11. The device according to claim 10, further characterized because the first microcontroller is configured to use an I2C communication protocol (two-wire interface) that is enabled by two pins of the first microcontroller and operates in a master and slave configuration.

12. The device according to claim 11 , further characterized because the first microcontroller has at least 23 general digital input and output pins that are configured two independently.

13. The device according to claim 12, further characterized because the first microcontroller has an additional serial peripheral interface with a master or slave configuration.

14. The device according to claim 13, further characterized by the peripheralcharacteristics of the first microcontroller include a timer/counter of at least 16 bits, at least one 8-bit timer/counter, at least six pulse width modulation channels and at least eight analog-to-digital conversion channels to sample low-frequency signals with a maximum frequency of 250 Hz.

15. The device according to claim 14, further characterized by the first microcontroller operating at a minimum clock frequency of 16MHz.

16. The device according to claim 7, further characterized because in the modular processing unit each DAC circuit has at least a resolution of 12 bits, an operating voltage of 5 V with an I2C serial communication protocol and a voltage output of 0 V to 5 V.

17. The device according to claim 1 , further characterized because the selection user interface comprises a touch screen showing a human-machine interface (IHM) of a plurality of IHM interfaces, each IHM interface corresponds to each of the sensor units of the plurality of sensor units, each IHM interface is configured to allow interaction between the user and the device.

18. The device according to claim 18, further characterized because the selection user interface includes a switch.

19. The device according to claim 18, further characterized because each z-IHM interface of theplurality of IHM interfaces is selected from: an eye movement sensor interface, an acceleration sensor interface, and/or a voice command sensor interface.

20. The device in accordance with claim 19, further characterized because the plurality interfaces IHM is configured to allow the selection of a wired or wireless connection with external electronic devices, determininghow each control signalissent to the external electronic device with which the user wishes to interactand /or communicate through each input signal.

21. The device according to claim 20, further characterized because the IHM user interface of I to plurality of IHM interfaces is configured to interact with the user input information and is interconnected with the modular processing unit.

22. The device in accordance with claim 2, further characterized by the plurality of sensor units includes a head movement unit comprising an acceleration sensor mounted inside a wireless headband, the acceleration sensor is configured to send an input signal to a second microcontroller; a second microcontroller of at least 32 bits that is configured to execute an algorithm to calculate tilt angles, filter the input signal of the acceleration sensor, and use a fuzzy logic controller (FLC); and an FLC controller that isconfigured to send a processed input signal to the first microcontroller.

23. The device according to claim 22, further characterized by the fact that the acceleration sensor is selected froma triple axis accelerometer to measure a degree of inclination of the user's head in an "x" direction and in a "y" direction, without taking into account the position of the sensor within the headband; the acceleration sensor is in communication with the second microcontroller via an I2C serial communication protocol; the FLC controller processthe input signal and sendaresultto the first microcontroller.

24. The device according to claim 23, further characterized by the communication betweenthe first microcontroller andthe second microcontroller is performed with a universal asynchronous serial communication protocol, the speed of writing and reading data that reaches a minimum speed of 38400 bit per second.

25. The device in accordance with claim 24, further characterized because the first microcontroller is configured to also use an I2C serial communication protocol to transmit data between other devices.

26. The device in accordance with claim 22, further characterized by the modular processing unit being configured to use a DAC circuit to convert the information received from the acceleration sensor into at least one control signal that controls external electronic devices.

27. The device in accordance with claim 26, further characterized by the transmission of the CONTROL signal of the DAC circuit is transmitted to external electronic devices using one of theplurality of connection units.

28. The device in accordance with claim 22, further characterized by the modular processing unit, relying on the FLC controller, is configured to use an intelligent control processing method to control external electronic devices with a particular behavior based on the inclination detected by the acceleration sensor in the"x" and "y" directions that corresponds to a head tilt motion based on the user's decisions.

29. The device in accordance with claim 28, further characterized by the intelligent control processing method comprises three main stages for generating a control signal: afuzzification stage that is configured to evaluate a degree of belonging that has an input value, with respect to the different degrees of memberships that are had within a language rule domain, and to obtain aninput membershipvalue; an inference mechanism stage that is configured to evaluate the linguistic rules of a language rule domain if-then, and to obtain an outbound membership with respect to the inputmembership value;and adefuzzification stage that is configured to convert an exit membership degree into an outbound membership value , and calculate the numerical response of the FLC controller;at each stage the modular processing unit is in communication with the FLC controller that writes the data for processing.

30. The device in accordance with claim 29, further characterized because its para each range or degree of thecalculated inclination of the acceleration sensor, membership functions are chosen for the fuzzification stage of the FLC controller.

31. The device in accordance with claim 29, further characterized by the inference mechanism stage, the language rules chosen are based on human experience.

32. The device in accordance with claim 29, further characterized because the FLC controller is configured to use a linear membership function for the defuzzification stage and to press at the end of the defuzzification stage a control signal.

33. The device according to claim 29, further characterized because the FLC controller is represented as a surface of its input and output ratio, for this purpose surfaces are generated for the direction "x" and the direction "y" of the acceleration sensor.

34. The device in accordance with claim 33, further characterized because the surfaces are subjected to regression analysis to obtain a polynomial expression.

35. The device in accordance with claim 34, further characterized because the polynomial expression is embedded in the second microcontroller to modify the hardware of the modular processing unit andallow interaction with external electronicdevices.

36. The device in accordance with claim 35, further characterized because the resulting polynomial expressions are as follows:

(p) the above expressions, M and D represent an external electronic device control signal in the "x" and "y" direction respectively; a, b, c and dare the regression coefficients obtained that are calculated based on the polynomial regression analysis; and x and y are the independent variables that correspond to the measurement of the tilt angle of the acceleration sensor.

37. The device in accordance with claim 36, further characterized because the polynomial expressions M and D are programmed and embedded within the second microcontroller and consequently generate a control signal to send it to the plurality of connection units and then to external electronic devices.

38. The device according to claim 22, further characterized because the first microcontroller is configured to turn on an LED to indicate user interaction each time it tries to perform a movement or action.

39. The device in accordance with claim 2, further characterized by the plurality of sensor units includes an eye movement sensor unit comprising at least one eye movement sensor that is mounted on a headband, each eye movement sensor is positioned so that it is in front of a user's eye, each eye movement sensor generates an analog signal that varies with the movements of the user's eye; each eye movement sensoris connected with at least one analog- digital converter circuit (ADC) of the modular processing unit; each ADC converter circuit is configured to receive the analog signal sent by each eye movement sensor, and to sample the analog signal of each sensor.

40. The device in accordance with claim 39, further characterized by the addition ofamodular processingunit including a second microcontroller that is configured to receive a sampled signal for each ADC converter circuit.

41. The device according to claim 40, further characterized because the modular processing unit is additionally configured to execute an algorithm capable of receiving the sampled signal from the ADC circuit, applying a series of filters passes high and cascaded passes to condition the sampled signal; to take a set of sampled signal samples; and to run with them an algorithm with a multilayer neural network to identify eye movement patterns.

42. The device in accordance with claim 41 , further characterized because the second microcontroller is in communication with the first microcontroller wirelessly through a universal asynchronous serial communication protocol, the speed of writing and reading data that reaches a minimum speed of 38400 bit per second.

43. The device according to claim 42, further characterized because the first microcontroller is configured to also use an I2C serial communication protocol to transmit data between other devices.

44. The device in accordance with claim 43, further characterized because the modular processing unit is configured to use the second microcontroller to process the information of each DAC circuit, and through one of the pluralities of connection units send a control signal that controls the external electronic devices.

45. The device in accordance with claim 44, further characterized by the transmission of the control signal to external electronic devices using one of the pluralities of connection units.

46. The device in accordance with claim 39, further characterized by the eye movement sensor also comprising at least two optical sensors that are placed just in front of each eye of theuser, on the sides of the iris in the direction of the sclera of the eye.

47. The device according to claim 46, further characterized by the eye movement sensor being able to distinguish at least three eye movements.

48. The device according to claim 47, further characterized because the eye movement sensor is placed on the headband of the head movement unit.

49. The device according to claim 48, further characterized because the headband of the head movement unit is placed at the top of the head, around the front ofthe user; and the eye movement sensor unit is installed on the front of the headband, so that the sensor is positioned right in front of the user's eye.

50. The device in accordance with claim 2, further characterized because the plurality of sensor units includes a voice command sensor unit comprising a microphone with conventional headphone jack as a means for the user to record his voice, the microphone is connected to a speech recognition circuit; and a speech recognition circuit that is connected to the microphone and is configured to process the audio signal received from the microphone, identify at least one voice command in the audio signal, and send a character string to the modular processing unit, via a channel with universal asynchronous serial communication protocol, the speed of writing and reading data that reaches a minimum speed of 38400 bit per second.

51. The device according to claim 50, further characterized by the modular processing unit, using the first microcontroller that is additionally configured to receive the characters sent by the speech recognition module, process the character string, pass a control signal through a DAC circuit, and send it through the plurality of connection units to external electronic devices.

52. The device in accordance with claim 51 , further characterized because the voice command sensor unit requires calibration before use, for which it is required and for the user to record voice commands in advance for the training of the speech recognition circuit.

53. The device in accordance with claim 1 , further characterized because it comprises, in addition, an obstacle avoidance module that is configured to be implemented when the processing device is arranged in an electrical mobility device; the avoidance module uses an ultrasonic sensor unit to help a user prevent collisions with static or moving objects.

54. The device in accordance with claim 53, further characterized by the obstacle avoidance module comprising an ultrasonic sensor unit that is divided into a first and second plurality of ultrasonic sensors; the first and second pluralities of ultrasonic sensors are distributed along an electric mobility device and are configured to measure the presence of nearby objects within a range of 1 .2 m or less, to avoid collisions and help the user safely direct the mobility device in any type of environment.

55. The device in accordance with claim 54, further characterized because the obstacle avoidance module is configured to avoid a collision in the presence of moving or fixed objects or persons.

56. The device in accordance with claim 54, further characterized because the first plurality of ultrasonic sensors is distributed at the front of the electric mobility device, as well as at the rear.

57. The device in accordance with claim 56, further characterized by the second plurality of ultrasonic sensors is distributed on the right and left sides of the electric mobility device.

58. The device according to claim 57, further characterized because the first plurality and the second plurality of ultrasonic sensors is configured to send an input signal to the modular processing unit; the input signal is then sampled by the first microcontroller and provides feedback to the user around their environment through the selection user interface through one of the pluralities of HMI interfaces.

59. The device according to claim 58, further characterized because an a of the plurality of HMI interfaces is configured to display sensor readings to the user and to provide feedback from the blind spots of the user's view.

60. The device in accordance with claim 58, further characterized by the input signal being sent to the DAC circuit.

61. The device according to claim 60, further characterized by the fact that the DAC circuit comprises, in addition: an I2C connector with two voltage output connectors, which are configured or s to emit two control signals: a first forward and reverse motion signal, and a second left turn motion signal andright turn; a power connector of at least 5V; and two digital-to- analog converters.

62. The device according to claim 61 , further characterized because the DAC circuit is configured to send the first and second motionsignals to the plurality of connection units; the plurality of connection units is configured to send the first and second motion signals to the electric mobility device.

63. The device in accordance with claim 53, further characterized by the ultrasonic sensor unit comprising an obstacle avoidance controller based on fuzzy logic.

64. The device in accordance with claim 63, further characterized because the obstacle avoidance controller is configured to perform an obstacle avoidance method, comprising: a fuzzification stage that is configured to evaluate a degree of belonging of an input value for each of the measured distances of the first and second plurality of ultrasonic sensors deployedaround the electric mobilitydevice, with respect to membership degrees that a language rule domain has, the domain includes two membership functions that are labeled as "Near" and "Far", and to obtain aninput membership value; a stage of inference mechanism that is configured to evaluate linguistic rules of the language rule domain of the yes-then form, and to obtain an outbound membership with respect to an input membership value; and a defuzzification stage that is configured to convert a degree of output membership membership to a number value, and calculate the numerical response of the obstacle avoidance controller;at each stage the modular processing unit isin communication with the obstacle avoidance controller that sends the data for processing.

65. The device according to claim 64, further characterized because the membership function "Near" is configured to use a "Z" input membership function to represent the reading of the first and second plurality of ultrasonic sensors within a range of 0 m to 0.9 m.

66. The device in accordance with claim 64, further characterized by the name function "Far" is configured to use an input membership function in the form of "S" to represent the reading of the first andsecond plurality of ultrasonic sensors within a range of 0.1 m to 1.2 m.

67. The device in accordance with claim 64, further characterized by the stage of inference mechanism, the language rules chosen are based on human experience, therefore, the linguistic rules are determined experimentally.

68. The device in accordance with claim 64, further characterized because for the defuzzification stage, membership functions are chosen from membership functions that represent the movements of the electrical mobility device: neutral, forward, and backward; and thus, send a motion signal if the presence of any obstacle in the direction of movement is detected.

69. The device according to claim 2, further characterized because the user selects which unitor sar of the plurality of sensor units, however, if no sensor unit is connected to the modular processing unit, an alarm is activated to tell the user that a control signal is not generated and that it is necessary to connect a sensor unit to the device.

70. The device in accordance with claim 1 , further characterized by the each one of the processing methods of the modular processing unit and the input signals operate independently of each other.

71. The device in accordance with claim 1 , further characterized because in each user interaction with the electronic processing device, the modular processing unit turns on an LED to indicate this interaction.

72. An electronic processing method characterizedby comprising the stages of: sending at least one input signal from a plurality of sensor units to a modular processing unit; processing each input signal received by the plurality of sensor units through a modular processing unit; send each control signal from the modular processing unit to a plurality of connection units; receive each control signal from the modular processing unit to the plurality of connection units; establish a connection from the plurality of connection units with a plurality of external electronic devices; and transmit each control signal from the plurality of connection units to the plurality of external electronic devices.

73. An electronic processing method characterized by comprising the stages of: sending at least one input signal from a plurality of sensor units to a modular processing unit; exchange each input signal according to the needs of a user with the use of a selection user interface that is in communication with the modular processing unit; process each input signal received by the plurality of sensor units through a modular processing unit; send each control signal from the modular processing unit to a plurality of connection units; receive each control signal from the modular processing unit to the plurality of connection units; establish connection from the plurality of connection units with a plurality of external electronic devices; and transmit each control signal from the plurality of connection units to the plurality of external electronic devices.

Description:
ELECTRONIC PROCESSING DEVICE FOR BIOMETRIC SIGNALS THAT ALLOWS THE

MANIPULATION OF ACTUATORS

FIELD OF INVENTION

The present invention is related to biometric signal processing devices, and more particularly is related to an electronic processing device of biometric signals that allows the manipulation of actuators and electrical systems, and related methods.

BACKGROUND OF INVENTION

The last decade has seen an increase in portable devices that are especially focused on implementing constant measurements of biometric signals.

Certain parts of the human body produce weak biometric signals, the most typical being electrical brain signals, electrocardiographic signals, electromyographic signals and electrical signals from the eye.

A wide range of sensors has been implemented to measure these signals, from cameras to capture images to accelerometers, voice recording system, muscle activity identifiers and even brain sensors; these sensors, however, are limited to their usability and adaptability.

There are different systems used for the detection and measurement of these biometric signals, such as that described in patent document US20090018419A1 , which refers to an eye motion detection device, which allows the monitoring and non-customizable interpretation of a single biometric parameter. However, the device described in this document requires specialized calibration techniques, which prevents the device from using parameters that fit with adaptive and continuous calibration on the fly.

Additionally, patent document JP2017526078A describes a method and system for using eye signals with secure mobile communications, including a portable device where a user is identified based on the recognition of their iris. Patent document US20160342206A1 describes a device and method for tracking a user's eye and head movements, including a plurality of optoelectronic sensor cells. In addition, document US6421064B1 describes a system for controlling automatic navigation of information, including a display, a computer system, a gambal sensor system to track and track the position and movement of a user's head and eye. However, the above-mentioned patent documents are limited because they do not allow the user to select which input biometric signals to prioritize; do not allow customization and adaptation of parameters for sorting and collecting; and require in addition to connecting to a computer to perform the output control tasks.

In addition to the above, patent document KR101728707B1 describes a portable eyeglass device and a method for controlling it, using a camera that follows eye movements. However, the device and method are limited in the number of input signals you can use and on the output systems that you can manipulate.

Another document in addition to those described above is patent document US9788714B2 describing a high-precision image classification and collection system with artificial intelligence, which can be implemented by combining a video camera based on eye facing sensors, a head orientation sensor, a display and an electronic circuit that connects eye sensors, the head sensor and the display. However, the document has limitations in its operation because they rely on high-performance processors to perform signal processing.

As a result, it has sought to eliminate the disadvantages presented by the systems and devices used for the detection and measurement of biometric signals currently used, developing a device for the electronic measurement and processing of biometric signals that allows to control and manipulate actuators, electrical or electronic systems that, in addition to allowing a selection , processing and classification of input biometric signals and output systems over a wide range, allows to process and classify biometric signals in a universal way, and additionally, allows to integrate different specialized and optimized electronic processing components with different processing techniques, and that also allows direct analog and digital electrical connections that do not require communication with an external unit for processing.

OBJECTS OF INVENTION

Considering the defects of the previous technique, it is an object of the present invention to provide an electronic processing device of biometric signals that allows the manipulation of actuators, which selects input biometric signals and output systems over a wide range.

It is another object of the present invention to provide an electronic processing device of biometric signals that allows the manipulation of actuators that processes and classifies biometric signals universally.

It is another object of the present invention to provide an electronic processing device of biometric signals that allows the manipulation of actuators that integrates different specialized and optimized electronic processing components with different processing techniques.

It is yet yet another object of the present invention to provide an electronic processing device of biometric signals that allows the manipulation of actuators that allow direct analog and digital electrical connections that do not require communication with an external unit for processing. It is an additional object of the present invention to provide an electronic processing device of biometric signals that allows the manipulation of actuators that allows to develop a method of processing input biometric signals.

These and other objects are achieved by means of an electronic processing device of biometric signals that allows the manipulation of actuators and related methods in accordance with the present invention.

BRIEF DESCRIPTION OF THE INVENTION.

To do this, an electronic processing device has been invented comprising: a power source to provide power to the processing device; a plurality of sensor units configured to send at least one input signal to a modular processing unit; a selection user interface that is in communication with the modular processing unit and is configured to allow exchange according to a user's needs of each input signal; a modular processing unit configured to process each input signal received by the plurality of sensor units, to filter and classify input signals, and to send at least one control signal to a plurality of connection units; and a plurality of connection units configured to receive each control signal from the modular processing unit, to connect to a plurality of external electronic devices and to transmit each control signal to the plurality of external electronic devices.

Other aspects of the invention, consider an electronic processing method comprising the stages of: sending at least one input signal from a plurality of sensor units to a modular processing unit; processing each input signal received by the plurality of sensor units through a modular processing unit; filter and classify the input signals through the modular processing unit; send each control signal from the modular processing unit to a plurality of connection units; receive each control signal from the modular processing unit to the plurality of connection units; establish connection from the plurality of connection units with a plurality of external electronic devices; and transmit each control signal from the plurality of connection units to the plurality of external electronic devices.

BRIEF DESCRIPTION OF THE DRAWINGS

The novel aspects that are considered characteristic of the present invention, will be established with particularity in the accompanying claims. However, some modalities, characteristics and some objects and advantages thereof will be better understood in the detailed description, when read in relation to the annexed drawings, in which:

Figure 1 illustrates an outline of the electronic processing device according to a first modality of the present invention.

Figure 2 illustrates a provision of a modular processing unit according to a modality of the present invention. Figure 3 illustrates a layout for a user interface according to a modality of the present invention.

Figure 4 illustrates a schematic for a head motion sensor unit according to a modality of the present invention.

Figure 5 illustrates a diagram of the intelligent control processing method according to a modality of the present invention.

Figure 6 illustrates a schematic for an eye movement sensor unit according to a modality of the present invention.

Figure 7 illustrates a schematic for a voice command sensor unit according to a modality of the present invention.

Figure 8 illustrates an outline of an obstacle avoidance module according to a second modality of the present invention.

Figure 9 illustrates a diagram of the method of avoidance of obstacles according to a modality of the present invention.

Figure 10 illustrates a diagram of the electronic processing method according to a third modality of the present invention.

Figure 11 illustrates a diagram of the electronic processing method according to a modality of the present invention.

DETAILED DESCRIPTION OF THE INVENTION

The present invention seeks a selection of wide range of input biometric signals and output systems; process and classify these biometric signals universally by integrating different specialized and optimized electronic processing components with different processing methods, allowing direct analog and digital electrical connections that do not require communication with an external unit for processing; implements, to achieve the above, a plurality of sensor units; a modular processing unit; and a plurality of connection units that allow connection with a plurality of external electronic devices through wired or wireless connections.

Thus, in one aspect of the invention is described an electronic processing device comprising: a power source to provide power to the processing device; a plurality of sensor units configured to send at least one input signal to a modular processing unit; a selection user interface that is in communication with the modular processing unit and is configured to allow the exchange, as needed by a user, of each input signal; a modular processing unit configured to process each input signal received by the plurality of sensor units and to send at least one control signal to a plurality of connection units; and a plurality of connection units configured to receive each control signal from the modular processing unit; , to connect to a plurality of external electronic devices and to transmit each control signal to the plurality of external electronic devices. ln one modality of the present invention each sensor unit of the plurality of sensor units is selected from the group comprising: a head motion sensor unit, a voice command sensor unit, an eye movement sensor unit, an ultrasonic sensor unit, among others.

In an additional modality of the present invention, the input signals received by the modular processing unit are preferably selected from: EMG signals (electromyography), EEG signals (electroencephalography), inertial signals, signals received from an acceleration sensor, signals received by an optical sensor, among others. Input signals enrich external information that can be processed for decision making and provide auxiliary information to the device user. This allows to have an advantage because this device can be used by people with limited capacities, to perform tasks where external information is needed.

In addition, in a mode of the present invention, each connection unit of the plurality of connection units is selected from the group comprising: wired connectivity ports (an analog audio output, an ethernet connector, a set of general input and output pins, universal serial buses or USB ports), a bluetooth unit, a Wifi unit, among others.

In an additional modality of the present invention, each electronic device of the plurality of electronic devices is selected from the group comprising: televisions, lighting systems, computer systems, smartphones, or any other electronic hardware with an analog or digital input connection.

The electronic processing device is connected to the power source, for example, via a USB Type-C connector and a micro-USB power supply connector. The total power consumption of the electronic processing device is approximately 18W.

The power source for the electronic processing device can be a battery bank, capacitors or a power supply network of any home, building, school, hospital or anywhere this device is used. If the electronic processing device is to be used with an electric mobility device, the device must be energized, for example, with a power source capable of supplying at least 3.5 A, at a voltage of at least 5.1 V.

In another mode of the present invention, the modular processing unit comprising: a first microcontroller of at least 8 bits of word length configured to receive the input signals, process the input signals, and to write and read data from a second microcontroller as well as from each digital-to-analog conversion circuit (DAC); a second microcontroller of at least 32 bits of word length configured to process the written data of the first microcontroller and/or each DAC conversion circuit, and to write and read data from the first microcontroller and/or each DAC conversion circuit; at least one DAC conversion circuit configured to convert the written data from the first and second microcontrollers; , and to write and read data from the first and second microcontroller and/or the plurality of connectionunits; a first port of the first microcontroller with universal asynchronous serial communication protocol configured to perform a write and read data of at least 38400 bits per second; a second port of the second microcontroller with universal asynchronous serial communication protocol configured to perform a write and read data of at least 38400 bits per second, the first port and the second port are configured for information exchange between the first microcontroller and the second microcontroller; a third port of the first microcontroller and a fourth port of the second microcontroller with universal asynchronous serial communication protocol configured to perform a write and read data of at least 38400 bits per second , intended for the exchange of information with other elements; a processor configured to perform multiprocessing tasks that is configured to process read data from the elements of the modular processing unit;a RAM that isconfigured to store data from different processes performed by the modular processing unit;a heat exchangefan generated by the elements of the modular processing unit;and a memory card configured to store data, software and/or an operating system of the modular processing unit.

Also, in one modality of the present invention, the modular processing unit is configured with a plurality of processingmethods; where each method of processing the plurality of processing methods is selected from: a method of using artificial intelligence algorithms such as fuzzy logic, neural networks, and signal filtering; an intelligent control processingmethod; a machine learning processing method; among others.

The modular processing unit can be configured to connect to external electronic devices through the plurality of connection units via the UART communication protocol, connections can be wired or wireless, depending on the characteristics and technical specifications of external electronic devices.

Preferably, the first microcontroller is configured to use a two-wire interface (I2C) communication protocol that is enabled by two pins of the first microcontroller and operates in a master and slave configuration. In addition, the first microcontroller can have at least 23 general digital input and output pins that can be configured independently. The first microcontroller can have an additional serial peripheral interface with a master or slave configuration. The peripheral characteristics of the first microcontroller can include a timer/counter of at least 16 bits, at least one 8-bit timer/counter, at least six pulse width modulation channels, and at least eight analog-to-digital conversion channels to sample low-frequency signals with a maximum frequency of 250 Hz. Finally, this first microcontroller can operate at a minimum clock frequency of 16MHz.

In the modular processing unit, the DAC circuit has, for example, at least a 12-bit resolution, a 5 V operating voltage with an I2C serial communication protocol, and a voltage output of 0 V to 5 V. Additionally, in a mode of the present invention, the selection user interface comprises a touch screen showing a human-machine interface (IHM) of a plurality of IHM interfaces, each IHM interface corresponds to each of the sensor units of the plurality of sensor units, each IHM interface is configured to allow interaction between the user and the device.

In an additional modality of the present invention, the selection user interface includes a switch.

Each interface of the plurality of IHM interfaces is selected from: an eye motion sensor interface, an acceleration sensor interface, and a voice command sensor interface. In the plurality of IHM interfaces, each of the IHM interfaces has, for example, a start button or condition and/or a stop button that can function as a condition to allow the use of sensor units.

The plurality of IHM interfacescan be configured to allow the selection of a wired or wireless connection with external electronic devices, determining, for example, how each control signal will be sent to the external electronic device with which the user wishes to interact and/or communicate through each input signal.

Each user interface can be configured to interact with user input information and can be interconnected with the modular processing unit, each user interface is configured, for example, to display and/or select each input signal. Additionally, each user interface can help the user establish a connection with external electronic devices. The Bluetooth connection between external electronic devices and the modular processing unit may require running a specific application on both to allow connection.

In a mode where the plurality of sensor units includes a head motion unit comprising an acceleration sensor mounted inside a wireless headband, the acceleration sensor is configured to send an input signal to a second microcontroller; a second microcontroller of at least 32 bits that is configured to execute an algorithm to calculate tilt angles, filter the input signal of the acceleration sensor, and use a fuzzy logic controller (FLC); and a FLC controller that can be configured to send a processed input signal to the first microcontroller. The acceleration sensor is selected from a triple-axis accelerometer to measure a degree of tilt of the user's head in an "x" direction and in a y direction, regardless of the position of the sensor within the headband; the acceleration sensor is in communication, for example, with the second microcontroller via an I2C serial communication protocol, so that the FLC controller can process the input signal and send a result to the first microcontroller. Communication between the first microcontroller and the second microcontroller is done with a universal asynchronous serial communication protocol, which can be wireless using a bluetooth module, the speed of writing and reading data can reach a minimum speed of 38400 bit per second. The first microcontroller can also be configured to use an I2C serial communication protocol to transmit data between other devices. The modular processing unit can be configured to use a DAC circuit to convert information received from the acceleration sensor into at least one control signal that controls external electronic devices. Each control signal of the DAC circuit is transmitted to external electronic devices using one of the pluralities of connection units.

It is only necessary to place the headband on top of the head, around the user's forehead, because it does not require any calibration or training to start using the head motion sensor unit. The headband is designed to be compact and lightweight, with wireless communication, with a modular design.

The modular processing unit, relying on the FLC controller, can be configured to use an intelligent control processing method to control external electronic devices with a particular behavior based on the inclination detected by the acceleration sensor in the "x" and "y" directions that corresponds to a head tilt motion based on user decisions.

The intelligent control processing method that can comprise three main stages for generating a control signal: a fuzzification stage that is configured to evaluate a degree of membership that has an input value, with respect to different degrees of memberships that are had within a language rule domain, and to obtain an inbound membership value; a stage of inference mechanism that is configured to evaluate linguistic rules of a language rule domain if- then, and to obtain an exit membership with respect to the inbound membership value; and a defuzzification stage that is configured to convert a membership degree from the outbound membership to an exit membership value; , and calculate the numerical response of the FLC controller; at each stage the modular processing unit is in communication, for example, with the acceleration sensor that writes the data for processing with the FLC controller. For each range or degree of the calculated tilt of the acceleration sensor, membership functions are chosen, for example, Gaussian membership functions, for the fuzzification stage of the FLC controller. For the inference mechanism stage, the chosen linguistic rules may be based on human experience, therefore linguistic rules can be determined experimentally. The FLC controller can be configured to use a linear membership function for the defuzzification stage and produce at the end of the defuzzification stage, for example, a control signal.

For the purposes of this application, the references made herein to the term "fuzzification" refer to the conversion of an actual variable into a degree of belonging that quantifies the degree of possession of the actual variable to its corresponding linguistic variable. In addition, the use of the term "defuzzification" refers to the conversion of a set of linguistic variables with a respective degree of belonging, into a real number.

It is important to note that the implementation of the FLC controller requires high computational capabilities by the second microcontroller. Therefore, it is necessary to represent the FLC controller in another way that requires fewer computational requirements of the second microcontroller. The FLC controller can be represented as a surface of its input and output ratio, for this purpose surfaces are generated, for example, for the direction "x" and the direction "y" of the acceleration sensor. Because these curves or surfaces can represent the dynamics of the controller, it is possible to generate a model that represents to some extent the behavior.

This surface can be subjected to regression analysis to obtain a polynomial expression. This expression can be embedded in the second microcontroller to modify the hardware of the modular processing unit and allow interaction with external electronic devices. The resulting polynomial expressions are as follows:

M — &¾ 4 kx t 4 e 1

For the above expressions, M and D can represent a control signal for the control of external electronic devices in the direction "x" and "y" respectively. On the other hand, a, b, c and d, can be the regression coefficients obtained that are calculated based on the polynomial regression analysis. Finally, x and y can be the independent variables that correspond to the measurement of the tilt angle of the acceleration sensor. The M and D polynomial expressions can be programmed and embedded within the second microcontroller and consequently these functions can generate a control signal to send to the plurality of connection units and then to external electronic devices. Implementation through FLC controller surface regression is less computationally expensive, making it suitable for integrated into the second microcontroller.

The first microcontroller can be configured to turn on an LED to indicate user interaction each time you try to perform a move or action. There can be a total of five LEDs to fulfill that purpose, one for each direction and tilt of the head, thus providing feedback to the user on the behavior of the electronic processing device and to allow him to start, stop and interact with the device, also using each user interface.

In a mode where the plurality of sensor units includes an eye movement sensor unit that can comprise at least one eye movement sensor that is mounted on a headband, each eye movement sensor is positioned so that it is in front of a user's eye, each eye movement sensor generates an analog signal that varies with the movements of the user's eye; each eye movement sensor can be connected to at least one analog-digital converter circuit (ADC) of the modular processing unit; each ADC converter circuit is configured to receive the analog signal sent by each eye movement sensor, and to sample the analog signal of each eye movement sensor. The modular processing unit can include a second microcontroller that is configured to receive a sampled signal for each ADC converter circuit. The processing unit is additionally configured to execute an algorithm capable of receiving the sampled signal from the ADC circuit, applying a series of high pass filters and cascade passes down to condition the sampled signal; to take a set of sampled signal samples; and to run with them an algorithm with a multilayer neural network to identify patterns of eye movement. The second microcontroller can be in communication with the first microcontroller, for example, wirelessly via a universal asynchronous serial communication protocol, the speed of writing and reading data can reach a minimum speed of 38400 bit per second. The first microcontroller can be configured to also use an I2C serial communication protocol to transmit data between other devices. The modular processing unit can be configured to use the second microcontroller to process the information of each DAC circuit, and through one of the pluralities of connection units send a control signal that controls external electronic devices. Each control signal is transmitted to external electronic devices using one of the pluralities of connection units. Training a multilayer neural network for pattern identification requires high computational capabilities and a previously identified database, therefore it is necessary to perform multilayer neural network training previously on a computer to obtain the coefficients and activation functions, learning results. These coefficients are used to perform basic matrix operations, so that fewer computational requirements of the second microcontroller are required. This eye processing unit does not require calibration or training by the user.

The eye movement sensor also comprises at least two optical sensors that are placed right in front of each user's eye, on the sides of the iris in the direction of the sclera of the eye. Each eye movement sensor may be able to distinguish at least three eye movements (look left, right, and down). These movements can be combined in different ways to define commands that help better identify the neural network. The eye movement sensor can be placed as in the headband of the head movement unit. The headband of the head movement unit is placed at the top of the head, around the user's forehead; and the eye movement sensor unit can be installed on the front of the headband, so that the sensor is positioned right in front of the user's eye.

The plurality of sensor units can include a voice command sensor unit can be configured to allow the user to control external electronic devices via voice commands. The voice command sensor unit can comprise a microphone with conventional headphone jack as a means for the user to record their voice, the microphone is connected to a speech recognition circuit; and a speech recognition circuit that is connected to the microphone and is configured to process the audio signal received from the microphone, identify at least one voice command in the audio signal, and send a character string to the modular processing unit, through a channel with universal asynchronous serial communication protocol, the speed of writing and reading data can reach a minimum speed of 38400 bit per second. The modular processing unit, using the first microcontroller that is additionally setup to receive characters sent by the speech recognition module, process the characterstring, pass a control signal through a DAC circuit, and send it through the plurality of connection units to external electronic devices. This voice command sensor unit requires calibration before use, for this the user may be required to record voice commands in advance for voice recognition circuit training.

In an alternative modality of the present invention, the electronic processing device comprises, in addition, an obstacle evitation/avoidance module that is configured to be implemented when the processing device is arranged in an electrical mobility device; the evitation module employs an ultrasonic sensor unit to help a user prevent collisions with static or moving objects.

The obstacle avoidance module that can comprise a unit of ultrasonic sensors, which is divided to, for example, into a firstand second plurality of ultrasonic sensors; the first and second plurality of ultrasonic sensors can be distributed along an electric mobility device and can be configured to measure the presence of nearby objects within a range of 1.2 m or less, to avoid collisions and help the user direct the mobility device safely in any type of environment. The obstacle avoidance module is configured, for example, to avoid a collision in the presence of moving or fixed objects or people. The first plurality of ultrasonic sensors can be distributed at the front of the electric mobility device as well as at the rear. The second plurality of ultrasonic sensors can be distributed on the right and left sides of the electric mobility device. The first plurality and the second plurality of ultrasonic sensors is preferably configured to send an input signal to the modular processing unit; the input signal is then sampled by the first microcontroller and provides feedback to the user around their environment through, for example, the selection ui through one of the pluralities of HMI interfaces. One of the pluralities of HMI interfaces is configured, for example, to display sensor readings to the user and to provide feedback from the blind spots of the user's view. Additionally, the input signal can be sent to the DAC circuit, the DAC circuit that can additionally comprise: an I2C connectorwith two voltage output connectors, which are configured or s to emit two control signals: a first forward and reverse motion signal, and a second left turn motion signal andright turn; a power connector of at least 5V and two digitals to analog converters. The DAC circuit is configured, for example, to send the first and second motion signals to the plurality of connection units; the plurality of connection units can be configured to send the first and second motion signals to the electric mobility device.

The ultrasonic sensor unit can also comprise an obstacle avoidance controller based on fuzzy logic. The obstacle avoidance controller is configured, for example, to perform an obstacle avoidance method, which is like the method performed by the FLC controller, but instead of using constant or linear membership functions for the defuzzification stage and producing sharp output values, the defuzzification functions can be triangular, Gaussian, trapezoidal, etc.

The obstacle avoidance method performed by the obstacle avoidance controller, which may include: a fuzzification stage that is configured to evaluate a degree of membership of an input value for each of the measured distances of the first and second plurality of ultrasonic sensors deployed around theelectric mobility device, with respect to the degrees ofmemberships that a mastery of language rules has, the domain includes two membership functions that are labeled "Near" and "Far", and to get an input membership value; a stage of inference mechanism that is configured to evaluate linguistic rules of the language rule domain in the si-then form, and to obtain anoutbound membership with respect to an input membershipvalue; and a defuzzification stage that is configured to convert a degree of output membership membership to a number value, and calculate the numerical response of the obstacle avoidance controller;at each stage the modular processing unit can be in communication with the obstacle avoidance controller that sends the data for processing. The "Close" membership function is configured, for example, to use a "Z" input membership function to represent the reading of the first and second plurality of ultrasonic sensors within a range of 0 m to 0.9 m. The "Far"membershipfunctionis configured, for example, to use an "S" input membership function to represent the reading of the first and second plurality of ultrasonic sensors within a range of 0.1 m to 1.2 m. For the inference mechanism stage, the chosen linguistic rules may be based on human experience, therefore linguistic rules can be determined experimentally. On the other hand, for the defuzzification stage, membership functions are chosen, for example, membership functions that represent the movements of the electric mobility device: neutral (no movement of the electric mobility device), back and forth; and thus, send a motion signal if the presence of any obstacle in the direction of movement is detected.

The user can selectwhich unit to use from the plurality of sensor units, however, if no sensor unit is connected to the modular processing unit, preferably an alarm is activated to tell the user that nocontrol signal is generated and that it is necessary to connect a sensor unit to thedevice. This allows the device to ensure, for example, sending at least one control signal to the plurality of external electronic devices.

Each of the modular processing unit processing methods and input signals can operate independently of each other.

In each user interaction with the electronic processing device, the modular processing unit, and specificly the first microcontroller, can turn on an LED to indicate this interaction. In this way, providing feedback to the user on the behavior of the electronic processing device and to allow him preferably to initiate, stop and interact with the device, also using the plurality of HMI interfaces.

The device adjusts to the conditions of the user input signals, which allows to have a device that allows to control different communication points, such as electric actuators, pneumatics, magnetic, etc.

In addition, an additional modality of the invention describes an electronic processing method comprising the stages of: sending at least one input signal from a plurality of sensor units to a modular processing unit; processing each input signal received by the plurality of sensor units through a modular processing unit; send each control signal from the modular processing unit to a plurality of connection units; receive each control signal from the modular processing unit to the plurality of connection units; establish a connection from the plurality of connection units with a plurality of external electronic devices; and transmit each control signal from the plurality of connection units to the plurality of external electronic devices.

In an additional modality of the present invention, an electronic processing method is described comprising the stages of: sending at least one input signal from a plurality of sensor units to a modular processing unit; exchange each input signal according to the needs of a user with the use of a selection user interface that is in communication with the modular processing unit; process each input signal received by the plurality of sensor units through a modular processing unit; send each control signal from the modular processing unit to a plurality of connection units; receive each control signal from the modular processing unit to the plurality of connection units; establish connection from the plurality of connection units with a plurality of external electronic devices; and transmit each control signal from the plurality of connection units to the plurality of external electronic devices.

Referencing Figure 1 , illustrating an electronic processing device 1000 comprising: a 1100 power source to power the processing device 1000; a plurality of 1200 sensor units configured to send at least one 1300 input signal to a 1400 modular processing unit; a 1500 selection user interface that is in communication with the 1400 modular processing unit and is configured to allow exchange according to a user's needs of each 1300 input signal; a 1400 modular processing unit configured to process each 1300 input signal received by the plural 1200 sensor units and to send at least one 1600 control signal to a plurality of 1700 connection units; and a plurality of 1700 connection units configured to receive each 1600 control signal from the 14000 modular processing unit , to connect to a plurality of external electronic devices 1800 and to transmit each 1600 control signal to the plurality of external electronic devices 1800.

Referencing Figure 2, which illustrates the modular processing unit 1400 comprising: a first microcontroller of at least 8 bits 1410 word length; a second microcontroller of atleast 32 bits 1430 word length; at least one digital-to-analog conversion circuit (DAC) 1420; a first port 1411 of the first 1410 microcontroller with universal asynchronous serial communication protocol 1440 configured to perform a write and read data of at least 38400 bits per second; a second port 1431 of the second 1430 microcontroller with universal asynchronous serial communication protocol 1440 configured to perform a write and read data of at least 38400 bits per second, the first port 1411 and the second Port 1431 are configured for information exchange between the first microcontroller 1410 and the second microcontroller 1430; a third port 1412 of the first microcontroller 1410 and a fourth port 1432 of the second 1430 microcontroller with universal 1440 asynchronous serial communication protocol configured to perform a write and read data of at least 38400 bits per second, intended for the exchange of information with other elements; a processor (not shown) configured to perform multiprocessing tasks; a RAM (not shown); a fan for heat exchange (not shown); and a memory card (not shown) configured to store data, software and/or an operating system.

Figure 3 illustrates the 1500 user interface comprising a 1510 touch screen that shows a human-machine interface (IHM) of a plurality of IHM 1520 interfaces for each of the plurality of sensor units, configured to allow interaction between the user and the device. The user interface also includes a 1530 switch.

Figure 4, illustrates a 1210 head motion unit, comprising a 1211 acceleration sensor mounted inside a 1212 wireless headband, the 1211 acceleration sensor is configured to send an input signal 1310 to a second 1430 microcontroller; a second 1430 microcontroller of at least 32 bits that is configured to execute an algorithm to calculate tilt angles, filter the acceleration sensor input signal, and use a fuzzy logic controller (FLC); and a FLC controller (not shown) that can be configured to send a processed 1310 input signal to the first 1410 microcontroller. The 1211 acceleration sensor can be a triple-axis accelerometer to measure the degree of inclination of the user's head in an "x" direction and in a y direction, regardless of the position of the 1211 acceleration sensor within the headband; The 1211 acceleration sensor is in communication, for example, with the second 1430 microcontroller via an I2C serial communication protocol, so that the FLC controller (not shown) can process input signal 1310 and send a result to the first 1410 microcontroller. Communication between the first 1410 microcontroller and the second 1430 microcontroller is performed with a universal asynchronous serial communication protocol 1440, which can be wireless using a bluetooth module (not shown), the speed of writing and reading data can reach a minimum speed of 38400 bit per second. The first 1410 microcontroller can be configured to also use an I2C serial communication protocol to transmit data among other elements. The 1400 modular processing unit can be configured to use at least one 1420 DAC circuit to convert information received from the 1211 acceleration sensor, processed by the second 1430 microcontroller and read by a 1440 communication channel by the first 1410 microcontroller, into at least one 1600 control signal that controls 1800 external electronic devices. Each 1600 control signal of the DAC 1420 circuit is transmitted to 1800 external electronic devices using one of the plurality of 1700 connection units.

Referencing Figure 5 now, which illustrates the intelligent control processing method 2100 that can comprise three main stages for generating a 1600 control signal: a 2110 fuzzification stage that is configured to evaluate the degree of membership that has an input value, with respect to different degrees of memberships that are had within a language rule domain, and to obtainan input membership value; a 2120 inference mechanism stage that is configured to evaluate the linguistic rules of a language rule domain if-then, and to obtain an exit membership withrespect to the entry membership value; and a 2130 defuzzification stage that is configured to convert an output membership membership degree to a number value, and calculate the numerical response of the FLC controller^ each stage the modular processing unit is in communication, for example, with the acceleration sensor that writes the data for processing with the FLC controller. For each calculated range or degree of tilt of the acceleration sensor, membership functions are chosen, for example, Gaussian membership functions for the 2110 fuzzification stage of the FLC controller. For the inference mechanism stage 2120, the chosen linguistic rules may be based on human experience, therefore linguistic rules can be determined experimentally. The FLC controller can be configured to use a linear membership function for the 2130 defuzzification stage. Producing at the end of the defuzzification stage, for example, a 1600 control signal.

Referencing Figure 6 now, which illustrates an eye movement sensor unit 1220 that can comprise at least one 1221 eye movement sensor mounted on a 1212 headband, Each 1221 eye movement sensor is positioned so that it is in front of a user's eye, each 1221 eye movement sensor generates an analog 1320 signal that varies with the movements of the user's eye, and each 1221 eye movement sensor can be connected to at least one analog-digital converter circuit (ADC) (not shown) of the 1400 modular processing unit; Each analog-to-digital converter (ADC) circuit (not shown) is configured to receive the 1320 analog signal sent by each 1221 eye movement sensor, and to sample analog signal 1320 from each 1221 eye movement sensor. The 1400 modular processing unit can include a second 1430 microcontroller of at least 32 bits of word length that is configured to receive a sampled signal per ADC converter circuit (not shown). The 1400 modular processing unit is additionally configured to execute an algorithm capable of receiving the sampled signal from the ADC circuit (not shown), applying a series of filters passes high and cascaded low to condition the sampled signal; to take a set of sampled signal samples; and to run with them an algorithm with a multilayer neural network to identify patterns of eye movement. The second 1430 microcontroller can be in communication with the first 1410 microcontroller, for example, wirelessly via a universal asynchronous serial communication protocol 1440, the speed of writing and reading data can reach a minimum speed of 38400 bit per second. The first 1410 microcontroller can be configured to also use an I2C serial communication protocol to transmit data among other elements. The 1400 modular processing unit can be configured to use the second microcontroller to send the processed information to each DAC 1420 circuit, and through one of the plurality of 1700 connection units send a 1600 control signal that controls the external electronic devices 1800. Each 1600 control signal of the DAC 1420 circuit is transmitted to 1800 external electronic devices using one of the plurality of 1700 connection units. Figure 7 illustrates a 1230 voice command sensor unit can be configured to allow the user to control external electronic devices 1800 via voice commands. The 1230 voice command sensor unit can comprise a microphone with conventional 1231 headset connection as a means for the user to record their voice, the 1231 microphone is connected to a 1232 speech recognition circuit, the microphone is configured to send an audio signal to the 1232 speech recognition circuit; and a 1232 speech recognition circuit that is connected to the 1231 microphone and is configured to process the audio signal received from microphone 1231 , identify at least one voice command in the audio signal, and send a character string to the 1400 modular processing unit, through a channel with universal asynchronous serial communication protocol 1330, the speed of writing and reading data can reach a minimum speed of 38400 bit per second. The first 1410 microcontroller can be additionally configured to receive the character string received from the 1232 speech recognition circuit, process the character string, pass a 1600 control signal through a 1420 DAC circuit, and send it through the plurality of 1700 connection units to 1800 external electronic devices. This voice command sensor unit requires calibration before use, for this the user may be required to record voice commands in advance for training the speech recognition circuit.

Referencing Figure 8, which illustrates an obstacle avoidance module 1900 that can comprise a unit of ultrasonic sensors 1240, which are divided, for example, into a first

1241 and a second plurality of ultrasonic sensors 1242; the first 1241 and the second plurality of

1242 ultrasonic sensors can be distributed along an 1810 electric mobility device and can be configured to measure the presence of nearby objects within a range of 1.2 m or less, to avoid collisions and help the user safely direct the 1810 mobility device in any type of environment. The 1900 obstacle avoidance module is configured, for example, to avoid a collision in the presence of moving or fixed objects or people. The first plurality of 1241 ultrasonic sensors can be distributed on the front of the 1810 electric mobility device as well as on the rear. The second plurality of ultrasonic sensors 1242 can be distributed on the right and left sides of the 1810 electric mobility device. The first plurality and the second plurality of ultrasonic sensors 1241, 1242 is preferably configured to send an input signal 1340 to the modular processing unit 1400; The 1340 input signal is then sampled by the first 1410 microcontroller and provides feedback to the user around their environment through, for example, the 1500 selection user interface through one of the pluralities of HMI 1520 interfaces. One of the pluralities of HMI 1520 interfaces is configured, for example, to display sensor readings to the user and to provide feedback from the user's view blind spots. Additionally, input signal 1340 can be sent to DAC circuit 1420, the DAC 1420 circuit that can additionally comprise: an I2C connector with two voltage output connectors, which are configured to emit two control signals: a first forward and reverse motion signal 1610, and a second left turn motion signal and 1620 right turn; a 5V power connector and two digital-to-analog converters. The DAC 1420 circuit is configured, for example, to send the first and second motion signal 1610, 1620 to the plurality of connection units 1700; the plurality of connection units 1700 can be configured to send the first and second motion signal 1610, 1620 to the 1810 electric mobility device.

The ultrasonic sensor unit can also comprise an obstacle avoidance controller based on fuzzy logic, embedded in the first microcontroller. The obstacle avoidance controller is configured, for example, to follow a method like the method performed by the FLC controller, illustrated in Figure 5, but instead of using constant or linear membership functions for the defuzzification stage and producing sharp output values, the defuzzification functions can be triangular, gaussian, trapezoidal, etc.

As shown in Figure 9, illustrating an obstacle avoidance method 2200 performed by the obstacle avoidance controller, comprising: a 2210 fuzzification stage that is configured to evaluate a degree of membership of an input value for each of the measured distances of the first and second plurality of ultrasonic sensors deployed aroundthe electric mobilitydevice, with respect to membership degrees that has a mastery of linguistic rules, the domain includes two membership functions that are labeled "Near" 2211 and "Far" 2212, and to get an input membership value; a 2220 inference mechanism stage that is configured to evaluate linguistic rules of the language rule domain in the si-then form, and to obtain an exit membership with respect to an input membership value; and a 2230 de-fuzz stage that is configured to convert an output membership membership degree to a number value, and calculate the numerical response of the obstacle avoidance controller;at each stage the modular processing unit is in communication with the obstacle avoidance controller that sends the data for processing. The "Close" membership function is configured to use a "Z" input membership function to represent the reading of the first and second plurality of ultrasonic sensors within a range of 0 m to 0.9 m. The "Far" membership function is configured to use an "S" input membership function to represent the reading of the first and second plurality of ultrasonic sensors within a range of 0.1 m to 1.2 m. For the inference mechanism stage 2220, the chosen linguistic rules are based on human experience, therefore the linguistic rules are experimentally determined. On the other hand, for defuzzification stage 2230, three membership functions representing the movements of the electric mobility device were selected: neutral (no movement of the electric mobility device), back and forth; and thus, send a motion signal if the presence of any obstacle in the direction of movement is detected.

Referencing Figure 10 now, which illustrates an electronic processing method 3000 comprising the stages of: sending at least one input signal from a plurality of sensor units to a modular processing unit; processing 3200 each input signal received by the plurality of sensor units through a modular processing unit; send 3300 each control signal from the modular processing unit to a plurality of connection units; receive 3400 each control signal from the modular processing unit to the plurality of connection units; establish 3500 connection from the plurality of connection units with a plurality of external electronic devices; and transmit 3600 each control signal from the plurality of connection units to the plurality of external electronic devices.

Figure 11 illustrates a mode of the electronic processing method 3000 comprising the stages of: sending at least one input signal from a plurality of sensor units to a modular processing unit; exchanging 3700 each input signal according to a user's needs with the use of a selection user interface that is in communication with the modular processing unit; process 3200 each input signal received by the plurality of sensor units through a modular processing unit; send 3300 each control signal from the modular processing unit to a plurality of connection units; receive 3400 each control signal from the modular processing unit to the plurality of connection units; establish 3500 connection from the plurality of connection units with a plurality of external electronic devices; and transmit 3600 each control signal from the plurality of connection units to the plurality of external electronic devices.

In accordance with the above, it can be observed that the electronic biometric signal processing device that allows the manipulation of actuators and related methods, have been devised to select a wide range of input biometric signals and output systems; to process and classify these biometric signals integrating different specialized and optimized electronic processing components with different processing methods, thus allowing direct analog and digital electrical connections that do not require communication with an external unit for processing, and it will be evident to any subject matter expert that the modalities of the electronic biometric signal processing device that allows the manipulation of actuators and related methods as described above and illustrated in the drawings accompanying, are only illustrative more non-limiting of the present invention, since numerous changes of consideration in their details are possible without departing from the scope of the invention. For example, it is possible to use an obstacle avoidance module on the electronic processing device when implemented in an electric mobility device.

Therefore, the present invention should not be considered as restricted except as required by the above technique and by the scope of the accompanying claims.