Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
A SYSTEM FOR ANALYSING PHYSICAL ACTIVITIES AND BIO-MARKERS
Document Type and Number:
WIPO Patent Application WO/2021/165861
Kind Code:
A1
Abstract:
The present invention provides a system (100) for monitoring physical activities and biomarkers of a user (200). The system (100) includes one or more cameras (10), one or more wearable devices (20), and a processing unit (30). The system (100) is configured in an electronic device (40). The camera (10) is adapted to record the physical activities, facial expressions of the user (200). The wearable device (20) has one or more bio-sensors (22) configured to sense the bio- parameters of the user (200). The processing unit (30) is connected to the camera (10) and the wearable device (20) to receive an input information. The processing unit (30) compares the input information with a predefined set of conditions and generates advice instructions (90). The system (100) helps the user (200) to improve body posture, psychological and physiological behaviour.

Inventors:
MADNANI AKASH NANIKRAM (IN)
ZADGAONKAR AKSHAY UMESH (IN)
NAISE AMAR UMASHANKAR (IN)
BORKAR CHETAN VASANT (IN)
BAGDE JAYESH RAMBHAU (IN)
CHANDRA MANISH (IN)
Application Number:
PCT/IB2021/051360
Publication Date:
August 26, 2021
Filing Date:
February 18, 2021
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
MADNANI AKASH NANIKRAM (IN)
International Classes:
A61B5/11; A61B5/0205; A61B5/113
Foreign References:
US20160004224A12016-01-07
US20150359467A12015-12-17
Attorney, Agent or Firm:
SABALE, Suneet (IN)
Download PDF:
Claims:
We Claim:

1. A system (100) for monitoring physical activities and biomarkers, the system (100) characterized in that: one or more cameras (10), the camera (10) is adapted to record physical activities, facial expressions of a user (200) when the camera (10) is operated accordingly; one or more wearable devices (20) arranged on the user’s (200) body, the wearable device (20) is having one or more bio-sensors (22) configured to sense bio-parameters such as a breathing pattern, a heartbeat rate, a sweating rate, and the like of the user; a processing unit (30) is connected to the camera (10) and the wearable device (20) to receive input information comprising physical activities, facial expressions, and bio -parameters of the user (200) therefrom, wherein the processing unit (30) compares the input information with a predefined set of conditions of physical activities, facial expressions, and bio-parameters for determining mental and physical states of the user (200) stored in a memory unit (50) of the processing unit (30) and generates advice instructions (90) according to a predefined set of instructions therein. 2. The system (100) as claimed in claim 1, wherein the system (100) is a set of hardware (100a) and a software (100b) configured in an electronic device (40), the electronic device (40) is a mobile or a computer or a tablet or an IoT (internet of things) device; the camera (10) is a camera of the electronic device (40), and the electronic device (40) is configured with a set of software and hardware for enabling the camera (10) to record the physical activities, the facial expressions of the user; the processing unit (30) is a processor of the electronic device (40), and the electronic device (40) is configured with a set of software and hardware for enabling the processor to compare the input and generate the advice instructions through an output unit (60) of the electronic device (40), the output unit (60) is a display, a speaker, a projector and a printer.

3. The system (100) as claimed in claim 1 & 2, wherein the wearable device (20) is connected to the electronic device (40) through a wired or wireless connection.

4. The system (100) as claimed in claim 1, wherein the wearable device (20) is a smartwatch or a torso measurement device or a hearable device or a device which is in the surface contact with the body of the user (200) for measuring bio- parameters such as a breathing pattern, a heartbeat rate, a sweating rate and the like of the user (200). 5. The system (100) as claimed in claim 1, wherein the wearable device (20) is a having one or more physiological parameter sensing unit (26) for measuring physiological parameters, such as heart rate, blood pressure, body temperature, serum levels of various stress hormones and immunological functions of the user (200).

6. The system (100) as claimed in claim 1, wherein the wearable device (20) is having one more electrophysiological parameters sensing unit (28), the electrophysiological parameters sensing unit (28) is adapted to measure the voltage changes or electric current or manipulations on a wide variety of scales from single ion channel proteins to whole organs like the heart, the brain and also to perform an electroencephalogram on the user (200).

7. The system (100) as claimed in claim 1, wherein the wearable device (20) includes a GPS tracking device (24) for tracking a geographical location of the user and sends the geographical location to the processing unit (30).

8. The system (110) as claimed in claims 1 and 2, wherein the processing unit (30) is connected to a cloud storage (70), the cloud storage (70) is having a predefined set of conditions of physical activities, facial expressions, and bio -parameters and the conditions are periodically updated by one or more electronic devices (40) connected thereto.

9. The system (110) as claimed in claims 1&8, wherein the system (110) enables one 5 or more users (200a, 200b, 200c) to get connected mutually through one or more electronic devices(40a, 40b, 40c) having systems (110) configured therein to update the conditions stored in the cloud storage (70) and also use the updated stored information subsequently.

1010. The system (110b) as claimed in claims 1,8 &9, wherein the one or more electronic devices(40a, 40b, 40c) are connected to a server (34), the server (34) is connected to the cloud storage (70), and a server processor (32) of the server (34) compares the input information from one or more electronic devices (40a, 40b, 40c) with the predefined set of conditions in the cloud storage (70) for generating one or

15 more advice instructions (90) according to a predefined set of instructions therein to one or more electronic devices(40a, 40b, 40c) connected to the server (34).

11. The system (100) as claimed in claim 1, wherein the advisory instructions (90) include body postures, facial expressions, and physical activities that need to be 0 corrected by the user.

12. The system (100) as claimed in claim 1, wherein the processing unit (30) predicts possible physical activities or movements by comparing the input information with the predefined set of conditions of physical activities, facial expressions, and bio- 5 parameters and generates advice instructions (90) as cautions or prediction instructions.

13. The system (120) as claimed in claim 1&2, wherein the system (120) is having a voice receiving unit (80) for receiving a voice input of the user (200), the voice

30 receiving unit (80) is connected to the processor (30) to provide voice input information, the processing unit (30) compares the voice input with predefined conditions of voice according to various emotional states of the user to deduce the emotional state of the user (200) and generates advice instructions (90) according to the predefined set of instructions therein.

514. The system (120) as claimed in claim 13, wherein one of the instruction is to display contents such as music, images according to the emotional state of the user (200), the processing unit (30) actuates the output unit (20) to the display the contents such as music, images according to the emotional state of the user (200) through the output unit (20). 0

Description:
"A System for Analysing Physical Activities and Bio-markers."

Field of the Invention

[0001] The present invention relates to a system for analyzing physical activities and bio-markers. More particularly, the present invention relates to the system for analyzing the physical activities, facial expressions, and bio markers of human beings.

Background of the Invention

[0002] Generally, an individual must have correct body movements, mental state, and posture to perform the daily activities accurately. Some physical activities demand correct body movements and posture to avoid injuries, thereby getting the best results. Similarly, facial expressions reflect the individual's engagement in the activity.

[0003] Presently, there are many devices (some are wearable devices) to measure bio-parameters (biomarkers) such as a breathing pattern, a heartbeat rate, a sweating rate, and the like of a user (human). Patent application WO2020194176A1 describes a mechanism for measuring the torso movement of a user. The mechanism helps in measuring the torso movements of a user of the mechanism. Similarly, patent application W02018207051A1 describes a System and a Method for Monitoring Human Performance. The system helps in monitoring human performance. But these mechanism or systems does not provide visual and auditory feedback according to physical and mental activities of the user and to the user to correct or modify his/her physical activities or biomarkers.

[0004] Therefore, to help and guide the individual to perform the activity correctly, it is essential to monitor and map both the facial expressions and body movements correctly. [0005] As of date, there is no such system or method which can guide the individual to correct the body posture and facial expressions.

[0006] Therefore, there is a need to provide a system that can overcome the drawbacks of the existing prior art.

Objects of the Invention

[0007] An object of the present invention is to provide a system for analyzing physical activities and bio-markers.

[0008] Another object of the present invention is to provide a system for analyzing physical activities and bio-markers, wherein the system provides an audio-visual alert to a user to correct body movement and facial expressions.

[0009] Yet an object of the present invention is to provide a system for analyzing physical activities and bio-markers, wherein the system provides an audio-visual alert to a user to regulate the breathing rate or heart rate thereby enhancing individual wellness.

[0010] One more object of the present invention is to provide a system for analyzing physical activities and bio -markers, wherein the system improves the work efficiency of a user.

[0011] Still another object of the present invention is to provide a system for analyzing physical activities and bio-markers, wherein the system is simple and economical.

Summary of the invention [0012] According to the present invention, there is provided with a system for analyzing physical activities and bio-markers. The system includes one or more cameras, one or more wearable devices, and a processing unit. The system is configured in an electronic device. The electronic device is a mobile or a computer or a tablet or an IoT (internet of things) device.

[0013] The camera is adapted to record the physical activities, facial expressions of a user. The camera will record only when the camera is operated accordingly by the user. The camera is a camera of the electronic device. The electronic device is configured with a set of software and hardware for enabling the camera to record the physical activities, the facial expressions of the user.

[0014] The wearable device is arranged on the user’s body. The wearable device is a smartwatch or a torso measurement device or a hearable device or a device that is in surface contact with the body of the user for measuring bio-parameters such as a breathing pattern, a heartbeat rate, a sweating rate, and the like of the user. The wearable device is connected to the electronic device through a wired or wireless connection. The processing unit is connected to the camera and the wearable device.

[0015] The wearable device is having one or more bio-sensors. The bio-sensor is configured to sense bio-parameters such as a breathing pattern, a heartbeat rate, a sweating rate, and the like of the user. Further, the wearable device includes a GPS tracking device. The GPS tracking device tracks the geographical location of the user and sends geographical location details to the processing unit. The wearable device has one or more physiological parameter sensing units for measuring physiological parameters, such as heart rate, blood pressure, body temperature, serum levels of various stress hormones (e.g. cortisol), and immunological functions of the user. [0016] Further, the wearable device (20) is having one more electrophysiological parameters sensing unit. The electrophysiological parameters sensing unit is adapted to measure the voltage changes or electric current or manipulations on a wide variety of scales from single ion channel proteins to whole organs like the heart, brain and to perform an electroencephalogram on the user.

[0017] The processing unit is a processor of the electronic device. The processing unit receives input information comprising physical activities, facial expressions, and bio-parameters of the user therefrom the camera and the wearable device.

[0018] Further, the processing unit compares the input information with a predefined set of conditions of physical activities, facial expressions, and bio-parameters for determining the mental and physical states of the user. The predefined set of conditions is stored in a memory unit of the processing unit. The electronic device is configured with a set of software and hardware. The software and the hardware enable the processor to compare the input and generate the advice instructions.

[0019] The processing unit generates advice instructions according to a predefined set of instructions therein. The advisory instructions include body postures, facial expressions, physical and mental activities that need to be corrected by the user. The advice instructions are generated through an output unit of the electronic device. The output unit is a display, a speaker, a projector, and a printer. In one embodiment, the processing unit predicts possible physical and mental activities or movements by comparing the input information with the predefined set of conditions of physical activities, facial expressions, and bio parameters and generates advice instructions as cautions or prediction instructions. [0020] In one of the embodiment, the processing unit is connected to a cloud storage. The cloud storage has a predefined set of conditions of physical and mental activities, facial expressions, and bio-parameters. The conditions are periodically updated by one or more electronic devices connected thereto. The system enables one or more users to get connected mutually through one or more electronic devices having systems configured therein to update the conditions stored in the cloud storage and also use the updated stored information subsequently.

[0021] In one more embodiment, the one or more electronic devices are connected to a server. The server is connected to the cloud storage. The server processor of the server compares the input information from one or more electronic devices with the predefined set of conditions in the cloud storage for generating one or more advice instructions according to a predefined set of instructions therein to one or more electronic devices connected to the server.

[0022] In an alternative embodiment, the system has a voice receiving unit for receiving a voice input of the user. The voice receiving unit is connected to the processor to provide voice input information. The processing unit compares the voice input with predefined conditions of voice according to various emotional states of the user to deduce the emotional state of the user and generates advice instructions according to the predefined set of instructions therein. One of the instructions is to display contents such as music, images according to the emotional state of the user. The processing unit actuates the output unit to display the contents such as music, images according to the emotional state of the user through the output unit.

Brief Description of the Drawings

[0023] The advantages and features of the present invention will be understood better with reference to the following detailed description of some embodiments of the impact energy absorber and claims taken in conjunction with the accompanying drawings, wherein like elements are identified with like symbols, and in which;

[0024] Figure 1 shows a block diagram of a system for analyzing physical activities and bio-markers in accordance with the present invention;

[0025] Figure la shows a schematic view of an electronic device configured with the system shown in figure 1 ;

[0026] Figure lb shows a schematic view of a wearable device of the system shown in figure 1 ;

[0027] Figure 2 shows a block diagram of an alternative embodiment of a system for analyzing physical activities and bio-markers in accordance with the present invention;

[0028] Figure 3 shows a block diagram of one more alternative embodiment of a system for analyzing physical activities and bio-markers in accordance with the present invention; and

[0029] Figure 4 shows a block diagram of one more alternative embodiment of a system for analyzing physical activities and bio-markers in accordance with the present invention.

Detailed Description of the Invention

[0030] An embodiment of this invention, illustrating its features, will now be described in detail. The words "comprising, "having”, "containing," and "including," and other forms thereof, are intended to be equivalent in meaning and be open-ended in that an item or items following any one of these words is not meant to be an exhaustive listing of such item or items or meant to be limited to only the listed item or items.

[0031] The terms “first,” “second,” and the like, herein do not denote any order, quantity, or importance, but rather are used to distinguish one element from another, and the terms “an” and “a” herein do not denote a limitation of quantity, but rather denote the presence of at least one of the referenced item.

[0032] The disclosed embodiments are merely exemplary of the invention, which may be embodied in various forms.

[0033] The present invention provides a system for analyzing physical activities and bio-markers. Further, the system provides an audio-visual alert to a user to correct body movement and facial expressions. Furthermore, the system improves the work efficiency of a user.

[0034] Referring now to figure 1, a block diagram of a system for analyzing physical activities and bio-markers in accordance with the present invention is illustrated. From herein afterwards, the system for analyzing physical activities and bio-markers is referred to as a system (100). The system (100) includes one or more cameras (10), one or more wearable devices (20), and a processing unit (30). The processing unit (30) is connected to the camera (10) and the wearable device (20). In a preferred embodiment (figure la), the system (100) is configured in an electronic device (40). The electronic device (40) is a mobile or a computer or a tablet or an IoT (internet of things) device or any device capable of processing input and providing an output using a set of hardware and software configured therein.

[0035] The camera (10) is adapted to record the physical activities, facial expressions of a user (200). The camera (10) will record only when the camera (10) is operated accordingly by the user (200). It may be an obvious skill to the user (200) for operating the camera (10) to capture/record the physical activities, the facial expressions of the user (200). In an embodiment, when the system (100) is configured in the electronic device (40), the camera (10) is a camera of the electronic device (40). The electronic device (40) is configured with a set of software and hardware for enabling the camera (10) to record the physical activities, the facial expressions of the user (200). The software can be a mobile application (app).

[0036] The wearable device (20) is arranged on the user’s (200) body. In an embodiment, the wearable device (20) is a smartwatch. In an alternative embodiment, the wearable device (20) is a torso measurement device described in patent application WO2020194176A1 or a system described in patent application W02018207051A1. In a preferred embodiment, the wearable device (20) is a hearable device or a device that is in surface contact with the body of the user (200) for measuring bio-parameters such as a breathing pattern, a heartbeat rate, a sweating rate, and the like of the user (200). The wearable device (20) is connected to the electronic device (40) through a wired or a wireless connection. The wireless connection can be through Bluetooth (trade name) or Wi-Fi (trade name) or internet or an intranet.

[0037] The wearable device (20) is having one or more bio-sensors (22) (figure lb). The bio-sensors (22) can be combinations of a heartbeat sensor (22a), a sweat sensor (22b), a breath sensor (22c), and the like. The bio-sensor (22) is configured to sense bio-parameters such as a breathing pattern, a heartbeat rate, a sweating rate, and the like of the user. Further, the wearable device (20) includes a GPS tracking device (24). The GPS tracking device (24) tracks the geographical location of the user (200) and sends the geographical location to the processing unit (30).

[0038] Further, the wearable device (20) is a having one or more physiological parameter sensing units (26) for measuring physiological parameters, such as heart rate, blood pressure, body temperature, serum levels of various stress hormones, and immunological functions of the user (200). The parameter sensing unit (26) can be configured with a body thermometer or a blood pressure sensor or any sensor adapted to sense and measure the physiological parameters with or without contacting the surface of the body of the user (200).

[0039] Furthermore, the wearable device (20) has one more electrophysiological parameters sensing unit (28). The electrophysiological parameters sensing unit (28) is adapted to measure the voltage changes or electric current or manipulations on a wide variety of scales from single ion channel proteins to whole organs like the heart, the brain. The electrophysiological parameters sensing unit (28) is adapted to perform an electroencephalogram on the user (200). The electrophysiological parameters sensing unit (28) may include electrodes.

[0040] The processing unit (30) receives input information comprising physical activities, facial expressions, and bio-parameters of the user therefrom the camera (10) and the wearable device (20). More specifically, the processing unit (30) compares the input information with a predefined set of conditions of physical activities, facial expressions, and bio-parameters for determining the mental and physical states of the user. The predefined set of conditions is stored in a memory unit (50) of the processing unit (30).

[0041] The processing unit (30) is a processor of the electronic device (40). The electronic device (40) is configured with a set of software and hardware for enabling the processor to compare the input and generate the advice instructions. The software calibrates the input received in real-time. The software includes a feature extraction module. The feature extraction module extracts the features from the input received. The features may include body posture, nose, eyes, and the like. The feature vectors are used to compare and specify a difference between the input and the predefined set of conditions. [0042] Further, the processing unit (30) generates advice instructions (90) according to a predefined set of instructions therein. The processing unit (30) generates the advice instructions (90) as an audio or a visual or an audio-visual. The advisory instructions include body postures, facial expressions, and physical and mental activities that need to be corrected by the user (200). The advice instructions (90) are generated through an output unit (60) of the electronic device (40). The output unit (60) is a display, a speaker, a projector, and a printer, or any such obvious output unit (60) through (on) which the user (200) receives an output.

[0043] For example, a user (200) is attending a corporate meeting. First, the user (200) needs to install the system (100) in his/her electronic device (40). The user (200) needs to wear the wearable device (20). The user needs to activate the system (100) by activating the electronic device (40) (More specifically, the camera (10) and the wearable device (20)). The camera (10) records the user's body movement, facial expressions, and other physical activities. Similarly, the wearable device (20) is configured with the heartbeat sensor (22a), the sweat sensor (22b), and the breath sensor (22c). The heartbeat sensor (22a) counts the heartbeats of the user, the sweat sensor (22b) measures the sweat rate through the user’s body and the breath sensor (22c) calculates the breathing rate of the user (200).

[0044] The camera (10) and the wearable device (20) send captured and measured information (details) to the processing unit (30) as an input. The processing unit (30) is configured with a predefined set of conditions. One such condition is a suitable body movement, facial expressions, a sweat rate for the user (200) in a corporate meeting. If there is a difference between the received information (details) such as a body movement, facial expressions, a sweat rate of the user (200) is deviated from the preconfigured conditions, the processing unit (30) generates advice instructions (90). For example, a suitable condition of a facial expression is a face with a “smile”. If the captured image of the user (200) has a face without a smile, the processing unit (30) generates an advice instruction (90) to the user (200) to change the face without a smile into a smiley one. [0045] Similarly, various sets of conditions can be configured in the memory unit (60). The memory unit (60) can be defined with the conditions according to locations based on country, city, and type of location, climate, and related categories. For example, the suitable facial expressions of a user (200) attending a business meeting in Japan or USA or addressing a business group or a public meeting or addressing a crowd or a casual meeting in a park and the like. The user can get instructions according to the location (country, city).

[0046] In one more example of the system (100), the memory unit (50) is defined with a set of conditions of a body posture, a facial expression, breath rate, heart rate of a user (200) when the user (200) performs a physical exercise (meditation or pranayama) in a suitable manner. Whenever the user (200) wants to monitor his/her body posture, a facial expression, breath rate, heart rate while performing physical exercise, the user (200) can actuate the system (200) by operating the camera (10) and the wearable (200) accordingly.

[0047] The processing unit (30) of the system (100) compares real time body posture, a facial expression, breath rate, heart rate with the body posture, the facial expression, the breath rate, the heart rate of the user (200) when the user (200) performs the physical exercise (meditation or pranayama) in a suitable manner and provide advice instruction (90) to the user (200) through the output unit (60) if these parameters deviate from the defined ones. Therefore, the system (100) can guide the user (200) (in this example, as a “wellness coach”) for performing various physical activities according to the defined conditions. [0048] Similarly, various conditions can be defined in the memory unit (50) according to various activities (such as physical exercises, meditations, yoga, aerobics) for enabling the user (200) to monitor her/her physical activities, facial expressions, biomarkers by the system (100).

[0049] In one more embodiment, the memory unit (50) is defined with a set of condition, and the processing unit (30) is defined with a set of instructions (including software) to predicts possible physical activities or movements by comparing the input information with the predefined set of conditions of physical activities, facial expressions, and bio-parameters and generates advice instructions (90) as cautions or prediction instructions. For example, the memory unit (50) is defined with a condition of correct body posture a user (200) while lifting a weight with a proper body balance. Further, the memory unit (50) is also defined with conditions of body postures which tend to make the user (200) lose his/her balance while lifting the weight.

[0050] Upon actuating the system (100) for monitoring the user (200) while lifting a weight, if the user (200) follows the body postures which tend to make the user (200) lose his/her balance, the system (100) alerts the user (200) with caution or prediction instruction (90) by alerting the user (200) that his/her moves may tend his/her to lose body balance while lifting the weight. Similarly, the system (100) can predict possible physical activities of the user (200) according to the predefined conditions and comparing real-time conditions (input information) therewith.

[0051] Referring to figure 2, a block diagram of an alternative embodiment of a system for analyzing physical activities and bio-markers in accordance with the present invention is illustrated. From herein afterwards, the system for analyzing physical activities and bio-markers is referred to as a system (110). In the system (110), the processing unit (30) is connected to cloud storage (70). The cloud storage (70) is having a predefined set of conditions of physical activities, facial expressions, and bio -parameters. The conditions are periodically updated by one or more electronic devices (40a, 40b, 40c) connected thereto. [0052] The system (110) enables one or more users (200a, 200b, 200c) to get connected mutually through one or more electronic devices(40a, 40b, 40c) having systems (110) to update the conditions stored in the cloud storage (70) and also use the updated stored information subsequently. “Using of updated stored information” here refers to configuring conditions, generating advice instructions according to updated stored information, and the like. For enabling cloud-based interaction between the electronic devices (40a, 40b, 40c) having systems (110) therein, a respective software (or mobile applications) are configured in the electronic devices (40a, 40b, 40c).

[0053] In one more embodiment (refer to figure 3) of a system (110b) for monitoring physical activities and biomarkers in accordance with the present invention, the one or more electronic devices (40a, 40b, 40c) are connected to a server (34). The server (34) is connected to the cloud storage (70). The server (34) is having a server processor (32). The server processor (32) compares the input information from one or more electronic devices (40a, 40b, 40c) with the predefined set of conditions in the cloud storage (70) for generating one or more advice instructions (90) according to the predefined set of instructions therein to one or more electronic devices(40a, 40b, 40c) connected to the server (34). The system (110b) enables multiple users (200a, 200b, 200c) to define conditions and process the conditions with real-time input using a single processor (I,e the server) (34) and a single memory unit (I, e the cloud storage (70)).

[0054] In one more embodiment (figure 4), the system (120) is having a voice receiving unit (80) for receiving a voice input of the user (200). The voice receiving unit (80) can be a microphone or a sound transducer. The system (120) can be configured in the electronic device (40). The voice receiving unit (80) can be a microphone of the electronic device (40). In an alternative embodiment, the voice receiving unit (80) is embedded in the wearable device (20). The voice receiving unit (80) is connected to the processor (30) to provide voice input information. The processing unit (30) compares the voice input with predefined conditions of voice according to various emotional states of the user to deduce the emotional state of the user (200) and generates advice instructions (90) according to the predefined set of instructions therein.

[0055] Some examples of the emotional states can be “happy”, “sad”, “fear”, “spunky” and the like. One of the instructions can be to display contents such as music, images according to the emotional state of the user (200). The processing unit (30) actuates the output unit (20) to the display the contents such as music, images according to the emotional state of the user (200) through the output unit (20). Similarly, various conditions of emotions, instructions can be defined in the memory unit (50) of the system (120).

[0056] The present invention has the advantage of providing the system (100, 110, 110b, 120) for monitoring physical activities and biomarkers of the user (200). The system (100, 110, 110b, 120) helps the user to improve body posture, psychological and physiological behaviour of the individual (200). Further, the system (100, 110, 110b, 120) is easy to operate and simple in construction. Furthermore, the system (100, 110, 110b, 120) provides an audio visual alert to the users (200a, 200b, 200c) to correct body movement and facial expressions. Also, the system (100, 110, 110b, 120) provides an audio-visual alert to the user (200a, 200b, 200c) to regulate the breathing rate or heart rate, thereby enhancing individual wellness.

[0057] The foregoing descriptions of specific embodiments of the present invention have been presented for purposes of illustration and description. They are not intended to be exhaustive or limit the present invention to the precise forms disclosed, and obviously, many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described to best explain the principles of the present invention and its practical application, and to thereby enable others skilled in the art to best utilize the present invention and various embodiments with various modifications as are suited to the particular use contemplated. It is understood that various omissions and substitutions of equivalents are contemplated as circumstances may suggest or render expedient, but such omissions and substitutions are intended to cover the application or implementation without departing from the scope of the claims of the present invention.