Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
EMOTION DETECTION SYSTEM AND METHOD
Document Type and Number:
WIPO Patent Application WO/2020/213007
Kind Code:
A1
Abstract:
The present invention relates to a system and method for emotion detection. The system includes a server configured to receive at least one data input of a person from a computing device, a scanning device configured to scan the data input to determine emotion quotient of the user, a processor coupled to an artificial intelligence engine configured to detect emotion on a face of the person, and a controller encoded with instructions enabling a controller to auto-generate the at least one data model employed to detect the emotion of the person based on polarity of the at least one data input.

Inventors:
GROVER RAJAT INDERKUMAR (IN)
JAITEG SINGH (IN)
Application Number:
PCT/IN2020/050353
Publication Date:
October 22, 2020
Filing Date:
April 14, 2020
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
GROVER RAJAT INDERKUMAR (IN)
International Classes:
G06K9/60; G06K9/62; G06Q10/06; G06Q99/00
Foreign References:
KR20180054407A2018-05-24
US20130216126A12013-08-22
KR20140037777A2014-03-27
Attorney, Agent or Firm:
BAGGA, Rahul (IN)
Download PDF:
Claims:
WE CLAIM:

1. A system (100) for emotion detection of a user, the system comprising:

a server (110) communicably coupled to at least one computing device (120) of the user, said server being configured to receive at least one data input of the user from the at least one computing device (120) of the user;

a scanning device ( 130) operably coupled to the at least one computing device (120) of the user, said scanning device being configured to determine an emotion quotient of the user;

a processor (140) coupled to an artificial intelligence engine (150) configured to detect an emotion on a face of the user; and

a controller (160) configured to autogenerate the at least one data model employed to detect the emotion of the user based on polarity of the at least one data input.

2. The system as claimed in claim 1, wherein the system is configured to:

detect an emotion on a face of the user from a multiplicity of images of the user collected by a database (170), wherein the system is further configured to compare the collected images with a standard library stored in the database (170).

3. The system as claimed in claim 1, wherein the system is configured to:

detect an emotion of the user from a data input (in text format) of the user in at least one journal maintained by the user,

perform sentiment analysis on a sentence level and outputs a value between [- 1, +1], where -1 depicting sad, 0 depicting neutral and +1 depicting happy emotional state.

4. The system as claimed in claim 3, wherein the system is configured to: calculate a polarity value of {P1, P2 ... Pn} for‘n’ sentences of the data input in the at least one journal maintained by the user,

calculate Textual polarity of the data input (in text format), Tpol = [(Pi+ P2. Pn)/n]; and

scan and detect an emotion of the user from an uploaded image and assign an emotion percentage and calculate a Photo polarity, Ppol = [Max (Emotion percentage)/ 100].

5. The system as claimed in claim 1, wherein the system is configured to:

detect an emotion of the user from a frame by frame analysis of a video data input, wherein the system compares an uploaded/pre-stored picture of the user with the video data input, and calculates emotive scores for the video containing“N” frames by:

where Pi = probability and Wi = weight factor for a frame belonging to ith emotion class

6. A method for emotion detection of a user, the method comprising:

receiving, by a server (110) communicably coupled to at least one computing device (120) of the user, at least one data input of the user from the at least one computing device (120) of the user;

determining, an emotion quotient of the user, by a scanning device (130) operably coupled to the at least one computing device (120) of the user;

detecting, an emotion on a face of the user, by a processor (140) coupled to an artificial intelligence engine (150); and

generating, by a controller (160), the at least one data model employed to detect the emotion of the user based on polarity of the at least one data input.

7. The method as claimed in claim 6, wherein the method comprises:

detecting, an emotion on a face of the user from a multiplicity of images of the user collected by a database (170), wherein the system is further configured to compare the collected images with a standard library stored in the database (170).

8. The method as claimed in claim 6, wherein the method comprises

detecting, an emotion of the user from a data input (in text format) of the user in at least one journal maintained by the user, and

performing, sentiment analysis on a sentence level and outputs a value between [-1, +1], where -1 depicting sad, 0 depicting neutral and +1 depicting happy emotional state.

9. The method as claimed in claim 8, wherein the method comprises: calculating, a polarity value of {P1, P2 .. Pn} for‘n’ sentences of the data input in the at least one journal maintained by the user,

calculating Textual polarity of the data input (in text format), Tpol = [(Pi+ P2. Pn)/n]; and

scanning and detecting an emotion of the user from an uploaded image, and assign an emotion percentage and calculating a Photo polarity, Ppol = [Max (Emotion percentage)/ 100].

10. The method as claimed in claim 6, wherein the method comprises:

detecting, an emotion of the user from a frame by frame analysis of a video data input,

comparing, an uploaded/pre-stored picture of the user with the video data input, and

calculating emotive scores for the video containing“N” frames by:

where Pi = probability and Wi = weight factor for a frame belonging to ith emotion class

11. A non-transitory computer readable storage medium storing one or more programs for execution by the one or more processors of a system for emotion detection of a user, the one or more programs including instructions for:

receiving, by a server (110) communicably coupled to at least one computing device (120) of the user, at least one data input of the user from the at least one computing device (120) of the user;

determining, an emotion quotient of the user, by a scanning device (130) operably coupled to the at least one computing device (120) of the user;

detecting, an emotion on a face of the user, by a processor (140) coupled to an artificial intelligence engine (150); and

generating, by a controller (160), the at least one data model employed to detect the emotion of the user based on polarity of the at least one data input.

12. The non-transitory computer readable storage medium as claimed in claim 11, wherein the one or more programs include instructions for:

detecting, an emotion on a face of the user from a multiplicity of images of the user collected by a database (170), wherein the system is further configured to compare the collected images with a standard library stored in the database (170);

detecting, an emotion of the user from a data input (in text format) of the user in at least one journal maintained by the user;

performing, sentiment analysis on a sentence level and outputs a value between [-1, +1], where -1 depicting sad, 0 depicting neutral and +1 depicting happy emotional state; calculating, a polarity value of {P1, P2 .. Pn} for‘n’ sentences of the data input in the at least one journal maintained by the user;

calculating Textual polarity of the data input (in text format), Tpol = [(P1+ P2............Pn)/n];

scanning and detecting an emotion of the user from an uploaded image and assign an emotion percentage and calculating a Photo polarity, Ppol = [Max (Emotion percentage)/ 100];

detecting, an emotion of the user from a frame by frame analysis of a video data input;

comparing, an uploaded/pre-stored picture of the user with the video data input; and

calculating emotive scores for the video containing“N” frames by:

where Pi = probability and Wi = weight factor for a frame belonging to ith emotion class

13. A computer program product having a non-transitory computer readable storage medium storing one or more programs for execution by the one or more processors of a system for emotion detection of a user, the one or more programs including instructions for:

receiving, by a server (110) communicably coupled to at least one computing device (120) of the user, at least one data input of the user from the at least one computing device (120) of the user;

determining, an emotion quotient of the user, by a scanning device (130) operably coupled to the at least one computing device (120) of the user;

detecting, an emotion on a face of the user, by a processor (140) coupled to an artificial intelligence engine (150); and

generating, by a controller (160), the at least one data model employed to detect the emotion of the user based on polarity of the at least one data input.

14. The computer program product as claimed in claim 13, wherein the one or more programs include instructions for:

detecting, an emotion on a face of the user from a multiplicity of images of the user collected by a database (170), wherein the system is further configured to compare the collected images with a standard library stored in the database (170);

detecting, an emotion of the user from a data input (in text format) of the user in at least one journal maintained by the user;

performing, sentiment analysis on a sentence level and outputs a value between [-1, +1], where -1 depicting sad, 0 depicting neutral and +1 depicting happy emotional state;

calculating, a polarity value of {P1, P2 ...Pn} for‘n’ sentences of the data input in the at least one journal maintained by the user; calculating Textual polarity of the data input (in text format), Tpol = [(Pi+ P2............Pn)/n];

scanning and detecting an emotion of the user from an uploaded image, and assign an emotion percentage and calculating a Photo polarity, Ppol = [Max (Emotion percentage)/ 100];

detecting, an emotion of the user from a frame by frame analysis of a video data input;

comparing, an uploaded/pre-stored picture of the user with the video data input; and

calculating emotive scores for the video containing“N” frames by:

where Pi = probability and Wi = weight factor for a frame belonging to ith emotion class

Description:
1. TITLE OF THE INVENTION: EMOTION DETECTION SYSTEM

AND METHOD

3. PREAMBLE OF THE DESCRIPTION: The following complete specification particularly describes the invention and the manner in which it is performed. FIELD OF INVENTION

[0001] The present invention generally relates to human emotions. More particularly, the invention relates to a system and a method for Emotion detection.

BACKGROUND OF THE INVENTION

[0002] The ability to identify faces and related emotions quickly and effortlessly exists within humans naturally. It also develops over several years from childhood to enable human beings to identify and recognize emotions of others.

[0003] Over the years it has been a great challenge to build gadgets and electronic systems that can replicate such skills even in changing environments. Various attempts in the past have dependent on image processing systems, identification algorithms to achieve intelligence for such identification and detection. However, the systems have not been successful or applied as would be desired.

[0004] Further, the existing automated detection systems, used to determine emotions are extremely inaccurate due to heavy dependency on signal processing and pattern recognition techniques. One of the major challenges is that expressive human behaviour is highly variable and depends on a number of factors.

[0005] In the view of foregoing, there is a need for an improved method and system for overcoming the short comings associated with prior arts.

SUMMARY

[0006] The following presents a simplified summary in order to provide a basic understanding of some aspects. This summary is not an extensive overview. It is not intended to identify the key /critical elements or to delineate the scope. Its sole purpose is to present some concept in a simplified form as a prelude to a more detailed description presented later.

[0007] In accordance with an aspect, a system for emotion detection of a user is provided. The system comprises a server communicably coupled to at least one computing device of the user, said server being configured to receive at least one data input of the user from the at least one computing device of the user; a scanning device operably coupled to the at least one computing device of the user, said scanning device being configured to determine an emotion quotient of the user; a processor coupled to an artificial intelligence engine configured to detect an emotion on a face of the user; and a controller configured to autogenerate the at least one data model employed to detect the emotion of the user based on polarity of the at least one data input.

[0008] In accordance with another aspect, a method for emotion detection of a user is provided. The method comprises the steps of receiving, by a server communicably coupled to at least one computing device of the user, at least one data input of the user from the at least one computing device of the user; determining, an emotion quotient of the user, by a scanning device operably coupled to the at least one computing device of the user; detecting, an emotion on a face of the user, by a processor coupled to an artificial intelligence engine; and generating, by a controller , the at least one data model employed to detect the emotion of the user based on polarity of the at least one data input.

[0009] In accordance with another aspect, a non-transitory computer readable storage medium storing one or more programs for execution by the one or more processors of a system for emotion detection of a user is provided. The one or more programs include instructions for receiving, by a server communicably coupled to at least one computing device of the user, at least one data input of the user from the at least one computing device of the user; determining, an emotion quotient of the user, by a scanning device operably coupled to the at least one computing device of the user; detecting, an emotion on a face of the user, by a processor coupled to an artificial intelligence engine; and generating, by a controller, the at least one data model employed to detect the emotion of the user based on polarity of the at least one data input.

[0010] In accordance with yet another aspect, a computer program product having a non-transitory computer readable storage medium storing one or more programs for execution by the one or more processors of a system for emotion detection of a user has been provided. The one or more programs include instructions for: receiving, by a server communicably coupled to at least one computing device of the user, at least one data input of the user from the at least one computing device of the user; determining, an emotion quotient of the user, by a scanning device operably coupled to the at least one computing device of the user; detecting, an emotion on a face of the user, by a processor coupled to an artificial intelligence engine; and generating, by a controller, the at least one data model employed to detect the emotion of the user based on polarity of the at least one data input.

[0011] These and other aspects of the embodiments herein will be better appreciated and understood when considered in conjunction with the following description and the accompanying drawings. It should be understood, however, that the following descriptions, while indicating preferred embodiments and numerous specific details thereof, are given by way of illustration and not of limitation. Many changes and modifications may be made within the scope of the embodiments herein without departing from the concept thereof, and the embodiments herein include all such modifications. BRIEF DESCRIPTION OF THE DRAWINGS

[0012] The other objects, features and advantages will occur to those skilled in the art from the following description of the preferred embodiment and the accompanying drawings in which:

[0013] Fig. 1 shows an emotion detection system in accordance with an embodiment of the present invention.

[0014] Fig. 2 shows a user interface displaying emotion of a person in accordance with an embodiment of the present invention.

[0015] Fig. 2a shows a user interface displaying plurality of happy emoticons to receive input from a user about his/her emotion in accordance with an embodiment of the present invention.

[0016] Fig. 2b shows a user interface displaying plurality of neutral emoticons to receive input from a user about his/her emotion in accordance with an embodiment of the present invention.

[0017] Fig. 2c shows a user interface displaying plurality of sad emoticons to receive input from a user about his/her emotion in accordance with an embodiment of the present invention.

[0018] Fig. 2d shows a user interface displaying plurality of distinct emoticons related to happiness to receive input from a user about his/her emotion in accordance with an embodiment of the present invention. [0019] Fig. 3 shows a flowchart depicting a method of emotion detection in accordance with an embodiment of the present invention.

DETAILED DESCRIPTION

[0020] Various embodiment of the present invention provides a system and method for emotion detection. The following description provides specific details of certain embodiments of the invention illustrated in the drawings to provide a thorough understanding of those embodiments. It should be recognized, however, that the present invention can be reflected in additional embodiments and the invention may be practiced without some of the details in the following description.

[0021] The various embodiments including the example embodiments will now be described more fully with reference to the accompanying drawings, in which the various embodiments of the invention are shown. The invention may, however, be embodied in different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art. In the drawings, the sizes of components may be exaggerated for clarity.

[0022] It will be understood that when an element or layer is referred to as being“on,” “connected to,” or“coupled to” another element or layer, it can be directly on, connected to, or coupled to the other element or layer or intervening elements or layers that may be present. As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items.

[0023] Spatially relative terms, such as“image processor,”“camera,” and the like, may be used herein for ease of description to describe one element or feature's relationship to another element(s) or feature(s) as illustrated in the figures. It will be understood that the spatially relative terms are intended to encompass different orientations of the structure in use or operation in addition to the orientation depicted in the figures.

[0024] Embodiments described herein will refer to plan views and/or cross-sectional views by way of ideal schematic views. Accordingly, the views may be modified depending on simplistic assembling or manufacturing technologies and/or tolerances. Therefore, example embodiments are not limited to those shown in the views but include modifications in configurations formed on basis of assembling process. Therefore, regions exemplified in the figures have schematic properties and shapes of regions shown in the figures exemplify specific shapes or regions of elements, and do not limit the various embodiments including the example embodiments.

[0025] The subject matter of example embodiments, as disclosed herein, is described with specificity to meet statutory requirements. However, the description itself is not intended to limit the scope of this patent. Rather, the inventors have contemplated that the claimed subject matter might also be embodied in other ways, to include different features or combinations of features similar to the ones described in this document, in conjunction with other technologies. Generally, the various embodiments including the example embodiments relate to a system and method for emotion detection.

[0026] In an embodiment, the present invention provides a system for emotion detection. The system is configured to calculate an emotion quotient for Happiness (HQ) like a HQ (or alternatively termed as Happiness Quotient) score for every person based on a probability algorithm. The system includes data inputs including text data, image data, video data or selection of emotion from emoticons by the person/user.

[0027] Referring to Fig. 1, the system 100 for emotion detection is shown in accordance with an embodiment of the present invention. The system 100 includes a server 110 configured to receive at least one data input of a person from a computing device 120, a scanning device 130 configured to scan the data input to determine emotion quotient of the user, a processor 140 coupled to an artificial intelligence engine 150 configured to detect emotion on a face of the person, and a controller 160 encoded with instructions enabling the controller 160 to auto-generate the at least one data model employed to detect the emotion of the person based on polarity of the at least one data input.

[0028] In an exemplary embodiment, the processor 140 is configured to process the collected images by a database 170. In an embodiment, the processor 140 is configured to compare the collected images with a standard library stored in the database 170. Further, in an embodiment, the processor 140 is configured to use the artificial intelligence engine 150 to process the images.

[0029] In an embodiment, the processor 140 of the present invention is implemented in the form of one or more processors and may be implemented as appropriate in hardware, computer-executable instructions, firmware, or combinations thereof. Computer-executable instruction or firmware implementations of the processing module may include computer-executable or machine-executable instructions written in any suitable programming language to perform the various functions described.

[0030] In an embodiment, the system of the present invention includes memory devices that may include a permanent memory such as hard disk drive, may be configured to store data, and executable program instructions that are implemented by the processor. The memory devices may be implemented in the form of a primary and a secondary memory. The memory devices may store additional data and program instructions that are loadable and executable on the processor, as well as data generated during the execution of these programs. Further, the memory devices may be a volatile memory, such as a random-access memory and/or a disk drive, or a non-volatile memory. The memory devices may comprise of removable memory such as a Compact Flash card, Memory Stick, Smart Media, Multimedia Card, Secure Digital memory, or any other memory storage that exists currently or may exist in the future.

[0031] In an embodiment, the system of the present invention includes a plurality of Journals where a registered user can write and upload data. The invention includes colour theme to identify each journal like Gratitude Journal, personal Journal, learning Journal or Wellness Journal.

[0032] In an embodiment, the system includes journal within journal concept under which registered user maintains any number of sub journals within each of the plurality of journals through categories.

[0033] In one embodiment, the system of the invention includes an advanced search functionality through which the registered user searches any data / journal in his / her journals. The functionality makes it a very easy data retrieval process. Students can also make their notes online and download them for studies whenever required.

[0034] In another embodiment the system of the invention enables the user to edit, download or email Journal (along with attachments) as per requirement.

[0035] In yet another embodiment the system of the invention enables entities to interact with the system for uploading data related to the users in the user Journals and the users can access these Journals anytime. The user can upload data in learning journal as well as companies / institute can upload data and it can be accessed by the user throughout their life by keeping his account active on our website / app. [0036] In an embodiment, the entities include companies, wellness centres, pathology labs and doctors that can upload data or report in the user’s wellness Journal and it can be easily accessed by the users. Since, the users as well as respective companies are uploading data in user account, when user visits doctors or concerned person, then he/she can show all their wellness data, medical and other reports to the doctors / concerned person online through the website or app of the present system.

[0037] In an embodiment, when user submit their journal, the probability algorithm performs sentiment analysis on a sentence level and outputs a value between [-1, +1]. With -1 depicting sad, 0 depicting neutral and +1 depicting happy emotional state. Then plurality of emoticons of varying degree weight depending upon the textual polarity ratio of emotion state is displayed and user need to select and submit any one that best suits him/her emotions in that journal.

[0038] In an embodiment, the system includes image data input where the user uploads their selfie/photo in their journal or the user may simply upload an image. Once the user submits their selfie / photo, the system identifies whether it’s a human being or not. Then system matches the selfie with profile photo to identify the users face in photo / group photo. Once the user is identified using his / her profile photo then image is scanned for emotions namely Disgust, Surprised, Anger, Happy and Sad and an Emotion percentage (%) will be assigned to each emotion based on its probability. Photo polarity is the maximum percentage of emotions that will be factored. The system includes an image processing apparatus configured to capture image of the person to identify face and determine emotion of the person. In another embodiment, the personality of the user is determined from the photo based on the system/method as disclosed herein. [0039] In an embodiment, the system includes video data input where the user can upload their video in their journal or may upload the video directly. Once user submits their video, the system identifies whether it’s a human being or not. Then the system matches the video with profile photo to identify the users face in group. Once the user is identified using his/her profile photo then video is broken into number of sub clips and is scanned for emotions namely Disgust, Neutral, Surprised, Anger, Happy and Sad and an Emotion percentage (%) will be assigned to each emotion based on its probability. Video polarity is the maximum percentage of emotion that will be factored. In another embodiment, the probable depression symptoms score of the user is determined from the video based on the system/method as disclosed herein.

[0040] In an embodiment, the system detects emotion of the user from a frame by frame analysis of a video data input. Emotion Classifier, EmoCla, takes an image (EmoCla(frame)) as input and returns a probability set of 6 emotions in sequence of {“Happy”,“Sad”,“Surprised”,“Anger”,“Neutra l”,“Disgust”}. The probability set contains the probability scores returned by the EmoCla for the particular image. For e.g. A probability set P = [0.1, 0.4, 0.1, 0.1, 0.0, 0.3} represents that the probability of this particular frame being categorized happy is 0.1, sad is 0.4, surprised is 0.1, angry is 0.1, neutral is 0.1 & disgust is 0.3.

[0041] The video data input‘V’ is converted to frames based on“Scene Similarity” and is divided into Frame set F= [f 1 , f2 , ... f i ... , f n ].

If Scene Similarity (f i , f i+ 1 , >0.5: Then save f i+ 1 in F; Else Discard f i+ 1 from F For each frame“f’ in F:

Probability Set P = EmoCla(f)

Pi = probability of frame belonging to 1 th emotion class. f(x) = arg max pi(x) Normalize probability set:

f i = { P (ha)i , P (sa)i , P (su)i , P (an )i , P (ne)i , P (di)i } ;

P max = max (f i )

P 2ndmax max (f i - P max )

Initialize weight set w; { w (ha)i , W (sa)i , W (su)i , W (an)i , W (ne)i , W (di)i } = 0

[where: ha= happy; sa=sad; su = surprised; an= anger, ne = neutral, di = disgust ]

If Pmax > 0.5

If (P max - P 2ndmax ) >0.2:

Max_Emo = Find_emotion (P max ) // Find_emotion tells the emotion according P max value

2ndmax_emo = Find_emotion(P 2ndmax )

Then Set W max_emo =1

Set rest all weights to 0

Else

Else

The final emotive scores for video containing“N” frames are calculated by

[0042] In an embodiment, the system includes selection of emotions from emoticons where the user can select his / her emotions from the emoticons. Emotions are of Happiness. [0043] In an embodiment, the system includes determination of happiness quotient HQ

Score of the user by calculating the score on the basis of data inputs. The happiness quotient score is the average overall polarity after scoring all the scores for a particular day for that user. [0044] In an embodiment, the system may be used by educational institutes to know HQ by class, batch, location, region etc. In another embodiment, the system may be used by corporates to determine HQ by department, reporting authorities, location, region etc. In yet another embodiment, it may be used in hospitality industry to determine the HQ of consumers and staff, etc. In an embodiment, the HQ is linked with well-being, studies, performance, absenteeism, relationships, general satisfactions such as job satisfactions, etc. In an embodiment, the system is used in hotel industry or similar industries such as restaurants, bed and breakfast, short-term accommodation providers to determine HQ of the customers and other people. In yet another embodiment, it can be used in tik-tok kind of videos to determine emotions of people.

[0045] In an embodiment, the HQ scores of several organisations and business entities may be published for users and public at large to determine organisations/entities through several rankings based on HQ.

[0046] In an embodiment, the system may be linked with a self-help system to provide self-help/assistance or counselling via online chats, video calls, and voice calls.

[0047] The present invention provides a method of emotion detection where the happiness quotient is calculated. The method includes performance of sentiment analysis on a sentence level (text/ journal) and output a value between [-1, +1]. With - 1 depicting sad, 0 depicting neutral and +1 depicting happy emotional state. Then assuming each journal written by a user is composed of“n” sentences. Hence each sentence is allotted a polarity value of [P 1 , P 2 ... P n }.

[0048] In an embodiment, a plurality of emoticons of varying degree weight depending upon a textual polarity ratio of emotion state is displayed to the person for selecting that best suits him/her emotions in that journal. [0049] In an embodiment, the textual polarity = T pol = (P 1 + P 2 ...........P n )/n; wherein

P is polarity value and‘n’ is number of sentences in text data.

[0050] In another embodiment, a photo polarity of the image Photo polarity of a blog = P pol = [Happy Emotion percentage (%)] / 100. The user has the option to upload his/her image while writing the journal. The image will be scanned for emotions namely Disgust, Surprised, Anger, Happy and Sad and an Emotion percentage (%) will be assigned to each emotion based on its probability as shown on interface view 200 of Fig. 2.

[0051] In an exemplary embodiment, the invention provides a functionality where user get 1000 points as welcome points in his/her account. These points are added / subtracted from the users account depending upon their journal entries. In the Journal, all positive entries have positive points and all Sad entries have negative points.

[0052] In an embodiment, the invention provides a functionality of counselling where user can take counselling through Life Coach, Image Consultant or qualified counsellor. Life Coach / Counsellor will access the user’s HQ data post his / her approval and then do the counselling basis the users HQ data making it a data driven counselling. Post counselling comments of the counsellor is stored in the users account for future reference.

[0053] In an embodiment, an electronic user interface of the computing device is configured to display data in a calendar manner, making it easy to view and review as per requirement. The interface is configured to display content and instruct on the scanning and recognition process. The display device may also be used to receive an input from a user/operator. The display device may be of any display type known in the art, for example, Liquid Crystal Displays (LCD), Light Emitting Diode (LED) Displays, Cathode Ray Tube (CRT) Displays, Orthogonal Liquid Crystal Displays (OLCD) or any other type of display currently existing or which may exist in the future.

[0054] In an embodiment, the HQ score is calculated on daily basis for employees of a company and it is shown as per reporting authority, department, location, State, Country, etc. to the Human Resource Department / concerned department. Depending upon the score patterns / data analysis Human Resource department / concerned department can plan the required interventions to increase the HQ score or understand the reason for low HQ score and then take the required remedial steps. This will help in proactive identification of the issues (if any) by the Human Resource Department / concerned department. The HQ score can also be linked to the performance, where the system will be able to view that happy employees are more productive, and the required interventions can be planned for the employees with low HQ score. As there is a continues collection of data, Human Resource Department / concerned department will be able to identify which interventions has worked for which set of employees or department. The system provides life coach / psychologist support as per the requirement. The system helps organisations in knowing the HQ score of their employees and publish the HQ score of the company. This HQ score is a very important factor during attracting candidates for an open position (if the company HQ score is published) and promoting a person to the higher role. This score can identify whether employees are happy under a particular leader or not. New candidates will view the HQ score of the companies and then decide whether they would like to join that company or not.

[0055] In an exemplary embodiment the HQ score is calculated on daily basis for the students of the education institutes and depending upon the score patterns / data analysis respective department can plan the required interventions for low HQ score students. They system provides life coach/psychologist support as per the requirement. This will help to identify the early signs of issues like depression, unhappiness etc in the students and proactive efforts / interventions at the right time will benefit to curb the issues arising at the long run. The HQ is calculated lecture wise, hostel wise, location wise, etc and basis the data analysis required interventions can be planned. The HQ score can also be linked to the performance of students, where we will be able to view that happy students are more productive, attentive & proactive and the required interventions can be planned for the students with low HQ score. As there is a continues collection of data, respective authorities will be able to identify which interventions has worked for which set of students and what other interventions are required for other students.

[0056] In an advantageous aspect, the system and method of the present invention focuses on how to help the educational institutes in knowing the HQ score of their students and finally the system will publish the HQ score of the institute. This HQ score is a very important factor during admission (if we publish Institutes HQ score) as students and their parents will be factoring HQ score also as one of the factors while selecting the institute as all parents wants that their children’s to be Happy in whichever institute they study.

[0057] In an embodiment, the system uses artificial intelligence, machine learning and deep learning for emotion detection, based on the processing of data. The emotion on the face is detected using at least one data model of a neural network. The data model is a trained data model configured for identifying emotion based on polarity of the at least one data input. Those skilled in the art will appreciate that the machine learning will improve in efficacy over time based on training by more data.

[0058] In an embodiment the system of the present invention includes at least one communication device configured to provide an interface between the face recognition and emotion detection through other remote networks. The communication device may include a modem, a network interface card (such as Ethernet card), a communication port, and a Personal Computer Memory Card International Association (PCMCIA) slot, among others. The communication device may include devices supporting both wired and wireless protocols. Data in the form of electronic, electromagnetic, optical, among other signals may be transferred via the communication device to remote networks or servers.

[0059] Further, the network environment includes face detection and emotion detection system connected to various scanning devices via a network. The Network may include, but is not restricted to, a communication network such as Internet, PSTN, Local Area Network (LAN), Wide Area Network (WAN), Metropolitan Area Network (MAN), and so forth. In an embodiment, the network can be a data network such as the Internet. Further, the messages exchanged between scanning devices can comprise any suitable message format and protocol capable of communicating the information necessary for the face recognition system to utilize artificial intelligence.

[0060] In an exemplary embodiment the scanner or scanning device of the system includes a lens and a camera.

[0061] As would be apparent to a person having ordinary skilled in the art, the afore- described methods and components may be provided in many variations, modifications or alternatives to existing methods and systems. The principles and concepts disclosed herein may also be implemented in various manners which may not have been specifically described herein but which are to be understood as encompassed within the scope of the appended claims.