Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
WORK MANAGEMENT APPARATUS, METHOD, AND PROGRAM
Document Type and Number:
WIPO Patent Application WO/2018/158622
Kind Code:
A1
Abstract:
It is provided a line manufacturing control apparatus, method, and computer program for controlling at least a part of a manufacturing line. The apparatus comprises an activity obtaining unit configured to obtain information indicating an activity of a worker during execution of an operation, by the worker, in the manufacturing line; a first estimation unit configured to estimate emotion and cognition of the worker using first learning data; a second estimation unit configured to estimate performance of the worker using second learning data; and a controller configured to control a functioning of the at least a part of the manufacturing line based on an estimation result of performance of the worker in the operation obtained by the second estimation unit. Apparatuses, methods and programs are also disclosed for providing driving assistance and healthcare support. The state of a worker is constantly determined accurately and objectively, and an accurate worker performance is estimated and thus obtained. Vital sign measurement data and motion measurement data obtained from workers WK1, WK2, and WK3 during operation are used as primary indicators. The primary indicators and learning data generated separately are used to estimate the emotion and the cognition of each worker. The estimated emotion and cognition are used as secondary indicators. The secondary indicators and relational expressions generated separately are used to estimate the productivity of each worker. The estimated performance can be used to control a manufacturing line.

Inventors:
KOTAKE YASUYO (JP)
NAKAJIMA HIROSHI (JP)
HATAKENAKA SATOKA (JP)
Application Number:
PCT/IB2017/055274
Publication Date:
September 07, 2018
Filing Date:
September 01, 2017
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
OMRON TATEISI ELECTRONICS CO (JP)
International Classes:
G06Q10/00; A61B5/16; A61B5/18
Domestic Patent References:
WO2006000166A12006-01-05
WO2009052633A12009-04-30
Foreign References:
US20140336473A12014-11-13
JP2011019921A2011-02-03
US20090066521A12009-03-12
JP5530019B12014-06-25
JP2016252368A2016-12-27
Download PDF:
Claims:
CLAIMS

1. A line manufacturing control apparatus for controlling at least a part of a manufacturing line, the apparatus comprising:

an activity obtaining unit configured to obtain information indicating an activity of a worker during execution of an operation, by the worker, in the

manufacturing line;

a first estimation unit configured to estimate emotion and cognition of the worker during the operation based on the obtained information indicating the activity used as a primary indicator, and first learning data indicating a relationship between the activity and the emotion of the worker and a relationship between the activity and the cognition of the worker; and

a second estimation unit configured to estimate performance of the worker in the operation based on the estimated emotion and cognition each used as a secondary indicator, and second learning data indicating a relationship between the performance, and the emotion and the cognition of the worker in the operation,

a controller configured to control a functioning of the at least a part of the manufacturing line based on an estimation result of performance of the worker in the operation obtained by the second estimation unit.

2. A manufacturing line operation efficiency estimation apparatus for estimating performance in executing, by a worker, an operation within a

manufacturing line, the apparatus comprising:

an activity obtaining unit configured to obtain information indicating an activity of the worker during the operation;

a first estimation unit configured to estimate emotion and cognition of the worker during the operation based on the obtained information indicating the activity used as a primary indicator, and first learning data indicating a relationship between the activity and the emotion of the worker and a relationship between the activity and the cognition of the worker; and

a second estimation unit configured to estimate performance of the worker in the operation based on the estimated emotion and cognition each used as a secondary indicator, and second learning data indicating a relationship between performance, and the emotion and the cognition of the worker in the operation.

3. A drive assisting apparatus for providing vehicle driving assistance, the apparatus comprising:

an activity obtaining unit configured to obtain information indicating an activity of a subject during driving a vehicle;

a first estimation unit configured to estimate emotion and cognition of the subject during driving based on the obtained information indicating the activity used as a primary indicator, and first learning data indicating a relationship between the activity and the emotion of the subject and a relationship between the activity and the cognition of the subject; and

a second estimation unit configured to estimate performance of the subject in driving based on the estimated emotion and cognition each used as a secondary indicator, and second learning data indicating a relationship between performance, and the emotion and the cognition of the subject in the driving,

a controller configured to provide driving assistance based on an estimation result of the performance of the subject in the driving obtained by the second estimation unit. 4. The apparatus according to claim 3, wherein the driving assistance is provided to at least one amongst the subject and the vehicle.

5. An apparatus for healthcare support of a subject, the apparatus comprising: an activity obtaining unit configured to obtain information indicating an activity of the subject when executing an operation;

a first estimation unit configured to estimate emotion and cognition of the subject when executing the operation based on the obtained information indicating the activity used as a primary indicator, and first learning data indicating a relationship between the activity and the emotion of the subject and a relationship between the activity and the cognition of the subject; and

a second estimation unit configured to estimate performance of the subject in executing the operation based on the estimated emotion and cognition each used as a secondary indicator, and second learning data indicating a relationship between performance, and the emotion and the cognition of the subject in the operation,

a controller configured to provide the subject with a healthcare support feedback based on an estimation result of the performance of the subject in the operation obtained by the second estimation unit.

6. The apparatus for healthcare support of a subject according to claim 5, wherein executing an operation includes at least one amongst executing an interacting operation with a machine and performing a physical exercise.

7. An apparatus for handling performance in executing a task by a subject, the apparatus comprising:

an activity obtaining unit configured to obtain information indicating an activity of the subject when executing the task;

a first estimation unit configured to estimate emotion and cognition of the subject when executing the task based on the obtained information indicating the activity used as a primary indicator, and first learning data indicating a relationship between the activity and the emotion of the subject and a relationship between the activity and the cognition of the subject; and a second estimation unit configured to estimate performance of the subject in executing the task based on the estimated emotion and cognition each used as a secondary indicator, and second learning data indicating a relationship between performance, and the emotion and the cognition of the subject in the task.

8. The apparatus according to claim 7, further comprising

a controller configured to provide the subject with an intervention based on an estimation result of the performance of the subject in the task obtained by the second estimation unit.

9. A method for providing vehicle driving assistance, the method comprising steps of:

obtaining information indicating an activity of a subject during driving a vehicle;

estimating emotion and cognition of the subject during driving based on the obtained information indicating the activity used as a primary indicator, and first learning data indicating a relationship between the activity and the emotion of the subject and a relationship between the activity and the cognition of the subject; and estimating performance of the subject in driving based on the estimated emotion and cognition each used as a secondary indicator, and second learning data indicating a relationship between performance, and the emotion and the cognition of the subject in the driving,

providing driving assistance based on an estimation result of the performance of the subject in the driving obtained by the second estimation unit.

10. The method according to claim 9, wherein the driving assistance is provided to at least one amongst the subject and the vehicle.

11 . A method for healthcare support of a subject, the method comprising steps of:

obtaining information indicating an activity of the subject when executing an operation;

estimating emotion and cognition of the subject during driving based on the obtained information indicating the activity used as a primary indicator, and first learning data indicating a relationship between the activity and the emotion of the subject and a relationship between the activity and the cognition of the subject; and estimating performance of the subject in executing the operation based on the estimated emotion and cognition each used as a secondary indicator, and second learning data indicating a relationship between performance, and the emotion and the cognition of the subject in the operation,

providing the subject with a healthcare support feedback based on an estimation result of the performance of the subject in the operation obtained by the second estimation unit.

12. The method for healthcare support of a subject according to claim 11 , wherein executing an operation includes at least one amongst executing an interacting operation with a machine and performing a physical exercise.

13. A method for handling performance in executing a task by a subject, the method comprising steps of:

obtaining information indicating an activity of the subject when executing the task;

estimating emotion and cognition of the subject when executing the task based on the obtained information indicating the activity used as a primary indicator, and first learning data indicating a relationship between the activity and the emotion of the subject and a relationship between the activity and the cognition of the subject; and estimating performance of the subject in executing the task based on the estimated emotion and cognition each used as a secondary indicator, and second learning data indicating a relationship between performance, and the emotion and the cognition of the subject in the task.

14. The apparatus according to claim 13, further comprising

providing the subject with an intervention based on an estimation result of the performance of the subject in the task obtained by the second estimation unit. 15. A work management apparatus for managing an operation performed by a worker in a system involving the operation performed by the worker, the apparatus comprising:

an activity obtaining unit configured to obtain information indicating an activity of the worker during the operation;

a first estimation unit configured to estimate emotion and cognition of the worker during the operation based on the obtained information indicating the activity used as a primary indicator, and first learning data indicating a relationship between the activity and the emotion of the worker and a relationship between the activity and the cognition of the worker ; and

a second estimation unit configured to estimate performance of the worker in the operation based on the estimated emotion and cognition each used as a secondary indicator, and second learning data indicating a relationship between performance, and the emotion and the cognition of the worker in the operation. 16. The work management apparatus according to claim 15 wherein the first learning data includes at least one first regression equation representing a relationship between the emotion and the activity of the worker, and the at least one first regression equation is generated with a correct value being information indicating an emotion self-reported by the worker and a variable being information indicating an activity of the worker obtained during a time period within which the self-reporting is performed.

17. The work management apparatus according to claim 16, wherein the at least one first regression equation representing the relationship between the emotion and the activity of the worker includes a first regression equation generated for emotional arousal, and a first regression equation generated for emotional valence when the information indicating the emotion self-reported by the worker includes the emotional arousal and the emotional valence.

18. The work management apparatus according to claim 15, wherein the first learning data includes a second regression equation representing a relationship between the cognition and the activity of the worker, and the second regression equation is generated with a correct value being information indicating a success or a failure of the operation by the worker and a variable being information indicating an activity of the worker obtained during a time period immediately before the success or the failure of the operation is obtained.

19. The work management apparatus according to claim 18, wherein the second regression equation representing the relationship between the cognition and the activity of the worker is generated with a correct value being information indicating a success or a failure of the operation by the worker and a variable being at least one of a feature quantlity indicating hand movement of the worker extracted from the information indicating the activity of the worker and a feature quantity indicating eye movement of the worker extracted from the information indicating the activity of the worker.

20. The work management apparatus according to claim 15, wherein the second learning data includes at least one of a first relational expression for estimating a skill level of the worker based on a difference between a current secondary indicator and a past secondary indicator that are estimates of the emotion and the cognition of the worker, and a second relational expression for estimating a misoperation frequency of the worker based on variations among current secondary indicators and variations among past secondary indicators that are the estimates of the emotion and the cognition of the worker.

21 . The work management apparatus according to any one of claims 15 to 20, further comprising

a controller configured to control an operation of the system based on an estimation result of the performance of the worker in the operation obtained by the second estimation unit.

22. The work management apparatus according to claim 21 , wherein the controller divides the estimated performance in the operation into a plurality of classes based on at least one predetermined threshold, and controls the operation of the system based on control information preliminarily associated with each of the classes. 23. A work management method that is implemented by a work management apparatus for managing an operation performed by a worker in a system involving the operation performed by the worker, the method comprising:

obtaining information indicating an activity of the worker during the operation; estimating emotion and cognition of the worker during the operation based on the obtained information indicating the activity used as a primary indicator, and first learning data indicating a relationship between the activity and the emotion of the worker and a relationship between the activity and the cognition of the worker; and estimating performance of the worker in the operation based on the estimated emotion and cognition used as secondary indicators, and second learning data indicating a relationship between the performance, and the emotion and the cognition of the worker in the operation.

24. A work management program enabling a processor to function as the units included in the work management apparatus according to any one of claims 15 to 22.

25. A computer program comprising instructions which, when executed on a computer, cause the computer to execute steps according to any of claims 9 to 14 and 23.

Description:
WORK MANAGEMENT APPARATUS, METHOD, AND PROGRAM

FIELD

[0001 ] The present invention relates to a work management apparatus, a method, and a program used in a system involving an operation performed by a worker. Further, the invention relates to a manufacturing line operation efficiency estimation apparatus, a line manufacturing control apparatus, a drive assisting apparatus, a healthcare support apparatus, and corresponding methods and programs.

BACKGROUND

[0002] For example, early detection of equipment malfunctions in various systems, such as production lines, is a key to preventing the operational efficiency from decreasing. A system has thus been developed for detecting a sign of an equipment malfunction by, for example, obtaining measurement data indicating the operating states of equipment from multiple sensors, and comparing the obtained measurement data with pre-generated learning data (refer to, for example, Patent Literature 1 ).

[0003] In a production line involving an operation performed by a worker, factors known to influence the productivity, or specifically the quality and the amount of production, include 4M (machines, methods, materials, and men) factors. Three of these factors, namely, machines, methods, and materials (3M), have been constantly improved and enhanced to increase the productivity. However, the factor "men" depends on the skill level, the aptitude, and the physical and mental states of a worker. Typically, a manager visually observes the physical and mental states of the worker, and provides an appropriate instruction for the worker to maintain and enhance the productivity.

CITATION LIST PATENT LITERATURE

l SUMMARY

TECHNICAL PROBLEM

[0005] However, this technique of observing the state of a worker relies on the experience or the intuition of the manager for accurate determination of the state of the worker affecting the workability. This technique may not always determine the state of the worker accurately.

[0006] In response to the above issue, one or more aspects of the invention are directed to a work management apparatus, a method, and a program that allow constantly accurate determination of the state of the worker affecting the workability without relying on the experience or the intuition of a manager. Further, prior art techniques have tried to improve safety in driving. However, there is a problem that known techniques do not take accurately into account the state of the driver, and how to obtain and use this in an objective and repeatable way in order to further improve safety. Still further, healthcare devices are known for supporting healthcare of a person; however, these devices do not take into account the accurate state of the person, and how this can be obtained and used in an objective and repeatable way so as to contribute to an improvement of the person's health conditions.

SOLUTION TO PROBLEM

[0007] In response to the above issue as recognized by the inventors, a first aspect of the present invention provides a work management apparatus, a work management method, or a work management program for managing an operation performed by a worker in a system involving the operation performed by the worker. Managing the operation includes estimating performance of the worker in the operation, and controlling an operation of the system based on such estimation. In other words, any of the apparatus, method, or program is for use within a system involving the operation performed by the worker. Thus, an apparatus according to the present aspect includes a manufacturing line operation efficiency estimation apparatus for estimating performance in executing, by a worker, an operation within a manufacturing line, and a line manufacturing control apparatus for controlling at least a part of a manufacturing line.

The apparatus or the method includes an activity obtaining unit or process for obtaining information indicating an activity of the worker during the operation, a first estimation unit or process for estimating emotion and cognition of the worker during the operation based on the obtained information indicating the activity used as a primary indicator, and first learning data indicating a relationship between the activity and the emotion of the worker and a relationship between the activity and the cognition of the worker, and a second estimation unit or process for estimating performance of the worker in the operation based on the estimated emotion and cognition each used as a secondary indicator, and second learning data indicating a relationship between the performance, and the emotion and the cognition of the worker in the operation.

[0008] In the apparatus, method, or program according to a second aspect of the present invention, the first learning data includes at least one first regression equation representing a relationship between the emotion and the activity of the worker. The at least one first regression equation is generated with a correct value being information indicating an emotion self-reported by the worker (or indirectly obtained by highly accurate measurement(s), see also further below) and a variable being information indicating an activity of the worker obtained during a time period within which the self-reporting is performed.

[0009] In the apparatus, method, or program according to a third aspect of the present invention, the at least one first regression equation representing the relationship between the emotion and the activity of the worker includes a first regression equation generated for emotional arousal, and a first regression equation generated for emotional valence when the information indicating the emotion self-reported by the worker includes the emotional arousal and the emotional valence.

[0010] In the apparatus, method, or program according to a fourth aspect of the present invention, the first learning data includes a second regression equation representing a relationship between the cognition and the activity of the worker. The second regression equation is generated with a correct value being information indicating a success or a failure of the operation by the worker and a variable being information indicating an activity of the worker obtained during a time period immediately before the success or the failure of the operation is obtained.

[001 1 ] In the apparatus, method, or program according to a fifth aspect of the present invention, the second regression equation representing the relationship between the cognition and the activity of the worker is generated with a correct value being information indicating a success or a failure of the operation by the worker and a variable being at least one of a feature quantity indicating hand movement of the worker extracted from the information indicating the activity of the worker and a feature quantity indicating eye movement of the worker extracted from the information indicating the activity of the worker.

[0012] In the apparatus, method, or program according to a sixth aspect of the present invention, the second learning data includes at least one of a first relational expression for estimating a skill level of the worker based on a difference between a current secondary indicator and a past secondary indicator that are estimates of the emotion and the cognition of the worker, and a second relational expression for estimating a misoperation frequency of the worker based on variations among current secondary indicators and variations among past secondary indicators that are the estimates of the emotion and the cognition of the worker.

[0013] The apparatus, method, or program according to a seventh aspect of the present invention further includes a controller for controlling an operation of the system based on an estimation result of the performance of the worker in the operation obtained by the second estimation unit.

[0014] In the apparatus, method, or program according to an eighth aspect of the present invention, the controller divides the estimated performance in the operation into a plurality of classes based on at least one predetermined threshold, and controls the operation of the system based on control information preliminarily associated with each of the classes. Further aspects are herein described, numbered as A1 , A2, etc. for convenience:

According to aspect A1 , it is provided a line manufacturing control apparatus for controlling at least a part of a manufacturing line, the apparatus comprising:

an activity obtaining unit configured to obtain information indicating an activity of a worker during execution of an operation, by the worker, in the manufacturing line;

a first estimation unit configured to estimate emotion and cognition of the worker during the operation based on the obtained information indicating the activity used as a primary indicator, and first learning data indicating a relationship between the activity and the emotion of the worker and a relationship between the activity and the cognition of the worker; and

a second estimation unit configured to estimate performance of the worker in the operation based on the estimated emotion and cognition each used as a secondary indicator, and second learning data indicating a relationship between the performance, and the emotion and the cognition of the worker in the operation, a controller configured to control a functioning of the at least a part of the manufacturing line based on an estimation result of performance of the worker in the operation obtained by the second estimation unit.

A2. A manufacturing line operation efficiency estimation apparatus for estimating performance in executing, by a worker, an operation within a manufacturing line, the apparatus comprising:

an activity obtaining unit configured to obtain information indicating an activity of the worker during the operation;

a first estimation unit configured to estimate emotion and cognition of the worker during the operation based on the obtained information indicating the activity used as a primary indicator, and first learning data indicating a relationship between the activity and the emotion of the worker and a relationship between the activity and the cognition of the worker; and

a second estimation unit configured to estimate performance of the worker in the operation based on the estimated emotion and cognition each used as a secondary indicator, and second learning data indicating a relationship between performance, and the emotion and the cognition of the worker in the operation.

A3. A drive assisting apparatus for providing vehicle driving assistance, the apparatus comprising:

an activity obtaining unit configured to obtain information indicating an activity of a subject during driving a vehicle;

a first estimation unit configured to estimate emotion and cognition of the subject during driving based on the obtained information indicating the activity used as a primary indicator, and first learning data indicating a relationship between the activity and the emotion of the subject and a relationship between the activity and the cognition of the subject; and

a second estimation unit configured to estimate performance of the subject in driving based on the estimated emotion and cognition each used as a secondary indicator, and second learning data indicating a relationship between performance, and the emotion and the cognition of the subject in the driving,

a controller configured to provide driving assistance based on an estimation result of the performance of the subject in the driving obtained by the second estimation unit.

A4. The apparatus according to aspect A3, wherein the driving assistance is provided to at least one amongst the subject and the vehicle.

A5. An apparatus for healthcare support of a subject, the apparatus comprising: an activity obtaining unit configured to obtain information indicating an activity of the subject when executing an operation;

a first estimation unit configured to estimate emotion and cognition of the subject when executing the operation based on the obtained information indicating the activity used as a primary indicator, and first learning data indicating a relationship between the activity and the emotion of the subject and a relationship between the activity and the cognition of the subject; and a second estimation unit configured to estimate performance of the subject in executing the operation based on the estimated emotion and cognition each used as a secondary indicator, and second learning data indicating a relationship between performance, and the emotion and the cognition of the subject in the operation, a controller configured to provide the subject with a healthcare support feedback based on an estimation result of the performance of the subject in the operation obtained by the second estimation unit.

A6. The apparatus for healthcare support of a subject according to aspect A5, wherein executing an operation includes at least one amongst executing an interacting operation with a machine and performing a physical exercise.

A7. An apparatus for handling performance in executing a task by a subject, the apparatus comprising:

an activity obtaining unit configured to obtain information indicating an activity of the subject when executing the task;

a first estimation unit configured to estimate emotion and cognition of the subject when executing the task based on the obtained information indicating the activity used as a primary indicator, and first learning data indicating a relationship between the activity and the emotion of the subject and a relationship between the activity and the cognition of the subject; and

a second estimation unit configured to estimate performance of the subject in executing the task based on the estimated emotion and cognition each used as a secondary indicator, and second learning data indicating a relationship between performance, and the emotion and the cognition of the subject in the task. A8. The apparatus according to aspect A7, further comprising

a controller configured to provide the subject with an intervention based on an estimation result of the performance of the subject in the task obtained by the second estimation unit.

A9. A method for providing vehicle driving assistance, the method comprising steps of:

obtaining information indicating an activity of a subject during driving a vehicle;

estimating emotion and cognition of the subject during driving based on the obtained information indicating the activity used as a primary indicator, and first learning data indicating a relationship between the activity and the emotion of the subject and a relationship between the activity and the cognition of the subject; and estimating performance of the subject in driving based on the estimated emotion and cognition each used as a secondary indicator, and second learning data indicating a relationship between performance, and the emotion and the cognition of the subject in the driving,

providing driving assistance based on an estimation result of the performance of the subject in the driving obtained by the second estimation unit.

A10. The method according to aspect A9, wherein the driving assistance is provided to at least one amongst the subject and the vehicle.

A1 1 . A method for healthcare support of a subject, the method comprising steps of:

obtaining information indicating an activity of the subject when executing an operation; estimating emotion and cognition of the subject during driving based on the obtained information indicating the activity used as a primary indicator, and first learning data indicating a relationship between the activity and the emotion of the subject and a relationship between the activity and the cognition of the subject; and

estimating performance of the subject in executing the operation based on the estimated emotion and cognition each used as a secondary indicator, and second learning data indicating a relationship between performance, and the emotion and the cognition of the subject in the operation,

providing the subject with a healthcare support feedback based on an estimation result of the performance of the subject in the operation obtained by the second estimation unit. A12. The method for healthcare support of a subject according to aspect A1 1 , wherein executing an operation includes at least one amongst executing an interacting operation with a machine and performing a physical exercise.

A13. A method for handling performance in executing a task by a subject, the method comprising steps of:

obtaining information indicating an activity of the subject when executing the task; estimating emotion and cognition of the subject when executing the task based on the obtained information indicating the activity used as a primary indicator, and first learning data indicating a relationship between the activity and the emotion of the subject and a relationship between the activity and the cognition of the subject; and estimating performance of the subject in executing the task based on the estimated emotion and cognition each used as a secondary indicator, and second learning data indicating a relationship between performance, and the emotion and the cognition of the subject in the task.

A14. The apparatus according to aspect A13, further comprising

providing the subject with an intervention based on an estimation result of the performance of the subject in the task obtained by the second estimation unit.

A15. A work management apparatus for managing an operation performed by a worker in a system involving the operation performed by the worker, the apparatus comprising:

an activity obtaining unit configured to obtain information indicating an activity of the worker during the operation;

a first estimation unit configured to estimate emotion and cognition of the worker during the operation based on the obtained information indicating the activity used as a primary indicator, and first learning data indicating a relationship between the activity and the emotion of the worker and a relationship between the activity and the cognition of the worker ; and

a second estimation unit configured to estimate performance of the worker in the operation based on the estimated emotion and cognition each used as a secondary indicator, and second learning data indicating a relationship between performance, and the emotion and the cognition of the worker in the operation.

A16. The work management apparatus according to aspect A15 wherein

the first learning data includes at least one first regression equation representing a relationship between the emotion and the activity of the worker, and the at least one first regression equation is generated with a correct value being information indicating an emotion self-reported by the worker and a variable being information indicating an activity of the worker obtained during a time period within which the self-reporting is performed.

A17. The work management apparatus according to aspect A16, wherein

the at least one first regression equation representing the relationship between the emotion and the activity of the worker includes a first regression equation generated for emotional arousal, and a first regression equation generated for emotional valence when the information indicating the emotion self-reported by the worker includes the emotional arousal and the emotional valence.

A18. The work management apparatus according to aspect A15, wherein

the first learning data includes a second regression equation representing a relationship between the cognition and the activity of the worker, and the second regression equation is generated with a correct value being information indicating a success or a failure of the operation by the worker and a variable being information indicating an activity of the worker obtained during a time period immediately before the success or the failure of the operation is obtained.

A19. The work management apparatus according to aspect A18, wherein

the second regression equation representing the relationship between the cognition and the activity of the worker is generated with a correct value being information indicating a success or a failure of the operation by the worker and a variable being at least one of a feature quantlity indicating hand movement of the worker extracted from the information indicating the activity of the worker and a feature quantity indicating eye movement of the worker extracted from the information indicating the activity of the worker.

A20. The work management apparatus according to aspect A15, wherein

the second learning data includes at least one of a first relational expression for estimating a skill level of the worker based on a difference between a current secondary indicator and a past secondary indicator that are estimates of the emotion and the cognition of the worker, and a second relational expression for estimating a misoperation frequency of the worker based on variations among current secondary indicators and variations among past secondary indicators that are the estimates of the emotion and the cognition of the worker.

A21 . The work management apparatus according to any one of aspects A15 to

A20, further comprising

a controller configured to control an operation of the system based on an estimation result of the performance of the worker in the operation obtained by the second estimation unit.

A22. The work management apparatus according to aspect A21 , wherein

the controller divides the estimated performance in the operation into a plurality of classes based on at least one predetermined threshold, and controls the operation of the system based on control information preliminarily associated with each of the classes.

A23. A work management method that is implemented by a work management apparatus for managing an operation performed by a worker in a system involving the operation performed by the worker, the method comprising:

obtaining information indicating an activity of the worker during the operation; estimating emotion and cognition of the worker during the operation based on the obtained information indicating the activity used as a primary indicator, and first learning data indicating a relationship between the activity and the emotion of the worker and a relationship between the activity and the cognition of the worker; and estimating performance of the worker in the operation based on the estimated emotion and cognition used as secondary indicators, and second learning data indicating a relationship between the performance, and the emotion and the cognition of the worker in the operation.

A24. A work management program enabling a processor to function as the units included in the work management apparatus according to any one of aspects A15 to A22.

A25. A computer program comprising instructions which, when executed on a computer, cause the computer to execute steps according to any of aspects A9 to A14 and A23. ADVANTAGEOUS EFFECTS

[0015] The apparatus, method, or program according to the first aspect of the present invention uses information indicating the activity of the worker during the operation used as a primary indicator. The primary indicator, and the first learning data generated separately are used to estimate the emotion and the cognition of the worker. The estimated emotion and cognition are then used as secondary indicators. The secondary indicators, and the second learning data generated separately are used to estimate the performance of the worker in the operation. This enables the performance of the worker in the operation to be estimated based on the information indicating the activity of the worker during the operation, and enables work management in accordance with human performance corresponding to emotion and cognition, which has not been used before. In other words, factory productivity can be monitored and/or improved by taking into account the accurate state of the worker(s), which is obtained in an objective and repeatable way autonomously by an apparatus, method, or program.

[0016] The apparatus, method, or program according to the second aspect of the present invention involves the first learning data including the first regression equations representing the relationship between the emotion and the activity of the worker, and thus enables the emotion of the worker to be estimated by computation using the first regression equations, for example, without storing a large amount of learning data into a database.

[0017] The apparatus, method, or program according to the third aspect of the present invention estimates the emotion of the worker using one first regression equation generated for emotional arousal and another first regression equation generated for emotional valence. This allows the emotion of the worker to be output as numerical information represented by arousal and valence.

[0018] The apparatus, method, or program according to the fourth aspect of the present invention involves the first learning data including the second regression equation representing the relationship between the cognition and the activity of the worker, and thus enables the cognition to be estimated by computation using the second regression equation without storing a large amount of learning data into a database.

[0019] The apparatus, method, or program according to the fifth aspect of the present invention generates the second regression equation with a correct value being information indicating a result of the operation by the worker, which is a success or a failure, and a variable being the feature quantities indicating hand movement of the worker and/or the feature quantities indicating eye movement of the worker. This allows the cognition associated with the operation by the worker to be estimated accurately based on at least one of the hand movement and the eye movement directly representing the state of the operation.

[0020] The apparatus, method, or program according to the sixth aspect of the present invention prepares the second learning data as at least one of the first relational expression for estimating the skill level of the worker and the second relational expression for estimating the misoperation frequency of the worker. This allows the performance of the worker in the operation to be estimated using a specific indicator of the skill level or the misoperation frequency.

[0021 ] The apparatus, method, or program according to the seventh aspect of the present invention controls the operation of the system based on the estimation results of the performance of the worker in the operation. When, for example, a decrease is estimated in the performance of the worker in the operation, the system is controlled to decrease its speed. This prevents quality deterioration of the product.

[0022] The apparatus, method, or program according to the eighth aspect of the present invention stratifies (or standardizes) the operation of the system in accordance with the class of the estimated performance of the worker in the operation. This allows the operation of the system to be more easily controlled in accordance with a decrease in the performance of the worker in the operation while maintaining the intended workability.

[0023] The above aspects of the present invention provide a work management system, an apparatus, a method, and a program that enable the state of the worker affecting the productivity to be determined always accurately without relying on the experience or the intuition of a manager.

According to further aspects, it is possible improving safety in driving, since the state of the driver can be objectively obtained by means of an apparatus and monitored to prevent hazardous situations. Also, the accurate state can be used to provide driving assistance, thus also increasing safety. Still further, the accurate state of a person can be objectively obtained by a healthcare support apparatus, so that the health condition of the person can be better monitored and/or improved.

BRIEF DESCRIPTION OF THE DRAWINGS

[0024] Fig. 1 is a schematic diagram of a system according to an embodiment of the present invention.

Fig. 2 is a diagram showing an example emotion input device and an example measurement device included in the system shown in Fig. 1.

Fig. 3 is a diagram showing another measurement device included in the system shown in Fig. 1 .

Fig. 4 is a functional block diagram of a production management apparatus installed in the system shown in Fig. 1 .

Fig. 5 is a flowchart showing the procedure and the details of emotion learning performed by the production management apparatus shown in Fig. 4. Fig. 6 is a flowchart showing the procedure and the details of cognition learning performed by the production management apparatus shown in Fig. 4.

Fig. 7 is a flowchart showing the first half part of the procedure and its details for generating and storing emotion learning data in an emotion learning mode shown in Fig. 5.

Fig. 8 is a flowchart showing the second half part of the procedure and its details for generating and storing the emotion learning data in the emotion learning mode shown in Fig. 5.

Fig. 9 is a flowchart showing the first half part of the procedure and its details for generating and storing learning data in the cognition learning shown in Fig. 6.

Fig. 10 is a diagram showing an example working process used for describing cognition estimation.

Fig. 1 1 is a flowchart showing the procedure and the details of production management performed by the production management apparatus shown in Fig. 4.

Fig. 12 is a flowchart showing emotion estimation and its details in the procedure shown in Fig. 1 1.

Fig. 13 is a flowchart showing cognition estimation and its details in the procedure shown in Fig. 1 1.

Fig. 14 is a flowchart showing productivity estimation and line control in the procedure shown in Fig. 1 1.

Fig. 15 is a diagram describing the definition of emotion information that is input through the emotion input device shown in Fig. 2.

Fig. 16 is a diagram showing example input results of emotion information obtained through the emotion input device in the system shown in Fig. 1 .

Fig. 17 is a diagram showing the classification of emotion information that is input through the emotion input device in the system shown in Fig. 1 .

Fig. 18 is a diagram showing variations in emotion information that is input through the emotion input device in the system shown in Fig. 1.

Fig. 19 illustrates a block diagram of a mental state model that is well suited for technical applications wherein a person interacts with a device/machine.

Fig. 20 shows how cognitive and emotional states can be measured by way of objective and repeatable measurements.

Fig. 21 shows examples of objective and repeatable measurements.

DETAILED DESCRIPTION

[0025] The present invention is based, amongst others, on the recognition that the human factor influencing for instance productivity is based on the mental state of a person, and that it is preferable using an appropriate model for the person (i.e. of his/her mental state) that takes into account different types of states of a person, wherein the states are directly or indirectly measurable by appropriate sensors. Thus, the mental state can be objectively and systematically observed, as well as estimated in view of the intended technical application, such that productivity can be better monitored and/or increased, and in general efficiency in executing a task can be better monitored and/or increased.

More in detail, in order to allow a technical application that objectively and systematically takes into account a mental state, the latter can be modeled by a combination of a cognitive state (also cognition, in the following) and an emotional state (also emotion, in the following) of a person. The cognitive state of the person relates to, for example, a state indicating a level of ability acquired by a person in performing a certain activity, for instance on the basis of experience (e.g. by practice) and knowledge (e.g. by training), as also further below discussed. The cognitive state is directly measureable, since it directly relates to the execution of a task by the person. Emotional state has been considered in the past solely as a subjective and psychological state, which could not be established objectively e.g. by technical means like sensors. Other (more recent) studies however led to a revision of such old view, and show in fact that emotional states of a person are presumed to be hard wired and physiologically (i.e. not culturally) distinctive; further, being based also on arousal (i.e. a reaction to a stimuli), emotions can be indirectly obtained from measurements of physiological parameters objectively obtained by means of suitable sensors, as also later mentioned with reference to Figure 20.

Figure 19 shows a model of a mental state that can be used, according to the inventors, for technical applications dealing for instance with human or men factor influencing for instance productivity (the same model can also be used for other applications, later discussed, like for instance assisted driving or healthcare support). In particular, the model comprises a cognitive part 510 and an emotional part 520 interacting with each other. The cognitive part and the emotional part represent the set of cognitive states and, respectively, the set of emotional states that a person can have, and/or that can be represented by the model. The cognitive part directly interfaces with the outside world (dashes line 560 represents a separation to the outside worlds), in what the model represents as input 540 and output 550. The input 540 represents any stimuli that can be provided to the person (via the input "coupling port" 540, according to this schematic illustration), and the output 550 (a schematic illustration of an output "coupling port" for measuring physiologic parameters) represents any physiological parameters produced by the person, and as such measurable. The emotional part can be indirectly measured, since the output depends on a specific emotional state at least indirectly via the cognitive state: see e.g. line 525 (and 515) showing interaction between emotion and cognition, and 536 providing output, according to the model of Figure 19. In other words, an emotional state will be measurable as an output, even if not directly due to the interaction with the cognitive part. It is herein not relevant how the cognitive part and the emotional part interact with each other. What matters to the present discussion is that there are input to the person (e.g. one or more stimuli), and output from the person as a result of a combination of a cognitive state and an emotional state, regardless of how these states/parts interact with each other. In other words, the model can be seen as a black box having objectify measurable input and output, wherein the input and output are causally related to the cognitive and emotional states, though the internal mechanism for such causal relationship are herein not relevant.

Despite the non-knowledge of the internal mechanisms of the model, the inventors have noted that such a model can be useful in practical application in the industry, like for instance when wanting to handle human/men factors influencing productivity, or when wanting to control certain production system parameters depending on human performance, as it will also become apparent in the following.

Figure 20 shows how cognitive and emotional states can be measured by way of objective and repeatable measurements, wherein a circle, triangle and cross indicates that the listed measuring methods are respectively well suitable, less suitable (due for instance to inaccuracies), or (at present) considered not suitable. Other techniques are also available, like for instance image recognition for recognizing facial expressions or patterns of facial expressions that are associated to a certain emotional state. In general, cognitive and emotional states can be measured by an appropriate method, wherein certain variable(s) deemed suitable for measuring the given state are determined, and then measured according to a given method by means of suitable sensor(s). A variety of sensors are suitable for obtaining such measurements, and are herein not described since any of them is suitable as long as they provide any of the parameters listed in figure 20, or any other parameters suitable for estimating cognitive and/or emotional states. The sensors can be wearables, e.g. included in a wrist or chest wearable device or in glasses, an helmet like device for measuring brain activity from the scalp (e.g. EEG/NIRS), or a large machine like PET/fMRI.

Thus, it is possible to model a person, like for instance a factory operator (or a driver of a vehicle, or a person using an healthcare supporting device), by using a model as illustrated in figure 19, and collect measurements of physiological parameters of the person as shown in figures 20 and 21 . In this way, as also shown in the following, it is possible to improve for instance factory productivity or improving the monitoring of factory productivity, safety of driving, improving monitoring of health conditions of a person, maintaining or improving health conditions of a person.

The above explanation is provided as illustrative and propaedeutic to the understanding of the invention and following embodiments/examples, without any limitation on the same.

Turning to the invention, and referring for the sake of illustration to the application of monitoring productivity on a production line: emotional and cognitive states can be estimated on the basis of first learning data and on information indicating an activity of the worker (i.e. information obtained from measurements on the worker); the worker performance can then be estimated on the basis of the estimated cognition and emotion, and of second learning data. The emotion and cognition estimation allow obtaining an accurate estimation of the overall mental state (see e.g. the above discussed model), and the worker performance can also be more accurately estimated, such that factory productivity can be more accurately monitored when taking into account also the human factor. It is significant that this productivity estimation is reached on the basis of objective and repeatable measurements (of the worker activity) that an apparatus can perform, and on specific learning data. Details on the estimation are provided also below, but reference is also made to JP2016-252368 1 filed on 27 December 2016 as well as to the PCT application (reference/docket number 198 759) filed by same applicant and on the same date as the present one and describing for instance how the emotional state can be estimated.

In another illustrative application, the estimated performance can be used to control the production line, so that the latter can be conveniently improved or so that the respective production quality can be better controlled and improved. Also here, significantly, the better productivity/quality is achieved on the basis of objective and repeatable measurements, and on specific learning data.

In other illustrative applications like for instance assisted driving or healthcare support, higher safety in driving, more accurate healthcare monitoring, or improved health conditions can be reached on the basis of objective and repeatable Priority document of case #1 , which we assume will be published. In case you are considering to withdraw the JP application before publication, please let us know so that we will delete this reference. measurements, and on specific learning data.

Embodiments of the present invention will now be described with reference to the drawings.

Embodiment 1

Principle

For example and as anticipated, factors that may influence the productivity of a production line include 4M (machines, methods, materials, and men) factors. In the present embodiment, the factor "men", which may influence the productivity, may be defined as emotion and cognition based on the neural activity of the brain. The emotion is, for example, human motivation and mood (comfort or discomfort) for an operation, and varies during a relatively short period such as hours or days. The cognition is a human baseline ability. This ability is associated with, for example, human attention to and judgment about an operation, and varies during a relatively long period such as months or years.

[0026] In the present embodiment, information indicating the human activity correlated with the neural activity of the brain, such as vital signs and motion information, is used as a primary indicator (for example, when using regression analysis, an indicator as herein used can be represented by an independent variable; in other words, information indicating human activity may represent independent variable(s) when using regression analysis). The information indicating the activity and the emotion correct value, as for instance input by the worker, are used to estimate the emotion. Examples of the information indicating the activity include vital signs and motion information such as the heart electrical activity, the skin potential activity, the motion, and the amount of exercise. With emotion correct value it is herein meant a value indicating the emotional state of the person (e.g. worker, driver, person using a healthcare device), which value is considered correct or highly accurate. In other words, the emotion correct value is (preferably, highly) accurate information on the emotional state of a person. The emotion correct value can be obtained, in one example, by means of an emotion input device 2. For simplicity, as later described in the example referring to figure 2, the emotion input device 2 can be represented by a device to which the person (e.g. worker) can input his/her current emotion. However, the emotion input device 2 can be represented for instance by a measurement apparatus and/or sensor (or combination of a plurality of such measurement apparatuses and/or sensors) capable of acquiring an emotion correct value (i.e. a highly accurate information on the emotional state), i.e. by means of suitable measurements made on the subject, see also the above discussion in relation to Figures 20 and 21 . In particular and preferably, the correct emotion value is acquired by means of devices suitable for determining such state with high precision/accuracy (regardless of the size and complexity of the sensor or device used; preferably, such sensors are large and complex devices achieving higher accuracy than other sensors as those included in wearables). Also, a combination of both an indirect (by means of accurate measurements) and direct (e.g. by means of user inputting his/her own state into a device) determination of the emotional state is possible. The correct emotion value herein discussed can be acquired for each of a plurality of workers, as also further later illustrated.

[0027] The cognition is estimated using, as primary indicators (when using for example regression analysis, the independent variable(s) may be given by such indicator(s)), the feature quantities of, for example, eye movement and hand movement representing the attention and judgment in the information indicating the human activity. The feature quantities of eye movement and hand movement, and the cognition correct value are used to estimate the cognition. Examples of the feature quantities representing eye movement include the eye movement speed, the gaze coordinates and the gaze duration, the number of blinks, and changes in the pupil size. Examples of the feature quantities representing hand movement include triaxial acceleration. With cognition correct value it is herein meant (preferably highly, accurate) information indicative of the cognitive state of the person, which information is acquired by means of one of more apparatuses, devices and/or sensors capable of determining whether an operation by the person is as expected, e.g. whether a detected operation (as acquired by such device/apparatus/sensor) is according to a predetermined pattern and/or template for such operation. An example for such device/apparatus/sensor is given by a work monitoring camera CM also later described. When using for example regression analysis, the cognition correct values may be represented as dependent variable(s). Thus, when using regression analysis for emotion or cognition, a relationship can be found between dependent variable(s) and independent variable(s), wherein the dependent variable(s) represent the correct values for emotion and, respectively, cognition, and the independent variable(s) represent indications of human activity as appropriately measured.

[0028] In the present embodiment, the emotion learning data and the cognition learning data are preliminarily generated for each worker. These learning data items are generated based on the above correct values (e.g. dependent variables) and primary indicators (e.g. independent variables). A change in the activity of the worker is measured during operation, and the measurement data is used as a primary indicator. This primary indicator and the learning data are used to estimate a change in each of the emotion and the cognition of the worker. In other words, (first) learning data is generated for instance by regression analysis between activity indication values (independent variables) and correct values (dependent variable) of emotion and, respectively, cognition - on the basis of data available for one or more persons, for instance. Once the learning data has been obtained, the emotion and/or cognition can be estimated on the basis of the (previously generated) learning data and the current activity as detected for a person at a certain point in time when the emotion/estimation wants or needs to be estimated.

[0029] In addition, relational expressions representing the correlation between the changes in the emotion and the cognition and a change in the performance (or, more in general, correlation between emotion and the cognition, and performance) in the operation by the worker are preliminarily generated for each worker as learning data for estimating the performance in the operation by the worker. In an example using regression analysis, the performance (or change in performance), may be represented as dependent variable(s). Information indicating performance or change in performance may be obtained for instance by measuring speed of producing an item, and/or how many items are produced per hour, and/or quality in producing item(s), etc. as also later explained. The estimated changes in the worker's emotion and cognition are used as secondary indicators; in the example of regression analysis, the secondary indicator(s) may be represented as independent variable(s). The secondary indicators and the relational expressions are used to estimate a change in the worker's current or future performance in the operation. In other words and as an example, (second) learning data is generated using regression analysis between performance information (as dependent variable(s)) and estimated emotion and/or cognition (as independent variable(s)). Once the (second) learning data is obtained, the actual performance can be estimated based on the emotion and/or cognition as estimated for a person at a certain point in time.

[0030] The information indicating the performance in an operation in a production line is typically defined by the quality and the number of products. In the present embodiment, this information is more specifically represented by skill level information and misoperation frequency information. The skill level information is represented by, for example, a difference between a standard operation time and an actual operation time. The misoperation frequency information is represented by, for example, deviations of the actual operation time from an average operation time. Thus, the information indicating the performance in an operation is defined as information representing the degrees of the worker's concentration, fatigue, and skill in the operation.

[0031 ] In the present embodiment, the information about the difference between the standard operation time and the actual operation time, and the information indicating deviations of the actual operation time from the average operation time are estimated for each worker as the information indicating the performance in the operation. The estimation results are used to control the line.

What has been explained above for a worker, equally applies to persons like a driver, or a person using a healthcare device.

In the case of a driver, for instance, correct values used for cognition estimation may be represented by how correctly the driving task is executed, which can be obtained e.g. by measuring certain driving parameters like how correctly the vehicle follows certain predetermined routes (e.g. comparing how smoothly the actual driving route correspond to an ideal route obtained from a a navigation system), how smooth the control of the vehicle is (e.g. whether or how often any sudden change of direction occurs), on the degree of the driver recognizing an obstacle, etc. The performance values of one driver (in the sense of performance in executing driving, to be used for obtaining learning data by way of regression analysis) can e.g. be obtained by comparing for instance the distance covered over a certain period over an expected distance for a given period, or whether in reaching two points a certain route has been followed compared to predetermined available routes, etc.

In the case of a person using a healthcare assistance device, the correct values for cognition estimation may be obtained by measuring how certain tasks are executed: for instance, how straight and balanced the person ' s body position is when walking, running or sitting (e.g. over predetermined patterns); how smoothly certain movements are made over predetermined patterns; etc. The performance values of the person (to be used for obtaining learning data by way of regression analysis) can e.g. be obtained by measuring efficiency and/or quality in completing a certain task of number of tasks, like for instance measuring the distance covered on foot over an expected distance; measuring the time for accomplishing a task over a predetermined time (e.g. completing a housecleaning or hobby-related operation, number of such operations performed in an hour or day), etc.

Other values and considerations apply as in the case of a worker.

System Configuration

[0032] A system according to an embodiment of the present invention is a cell production system. The cell production system divides the product manufacturing process into multiple sections. The production line has working areas, called cells, for these sections. In each cell, a worker performs the operation of the assigned section.

[0033] Fig. 1 shows an example cell production system, which includes a U-shaped production line CS. The production line CS includes, for example, three cells C1 , C2, and C3 corresponding to different sections on the course of the products. Workers WK1 , WK2, and WK3 are assigned to the cells C1 , C2, and C3, respectively. In addition, a skilled leader WR is placed to supervise the overall operation on the production line CS. The leader WR has a portable information terminal TM, such as a smartphone or a tablet terminal. The portable information terminal TM is used to display information for managing the production operation provided to the leader WR.

[0034] A part feeder DS and a part feeder controller DC are located most upstream of the production line CS. The part feeder DS feeds various parts for assembly onto the line CS at a specified rate in accordance with a feed instruction issued from the part feeder controller DC. Additionally, the cell C1 , which is a predetermined cell in the production line CS, has a cooperative robot RB. In accordance with an instruction from the part feeder controller DC, the cooperative robot RB assembles a part into a product B1 in cooperation with the part feed rate.

[0035] The cells C1 , C2, and C3 in the production line CS have monitors M01 , M02, and M03, respectively. The monitors M01 , M02, and M03 are used to provide the workers WK1 , WK2, and WK3 with instruction information about their operations and other messages.

[0036] A work monitoring camera CM is installed above the production line CS. The work monitoring camera CM captures images to be used for checking the results of the production operations for the products B1 , B2, and B3 performed by the workers WK1 , WK2, and WK3 in the cells C1 , C2, and C3. The results of the production operations are used as correct values when learning data for cognition estimation is generated. The numbers of monitors, sections, and workers, and the presence or absence of a leader may not be limited to those shown in Fig. 1 . The production operation performed by each worker may also be monitored in any manner other than to use the work monitoring camera CM. For example, the sound, light, and vibrations representing the results of the production operation may be collected, and the collected information may be used to estimate the results of the production operation.

[0037] To estimate the emotion and the cognition of each of the workers WK1 , WK2, and WK3, the workers WK1 , WK2, and WK3 have input and measurement devices SS1 , SS2, and SS3, respectively. The input and measurement devices SS1 , SS2, and SS3 each include an emotion input device 2 for receiving an emotion correct value, a measurement device 3 for measuring the worker's activity used as a primary indicator for estimating the emotion and the cognition, and an eye movement monitoring camera 4.

[0038] The emotion input device 2, which is for example a smartphone or a tablet terminal as shown in Fig. 2, displays an emotion input screen under control with application programs. The emotion input screen shows emotions using a two-dimensional coordinate system with emotional arousal on the vertical axis and emotional valence on the horizontal axis. When a worker plots the position corresponding to his or her current emotion on the emotion input screen, the emotion input device 2 recognizes the coordinates indicating the plot position as information indicating the emotion of the worker.

[0039] This technique of expressing the emotions using arousal and valence on the two-dimensional coordinate system is known as the Russell's circumplex model. Fig. 15 schematically shows this model. Fig. 16 is a diagram showing example input results of emotion at particular times obtained through the emotion input device 2. The arousal indicates the emotion either being activated or deactivated and the degree of activation to deactivation, whereas the valence indicates the emotion either being comfortable (pleasant) or uncomfortable (unpleasant) and the degree of being comfortable to uncomfortable.

[0040] The emotion input device 2 transforms the position coordinates detected as the emotion information to the arousal and valence values and the information about the corresponding quadrant of the two-dimensional arousal-valence coordinate system. The resultant data, to which the time stamp data indicating the input date and time is added, is transmitted as emotion input data (hereinafter referred to as scale data) to a production management apparatus 1 , which is a work management apparatus, through a network NW using a wireless interface.

[0041 ] The measurement device 3 is, for example, incorporated in a wearable terminal, and is mounted on a wrist of the worker as shown in Fig. 3. The measurement device 3 may not be incorporated in a wearable terminal, and may be mountable on clothes, a belt, or a helmet. The measurement device 3 measures information indicating human activity correlated with human emotions and cognition. The information indicating human activity includes vital signs and motion information. To measure the vital signs and the motion information, the measurement device 3 includes various vital sign sensors and motion sensors. Examples of the vital sign sensors and the motion sensors include sensors for measuring heart electrical activity H, skin potential activity G, motion BM, and an activity amount Ex.

[0042] The heart electrical activity sensor measures the heart electrical activity H of the worker in predetermined cycles or at selected timing to obtain the waveform data, and outputs the measurement data. The skin potential activity sensor, which is for example a polygraph, measures the skin potential activity G of the worker in predetermined cycles or at selected timing, and outputs the measurement data. The motion sensor, which is for example a triaxial acceleration sensor, measures the motion BM, and outputs the triaxial acceleration measurement data indicating hand movement of the worker. The sensor for measuring the activity amount Ex, which is an activity sensor, outputs the measurement data indicating the intensity of physical activity (metabolic equivalents, or METs) and the amount of physical activity (exercise). Another example of the vital sign sensors may be an electromyograph for measuring electric charge in the muscle.

[0043] The eye movement monitoring camera 4 is a small image sensor, and is mounted on, for example, the cap worn by each of the workers WK1 , WK2, and WK3 as shown in Fig. 3, or on the frame of glasses or goggles. The eye movement monitoring camera 4 captures the eye movement (EM) of the worker, and transmits the captured image data to the production management apparatus 1 as measurement data.

[0044] Each of the measurement device 3 and the eye movement monitoring camera 4 adds the time stamp data indicating the measurement date and time to its measurement data. The measurement device 3 and the eye movement monitoring camera 4 each transmit the measurement data to the production management apparatus 1 through the network NW using a wireless interface.

[0045] The wireless interface complies with, for example, low-power wireless data communication standards such as wireless local area networks (WLANs) and Bluetooth (registered trademark). The interface between the emotion input device 2 and the network NW may be a public mobile communication network, or a signal cable such as a universal serial bus (USB) cable.

[0046] The structure of the production management apparatus 1 will now be described. Fig. 4 is a functional block diagram of the apparatus. The production management apparatus 1 is, for example, a personal computer or a server computer, and includes a control unit 1 1 , a storage unit 12, and an interface unit 13.

[0047] The interface unit 13, which allows data communication in accordance with a communication protocol defined by the network NW, receives the measurement data transmitted from the input and measurement devices SS1 , SS2, and SS3 through the network NW. The interface unit 13 transmits display data output from the control unit 1 1 to the portable information terminal TM and the monitors M01 , M02, and M03, and also transmits a control command for the production line CS output from the control unit 1 1 to the part feeder controller DC.

[0048] The interface unit 13 may also include a man-machine interface function. The man-machine interface function receives, for example, data input from an input device, such as a keyboard or a mouse, and outputs display data input from the control unit 1 1 to a display (not shown) on which the data will appear. The man-machine interface function may additionally capture the voice of the worker or output sound, or may have other functions.

[0049] The storage unit 12 is a storage medium, and is a readable and writable non-volatile memory, such as a hard disk drive (HDD) or a solid state drive (SSD). The storage unit 12 includes a sensing data storage 121 , a learning data storage 122, and a control history storage 123 as storage areas used in the embodiment.

[0050] The sensing data storage 121 stores data transmitted from the input and measurement devices SS1 , SS2, and SS3 in a manner associated with the identifiers of the workers WK1 , WK2, and WK3 that have transmitted the corresponding data. The transmitted and stored data includes scale data indicating the worker's emotion input through the emotion input device 2, measurement data obtained through the sensors of the measurement device 3, and image data input from the eye movement monitoring camera 4. The sensing data storage 121 also stores image data about the results of the operation for a product transmitted from the work monitoring camera CM.

[0051 ] The learning data storage 122 stores learning data to be used for emotion estimation, learning data to be used for cognition estimation, and learning data to be used for productivity estimation, which are generated by the control unit 1 1 for each of the workers WK1 , WK2, and WK3.

[0052] The control history storage 123 stores information indicating the productivity estimation results generated by the control unit 1 1 for each of the workers WK1 , WK2, and WK3 and the results of the corresponding control for the production line CS as a control history event.

[0053] The control unit 1 1 includes a central processing unit (CPU) and a working memory. The control unit 1 1 includes a sensing data obtaining controller 1 1 1 , a feature quantity extraction unit 1 12, a productivity estimation unit 1 13, a line controller 1 14, and a learning data generation unit 1 15 as control functions used in the embodiment. Each of these control functions is implemented by the CPU executing the application programs stored in program memory (not shown).

[0054] The sensing data obtaining controller 1 1 1 obtains, through the interface unit 13, data transmitted from each of the input and measurement devices SS1 , SS2, and SS3, or scale data output from the emotion input device 2, measurement data output from the measurement device 3, and image data output from the eye movement monitoring camera 4, and stores the obtained data into the sensing data storage 121 . The sensing data obtaining controller 1 1 1 also obtains, through the interface unit 13, work monitoring image data about the results of the operations performed by the workers WK1 , WK2, and WK3 transmitted from the work monitoring camera CM, and stores the obtained data into the sensing data storage 121 .

[0055] In a learning mode, the feature quantity extraction unit 1 12 reads, from the sensing data storage 121 , the scale data, the measurement data, and the image data for each of the workers WK1 , WK2, and WK3 within each of the windows that are arranged at time points chronologically shifted from one another. The feature quantity extraction unit 1 12 extracts the feature quantities (extracted data, or extracted sensing data) from the read scale data, measurement data, and image data, calculates the variation between the feature quantities, and transmits the calculation results to the learning data generation unit 1 15.

[0056] The windows each have a predetermined unit duration. The windows are defined in a manner shifted from one another by the above unit duration to avoid overlapping between chronologically consecutive windows, or in a manner shifted by a time duration shorter than the above unit duration to allow overlapping between chronologically consecutive windows. The unit duration of each window may be varied by every predetermined value within a predetermined range.

[0057] The learning data generation unit 1 15 performs multiple regression analysis for each of the workers WK1 , WK2, and WK3 with correct values (supervisory data) being the variations among the feature quantities in the scale data for arousal and for valence that are extracted by the feature quantity extraction unit 1 12 and variables being the variations among the feature quantities of the measurement data. This generates regression equations for arousal and for valence representing the relationship between the emotion and the feature quantities of measurement data. The learning data generation unit 1 15 associates the generated regression equations with window identifiers that indicate the time points of the corresponding windows, and stores the equations into the learning data storage 122 as learning data to be used for emotion estimation.

[0058] The learning data generation unit 1 15 also performs multiple regression analysis for each of the workers WK1 , WK2, and WK3 with correct values being the data indicating whether the operation results extracted from the captured image data obtained through the work monitoring camera CM suggest a correctly performed operation (e.g. whether the images acquired by the camera are according to a predetermined pattern or template, as also below further illustrated), and with variables being the eye movement data and hand movement data. The eye movement data is extracted by the feature quantity extraction unit 1 12 from the captured image data obtained through the eye movement monitoring camera 4. The hand movement data is extracted by the feature quantity extraction unit 1 12 from the measurement data obtained through the triaxial acceleration sensor included in the measurement device 3. In this manner, the learning data generation unit 1 15 generates a regression equation for each of the workers WK1 , WK2, and WK3 representing the relationship between the cognition, and the eye movement and hand movement of each worker. The learning data generation unit 1 15 stores the generated regression equation into the learning data storage 122 as learning data to be used for cognition estimation.

[0059] The learning data generation unit 1 15 further uses the estimated changes in the emotion and the cognition of each of the workers WK1 , WK2, and WK3 as secondary indicators, and generates a relational expression for each worker representing the correlation between each secondary indicator and a change in the productivity of each worker. The learning data generation unit 1 15 stores the generated relational expressions into the learning data storage 122 as learning data to be used for productivity estimation.

[0060] More specifically, skill level information and misoperation frequency information are defined as productivity information. The skill level information is represented by, for example, a difference between a standard operation time and an actual operation time. The misoperation frequency information is represented by, for example, deviations of the actual operation time from an average operation time. The learning data generation unit 1 15 generates relational expressions for estimating the skill level information and the misoperation frequency information based on the estimates of the changes in the emotion and the cognition, and stores the relational expressions into the learning data storage 122.

[0061 ] In a productivity estimation mode, the feature quantity extraction unit 1 12 reads, from the sensing data storage 121 , the measurement data and the image data for each of the workers WK1 , WK2, and WK3 within each of the windows that are arranged at time points chronologically shifted from one another. The feature quantity extraction unit 1 12 extracts the changes in the feature quantities from the read measurement data and image data for emotion and cognition estimation, and transmits the changes in the feature quantities to the productivity estimation unit 1 13.

[0062] For each of the workers WK1 , WK2, and WK3, the productivity estimation unit 1 13 receives the changes in the feature quantities for emotion and cognition estimation extracted by the feature quantity extraction unit 1 12, and reads the emotion learning data and the cognition learning data from the learning data storage 122. The productivity estimation unit 1 13 uses the changes in the feature quantities and the emotion and the cognition learning data to estimate a change in each of the emotion and the cognition.

[0063] For each of the workers WK1 , WK2, and WK3, the productivity estimation unit 1 13 also reads the relational expressions for productivity estimation from the learning data storage 122. The productivity estimation unit 1 13 uses the read relational expressions and the estimates of the changes in the emotion and the cognition to estimate the productivity of each of the workers WK1 , WK2, and WK3. More specifically, the productivity estimation unit 1 13 estimates the skill level represented by the difference between the standard operation time and the actual operation time, and the misoperation frequency represented by the deviations of the actual operation time from the average operation time.

[0064] Based on the productivity estimation results obtained for each of the workers WK1 , WK2, and WK3 by the productivity estimation unit 1 13, the line controller 1 14 determines whether the speed of the production line CS needs to be regulated, and whether the worker WK1 , WK2, or WK3 needs to be replaced or rest. When determining that the speed regulation is needed, the line controller 1 14 outputs a line speed control instruction to the part feeder controller DC. When determining that the replacement or the rest is needed, the line controller 1 14 transmits the replacement or rest instruction information to the portable information terminal TM held by the leader WR or the monitor M01 , M02, or M03.

Operation

[0065] The operation of the production management apparatus 1 with the above structure will now be described in association with the operation of the overall system.

(1 ) Learning Data Generation

Before the process for estimating the productivity of the workers WK1 , WK2, and WK3, the production management apparatus 1 generates, for each of the workers WK1 , WK2, and WK3, the learning data to be used for productivity estimation in the manner described below.

1 -1 : Generation of Learning Data for Emotion Estimation

[0066] The production management apparatus 1 generates, for each of the workers WK1 , WK2, and WK3, the learning data to be used for emotion estimation in the manner described below. Fig. 5 is a flowchart showing the procedure and its details.

[0067] More specifically, each of the workers WK1 , WK2, and WK3 inputs his or her current emotions with the emotion input device 2 at predetermined time intervals or at selected timing while working. [0068] As described above, the emotion input device 2 displays the emotion of the worker in the two-dimensional coordinate system for emotional arousal and emotional valence, and detects the coordinates of a position plotted by the worker WK1 , WK2, or WK3 on the two-dimensional coordinate system. The two-dimensional coordinate system used in the emotion input device 2 has the four quadrants indicated by 1 , 2, 3, and 4 as shown in Fig. 17, and the arousal and valence axes each representing values from -100 to +100 with the intersection point as 0 as shown in Fig. 18. The emotion input device 2 transforms the detected coordinates to the information about the corresponding quadrant and to the corresponding values on both the arousal and valence axes. The emotion input device 2 adds the time stamp data indicating the input date and time and the identifier (worker ID) of the worker WK1 , WK2, or WK3 to the resultant information, and transmits the data to the production management apparatus 1 as scale data. As above illustrated, the emotion input device 2 is not limited to a device to which the worker inputs his/her emotion (which is herein described for simplicity only), but includes in fact also devices capable of accurately determining an emotion state on the basis of accurate measurement(s).

[0069] In parallel with this, the measurement device 3 measures the heart electrical activity H, the skin potential activity G, the motion BM, and the activity amount Ex of the worker WK1 , WK2, or WK3 at predetermined time intervals. The measurement data is transmitted to the production management apparatus 1 together with the time stamp data indicating the measurement time and the worker ID of the worker WK1 , WK2, or WK3. Additionally, the eye movement EM of the worker WK1 , WK2, or WK3 is captured by the eye movement monitoring camera 4. The image data is also transmitted to the production management apparatus 1 together with the time stamp data and the identifier (worker ID) of the worker WK1 , WK2, or WK3.

[0070] In step S1 1 , the production management apparatus 1 receives, for each of the workers WK1 , WK2, and WK3, the scale data transmitted from the emotion input device 2 though the interface unit 13 as controlled by the sensing data obtaining controller 1 1 1 , and stores the received scale data into the sensing data storage 121 . [0071 ] In step S12, the production management apparatus 1 also receives, for each of the workers WK1 , WK2, and WK3, the measurement data transmitted from the measurement device 3 and the image data transmitted from the eye movement monitoring camera 4 through the interface unit 13 as controlled by the sensing data obtaining controller 1 1 1 , and stores the received measurement data and image data into the sensing data storage 121 .

[0072] In step S13, when the scale data, the measurement data, and the image data accumulate for a predetermined period (e.g., one day or one week), the production management apparatus 1 generates learning data to be used for emotion estimation, as controlled by the feature quantity extraction unit 1 12 and the learning data generation unit 1 15 in the manner described below. Figs. 7 and 8 are flowcharts showing the procedure and its details.

[0073] In step S131 , the unit duration of the window Wi (i = 1 , 2, 3, ...) is set at an initial value. In step S132, the first window (i = 1 ) is selected. In step S133, the feature quantity extraction unit 1 12 reads a plurality of sets of scale data within the first window from the sensing data storage 121. In step S134, the feature quantity extraction unit 1 12 calculates the variations among the feature quantities for arousal and for valence.

[0074] For example, when scale data K1 and scale data K2 are input within the unit duration of one window as shown in Fig. 18, the variations are calculated as the change from the third to the fourth quadrant, and as the increment of 20 (+20) for arousal and the increment of 50 (+50) for valence. For a change to a diagonally opposite quadrant, for example, for a change from the third to the second quadrant, the variations among the resultant feature quantities may be calculated for arousal and for valence.

[0075] In step S135, the feature quantity extraction unit 1 12 reads the measurement data and image data obtained within the unit duration of the first window, which are the measurement data about the heart electrical activity H, the skin potential activity G, the motion BM, and the activity amount Ex, and the image data about the eye movement EM, from the sensing data storage 121 . In step S136, the feature quantity extraction unit 1 12 extracts the feature quantities from the measurement data and the image data.

[0076] For example, the heart electrical activity H has the feature quantities that are the heartbeat interval (R-R interval, or RRI), and the high frequency components (HF) and the low frequency components (LF) of the power spectrum of the RRI. The skin potential activity G has the feature quantity that is the galvanic skin response (GSR). The motion BM has feature quantities including the hand movement directions and speed. The hand movement directions and speed are calculated based on, for example, the triaxial acceleration measured by the triaxial acceleration sensor. The activity amount Ex has the feature quantities that are the intensity of physical activity (METs) and the exercise (EX). The exercise (EX) is calculated by multiplying the intensity of physical activity (METs) by the activity duration. The eye movement EM has the feature quantities including the eye movement speed, the gaze coordinates and the gaze duration, the number of blinks, and changes in the pupil size.

[0077] The feature quantity extraction unit 1 12 calculates the variations among the extracted feature quantities that are the heart electrical activity H, the skin potential activity G, the motion BM, the activity amount Ex, and the eye movement EM within the unit duration of the window.

[0078] In step S137, the learning data generation unit 1 15 generates learning data for arousal and learning data for valence based on the variations calculated in step S134 among the scale data feature quantities and the variations calculated in step S136 among the measurement data and image data feature quantities.

[0079] For example, the learning data generation unit 1 15 performs multiple regression analysis using the variations among the scale data feature quantities for arousal and for valence as supervisory data, and the variations among the measurement data and image data feature quantities as independent variables, which are primary indicators. The learning data generation unit 1 15 then generates a regression equation for each of the workers WK1 , WK2, and WK3 for arousal and for valence representing the relationship between the change in the emotion of each worker and the changes in the measurement data and image data feature quantities.

[0080] The regression equations corresponding to the i-th window are as follows:

XAi = f(a1 Hi, a2Gi, a3EMi, a4BMi, a5Exi), and

XVi = f(a1 Hi, a2Gi, a3EMi, a4BMi, a5Exi) (1 )

where XAi is the estimate of the arousal change, XVi is the estimate of the valence change, a1 , a2, a3, a4, and a5 are the weighting coefficients for the feature quantities of the measurement data items Hi, Gi, EMi, BMi, and Ex, and f is the sum of the indicators obtained from the feature quantities of the measurement data items Hi, Gi, EMi, BMi, and Ex, which are primary indicators. The weighting coefficients may be determined by using, for example, the weighted average based on the proportions in the population data obtained in the learning stage. Equations (1 ) are an example of a relationship between activity and emotion of a person. In one example, first learning data (also above discussed) may include, indicate or be based on equations (1 ) above, representing a relationship between activity and emotion.

[0081 ] In step S138, the learning data generation unit 1 15 stores the generated regression equations for arousal and for valence corresponding to the i-th window into the learning data storage 122. In step S139, the learning data generation unit 1 15 determines whether all the windows Wi have been selected for generating regression equations. When any window remains unselected, the processing returns to step S132, where the unselected window is selected, and the processing in steps S133 to S139 for generating the learning data for emotion estimation is repeated for the next selected window.

[0082] The feature quantity extraction unit 1 12 and the learning data generation unit 1 15 change the window unit duration by every predetermined value and the chronological shift of the window by every predetermined amount to determine the optimum window unit duration and the optimum shift. Of all the combinations of the unit durations and the shifts, the learning data generation unit 1 15 selects a combination that minimizes the difference between the emotion estimates obtained using the regression equations and the emotion information correct values input through the emotion input device 2. The learning data generation unit 1 15 then sets, for the emotion estimation, the selected window unit duration and the selected shift, as well as the regression equations generated for this combination. The window unit duration and the shift may be fixed.

[0083] An example of the processing of selecting the optimum window will now be described. Fig. 8 is a flowchart showing the procedure and its details.

In step S141 , the learning data generation unit 1 15 calculates the emotion estimates XAi and XVi using the regression equations generated for each window Wi, and calculates the sum of the calculated estimates XAi as XA and the sum of the calculated estimates XVi as XV. In step S142, the learning data generation unit 1 15 calculates the differences between the sums of the emotion estimates XA and XV, and the sums of the true values XA and XV of the emotion information input through the emotion input device 2 in the manner described below.

∑(XA - XA) and∑(XV - XV)

The calculation results are stored into the learning data storage 122. For simplifying the flowchart, Fig. 8 only shows∑(XA - XA).

[0084] In step S143, the learning data generation unit 1 15 determines whether changing the window unit duration and the shift has been complete, or in other words, whether regression equations have been generated for all combinations of the window unit durations and the shifts. When this process is incomplete, the processing advances to step S144, in which the unit duration and the shift of the window Wi is changed by the predetermined amount. The processing then returns to step S132 shown in Fig. 7, and then the processing in steps S132 to S143 is performed. In this manner, the processing in steps S132 to S144 is repeated until the regression equations are generated for all the combinations of the window unit durations and the shifts.

[0085] When the regression equations have been generated for all the combinations of the window unit durations and the shifts, the learning data generation unit 1 15 compares the differences, calculated for all the combinations of the window unit durations and the shifts, between the sums of the emotion information true values XA and XV, and the sums of the emotion estimates XA and XV, which are∑(XA - XA) and∑(XV - XV), in step S145. The learning data generation unit 1 15 then selects the combination of the window unit duration and the shift that minimizes the values of ∑(XA - XA) and∑(XV - XV).

[0086] In step S146, the learning data generation unit 1 15 sets the selected combination of the window unit duration and the shift in the feature quantity extraction unit 1 12. In step S147, the learning data generation unit 1 15 stores the regression equations corresponding to the selected combination into the learning data storage 122. The process of generating the learning data to be used for emotion estimation ends.

1 -2: Generation of Learning Data for Cognition Estimation

[0087] The learning data generation unit 1 15 generates the learning data to be used for cognition estimation in the manner described below. Fig. 6 is a flowchart showing the procedure and its details.

[0088] More specifically, the motion BM of each of the workers WK1 , WK2, and WK3 indicating hand movement is measured by the triaxial acceleration sensor included in the measurement device 3. The measurement data is then transmitted to the production management apparatus 1 . In parallel with this, the eye movement EM indicating eye movement during operation is captured by the eye movement monitoring camera 4. The captured image data is transmitted to the production management apparatus 1.

[0089] In step S14, the production management apparatus 1 receives, for each of the workers WK1 , WK2, and WK3, the measurement data about the motion BM indicating the hand movement transmitted from the measurement device 3 and the image data about the eye movement EM transmitted from the eye movement monitoring camera 4 through the interface unit 13 as controlled by the sensing data obtaining controller 1 1 1 , and stores the received measurement data and image data into the sensing data storage 121. The measurement data about the motion BM and the image data about the eye movement EM may be the corresponding data obtained during the process of generating the learning data to be used for emotion estimation.

[0090] In the cells C1 , C2, and C3 of the production line CS, the results of the operations performed by the workers WK1 , WK2, and WK3 are captured by the work monitoring camera CM. The captured image data is transmitted to the production management apparatus 1 . In step S15, the production management apparatus 1 receives the image data transmitted from the work monitoring camera CM through the interface unit 13 as controlled by the sensing data obtaining controller 1 1 1 , and stores the received image data into the sensing data storage 121.

[0091 ] In step S16, the production management apparatus 1 generates the learning data to be used for cognition estimation as controlled by the feature quantity extraction unit 1 12 and the learning data generation unit 1 15 in the manner described below. Fig. 9 is a flowchart showing the procedure and its details.

[0092] In step S161 , the production management apparatus 1 selects an operation time period (e.g., one day or one week). In step S162, the feature quantity extraction unit 1 12 reads the image data indicating the operation results from the sensing data storage 121 . In step S163, the feature quantity extraction unit 1 12 extracts the feature quantities indicating the success or failure in the operation from the read image data indicating the operation results by, for example, pattern recognition (i.e. this is an example of obtaining correct values indicating whether the operation results suggest a correctly performed operation, wherein images taken by a camera are compared to a pattern to establish whether the operation was correctly performed or not). The feature quantities are, for example, represented by the number or incidence of misoperations during the selected time period. The feature quantity extraction unit 1 12 uses the extracted feature quantities as correct values of the cognition.

[0093] In step S164, the feature quantity extraction unit 1 12 reads the measurement data obtained by the triaxial acceleration sensor included in the measurement device 3. In step S165, the feature quantity extraction unit 1 12 extracts the feature quantities indicating the hand movement of the worker from the read measurement data. In parallel with this, the feature quantity extraction unit 1 12 reads the image data obtained through the eye movement monitoring camera 4 in step S164, and extracts the feature quantities indicating the eye movement of the worker (eye movement EM) from the read image data in step S165. The extracted eye movement EM is represented by, for example, the eye movement speed, the gaze coordinates and the gaze duration, the number of blinks, and changes in the pupil size as described above. The feature quantities of the motion BM and the eye movement EM may be the corresponding feature quantities extracted during the process of generating the learning data to be used for emotion estimation.

[0094] In step S166, the learning data generation unit 1 15 performs multiple regression analysis with correct values (supervisory data) being the feature quantities indicating the success or failure in the operation and variables being the feature quantities indicating the hand movement and the feature quantities indicating the eye movement EM. This generates a regression equation. The learning data generation unit 1 15 stores the generated regression equation into the learning data storage 122 as learning data to be used for cognition estimation. An example regression equation used for cognition estimation is as follows:

Yi = f(pi EMi, β2ΒΜί) (2)

where Yi is the estimate of the cognition change, β1 is the weighting coefficient for the feature quantities of the eye movement EMi, β2 is the weighting coefficient for the feature quantities of the motion BMi, and f is the sum of the indicators obtained from the feature quantities of the eye movement EMi and the motion BMi, which are primary indicators. The weighting coefficients may be determined by using, for example, the weighted average based on the proportions in the population data obtained in the learning stage. Equation (2) is an example of a relationship between activity and cognition. In one example, first learning data (also above discussed) may include, indicate or be based on equation (2) above, indicating in fact a relationship between activity and cognition. In a further example, first learning data (also above discussed) may include, indicate or be based on equations (1 ) and equation (2) above.

[0095] In step S167, the learning data generation unit 1 15 determines whether all the operation time periods have been selected for generating regression equations. When any operation time period remains unselected, the processing returns to step S161 , and the regression equation generation process is repeated. When the regression equations have been generated for all the operation time periods, the learning data generation unit 1 15 associates, in step S168, the generated regression equations with the information indicating their corresponding operation time periods, and stores the regression equations into the learning data storage 122.

1 -3: Generation of Learning Data for Productivity Estimation

[0096] When the learning data for emotion estimation and the learning data for cognition estimation have been generated for each of the workers WK1 , WK2, and WK3, the learning data generation unit 1 15 generates the learning data to be used for productivity estimation in the manner described below.

[0097] More specifically, the learning data generation unit 1 15 defines the productivity information by using skill level information and misoperation frequency information. The skill level information is represented by, for example, a difference between a standard operation time and an actual operation time. The misoperation frequency information is represented by deviations of the actual operation time from an average operation time.

[0098] The learning data generation unit 1 15 uses the emotion estimates and the cognition estimates as secondary indicators, and generates a relational expression for estimating the skill level of the worker based on the difference between the current and past secondary indicators. An example of the relationship is described below.

[0099] A skill level Quality-A is expressed using the formula below.

Quality-A = V{(ya1 (X2 - x1 )) 2 } + V{(ya2(Y2 - y1 )) 2 } (3) In the formula, x1 is the current emotion estimate, y1 is the current cognition estimate, X2 is the average of past emotion estimates, Y2 is the average of past cognition estimates, ya1 is the weighting coefficient for emotion, and ya2 is the weighting coefficient for cognition.

[0100] The learning data generation unit 1 15 also uses the emotion estimates and the cognition estimates as secondary indicators, and generates a relational expression for estimating the misoperation frequency of the worker based on the variations among the past and current secondary indicators. An example of the relationship is described below.

[0101 ] A misoperation frequency Quality-B is expressed using the formula below.

Quality-B = yb1 V{((X1 - x1 ) /∑(X - xi)) 2 } + yb2V{((Y1 - y1 ) /∑(Y - yi)) 2 } (4) In the formula, x1 is the current emotion estimate, y1 is the current cognition estimate, X1 is the average of historical emotion estimates, Y1 is the average of historical cognition estimates, yb1 is the weighting coefficient for emotion, and yb2 is the weighting coefficient for cognition.

[0102] The weighting coefficients ya1 , ya2, yb1 , and yb2 may be determined for each of the workers WK1 , WK2, and WK3 by using, for example, multiple regression analysis or questionnaires to the workers WK1 , WK2, and WK3. In one example, each or both equations (3) and (4) indicate a relationship between performance, and emotion and cognition. In a further example, second learning data (also above discussed) may include, indicate or be based on equation (3) and/or (4) above, indicating in fact a relationship between performance and activity.

(2) Productivity Estimation

[0103] After the learning data for productivity estimation is generated, the production management apparatus 1 uses the learning data to estimate the productivity of the workers WK1 , WK2, and WK3 during operation in the manner described below. Fig. 1 1 is a flowchart showing the estimation process and its details.

2-1 : Collecting Worker's Sensing Data [0104] When detecting an input operation start command in step S21 , the production management apparatus 1 specifies an initial part feed rate in the part feeder controller DC in accordance with the preliminarily input information specifying the production amount (e.g., 100 products/day) in step S22. The part feeder controller DC then instructs the part feeder DS to feed the sets of parts for the products to be manufactured to the production line CS at the specified rate. In response to the fed sets of parts, the workers WK1 , WK2, and WK3 in their assigned cells start their operations for assembling products.

[0105] During the operation, the measurement device 3 in each of the input and measurement devices SS1 , SS2, and SS3 of the workers WK1 , WK2, and WK3 measures the heart electrical activity H, the skin potential activity G, the motion BM, and the activity amount Ex of the worker at predetermined time intervals or at selected timing. The measurement data is transmitted to the production management apparatus 1 . The eye movement EM of each of the workers WK1 , WK2, and WK3 is also captured by the eye movement monitoring camera 4. The captured image data is transmitted to the production management apparatus 1 .

[0106] In step S23, the production management apparatus 1 receives the measurement data and the image data transmitted from the input and measurement devices SS1 , SS2, and SS3 through the interface unit 13 as controlled by the sensing data obtaining controller 1 1 1 . The production management apparatus 1 stores the received data into the sensing data storage 121 .

2-2: Estimating Worker's Emotion

[0107] When determining that a predetermined time (e.g., one hour) has passed in step S24, the production management apparatus 1 selects one of the workers WK1 , WK2, and WK3 in step S25. The feature quantity extraction unit 1 12 then reads the measurement data and the image data associated with the selected worker from the sensing data storage 121 , and extracts the feature quantities from both the measurement data and the image data.

[0108] For example, the feature quantity extraction unit 1 12 extracts the feature quantities of the heart electrical activity Hi, the skin potential activity Gi, the motion BMi, the activity amount Exi, and the eye movement EMi, which are correlated with emotional changes, from the measurement data for the heart electrical activity H, the skin potential activity G, the motion BM, and the activity amount Ex and the image data for the eye movement EM. In parallel with this, the feature quantity extraction unit 1 12 extracts the feature quantities correlated with cognition changes from the motion BM measurement data and the eye movement EM image data. The extracted feature quantities are the same as those extracted in the learning data generation process described above, and will not be described in detail.

[0109] In step S26, the production management apparatus 1 estimates emotional changes in the worker as controlled by the productivity estimation unit 1 13. Fig. 12 is a flowchart showing the procedure and its details.

[01 10] In step S261 , the productivity estimation unit 1 13 receives the feature quantities to be used for emotion estimation from the feature quantity extraction unit 1 12. In step S262, the productivity estimation unit 1 13 reads, from the learning data storage 122, the regression equations (1 ) for emotion estimation for arousal and for valence corresponding to the predetermined time period described above. In step S263, the productivity estimation unit 1 13 calculates the estimates of emotional changes xAi and XVi for the worker in the predetermined time period described above using the feature quantities to be used for the emotion estimation and the regression equations for arousal and for valence.

2-3: Estimating Worker's Cognition

[01 1 1 ] The feature quantity extraction unit 1 12 included in the production management apparatus 1 extracts the feature quantities correlated with cognition from each of the motion BMi measurement data and the eye movement EMi image data obtained during the predetermined time described above.

[01 12] In step S27, the production management apparatus 1 estimates the cognition of the worker as controlled by the productivity estimation unit 1 13. Fig. 13 is a flowchart showing the procedure and its detail. [01 13] In step S271 , the productivity estimation unit 1 13 receives, from the feature quantity extraction unit 1 12, the feature quantities of the eye movement EMi and the motion BMi to be used for cognition estimation corresponding to the predetermined time period described above. In step S272, the productivity estimation unit 1 13 reads, from the learning data storage 122, the regression equation (2) for cognition estimation corresponding to the predetermined time period described above. In step S273, the productivity estimation unit 1 13 calculates the cognition estimate Yi for the worker using the feature quantities of the eye movement EMi and the motion BMi to be used for the cognition estimation and the regression equation for the cognition estimation.

(2-4) Productivity Estimation

[01 14] In step S28, the production management apparatus 1 estimates the productivity of the worker in the manner described below using the calculated emotional change estimates and the cognition estimates, and the relational expressions (3) and (4) for productivity estimation stored in the learning data storage 122, as controlled by the productivity estimation unit 1 13.

[01 15] In step S281 shown in Fig. 14, the production management apparatus 1 first calculates the difference between the standard operation time and the actual operation time using the relational expression (3), and outputs the calculated difference in operation time as information indicating the skill level Quality-A of the worker. In step S282, the production management apparatus 1 calculates the deviations of the actual operation time from the average operation time using the relational expression (4), and outputs the calculated values as information indicating the misoperation frequency Quality-B of the worker. The skill level Quality-A and the misoperation frequency Quality-B are estimates of the productivity of the worker. (3) Controlling Production Line Based on Worker Productivity Estimates

[01 16] When obtaining the productivity estimates, the production management apparatus 1 controls the production line CS based on the worker productivity estimates in step S29, as controlled by the line controller 1 14 in the manner described below by way of non-limiting example.

[01 17] In step S291 shown in Fig. 14, the line controller 1 14 compares the calculated difference in operation time (skill level Quality-A) with a predetermined first threshold. When the comparison shows that the difference in operation time (skill level Quality-A) is larger than the first threshold, the line controller 1 14 transmits a speed change command to the part feeder controller DC to reduce the rate of feeding parts to the production line CS in step S292 (or command to change speed of functioning of any component of the line, like a tooling machine, etc.). This lowers the rate of feeding parts from the part feeder DS to the line, as controlled by the part feeder controller DC. In this manner, the speed of the production line CS is adjusted in accordance with the productivity of the workers. This adjustment allows the workers to have enough time, and prevents quality deterioration.

[01 18] When the comparison in step S291 shows that the difference in operation is equal to or smaller than the first threshold, the line controller 1 14 compares the calculated deviations in operation time (misoperation frequency Quality-B) with a predetermined second threshold in step S293. When the comparison shows that the deviations in operation (misoperation frequency Quality-B) are larger than the second threshold, the line controller 1 14 determines that the productivity of the worker is unstable. The line controller 1 14 then generates message information about a misoperation alert in step S294, and transmits the message information to the portable information terminal TM held by the leader WR (the message information can be also provided directly to the worker(s), e.g. to a terminal of the worker, or visible by the worker on the line, or to an audio reproducing apparatus for reproducing such message information, or for providing other types of stimuli). The leader WR receiving the message information checks the physical and mental health of the worker, and for example, replaces or instructs the worker to rest.

[01 19] The production management apparatus 1 evaluates the estimates of the productivity of each worker using a plurality of thresholds, and controls the operation of the production line based on the evaluation results. In other words, the production management apparatus 1 stratifies (or standardizes) the operation of the production line in accordance with the estimates of the productivity of the workers. This allows easier control of the operation of the production line in accordance with a decrease in the productivity of the workers while maintaining the intended productivity.

[0120] When the production management apparatus 1 completes the processing from the emotion estimation to the line control for one worker, the production management apparatus 1 determines, in step S30, whether all the workers have been selected for the processing. When any worker remains unselected, the processing returns to step S25, in which the unselected worker is selected, and the processing in steps S25 to S29 is repeated for the next selected worker.

[0121 ] When the processing has been completed for all the workers WK1 , WK2, and WK3, the production management apparatus 1 determines whether it has reached the closing time for the production line CS in step S31 . At the closing time, the production management apparatus 1 stops the production line CS in step S32.

[0122] When the line control is performed, the line controller 1 14 generates information indicating the date and time and details of the line control, and stores the information associated with the worker ID into the control history storage 123. The information indicating the line control history stored in the control history storage 123 is, for example, used for the production management including a change in the number of products to be manufactured.

Thus, a unit (e.g. a controller) within or coupled to the manufacturing line controls a functioning of the at least a component (any of the machines or

apparatuses included in the line, like e.g. a part feeder, part feeder controller, tooling machine, etc) of the manufacturing line based on an estimation result of performance of the worker in the operation obtained by the second estimation unit.

Advantageous Effects of Embodiment

[0123] As described in detail in the above embodiment, vital sign measurement data and motion measurement data obtained from the workers WK1 , WK2, and WK3 during operation are used as primary indicators. The primary indicators and the learning data generated separately are used to estimate the emotion and the cognition of the worker. The estimated emotion and cognition are used as secondary indicators. The secondary indicators and the relational expressions generated separately are used to estimate the productivity of the worker.

[0124] The productivity of a worker can thus be estimated based on vital sign measurement data and motion measurement data obtained from the worker during operation. This enables production management in accordance with human performance corresponding to emotion and cognition, which has not been used before.

[0125] The measurement data about the heart electrical activity H, the skin potential activity G, the eye movement EM, the motion BM, and the activity amount Ex is used as vital sign measurement data and motion measurement data from the workers WK1 , WK2, and WK3. Thus, data used as primary indicators is obtained from each worker in a noninvasive manner.

[0126] Emotional changes are expressed as arousal and valence variations and the quadrants of the two-dimensional arousal-valence coordinate system. This allows the emotional changes to be estimated easily and accurately.

[0127] The learning data for cognition estimation is generated with correct values (supervisory data) being the feature quantities indicating the success or failure in the operation extracted from the image data obtained by the work monitoring camera CM, and variables being the feature quantities indicating hand movement and the feature quantities indicating eye movement EM. This allows the worker's cognition about the production operation to be estimated more accurately.

[0128] In one example, a worker is currently connecting parts. The image data about the operation results is as shown in Fig. 10. In this example, the operation ends with a terminal 53 and a terminal 63 unsuccessfully connected using a lead 73, and a terminal 58 and a terminal 68 unconnected. In the present embodiment, supervisory data indicating the worker's cognition is the feature quantities indicating the success or failure in the operation, and variables are primary indicators associated with the worker's cognition obtained in parallel within the same time period, or in other words, the feature quantities indicating the hand movement of the worker and the feature quantities indicating the eye movement (EM). The supervisory data and the variables are used to generate a relational expression for estimating the cognition. With the measurement data including the feature quantities indicating hand movement and the feature quantities indicating eye movement, the estimation of the worker's cognition using the relational expressions enables the estimation of the possibility of misoperation by the worker as shown in Fig. 10.

[0129] The information indicating the productivity of the worker is defined by the skill level represented by a difference between a standard operation time and an actual operation time, and the misoperation frequency represented by deviations of the actual operation time from an average operation time. The worker productivity is estimated with learning data prepared for both the skill level and the misoperation frequency. This allows the productivity of the worker to be accurately estimated in accordance with the assessment indicator at a production site.

Other Embodiments

[0130] The relationship between human emotions and vital signs, or the relationship between human emotions and motion information may change depending on the date, the day of the week, the season, the environmental change, and other factors. The learning data to be used for emotion estimation may thus be updated regularly or as appropriate. When the difference calculated between a correct value of an emotion and an estimate of the emotion obtained by the productivity estimation unit 1 13 exceeds a predetermined range of correct values, the learning data stored in the learning data storage 122 may be updated. In this case, the correct value can be estimated based on the trends in the emotion estimates. In another embodiment, the correct value of the emotion may be input regularly by the subject through the emotion input device 2, and the input value may be used.

[0131 ] Similarly, when the difference calculated between the correct value of cognition and the estimate of the cognition obtained by the productivity estimation unit 1 13 exceeds a predetermined range of correct values, the learning data stored in the learning data storage 122 may be updated. In this case, the correct value can be estimated based on the trends in the cognition estimates.

[0132] The relational expression representing the relationship between the productivity, and the emotion and the cognition may also be modified based on the productivity estimate. In this case as well, the correct value can be estimated based on the trends in the cognition estimates.

[0133] In the embodiment described above, the information indicating the emotion of the worker is input into the production management apparatus 1 through the emotion input device 2, which is a smartphone or a tablet terminal. The information may be input in any other manner. For example, the worker may write his or her emotion information on print media such as a questionnaire form, and may use a scanner to read the emotion information and input the information into the production management apparatus 1.

[0134] Further, a camera may be used to detect the facial expression of the worker. The information about the detected facial expression may then be input into the production management apparatus 1 as emotion information. A microphone may be used to detect the worker's voice. The detection information may then be input into the production management apparatus 1 as emotion information. Emotion information may be collected from a large number of unspecified individuals by using questionnaires, and the average or other representative values of the collected information may be used as population data to correct the emotion information from an individual. Any other technique may be used to input the information indicating human emotions into the production management apparatus 1.

[0135] The above embodiment describes the two-dimensional arousal-valence system for expressing the information about the worker's emotion. Another method may be used to express the worker's emotion information.

[0136] In the embodiment described above, the measurement data items, namely, the heart electrical activity H, the skin potential activity G, the eye movement EM, the motion BM, and the activity amount Ex are input into the production management apparatus 1 as information indicating the activity of the worker, and all these items are used to estimate the emotions. However, at least one item of the measurement data may be used to estimate the emotions. For example, the measurement data about the heart electrical activity H, which is highly contributory to emotions among the other vital signs, may be solely used to estimate the emotions. Vital signs other than the items used in the embodiment may also be used.

[0137] Additionally, measurement data other than the hand movement and the eye movement may also be used as a primary indicator to estimate the cognition.

[0138] In addition, the number of cells in the production line CS and the types of products assembled in each cell may also be modified variously without departing from the scope and spirit of the invention.

[0139] The examples described above include a production line involving an operation performed by a worker. In this production line, the performance of the worker in the operation is estimated based on the worker's emotion and cognition. However, the present invention is not limited to such examples, but is applicable to any system involving a human operation.

[0140] For example, the present invention is applicable to cars, ships, airplanes, or other vehicles operated by a driver or a navigator corresponding to a worker, and the performance of the operation is estimated based on the emotion and the cognition of the driver or navigator. In this case, the cognition of the driver or navigator can be measured based on the degree of his or her cognition about an obstacle visible in the outside view. The present invention is also applicable to construction machinery, power generation equipment, electrical transformers, or medical devices operated by an operator corresponding to a worker, as well as control systems for various plants, airplanes, or trains. By way of example, further embodiments are below described.

EMDODIMENT 2 In embodiment 1 , a work management apparatus has been presented, which is suitable to manage an operation performed by a worker in a system including at least partially a manufacturing line, and/or for estimating performance in executing by a worker an operation within a manufacturing line, and/or for controlling at least a part of a manufacturing line. Present embodiment 2 is directed to a drive assisting apparatus for providing vehicle driving assistance, wherein driving assistance is provided, when the driver is driving the vehicle, based on the estimation result of the performance of the driver. The estimation result of the performance of the driver can be obtained as described in embodiment 1 , and for instance as represented in figure 4 (wherein, in the case of the present embodiment, the productivity estimation unit 113 is substituted by a driving performance estimation unit 113; the line controller 114 by a controller for providing driving assistance; the same sensors or devices SS1 to SS3 can be used, when conveniently installed in view of the driver position etc.). As an example, in the present embodiment, correct values used for cognition estimation may be represented by how correctly the driving task is executed, which can be obtained e.g. by measuring certain driving parameters like how correctly the vehicle follows certain predetermined routes (e.g. comparing how smoothly the actual driving route correspond to an ideal route obtained from a ma), how smooth the control of the vehicle is (e.g. whether or how often any sudden change of direction occurs), on the degree of the driver recognizing an obstacle, etc. Suitable sensors could be provided (as represented by CM in figure 4), including for instance positioning measurement systems, camera for recognizing driving paths or patterns, vehicle speed sensors, vehicle inertial systems for obtaining information on current driving parameters, etc. The performance values of one driver (in the sense of performance in executing driving, to be used for obtaining learning data by way of regression analysis) can e.g. be obtained by comparing for instance the distance covered over a certain period over an expected distance for a given period, or whether in reaching two points a certain route has been followed compared to predetermined available routes, etc. The controller providing assistance may provide driving assistance to either one or both of the driver and the vehicle. For example, driving assistance may include an active control of the vehicle by an assisting unit during driving: in one example, if the estimated performance is found to be associated to a given performance value, the control unit (or any other unit suitable for automatically or semi automatically driving the vehicle) may act on components of the vehicle like the brakes or accelerator to adapt the speed of the vehicle to the current performance of the driver; in case the estimated performance is determined to have another predetermined value, indicating for instance a performance associated to a potentially hazardous situation, the control unit may act on the brake and/or on the steering wheel to take over control (e.g. an automatic pilot) or to stop the vehicle. Preferably, the driving assistance may include providing the driver of the vehicle with at least a feedback during driving depending on the performance level estimated. For instance, the message may include a message (as an example of the feedback) to the driver suggesting to make a stop and take a rest. Another example of driving assistance (or driving assistance feedback) is represented by a sound, melody, music, or audio message in general; in this way, the driver may be alerted so that the hazardous situation is avoided, and alerted in a way that is appropriate to the estimated performance level. Other types of driving assistance feedback are of course suitable. The controller providing driving

assistance may be installed in the vehicle. However, the calculation or determination of the driving assistance based on the estimated result may be indifferently performed within the vehicle or outside the vehicle; in the latter case, the calculated driving assistance is communicated to the control unit within the vehicle, which provides the (outside calculated) driving assistance to other parts of the vehicle and/or to the driver. Reference is also made to embodiment 1 (and corresponding figures), illustrating details that are equally and optionally applicable to the present embodiment.

EMDODIMENT 3

Present embodiment 3 is directed to an apparatus for healthcare support of a subject, wherein the device is preferably coupled to the subject. By coupled to the subject it is meant that the device is within range of interaction with the subject, e.g. capable of making measurements on the subject, and/or providing feedback or stimulus to the subject, and/or receiving inputs from (e.g. commands) and providing output to the subject. The healthcare support apparatus includes a controller for providing the subject with a healthcare support feedback based on an estimated performance of the subject. The estimated performance refers to the performance in executing an operation by the person. Preferably, the operation includes an operation of a device by the person; the operation includes however also a physical or intellectual exercise of the subject. Thus, the operation refers to an action executed by the subject. The estimated performance may be an estimation result of the

performance (of the subject when executing the operation), the result obtained by the a performance estimation unit, represented for instance by the second estimation unit illustrated also above. More in particular, the estimation result of the performance of the subject can be obtained as described in embodiment 1 , and for instance as represented in figure 4 (wherein, in the case of the present embodiment, the productivity estimation unit 113 is substituted by a performance estimation unit 113; the line controller 114 by a controller for providing healthcare support feedback; the same sensors or devices SS1 to SS3 can be used, when conveniently installed in view of the subject, and preferably when having regard of one or more types of operation/action executed by the subject. As an example, in the present embodiment, correct values for cognition estimation may be obtained by measuring how one or more task (i.e. an operation or action) is executed by the subject: for instance, how straight and balanced the person ' s body position is when walking, running or sitting (e.g. over predetermined patterns); how smoothly certain movements are made over predetermined patterns; etc. This can be obtained for instance by comparing an image (obtained e.g. via camera CM) with a predetermined pattern, or by making other suitable measurements and comparing the same with predetermined values and/or pattern of values. The performance values of the person (to be used for obtaining learning data by way of regression analysis) can e.g. be obtained by measuring efficiency and/or quality in completing a certain task (i.e. the operation or action above explained) or number of tasks, like for instance measuring the distance covered on foot over an expected distance; measuring the time for accomplishing a task over a predetermined time (e.g. completing a housecleaning or hobby-related operation, number of such operations performed in an hour or day), etc.

The healthcare support feedback (also feedback, in short) may be

represented for instance by one or more messages (in the form of text, audio, and/or video, etc.) suggesting certain activities to undertake or lifestyle to follow, or one or more stimuli signals induced on the subject (for instance, audio/video signal to induce stimulation on the subject, and/or an electric signal inducing stimulation on the subject, etc. ). Other types of feedback are of course suitable. Since the performance can be accurately estimated, a healthcare feedback can be accurately provided for instance when it is really needed (e.g. in correspondence of a predetermined performance value, which can herein be accurately estimated), or chosen in dependence of the estimated performance; for instance, if the performance decreases, a particular feedback can be chosen for prompting an improvement of health condition; when performance increases, another type of feedback may be given to maintain the same level of performance, and for prompting maintenance of good health conditions also in the long term. In this way, it is possible to improve health conditions of a person, or maintain a (e.g. good) health condition. Also, when omitting the control device, it is possible to more accurately estimate the performance of the person, which is an index of the health condition of the person. Thus, the device also allows a better and more accurate monitoring of the health conditions of a person. Reference is also made to embodiment 1 (and corresponding figures), illustrating details that are equally and optionally applicable to the present embodiment.

OTHER EMDODIMENTS

In general, a performance estimation apparatus is provided comprising a first estimation unit capable of estimating emotion and cognition on the basis of the discussed first learning data and of information obtained (e.g. measured) about the activity of a person while the person executes a certain action or task. The apparatus also includes a second estimation unit for estimating performance of the subject (in executing the action) based on second learning data and on estimated cognition and emotion. In this way, the performance estimation apparatus can accurately monitor the performance of the subject when executing a certain action or task. Embodiments 1 to 3 provide examples of action/task, including an operation on a system like a production line, operation of a vehicle, or performing a task when using a healthcare supporting device, though any other types of task is included in the present embodiment, in particular e.g. when the subject interacts with any device.

Optionally, a controller can be included in the performance estimation apparatus, wherein the controller provides an intervention to the person and/or the device with which the person is interacting (or with the system including the device). The intervention is based on the performance estimation result. The intervention includes an intervention acting on a user (such as providing stimuli, and message/guidance), and/or an intervention acting on the device with which the user is interacting (or system including the device). In this way, the interaction between person and device can be improved, since the intervention can be provided at the correct time depending on the accurately estimated performance, or the type of intervention can be accurately chosen depending on the accurately estimated performance. Thus, the interaction between machine and person can be improved in an objective and repeatable way, and autonomously by the apparatus.

[0141 ] The present invention is not limited to the embodiment described above, but may be embodied using the components modified without departing from the scope and spirit of the invention in its implementation. An appropriate combination of the components described in the embodiment may constitute various aspects of the invention. For example, some of the components described in the embodiment may be eliminated. Further, components from different embodiments may be combined as appropriate. Also, even if certain features have been described only with reference to a device, the same feature can also be described in terms of a method (e.g. according to which the same device operated) or of a program (for programming a computer so as to function like the described apparatus features). Similarly, even if a certain feature is described only with reference to a method, the same feature can also be described in terms of a unit or of a device means (or of computer program instructions) configured to perform the same described method feature.

Further, in the above and other (see also below) methods herein described, steps are defined like obtaining, estimating, controlling, providing driving assistance, providing healthcare support feedback, etc. It is however noted that such steps (or any combination of them) may also be caused or induced by a remote device, like for instance by a client computer or a portable terminal, on another device (like for instance a server, localized or distributed) that correspondingly performs the actual step. Thus, the mentioned steps are to be understood also as causing to obtain, causing to estimate, causing to control, causing to provide driving assistance, causing to provide healthcare support feedback, etc., such that any of their combination can be caused or induced by a device remote to the device actually performing the respective step.

[0142] The above embodiments may be partially or entirely expressed in, but not limited to, the following forms.

Appendix 1 :

A work management apparatus for managing an operation performed by a worker in a system involving the operation performed by the worker, the apparatus comprising at least one hardware processor and a memory,

the memory including

a first storage configured to store first learning data indicating a relationship between an activity and emotion of the worker and a relationship between the activity and cognition of the worker, and

a second storage configured to store second learning data indicating a relationship between the emotion of the worker, the cognition of the worker, and performance of the worker in the operation; and

the at least one hardware processor being configured to

obtain information indicating the activity of the worker during the operation,

estimate the emotion and the cognition of the worker during the operation based on the obtained information indicating the activity used as a primary indicator, and the first learning data indicating the relationship between the activity and the emotion of the worker and the relationship between the activity and the cognition of the worker, and

estimate the performance of the worker in the operation based on the estimated emotion and cognition each used as a secondary indicator, and the second learning data indicating the relationship between the emotion, the cognition, and the performance of the worker in the operation.

[0143]

Appendix 2:

A work management method that is implemented by an apparatus including at least one hardware processor and a memory, the method comprising:

at least one hardware processor obtaining information indicating the an activity of the worker during the operation;

the at least one hardware processor estimating emotion and cognition of the worker during the operation based on the obtained information indicating the activity used as a primary indicator, and first learning data indicating a relationship between the activity and the emotion of the worker and a relationship between the activity and the cognition of the worker; and

the at least one hardware processor estimating performance of the worker in the operation based on the estimated emotion and cognition each used as a secondary indicator, and second learning data indicating a relationship between the performance of the worker in the operation, the emotion of the worker, and the cognition of the worker. REFERENCE SIGNS LIST

[0144] CS production line

B1 , B2, B3 product

C1 . C2. C3 cell

WR leader

WK1 , WK2, WK3 worker

M01 , M02, M03 monitor

TM portable information terminal

DC part feeder controller

DS part feeder

RB cooperative robot

CM work monitoring camera

NW network

SS1 , SS2, SS3 input and measurement device

1 production management apparatus

2 emotion input device

3 measurement device

4 eye movement monitoring camera

1 1 control unit

1 1 1 sensing data obtaining controller

1 12 feature quantity extraction unit

1 13 productivity estimation unit

1 14 line controller

1 15 learning data generation unit

12 storage unit

121 sensing data storage

122 learning data storage

123 control history storage interface unit