GB2623553A - Computer-implemented method for generating personal data on a condition of a person - Google Patents

Computer-implemented method for generating personal data on a condition of a person Download PDF

Info

Publication number
GB2623553A
GB2623553A GB2215484.3A GB202215484A GB2623553A GB 2623553 A GB2623553 A GB 2623553A GB 202215484 A GB202215484 A GB 202215484A GB 2623553 A GB2623553 A GB 2623553A
Authority
GB
United Kingdom
Prior art keywords
person
feature
personal
neural network
perceptual
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
GB2215484.3A
Other versions
GB202215484D0 (en
Inventor
Dhariwal Shashank
Singh Nahar
Uniyal Asmita
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Continental Automotive Technologies GmbH
Original Assignee
Continental Automotive Technologies GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Continental Automotive Technologies GmbH filed Critical Continental Automotive Technologies GmbH
Priority to GB2215484.3A priority Critical patent/GB2623553A/en
Publication of GB202215484D0 publication Critical patent/GB202215484D0/en
Publication of GB2623553A publication Critical patent/GB2623553A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0639Performance analysis of employees; Performance analysis of enterprise or organisation operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0639Performance analysis of employees; Performance analysis of enterprise or organisation operations
    • G06Q10/06398Performance of employee with respect to a job function
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/04Manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Health & Medical Sciences (AREA)
  • Human Resources & Organizations (AREA)
  • Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Strategic Management (AREA)
  • Economics (AREA)
  • Educational Administration (AREA)
  • Development Economics (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Biophysics (AREA)
  • General Business, Economics & Management (AREA)
  • Tourism & Hospitality (AREA)
  • Marketing (AREA)
  • Veterinary Medicine (AREA)
  • Medical Informatics (AREA)
  • Social Psychology (AREA)
  • Public Health (AREA)
  • Psychology (AREA)
  • Psychiatry (AREA)
  • Hospice & Palliative Care (AREA)
  • Animal Behavior & Ethology (AREA)
  • Game Theory and Decision Science (AREA)
  • Educational Technology (AREA)
  • Surgery (AREA)
  • Operations Research (AREA)
  • Quality & Reliability (AREA)
  • Pathology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Developmental Disabilities (AREA)
  • Child & Adolescent Psychology (AREA)
  • Artificial Intelligence (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)

Abstract

A computer-implemented method for generating personal data on a condition of a person. The method comprises feeding monitoring data 14 to a perception layer for determining at least one perceptual feature 58,68,72,76,80,84,92,96,100, the perception layer includes at least one perceptual feature neural network 56,66,70,74,78,82,90,94,98 trained to determine the perceptual feature. A behaviour change 22 of the monitored person is detected based on the perceptual feature and personal data on a condition of the monitored person is generated by an evaluating machine learning model 24. The evaluating machine learning model 24 evaluates a personal distraction level 120 and/or personal cognitive load level 122. The personal data is analysed with regard to logged personal data 124, and an output message 26 is generated in response to the analysing. The method may provide a mechanism to check whether an operator is in an optimal condition to work or not.

Description

DESCRIPTION
Computer-implemented method for generating personal data on a condition of a person
TECHNICAL FIELD
The invention relates to a computer-implemented method for generating personal data on a condition of a person. The invention further relates to a data processing 10 device, a computer program and a computer-readable data carrier.
BACKGROUND
Manufacturing plants take various measures for quality assurance (QA). QA processes may, for example, rely on decisions of a certified human expert (operator) inspecting, for instance visually, a product regarding its quality. The operator can reduce the number of faulty parts. However, the operator should be in an optimum condition while evaluating the quality of the product. Therefore, plant managers may be instructed to evaluate a performance of the operator and regular trainings and breaks for operators may be scheduled.
Nevertheless, after a QA process, there can still remain a risk of the occurrence of defective products. Once a faulty product or part has passed a test, the quality cannot be restored, thereby impacting overall product quality. Thus, quality deterioration may occur.
Emotion state monitoring has been used in decision making, recruitment, evaluation and education, see for example AU 2007 327 315 Al. Methods such as distraction detection, emotion recognition, activity recognition, mental fatigue recognition, etc. may be used for operator monitoring (EP 3 462 403 Al), for example in healthcare.
SUMMARY OF THE INVENTION
The object of the invention is to improve the quality of products manufactured at a site and to perform a quality assurance process in a more efficient and redundant way.
To achieve this object, the invention provides a computer-implemented method for generating personal data on a condition of a person. A data processing device, a computer program and a computer-readable data carrier are subject-matter of the parallel claims.
Advantageous embodiments of the invention are subject-matter of the dependent claims In one aspect, the invention provides a computer-implemented method for generating personal data on a condition of a person, the method comprising: a) Monitoring a person at a site by means of at least one monitoring device in order to generate monitoring data indicative for a behaviour of the monitored person; b) Feeding the monitoring data to a perception layer for determining at least one perceptual feature, the perception layer including at least one perceptual feature neural network trained to determine the at least one perceptual feature; c) Detecting behaviour change of the monitored person based on the at least one perceptual feature and, if a behaviour change of the monitored person is detected: d) Generating personal data on a condition of the monitored person by means of an evaluating machine learning model using as input the monitoring data and/or the at least one perceptual feature, the evaluating machine learning model evaluating a personal distraction level and/or personal cognitive load level; e) Analyzing the personal data with regard to logged personal data; and f) Generating an output message in response to the analyzing.
Preferably, the method further comprises one, several or all of the following: g) Monitoring one or more environmental parameters at the site and determining, by the perception layer, as perceptual feature a thermal comfort feature of the monitored person and/or an illumination feature; h) Monitoring one or more physiological parameters of the person and determining, by the at least one perceptual feature neural network, as perceptual feature a mental fatigue feature of the monitored person; i) Monitoring the person at the site and determining, by the at least one perceptual feature neural network, as perceptual feature a personal identity feature of the monitored person; and j) Monitoring an object which the monitored person is inspecting at the site and determining, by the at least one perceptual feature neural network, as perceptual feature an object identity feature.
Preferably, the monitoring device is configured as one, several or all of the following: an image or video capturing device, a microphone, a physiological parameter sensor, an environmental parameter sensor, a temperature sensor, and a light or brightness sensor.
Preferably, the at least one perceptual feature neural network is trained to perform one, several or all of the following: kl) recognizing a face of a person; k2) recognizing an object a person is inspecting; k3) recognizing a generic emotion of a person; k4) recognizing an activity of a person; k5) estimating a pose of a person; k6) estimating a gaze of a person; k7) estimating a mental fatigue of a person; k8) detecting a drowsiness of a person; k9) tracking a head of a person; and kl 0) recognizing a spontaneous emotion of a person.
Preferably, step c) comprises one or both of the following: cl) weighting a plurality of perceptual features with perceptual feature weight factors in order to create a cumulative behaviour change level; and c2) comparing the cumulative behaviour change level with a cumulative behaviour change level threshold and detecting the behaviour change of the monitored person based on the comparison.
Preferably, the evaluating machine learning model comprises one, several or all of the following: dl) a machine learning algorithm; d2) a random forest algorithm; d3) a gradient boosting algorithm; d4) a regression algorithm; and d5) an evaluating neural network.
Preferably, the method further comprises one, several or all of the following: I) Logging the personal data on the condition and/or the perceptual features in order to generate the logged personal data; m) Logging times and/or durations of the detected behaviour change; and n) Logging times and/or durations of a presence of the monitored person at and/or 15 of an absence of the monitored person from the site.
Preferably, step e) comprises: el) Feeding the personal distraction level and/or personal cognitive load level and the logged personal data to a personal condition neural network; and el.1) Estimating, by the personal condition neural network, a personal distraction level threshold and/or personal cognitive load level threshold; and/or el.2) Estimating, by the personal condition neural network, a time and/or a duration the monitored person is to leave the site for recover the condition.
Preferably, the output message is indicative for the monitored person when to leave the site and/or when to return to the site, to repeat, change or avoid an activity at the site, and/or to have an unfit condition at the site.
In another aspect, the invention provides a data processing device comprising 30 means for carrying out the method of any of the preceding embodiments.
In another aspect, the invention provides a computer program comprising instructions which, when the program is executed by a computer, cause the computer to carry out the method of any of the preceding embodiments.
In another aspect, the invention provides a computer-readable data carrier having stored thereon the computer program.
Embodiments of the invention preferably have the following advantages and effects: Embodiments of the invention preferably provide a mechanism to check whether an operator is in an optimal condition to work or not. Embodiments of the invention preferably make sure that a quality check is well scrutinized.
An operator may sometimes have a high personal cognitive load. Therefore, embodiments of the invention preferably provide a method that computes features such as personal distraction or personal cognitive load of the operator. The method preferably provides a data-driven solution for QA which monitors human operators involved in the product assessment process.
An idea of preferred embodiments of the invention is to use a visual inspection system that already exists at manufacturing plants or sites. The system preferably 20 can raise an alarm when an operator has a high personal distraction and/or a high personal cognitive load.
Preferred embodiments may improve the quality check of operators and may also help the manufacturing plants to reach Industry4.0 standards.
An idea of the invention preferably is to provide a method that effectively monitors an operator's behaviour while he/she visually is inspecting a product. The method may comprehend this information and may deliver intelligent insight while monitoring the operator for personal cognitive load. To assess the operator's behaviour, preferred embodiments of the invention use one or more cameras and health sensors.
Artificial Intelligence based neural network models such as support vector machines, decision trees, ensemble models, k-nearest neighbours models, Bayesian networks, or other types of models including linear models and/or nonlinear models may be used. Neural networks may also include feed-forward neural networks, convolutional neural networks (CNN), and recurrent neural networks (RN N).
The artificial-intelligence based algorithms used in preferred embodiments of the invention are disclosed and described in the following documents or on the following websites: [1] Kaiming He et al.: "Deep Residual Learning for Image Recognition"; https://doi.org/10.48550/arXiv.1512.03385; [2] Christian Szegedy et al.: "Rethinking the Inception Architecture for Computer Vision; https://doi.org/10.48550/arXiv.1512.00567; [3] Karen Simonyan and Andrew Zisserman: "Very Deep Convolutional Networks for Large-Scale Image Recognition; https://dotorg/10.48550/arXiy.1409.1556; [4] Mingxing Tan et al.: "EfficientDet: Scalable and Efficient Object Detection"; https://doi.org/10.48550/arXiv.1911.09070; [5] Joseph Redmon and Ali Farhadi: "YOLOy3: An Incremental Improvement; https://doi.org/10.48550/arXiv.1804.02767; [6] Ross Girshick et al.: "Rich feature hierarchies for accurate object detection and semantic segmentation"; https://doi.org/10.48550/arXiv.1311.2524; [7] Wei Liu et al.: "SSD: Single Shot MultiBox Detector"; https://doi.org/10.48550/arXiv.1512.02325; [8] Jingdong Wang et al.: "Deep High-Resolution Representation Learning for Visual Recognition"; https://doi.org/10.48550/arXiv.1908.07919; [9] Alexander Toshey and Christian Szegedy: "DeepPose: Human Pose Estimation via Deep Neural Networks"; https://doi.org/10.48550/arXiv.1312.4659; [10] Zhe Cao, Gines Hidalgo et al.: "OpenPose: Rea!time Multi-Person 2D Pose Estimation using Part Affinity Fields"; https://arxiv.org/abs/1812.08008; [11] Ralf C. Staudemeyer and Eric Rothstein Morris: "Understanding LSTM --a tutorial into Long Short-Term Memory Recurrent Neural Networks"; https://arxiv.org/abs/1909.09586; [12] Satya P. Singh et al.: "Deep ConvLSTM with self-attention for human activity decoding using wearables"; https://arxiy.org/abs/2005.00698; [13] Kaipeng Zhang et al.: "Joint Face Detection and Alignment using Multi-task Cascaded Convolutional Networks"; arXiv:1604.02878; [14] Sara Sabour et al.: "Dynamic Routing Between Capsules"; https://arxiv.org/abs/1710.09829; [15] "Tiny Full Connected Neural Network Library": [16] Leo Breiman: "Random Forests"; Nt [17] Leo Breiman: "Random Forests"; [18] Gradient Boosting Classifier: htips:fiscikit-21; [19] Personalized Regression: )ssiog;
I
[20] Gated Recurrent Unit: [22] Jonathan Long et al.: "Fully Convolutional Networks for Semantic Segmentation"; httoslidoi Grall 0.48550"arkv.114.11.4038; [23] Corinna Codes and Vladimir Vapnik: "Support-vector networks"; ; and [24] Decision Trees: h, The neural networks may use this data to compute one, several or all of the following attributes or perceptual features: - operator profiling based on identification/personal identity; -generic emotion recognition; - spontaneous emotion recognition; - object recognition; - body-pose recognition; - activity recognition; () EilnC ShiClie [30] Gradient Boosting Regressor: orlittenn - mental fatigue estimation; - drowsiness detection, i.e., detection if the operator is drowsy; - gaze estimation; - head tracking; -thermal comfort estimation; and - illumination/brightness level detection.
Once an operator is identified, in preferred embodiments a unique profile is created and added to a personal database. This profile may be used repeatedly to map the operator's behaviour continuously as well as may govern various thresholds used across the system.
A mechanism may assign different weights to the perception layer features or perceptual features according to their importance for an operator. For example, drowsiness detection or mental fatigue may be assigned more weightage as compared to pose estimation for an operator. Finally, a cumulative weightage may be computed and fed to a thresholding block.
Since some of the perception layer features or perceptual features may not contribute toward stress singularly, a behaviour change threshold may be defined so that the system may not need to perform further computation. For example, if an operator is active, focussed and emotionally neutral, the system may stop processing at the threshold level and does not need to compute further perceptual features The above attributes are preferably used to estimate operator behaviour scores. Preferably, the operator behaviour scores are the personal distraction level or score and the personal cognitive load level or score.
A neural network may take the personal distraction score and the cognitive load score along with previous event logs. Previous event logs may be the frequency of previous breaks, the duration of previous breaks, the historic personal distraction levels, and historic personal cognitive load scores, e.g., levels after coming from breaks, etc. The neural network may take these previous event logs as an input to calculate preferably a time required for a break. The neural network may also take the historic data from the logs to suggest better break timings and durations. Machine learning techniques such as random forest, gradient boosting, personalized regression along with RNN-based architecture such as LSTM could be used as an architecture.
The neural network may output one, several or all of the following: - time in which an operator should take a break; - duration of the break; -personalized cognitive load threshold value; and - personalized distraction threshold value.
The personalized threshold values may be used to decide whether to fire a warning (for an operator to take a break) or not. For example, if the personal cognitive load score is high, but the personal distraction score is low, this may be indicative for the operator to go for a break. However, if the personal distraction score is high and the personal cognitive load score is low then a warning may be provided, and the operator may reassess the part.
In preferred embodiments, the final output may comprise one, several or all of the following: - warning message; - time in which an operator should take a break; and - duration of the break.
The operator profiles, the personal distraction level, the personal cognitive load level, personalized threshold values, time and duration of breaks, warnings and/or product unique ID may be added to the logs. The logs may be added to the personal database.
Embodiments of the invention preferably provide manufacturing companies with an avenue to audit the quality assurance (QA) with effective data. They may allow plant managers to monitor the performance and to provide essential training for the operators involved in the quality assurance process.
The operators may also benefit from preferred embodiments of the invention, as they are alerted when they are distracted or tired so that they can plan appropriate breaks and come back fresh to their station.
Thermal comfort and illumination feature may help in assuring that the operator is able to work at comfortable temperature levels and brightness levels to avoid environmental distraction.
With data collected in the personal database (operator profile, stress score, rejected products, performance reports, etc.) over a longer period of time, a duration for which an operator can work optimally without stress, may be estimated.
By employing a mechanism to detect behaviour change, preferred embodiments of the invention reduce computation costs. Companies that depend on third-party suppliers for various components may have a greater trust when QA practices as suggested herein are followed.
The method disclosed herein may also allow the justification of insurance claims, etc. and may result in reduced losses for manufacturing plants.
The method disclosed herein preferably employs a holistic approach to understanding operator behaviour. The evaluation of attributes or perceptual features such as facial expressions, body pose, mental fatigue, drowsiness, gaze estimation, cognitive load, etc. may add credibility and improve quality standards. Evaluation of human behaviour could be useful in manufacturing, healthcare and automotive sectors.
BRIEF DESCRIPTION OF THE DRAWINGS
Embodiments of the invention are now explained in more detail with reference to the accompanying drawings of which Fig. 1 shows a computer-implemented method for generating personal data on a condition of a person according to an embodiment of the invention; Fig. 2 shows the computer-implemented method for generating the personal data on the condition of the person according to a further embodiment of the invention; Fig. 3 shows in detail steps of the method for generating a personal distraction level of the monitored person; and Fig. 4 shows in detail further steps of the method for generating a personal cognitive load level of the monitored person.
DETAILED DESCRIPTION OF EMBODIMENTS
Fig. 1 shows a computer-implemented method for generating personal data 10 on a condition of a person or operator according to an embodiment of the invention.
In a first step, the person is monitored at a site by means of at least one monitoring device 12 in order to generate monitoring data 14 indicative for a behaviour of the monitored person. In the embodiment shown in Fig. 1, the at least one monitoring device 12 are summarized into a sensor array 16 generating the monitoring data 14.
In a second step, the monitoring data 14 is fed to a perception layer 18. The perception layer 18 is configured for determining a plurality of perceptual features 20.
In a third step, based on the determined perceptual features 20, a behaviour change 22 of the monitored person is detected or not.
If a behaviour change 22 of the monitored person is detected, the method includes in a fourth step the generating of the personal data 10 on the condition. The generating of the personal data 10 is performed by means of an evaluating machine learning model or behaviour model 28.
In a fifth step, an output message 26 is generated.
Fig. 2 shows the computer-implemented method for generating the personal data 10 on the condition of the person according to a further embodiment of the invention, wherein the steps of the method described previously are shown in more detail.
As previously described in relation to Fig. 1, in the first step of the method the person is monitored by means of at least one monitoring device 12 in order to generate monitoring data 14.
In the embodiment shown in Fig. 2, the monitoring device 12 or sensor array 16 includes an image or video capturing device 28, a microphone 30, a physiological parameter sensor 32, and an environmental parameter sensor 34. In the embodiment of Fig. 2, the microphone 30 is included in the image or video capturing device 28.
The image or video capturing device 28 can monitor the person at the site. For example, the image or video capturing device 28 may capture one or more images of the monitored person. The image or video capturing device 28 can further monitor an object or product 35 which the monitored person is inspecting or working on. For example, the monitoring device 12 may capture one or more images of the object 35 near the monitored person. The image or video capturing device 28 generates image data 36.
The microphone 30 can monitor the person at the site. For example, the microphone 30 may capture a voice volume and a voice tone of the monitored person. The microphone 30 generates microphone data 38.
The physiological parameter sensor 32 can monitor the person at the site. In particular, the physiological parameter sensor 32 can monitor one or more physiological parameters 40 of the monitored person. For example, the physiological parameter sensor 32 may capture a heart rate, respiration rate, body temperature of the monitored person, or a combination thereof. The physiological parameter sensor 32 generates physiological data 42.
The environmental parameter sensor 34 can monitor an environmental parameter 44 at the site. For example, the environmental parameter sensor 34 may include a temperature sensor 46 and a light sensor 48. The temperature sensor 46 may capture a site temperature 49 at the site. The light sensor 48 may capture an illumination or brightness level 50. The environmental parameter sensor 34 generates environmental data 52.
Thus, the monitoring data 14 may include image data 36, microphone data 38, physiological data 42, and environmental data 52.
As previously described in relation to Fig. 1, in the second step of the method the monitoring data 14 is fed to the perception layer 18 for determining the plurality of perceptual features 20.
The perception layer 18 includes at least one perceptual feature neural network 54. In the embodiment shown in Fig. 2, the perception layer 18 includes ten perceptual feature neural networks 54 each trained to determine one of the plurality of perceptual feature 20. However, within the scope of the invention, the perception layer 18 may include only a selection of the ten perceptual feature neural networks 54. Also, the at least on perceptual feature neural network 54 may be trained to determine a plurality of the perceptual feature 20.
A first perceptual feature neural network 56 is trained to recognize a face of a person. Thus, the first perceptual feature neural network 56 fed with the monitoring data 14, can determine as perceptual feature 20 a personal identity feature 58 of the monitored person.
The first perceptual feature neural network 56 may be based on a convolutional neural network (CNN). For example, the first perceptual feature neural network 56 may be based on ResNet, InceptionV3, or a combination thereof. The first perceptual feature neural network 56 may be trained on a dataset 60 having images containing clearly visible faces. The dataset 60 may be real or simulated data. For example, the dataset 60 may contain face photographs of people collected from the web, wherein each face is labelled with the name of the person pictured. An example for such a dataset 60 is "Labeled faces in the Wild" which is described and downloadable under Inttp:L '4u/ft/. The dataset may be stored in a database 62. The first perceptual feature neural network 56 may analyze the image data 36. The personal identity feature 58 of the monitored person can be a unique identity of the monitored person. Thus, a unique personal database 64 for the monitored person can be created.
A second perceptual feature neural network 66 is trained to recognize an object a person is inspecting or working on. Thus, the second perceptual feature neural network 66 fed with the monitoring data 14, can determine as perceptual feature 20 an object identity feature 68 of the object 35 the monitored person is inspecting or working on.
The monitored person may inspect the object 35 visually, manually, or in any other way and the second perceptual feature neural network 66 can determine the object identity feature 68 of the object 35. The second perceptual feature neural network 66 may be based on a CNN. For example, the second perceptual feature neural network 66 may be based on EfficientDet, YOLO, SSD, RCNN, or a combination thereof. The second perceptual feature neural network 66 may be trained on a dataset 60 containing common images of objects appearing in varying shapes, sizes, and orientations. The dataset 60 may be real or simulated data. For example, the dataset 60 may contain images showing different objects, wherein each object instance pictured is labelled and annotated with a segmentation mask. Examples of such a dataset 60 are "COCO" which is described and downloadable ig or "CIFAR" described and downloadable I. The dataset 60 may be stored in under http., under I-4 the database 62. The second perceptual feature neural network 66 may analyze the image data 36. The object identity feature 68 can be a label and a location of the object 35 the monitored person is inspecting, and a confidence with which the object 35 was detected.
A third perceptual feature neural network 70 is trained to recognize a generic emotion of a person. Thus, the third perceptual feature second neural network 70 fed with the monitoring data 14, can determine as perceptual feature 28 a generic emotion feature 72 of the monitored person.
The third perceptual feature neural network 70 may be based on time-series based model with an evaluation window of the number of frames and may select the generic emotion feature 72 that occurs the maximum number of times within the window. For example, the third perceptual feature neural network 70 may be based on a long short term memory (LSTM) neural network, a recurrent neural network (RNN), or a combination thereof. The third perceptual feature neural network 70 may be trained on a dataset 60 having images containing clearly visible facial emotions. The dataset 60 may be real or simulated data. For example, the dataset 60 may contain facial images collected from the web, wherein each facial image is annotated with a facial expression selected from a group of facial expressions. An example of such a dataset 60 is "AffectNet" which is described and downloadable under Ntzfc, The dataset 60 may be stored in the database 62. The third perceptual feature neural network 70 may analyze the image data 36 and the microphone data 38.
The generic emotion feature 72 of the monitored person can be a rated emotional expression such as anger, disgust, fear, happiness, sadness, and surprise.
A fourth perceptual feature neural network 74 is trained to recognize an activity of a person. Thus, the fourth perceptual feature neural network 74 fed with the monitoring data 14, can determine as perceptual feature 20 an activity feature 76 of the monitored person.
The fourth perceptual feature neural network 74 may be based on an RNN. For example, the fourth perceptual neural network 74 may be based on an LSTM neural network, such as DeepConvLSTM. The fourth perceptual neural network 74 may be trained on a dataset 60 having images containing humans performing general activities like operating a phone, drinking coffee, etc. The dataset 60 may be real or simulated data. For example, the dataset 60 may contain videos of human activity, wherein each human activity is annotated on a spatio-temporally localized scale. An example of such a dataset 60 is "AVA (Atomic Visual Actions)" which is described and downloadable under \ The dataset 60 may be stored in the database 62. The fourth perceptual feature neural network 74 may analyze the image data 36. The activity feature 76 of the monitored person can be a label of an activity of the monitored person.
A fifth perceptual feature neural network 78 is trained to estimate a pose of a person. Thus, the fifth perceptual feature neural network 78 fed with the monitoring 10 data 14, can determine as perceptual feature 20 a pose feature 80 of the monitored person.
The fifth perceptual feature neural network 78 may be based on a CNN. For example, the fifth perceptual feature neural network 78 may be based on High-Resolution Net (HRNet), DeepPose, OpenPose, or a combination thereof. The fifth perceptual feature neural network 78 may be trained on a dataset 60 having images containing human bodies appearing in varying poses and orientations. The dataset 60 may be real or simulated data. For example, the dataset 60 may contain images of people with different body poses. An example of such a dataset 60 is "Human3.6" which is described and downloadable under httos.i/Daperswithcocie.conildatasetihurnanaern. The dataset 60 may be stored in the database 62. The fifth perceptual feature neural network 78 may analyze the image data 36. The pose feature 80 of the monitored person can be an orientation of body parts of the monitored person.
A sixth perceptual feature neural network 82 is trained to estimate a gaze of a person. Thus, the sixth perceptual feature neural network 82 fed with the monitoring data 14, can determine as perceptual feature 20 a gaze feature 84 of the monitored person.
The sixth perceptual feature neural network 82 may be based on a CNN. For example, the sixth perceptual feature neural network 82 may be based on ResNet, InceptionV3, or a combination thereof. The sixth perceptual feature neural network 82 may be trained on a dataset 60 having images containing clearly visible faces.
The dataset 60 may be real or simulated data. For example, the dataset 60 may contain images with subjects in indoor and outdoor environments, wherein each subject is labelled with 3D gaze across a range of head poses and distances. An example of such a dataset 60 is "Gaze360" which is described and downloadable under ittri. The dataset 60 may be stored in the database 62. The sixth perceptual feature neural network 82 may analyze the image data 36. The gaze feature 84 of the monitored person can be an estimate whether the monitored person is drowsy or not.
A seventh perceptual feature neural network 86 is trained to estimate a mental fatigue of a person. Thus, the seventh perceptual feature neural network 86 fed with the monitoring data 14, can determine as perceptual feature 20 a mental fatigue feature 88 of the monitored person.
The seventh perceptual feature neural network 86 may be based on a CNN. For example, the seventh perceptual feature neural network 86 may be based on EMCNN. The seventh perceptual feature neural network 86 may further be based on an RNN. For example, the seventh perceptual feature neural network 86 may be based on an LSTM neural network. The seventh perceptual feature neural network 86 may be trained on a dataset 60 having images containing human faces, audio samples, and numerical data from physiological sensors such as heart rate, respiratory rate, or body temperature. The dataset 60 may be real or simulated data. For example, the dataset 60 may contain multimodal sensor data from wearable sensors during controlled physical activity sessions. An example of such a dataset is "FatigueSet" which is described and downloadable under f. The dataset 60 may be stored in the database 62. The seventh perceptual feature neural network 86 may analyze the image data 36, the microphone data 38, and the physiological data 42. The mental fatigue feature 88 of the monitored person can be a numerical measure of the mental fatigue of the monitored person.
An eighth perceptual feature neural network 90 is trained to detect a drowsiness of a person. Thus, the eighth perceptual feature neural network 90 fed with the monitoring data 14, can determine as perceptual feature 20 a drowsiness feature 92 of the monitored person.
The eighth perceptual feature neural network 90 may be based on a CNN. For example, the eighth perceptual feature neural network 90 may be based on ResNet, InceptionV3, CapsuleNet, or a combination thereof. The eighth perceptual feature neural network 90 may be trained on a dataset 60 having images containing clearly visible faces. The dataset 60 may be real or simulated data. For example, the dataset 60 may contain images of people driving a car, wherein each image labelled whether the driver pictured is yawning or not. An example of such a dataset 60 is described and downloadable under it+ -mt:pippru r.. . .. . taset. The dataset 60 may contain also sequences of people driving a car, wherein each sequence is labelled whether the driver is distracted in some way. An example of 15 such a dataset 60 is described under shame un. The dataset 60 may be stored in the database 62. The eighth perceptual feature neural network 90 may analyze the image data 36. The drowsiness feature 92 of the monitored person can be an estimate whether the monitored person is drowsy or not.
A nineth perceptual feature neural network 94 is trained to track a head of a person. Thus, the nineth perceptual feature neural network 94 fed with the monitoring data 14, can determine as perceptual feature 20 a head movement feature 96 of the monitored person.
The nineth perceptual feature neural network 94 may be based on a CNN. For example, the nineth perceptual feature neural network 94 may be based on ResNet, InceptionV3, or a combination thereof. The nineth perceptual feature neural network 94 may be trained on a dataset 60 having images containing clearly visible faces. The dataset 60 may be real or simulated data. For example, dataset 60 may contain sequences of subjects performing a task, wherein each sequence is labelled based on head and eye movement. An example of such a dataset 60 is "Gaze-in-wild" which is described under The dataset 60 may be stored in the database 62. The nineth perceptual feature neural network 94 may analyze the image data 36. The head movement feature 96 of the monitored person can be an estimate whether the monitored person is drowsy or not.
A tenth perceptual feature neural network 98 is trained to recognize a spontaneous emotion of a person. Thus, the tenth perceptual feature neural network 98 fed with the monitoring data 14, can determine as perceptual feature 20 a spontaneous emotion feature 100 of the monitored person.
The tenth perceptual feature neural network 98 may be based on a CNN. For example, the tenth perceptual feature neural network 98 may be based on VGG16, ResNet, InceptionV3, or a combination thereof. The tenth perceptual feature neural network 98 may be trained on a dataset 60 having images containing clearly visible facial emotions. The dataset 60 may be real or simulated data. For example, the dataset 60 may contain facial images collected from the web, wherein each facial image is annotated with a facial expression selected from a group of facial expressions. An example of such a dataset 60 is "AffectNet" which is described and downloadable under -.1ro qt. The dataset 60 may be stored in the database 62. The tenth perceptual feature neural network 98 may analyze the image data 36 and the microphone data 38. The spontaneous emotion feature 100 of the monitored person can be a rated emotional expression such as anger, disgust, fear, happiness, sadness, and surprise.
As further perceptual features 20, the perception layer 18 can determine a thermal comfort feature 102. This may be performed by a thermal comfort feature neural network 104.
The thermal comfort feature neural network 104 is trained to recognize a condition of mind that expresses satisfaction with the thermal environment. Thus, the thermal comfort feature neural network 104 fed with the monitoring data 14, can determine as perceptual feature 20 the thermal comfort feature 102 of the monitored person.
The thermal comfort feature neural network 104 may be based on a CNN. For example, the thermal comfort feature neural network 104 may be based on ResNet, InceptionV3, CapsuleNet, or a combination thereof. Furthermore, the thermal comfort feature neural network 104 may be based of machine learning algorithms 106 such as a random forest algorithm, a gradient boosting algorithm, regression algorithm, etc. The thermal comfort feature neural network 104 may be trained on a dataset 60 having images of an environment and data from various environmental parameter sensors. The dataset 60 may be real or simulated data. For example, the dataset 60 may contain sets of objective indoor climatic observations accompanied by subjective evaluations by building occupants who were exposed to them. An example of such a dataset 60 is "ASHRAE Global Thermal Comfort Database II" which is described under 15. The dataset 60 may be stored in the database 62. The thermal comfort feature neural network 104 may analyze the image data 36 and the environmental data 52. The thermal comfort feature 102 of the monitored person can be a numeral measure of thermal comfort.
As further perceptual feature 20, the perception layer 18 can determine an illumination feature 108.
The perception layer 18 detects a change in the illumination or brightness level 50 of an environment. Thus, the perception layer 18 fed with the environmental data 52, can determine the illumination feature 108.
The perception layer 18 may record the illumination or brightness level 50 captured by the light sensor 48. The perception layer 18 may compare the illumination level 50 to a dataset 60 having data from various lighting conditions.
For example, the dataset 60 may provide different rendered views of indoor light scenes based on professional architectural designed CAD models. The dataset 60 may be stored in the database 62. The illumination feature 108 can be positive, if a change in the illumination level 50 is detected.
and can be downloaded under Th" As previously described in relation to Fig. 1, in the third step of the method, from the determined perception features 20, a behaviour change 22 of the monitored person is detected or not.
Therefore, the determined perceptual features 20 can be respectively weighted by a perceptual feature weight factor 110 in order to calculate a cumulative behaviour change level 112. The following Table shows an example of perceptual feature weight factors 110 encoded on a scale including low, medium and high.
Perceptual feature 20 Weight factor 110 face recognition High emotion recognition High object recognition Medium body pose Low action recognition Medium gaze estimation High mental fatigue High head tracking High drowsiness High thermal comfort Low illumination Low Table: Example of perceptual feature weight factors 110 encoded on a scale including low, medium and high The skilled person may convert the scale, for example, to a numerical scale of choice. The perceptual feature weight factors 110 may be stored in the database 62.
Furthermore, a cumulative behaviour change threshold 114 can be defined and stored in the database 62. The cumulative behaviour change level 112 is then compared to the cumulative behaviour change level threshold 114. For example, if the cumulative behaviour change level 112 is above the cumulative behaviour change level threshold 114, the behaviour change 22 is detected.
Additionally or alternatively, one or more perceptual features 20 may be separately compared with a perceptual feature threshold 116 in order to detect the behaviour change 22 of the monitored person. For example, a behaviour change 22 may also be detected, if one or more perceptual features 20 are above their respective perceptual feature thresholds 116. A behaviour change 22 may also be excluded, if one or more perceptual features 20 remain below their respective perceptual feature thresholds 116.
In other terms, the perception layer 18 may determine also only a selection of the perceptual features 20 in the second step or the perception layer 18 may abort determining other perceptual features 20, if one or more perceptual features 20 are above or below their respective perceptual feature thresholds 116. This saves calculation time.
If the behaviour change 22 of the monitored person is detected, in the fourth step the personal data 10 on the condition is generated by means of the evaluating machine learning model 24.
Reference is now made to Fig. 3 and 4.
In order to generate the personal data 10 on the condition, the evaluating machine learning model 24 includes one or more machine learning algorithms 106. For example, the evaluating machine learning model 24 may include a random forest algorithm, a gradient boosting algorithm, a regression algorithm, an evaluating neural network, or a combination thereof.
The evaluating machine learning model 24 gets as input 118 the monitoring data 14 and/or the perceptual features 20 and generates the personal data 10 on the 30 condition.
In the present case, the personal data 10 includes a personal distraction level 120 and a personal cognitive load level 122. However, within the scope of the invention, the personal data 10 may also include only one of the personal distraction level 120 and the personal cognitive load level 122.
Fig. 3 shows in detail steps of the method for generating the personal data 10 on the condition. In Fig. 3, the evaluating machine learning model 24 processes the gaze feature 84, the head movement feature 96, the spontaneous emotion feature 100, the activity feature 76, the pose feature 80, the thermal comfort feature 102, the illumination feature 108, and the personal identity feature 56 and evaluates the personal distraction level or score 120.
Fig. 4 shows in detail further steps of the method for generating the personal data 10 on the condition. In Fig. 4, the evaluating machine learning model 24 processes the gaze feature 84, the head movement feature 96, the generic emotion feature 72, the drowsiness feature 92, the mental fatigue feature 88, and the personal identity feature 56 and evaluates the personal cognitive load level or score 122.
Reference is made again to Fig. 2.
The evaluating machine learning model 24 further analyzes the personal distraction level 120 and the personal cognitive load level 122 with regard to logged personal data 124.
The logged personal data 124 may be stored in the personal database 64 and the evaluating machine learning model 24 may have access to the personal database 25 64.
The logged personal data 124 may include historic personal data 126 on the condition, historic perceptual features 128, and/or historic cumulative behaviour change levels 130. Furthermore, the logged personal data 124 may include times and/or durations of a behaviour change 22 detected at earlier times. Furthermore, the logged personal data 124 may include times and/or durations of a presence of the monitored person at and/or of an absence of the monitored person from the site.
The personal distraction level 120, the personal cognitive load level 122, and the logged personal data 124 are then fed to a personal condition neural network 132.
The personal condition neural network 132 may be based on an RNN. For example, the personal condition neural network 132 may be based on a LSTM neural network. Furthermore, the personal condition neural network 132 may be based on machine learning algorithms 106 such as a random forest algorithm, a gradient boosting algorithm, a regression algorithm, or a combination thereof.
The personal condition neural network 132 is trained to estimate a personal distraction level threshold 134 and a personal cognitive load level threshold 136 based on the logged personal data 124.
For example, if one or both of the personal distraction level 120 and the personal cognitive load level 122 are above the personal distraction level threshold 134 or the personal cognitive load level threshold 136, the output message 26 can be generated in the fifth step of the method.
The output message 26 may be indicative for the monitored person to repeat, 20 change or avoid an activity at the site. The output message 26 may further be indicative for the monitored person to have an unfit condition at the site.
The personal condition neural network 132 is further trained to estimate a time and/or a duration the monitored person is to leave the site for recover the condition and/or when to return to the site.
For example, if the personal cognitive load level 122 is high relative to or above the personal cognitive load level threshold 136 and the personal distraction level is low relative to or below the personal distraction level threshold 134, the output message 26 can be generated in the fifth step of the method, the output message 26 being indicative for the monitored person to leave the site and to take a break for the estimated duration.
The generated personal data 10 on the condition, for instance the personal distraction level 120 and the personal cognitive load level 122, the perceptual features 20, and/or the cumulative behaviour change level 112 may be logged to the personal database 64 in order to include them in the logged personal data 124.
Furthermore, times and/or durations of the detected behaviour change 22 of the monitored person may be logged.
The invention further provides a data processing device 138 (not shown) comprising means for carrying out the method for generating the personal data 10 on the condition of the monitored person. The invention further provides a computer program 140 (not shown) comprising instructions which, when the program 140 is executed by a computer, cause the computer to carry out the method. The invention further provides a computer-readable data carrier 142 (not shown) having stored thereon the computer program 140.
REFERENCE SIGNS 10 personal data 12 monitoring device 14 monitoring data 16 sensor array 18 perception layer perceptual feature 22 behaviour change 24 evaluating machine learning model/behaviour model 26 output message 28 image or video capturing device microphone 32 physiological parameter sensor 34 environmental parameter sensor 35 object or product 36 image data 38 microphone data physiological parameter 42 physiological data 44 environmental parameter 46 temperature sensor 48 light sensor 49 site temperature illumination or brightness level 52 environmental data 54 perceptual feature neural network 56 first perceptual feature neural network 58 personal identity feature dataset 62 database 64 personal database 66 second perceptual feature neural network 68 object identity feature third perceptual feature neural network 72 generic emotion feature 74 fourth perceptual feature neural network 76 activity feature 78 fifth perceptual feature neural network 80 pose feature 82 sixth perceptual feature neural network 84 gaze feature 86 seventh perceptual feature neural network 88 mental fatigue feature 90 eighth perceptual feature neural network 92 drowsiness feature 94 nineth perceptual feature neural network 96 head movement feature 98 tenth perceptual feature neural network 15 100 spontaneous emotion feature 102 thermal comfort feature 104 thermal comfort feature neural network 106 machine learning algorithm 108 illumination feature 110 perceptual feature weight factor 112 cumulative behaviour change level 114 cumulative behaviour change level threshold 116 perceptual feature threshold 118 input 120 personal distraction level 122 personal cognitive load level 124 logged personal data 126 historic personal data 128 historic perceptual features 130 historic cumulative behaviour change level 132 personal condition neural network 134 personal distraction level threshold 136 personal cognitive load level threshold 138 data processing device computer program 142 computer-readable data carrier

Claims (12)

  1. CLAIMS1. A computer-implemented method for generating personal data (10) on a condition of a person, the method comprising: a) Monitoring a person at a site by means of at least one monitoring device (12) in order to generate monitoring data (14) indicative for a behaviour of the monitored person; b) Feeding the monitoring data (14) to a perception layer (18) for determining at least one perceptual feature (20), the perception layer (18) including at least one perceptual feature neural network (54) trained to determine the at least one perceptual feature (20); c) Detecting a behaviour change (22) of the monitored person based on the at least one perceptual feature (20) and, if a behaviour change (22) of the monitored person is detected: d) Generating personal data (10) on a condition of the monitored person by means of an evaluating machine learning model (24) using as input (118) the monitoring data (14) and/or the at least one perceptual feature (20), the evaluating machine learning model (24) evaluating a personal distraction level (120) and/or personal cognitive load level (122) of the monitored person; e) Analyzing the personal data (10) with regard to logged personal data (124); and f) Generating an output message (26) in response to the analyzing.
  2. 2. The method according to claim 1, further comprising one, several or all of the following: g) Monitoring one or more environmental parameters (44) at the site and determining, by the perception layer (18), as perceptual feature (20) a thermal comfort feature (102) of the monitored person and/or an illumination feature (108); h) Monitoring one or more physiological parameters (40) of the person and determining, by the at least one perceptual feature neural network (54), as perceptual feature (20) a mental fatigue feature (88) of the monitored person; i) Monitoring the person at the site and determining, by the at least one perceptual feature neural network (54), as perceptual feature (20) a personal identity feature (58) of the monitored person; and j) Monitoring an object (35) which the monitored person is inspecting at the site and determining, by the at least one perceptual feature neural network (54), as perceptual feature (20) an object identity feature (68).
  3. 3. The method according claim 1 or 2, characterized in that the monitoring device (12) is configured as one, several or all of the following: an image or video capturing device (28), a microphone (30), a physiological parameter sensor (32), an environmental parameter sensor (34), a temperature sensor (46), and a light or brightness sensor (48).
  4. 4. The method according to any of the preceding claims, characterized in that the at least one perceptual feature neural network (54) is trained to perform one, several or all of the following: kl) recognizing a face of a person; k2) recognizing an object a person is inspecting; k3) recognizing a generic emotion of a person; k4) recognizing an activity of a person; k5) estimating a pose of a person; k6) estimating a gaze of a person; k7) estimating a mental fatigue of a person; k8) detecting a drowsiness of a person; k9) tracking a head of a person; and kl 0) recognizing a spontaneous emotion of a person.
  5. 5. The method according to any of the preceding claims, characterized in that step c) comprises one or both of the following: c1) weighting a plurality of perceptual features (20) with perceptual feature weight factors (110) in order to create a cumulative behaviour change level (112); and c2) comparing the cumulative behaviour change level (112) with a cumulative behaviour change level threshold (114) and detecting the behaviour change (22) of the monitored person based on the comparison.
  6. 6. The method according to any of the preceding claims, characterized in that the evaluating machine learning model (24) comprises one, several or all of the following: dl) a machine learning algorithm (106); d2) a random forest algorithm; d3) a gradient boosting algorithm; d4) a regression algorithm; and d5) an evaluating neural network.
  7. 7. The method according to any of the preceding claims, further comprising one, several or all of the following: I) Logging the personal data (10) on the condition and/or the perceptual features (20) in order to generate the logged personal data (124); m) Logging times and/or durations of the detected behaviour change (22) of the 15 monitored person; and n) Logging times and/or durations of a presence of the monitored person at and/or of an absence of the monitored person from the site.
  8. 8. The method according to any of the preceding claims, characterized in that step e) comprises: el) Feeding the personal distraction level (120) and/or personal cognitive load level (122) and the logged personal data (124) to a personal condition neural network (132); and el.1) Estimating, by the personal condition neural network (132), a personal distraction level threshold (134) and/or personal cognitive load level threshold (136); and/or el.2) Estimating, by the personal condition neural network (132), a time and/or a duration the monitored person is to leave the site for recover the condition.
  9. 9. The method according to any of the preceding claims, characterized in that the output message (26) is indicative for the monitored person when to leave the site and/or when to return to the site, to repeat, change or avoid an activity at the site, and/or to have an unfit condition at the site.
  10. 10. A data processing device (138) comprising means for carrying out the method of any of the preceding claims.
  11. 11. A computer program (140) comprising instructions which, when the program 5 (140) is executed by a computer, cause the computer to carry out the method of any of the claims 1 to 9.
  12. 12. A computer-readable data carrier (142) having stored thereon the computer program (140) of claim 11.
GB2215484.3A 2022-10-20 2022-10-20 Computer-implemented method for generating personal data on a condition of a person Pending GB2623553A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB2215484.3A GB2623553A (en) 2022-10-20 2022-10-20 Computer-implemented method for generating personal data on a condition of a person

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB2215484.3A GB2623553A (en) 2022-10-20 2022-10-20 Computer-implemented method for generating personal data on a condition of a person

Publications (2)

Publication Number Publication Date
GB202215484D0 GB202215484D0 (en) 2022-12-07
GB2623553A true GB2623553A (en) 2024-04-24

Family

ID=84818459

Family Applications (1)

Application Number Title Priority Date Filing Date
GB2215484.3A Pending GB2623553A (en) 2022-10-20 2022-10-20 Computer-implemented method for generating personal data on a condition of a person

Country Status (1)

Country Link
GB (1) GB2623553A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108960065A (en) * 2018-06-01 2018-12-07 浙江零跑科技有限公司 A kind of driving behavior detection method of view-based access control model
US20190164103A1 (en) * 2017-11-28 2019-05-30 International Business Machines Corporation Maximize human resources efficiency by reducing distractions during high productivity periods
WO2020122986A1 (en) * 2019-06-10 2020-06-18 Huawei Technologies Co.Ltd. Driver attention detection using heat maps
US20220188737A1 (en) * 2020-12-15 2022-06-16 Dassault Aviation System for determining an operational state of an aircrew according to an adaptive task plan and associated method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190164103A1 (en) * 2017-11-28 2019-05-30 International Business Machines Corporation Maximize human resources efficiency by reducing distractions during high productivity periods
CN108960065A (en) * 2018-06-01 2018-12-07 浙江零跑科技有限公司 A kind of driving behavior detection method of view-based access control model
WO2020122986A1 (en) * 2019-06-10 2020-06-18 Huawei Technologies Co.Ltd. Driver attention detection using heat maps
US20220188737A1 (en) * 2020-12-15 2022-06-16 Dassault Aviation System for determining an operational state of an aircrew according to an adaptive task plan and associated method

Also Published As

Publication number Publication date
GB202215484D0 (en) 2022-12-07

Similar Documents

Publication Publication Date Title
JP4401079B2 (en) Subject behavior analysis
RU2711976C1 (en) Method for remote recognition and correction using a virtual reality of a psychoemotional state of a human
Rieth et al. Priming and habituation for faces: Individual differences and inversion effects.
Vivekanandam Evaluation of activity monitoring algorithm based on smart approaches
US20220067519A1 (en) Neural network synthesis architecture using encoder-decoder models
Hosseini et al. Convolution neural network for pain intensity assessment from facial expression
KR102528032B1 (en) Method and system for checking fatigue of pilot before flying
CN112515674A (en) Psychological crisis early warning system
GB2623553A (en) Computer-implemented method for generating personal data on a condition of a person
CN111202534A (en) Emotion prediction method based on group temperature monitoring
CN112036328A (en) Bank customer satisfaction calculation method and device
Jazouli et al. Stereotypical motor movement recognition using microsoft kinect with artificial neural network
Gamage et al. Academic depression detection using behavioral aspects for Sri Lankan university students
Ghamen et al. Positive and negative expressions classification using the belief theory
CN109635778B (en) Risk behavior monitoring and early warning method and system suitable for special population
Chavez-Guerrero et al. Classification of Domestic Dogs Emotional Behavior Using Computer Vision
Ahmed et al. Assisting the autistic with improved facial expression recognition from mixed expressions
Budarapu et al. Early Screening of Autism among Children Using Ensemble Classification Method
Isaeva et al. Making decisions in intelligent video surveillance systems based on modeling the pupillary response of a person
Puteri et al. Micro-sleep detection using combination of haar cascade and convolutional neural network
Sengupta et al. Driver sleep detection: A new and accurate approach
Álvarez et al. Consumer acceptances through facial expressions of encapsulated flavors based on a nanotechnology approach
Arsić et al. System for detecting driver’s drowsiness, fatigue and inattention
US20240212332A1 (en) Fatigue level determination method using multimodal tensor fusion
US20240138762A1 (en) Automated impairment detection system and method