WO2008064431A1 - Procédé et système de surveillance des changements d'état émotionnel - Google Patents

Procédé et système de surveillance des changements d'état émotionnel Download PDF

Info

Publication number
WO2008064431A1
WO2008064431A1 PCT/AU2007/001854 AU2007001854W WO2008064431A1 WO 2008064431 A1 WO2008064431 A1 WO 2008064431A1 AU 2007001854 W AU2007001854 W AU 2007001854W WO 2008064431 A1 WO2008064431 A1 WO 2008064431A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
emotional state
subject
emotional
cognitive
Prior art date
Application number
PCT/AU2007/001854
Other languages
English (en)
Inventor
Rajiv Khosla
Original Assignee
Latrobe University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from AU2006906746A external-priority patent/AU2006906746A0/en
Application filed by Latrobe University filed Critical Latrobe University
Priority to AU2007327315A priority Critical patent/AU2007327315B2/en
Publication of WO2008064431A1 publication Critical patent/WO2008064431A1/fr

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • A61B5/7267Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/165Evaluating the state of mind, e.g. depression, anxiety
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/174Facial expression recognition
    • G06V40/176Dynamic expression
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/70ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for mining of medical data, e.g. analysing previous cases of other patients

Definitions

  • the field of the present invention is the automated monitoring of emotional state changes.
  • An example of an application for the method of the present invention is for monitoring a person's emotional reactions during an interview or assessment process.
  • Facial expressions, vocal expressions, gesture, posture, gait and physiological data such as blood pressure and heart rate can all be used for the purpose of analysing the emotional state of a human subject.
  • Measuring of physiological data typically requires some physical invasion, such as the application of heart rate or blood pressure monitoring equipment, which can be awkward or discomforting for the subject.
  • Non-invasive monitoring emotional states typically requires a trained person to observe facial expressions, gestures, postures and vocal expressions of a subject and make an educated, but subjective, assessment of the subject's emotional state based on these observations. However, often changes in facial expression are very slight and difficult to observe or qualify.
  • Emotional systems in humans influence many cognitive processes (e.g., decision making, focus and attention, goal generation, categorization) .
  • Recognition of emotional information is a key part of human-human communication.
  • the correlation between a person's expression and their words or actions can influence the level of confidence or trust another person has in the first person.
  • the level of confidence in the subject's responses to questions can be influenced by- observation of their expressions reactions when providing the response and the interviewer get a sense for whether the person in lying or withholding information based on these observations, however this is a subjective assessment and dependent on the skill of the interviewer.
  • a known example of an objective measure of a person's reactions during an interview is a polygraph or "lie- detector" test occasionally used during police interviews where physiological reactions are monitored and used to indicate whether a person is being truthful.
  • this is an invasive proves requiring the subject to be connected to various physiological monitoring equipment.
  • interviewing is a complex social interaction between candidate and interviewer.
  • the interviewing process assesses applicant interests, motivation or affinity for a particular job.
  • personal qualities such as oral communication, decisiveness and manner of self- preservation are evaluated in the interview.
  • a problem with the interview process is that decisions are based on subjective analysis, by the interviewer, of the candidate's reactions.
  • IT Information technology
  • the DISC personal profiling system is a personality behavioural testing profile using a four dimensional model of normal behaviour in an assessment, inventory, and survey format either as a self-scored paper or online version.
  • the four dimensional behaviour is: Dominance, Influence, Steadiness and Conscientiousness/compliance (DISC) .
  • the four dimensional human behaviour could be studied using a two-axis model based on a person's action in a favourable or an unfavourable environment providing observational methods to demonstrate how four primary emotions are related to a logical analysis of neurological results .
  • the DISC personal profile system can present a plan helping to understand candidates and others in a specific environment. Through using these profiles it is possible to understand their behaviour and temperament and identify the environment most conducive for personal and organisational success. At the same time, they can learn about the ways others differ, and the different environments people need for maximum productivity and teamwork in an organisation. Research evidence supports the conclusion that the most effective candidates are those who know themselves, recognize the demands of a situation, and adapt strategies to meet those needs. By maximizing strengths and minimizing weaknesses, these profiles can ensure that individuals are able to perform to potential. These personal profiles enable the candidates to identify their behavioural, character and temperament profiles, capitalise on their strengths, anticipate and minimise potential problems and conflict and read and understand others better.
  • the DISC model does not measure personalities but rather behaviour in a specific situation. Unlike other profiles, the DISC model assumes that candidates have the ability to choose their preferred style and behave in a manner they have some control over. Although everyone tends to have a default behaviour style, by being aware of the process they are able to understand themselves and those they deal with to ensure they can gain maximum benefit out of any situation. Thus, DISC has been used in many areas including: team development, leadership, change management, negotiation, sales and conflicts.
  • Myers-Briggs Personality Type Indicator ® is a questionnaire based on the psychological teachings of Carl Jung.
  • the Myers-Briggs Personality Type Indicator ® measures a person's preferences, using four basic scales with opposite poles. The four scales are: (1) Extraversion/Introversion (describes where people prefer to focus their attention and get their energy from the outer world of people and activity or their inner world of ideas and experiences) , (2) Sensate/Intuitive (describes how people prefer to take in information focused on what is real and actual or on patterns and meanings in data) , (3) Thinking/Feeling: describes how people prefer to make decisions based on logical analysis or guided by concern for their impact on others; and , (4) Judging/Perceiving (describes how people prefer to deal with the outer world in a planned orderly way, or in a flexible spontaneous way) .
  • the Indicator is a very useful tool to enlarge and deepen candidates' self-knowledge and understanding of their behaviour. It can be of real benefit to them in making informed life-choices and in relationship building. From a recruiter' s perspective it can be used to help understand the candidate's personality and preferences. It is not a test. There are no right or wrong answers. This is an "indicator" of candidates' personality as they see the "real them”.
  • a method for monitoring emotional state changes in a subject based on facial expressions comprising the steps of: capturing first facial image data of the subject and second facial image data of the subject at first and second times respectively; and processing the first and second facial image data to produce emotional state change data for the subject for the period of time between the capture of the first facial image data and the second facial image data.
  • a system for monitoring emotional state changes in a subject based on facial expressions comprising: a image capturer adapted to capture first facial image data and second facial image data of the subject at first and second times respectively; and a processor adapted to process the first and a second facial image data to produce emotional state change data for the subject for the period of time between the capture of the first facial image data and the second facial image data.
  • a series of facial images of the subject is electronically captured as facial image data, wherein each facial image is separated from the next by a designated time period.
  • a series of facial images may be captured by a visual image recorder as frames of a video stream.
  • the first and second facial image data may be selected from this series of facial images.
  • processing of the first and second image data includes identifying changes between the first and second facial image data using an image processor, and analysing the changes between the facial image data for the first and the second facial images to produce the emotional state change data.
  • the emotional state change data characterises the direction of the emotional state change (positive or negative) .
  • the emotional state change data also characterises the intensity of the emotional state change .
  • the facial images are be normalised to compensate for changes in head placement between the first and second images.
  • geometric normalisation based on tracking the eyes or the relative location of facial features or positions on the face or head may be used, such as ears, nose, forehead, or hair line.
  • Changes between facial images may be identified and characterised to provide emotional state data using a number of image processing techniques and analysis.
  • Gabor wavelets are used to identify changes between facial images and these changes analysed using a neural network to provide the emotional state change data.
  • the neural network can be trained to classify emotional state changes .
  • changes between the facial images indicative of emotion states are determined using an optical flow algorithm to track displacement of selected facial features or selected pixels, the selected pixels being indicative of facial features associated with emotional expression.
  • fuzzy logic rules can be used to determine emotional state change data based on the direction and magnitude of the displacement for each feature.
  • emotional state change data can include data indicating the time an emotional state change occurred. This time may be a relative or absolute time. For example time-stamping of the start of a sequence of images can be used to determine the time of an emotional state change relative to the start time of the sequence, or recording the time each individual image within the sequence was recorded can be used to provide an absolute or actual time the emotional state change occurred.
  • a method for correlating emotional state changes with a subject's cognitive response to stimulus comprising the steps of: applying a stimulus to the subject to evoke a cognitive response from the subject; monitoring the subject during the time the stimulus is applied to obtain emotional state change data indicative of emotional reaction of the subject to the stimulus; and associating the emotional state change data with the cognitive response to the stimulus.
  • a system for correlating emotional state changes with a subject's cognitive response to stimulus comprising: a monitor adapted to monitor the subject on application of a stimulus to evoke a cognitive response to obtain emotional state change data indicative of the emotional reaction of the subject to the stimulus; and a processor adapted associate the emotional state change with the cognitive response to the stimulus.
  • the monitor captures data indicative of the subject's emotional state changes during application of the stimulus.
  • the captured data is then analysed to provide emotional state change data for the subject during the time the stimulus was applied.
  • a sequence of stimuli is applied to the subject, and the subject's emotional reactions monitored for the duration of the application of the sequence and the recorded data analysed to provide emotional state change data associated with each stimulus in the sequence.
  • the recorded data may include time-stamp data indicating the timing of the application of each stimulus.
  • the relative timing of each emotional state change from the time where the first stimulus was applied can be compared with the relative timing of the application of each stimulus in the sequence.
  • the cognitive response may also be recorded and the timing of each cognitive response also used for correlation of the cognitive and emotional responses. For example, the time taken between the application of a stimulus, such as asking a question, and making of a cognitive response, such as answering the question, can be used.
  • a cognitive response is a response made by the subject employing their knowledge, perception or conception.
  • a cognitive response is a response made consciously generally employing reason
  • the cognitive response may be a verbal, written, key strokes made on a computer keyboard or key pad of a user device, a selection from a number of objects/colours/letters/sounds etc.
  • the form of cognitive responses may be related to the stimuli.
  • a cognitive response may be non verbal, such a gesture, such as raising an arm or finger in response to a question or instruction or pointing to a selected item from a set.
  • Cognitive responses may include a subjects actions and interaction with their environment, for example selections of items, links or articles to view in an on-line environment or actions taken in a physical environment.
  • a predetermined sequence of stimuli can be prepared and applied to the subject, such as a survey, game, website or structured environment such as a simulation.
  • This sequence may also be adapted or modified.
  • the order or structure of the sequence may be modified to avoid repetition and learning of the sequence.
  • the sequence may be dynamically adapted in response to responses given during the course of the sequence.
  • the sequence of stimuli may involve the interaction between the subject and their environment, for example the reaction of a medical patient to pain they experience or effect of treatments, or a person performing an Internet search and the web sites they enter. In these cases both the cognitive actions of the individual who is the subject of the monitoring and their emotional reactions are captured and time stamped for correlation.
  • a third party's involvement such as the questions asked or treatment applied by a doctor can also be monitored and time stamped for correlation with the subject's responses.
  • the subject's reactions to stimuli are monitored by recording facial images
  • the above described method and system for monitoring emotional state changes using facial images can be used to obtain emotional state change data.
  • the synchronisation of the emotional state changes with the cognitive response by the subject can be based on the timing of the visual image recording.
  • the selection of images, to compare for determining emotional state changes, out of a series of images can be based on the timing of the application of the stimulus and/or the subject's response.
  • data can be presented correlating the subject's cognitive response to a stimulus with the emotional change data exhibiting the emotional change experienced by the subject during the application and response to the stimulus.
  • this data can be simply presented as an output for analysis by a person.
  • the emotional state change data can be applied to qualify a subject's cognitive response.
  • a series of cognitive and emotional responses can be analysed to determine trends in emotional state changes in comparison with cognitive responses.
  • Analysis of correlated cognitive response data and emotional state change data may also include comparison of data between individual subjects, among a group of subjects, or between a subject and model response data and emotional data.
  • the method and system for correlating emotional state changes with a subject's response to stimulus can further be used modelling correlated cognitive and emotional responses .
  • an emotional response model may be developed based on a typical or desired individual subject, for example a model employee for a particular role.
  • such a method for developing an emotional response model comprises the steps of: selecting a model subject; applying a predetermined sequence of stimuli to the subject to elicit a series of cognitive responses from the subject; monitoring the cognitive responses of the subject to each stimulus and recording the cognitive response data; monitoring the subject during the time when each stimulus is applied and automatically recording data indicative of emotional reaction of the subject to each stimulus; analysing the recorded data to provide emotional state change data for the subject during the time when the stimulus was applied for each stimulus; correlating the cognitive response data with the emotional state change data for each stimulus; and developing an emotional response model for the sequence of stimuli based on the correlated data.
  • Modelling correlated cognitive and emotional responses can also be based on a number of subjects using the method above applied to a number of subjects and performing further modelling analysis.
  • the further modelling analysis may vary depending on the number of subjects.
  • a model may be developed based on statistical norms within the sample. The above steps can be applied to each subject in the sample and statistical analysis performed on the data acquired from all the subjects.
  • subjects may be classified into groups based on particular trends or similarities in individual subject's cognitive data and/or emotional response data and the model based on particular trends within a group. For example, out of a sample of subjects who all complete the above steps, the data can be analysed to select a group of subjects who all showed similar cognitive responses. The emotional responses can also be analysed to identify similarities across the group or to qualify cognitive responses. A model can be developed based on the combined cognitive and emotional responses for the group. The analysis performed to develop such a model can vary depending on the size of the group and the context.
  • the sequence of stimuli may be a survey relating to job behaviour and attitudes which may produce very distinct trends in both cognitive and emotional responses in contrast to a holiday destination selection context where reactions may show greater variation across a sample of subject as the selection of a holiday destination is subject to whim whereas responses to a job related survey are based on expectations of a particular job role or task.
  • a method of acquiring cognitive and emotional state change data comprising the steps of: monitoring a subject's cognitive actions while performing a task; capturing cognitive data associated with the task; monitoring the subject's emotional reactions while performing the task; capturing data indicative of the subject's emotional state changes during the task; analysing the captured data to provide emotional state change data for the subject during the time when the task is being performed; and correlating the cognitive data with the emotional state change data for each cognitive response.
  • the cognitive and emotional state change data can then be applied in a decision making context.
  • a system adapted to present correlated cognitive and emotional state change data for use in decision making, the system comprising: a controller arranged to apply a decision making model for executing a plurality of decision making phases and applying rules in accordance with the model; and a plurality of agents each adapted to perform a function for use in the decision making process, at least one of the agents being adapted to provide emotional state change data associated with a decision making context, the controller and agents being implemented such that the controller is adapted to request actions be performed by the plurality of agents in accordance with the decision making model.
  • the controller controls the execution of a number of decision making phases such as: a pre-processing phase for data acquisition and manipulation of raw data; a context elicitation phase for analysis of the data to identify a situation with a problem to be solved; a situation interpretation labelling phase for identifying the required tasks and data associated with the problem to be solved according to a predetermined decision making model; a situation action phase for executing tasks associated with the identified problem based on the data; and a situation adaptation phase for monitoring results from task execution and reactions from external to the system to the results, including emotional reactions, enabling these external reaction to be fed back into the system for adaptation of the predetermined model based on the external reactions.
  • a pre-processing phase for data acquisition and manipulation of raw data
  • a context elicitation phase for analysis of the data to identify a situation with a problem to be solved
  • a situation interpretation labelling phase for identifying the required tasks and data associated with the problem to be solved according to a predetermined decision making model
  • a situation action phase for executing tasks associated with the
  • the controller and agents are arranged in a layered architecture, the layers comprising: a reactive layer including agents for raw data acquisition and basic data manipulation; an intelligent technology layer including agents for data mining and identification of patterns in the acquired data; a cognitive sensemaking layer for coordinating the use of agents associated with other layers and interpretation of data in accordance with a decision making context; an affective sensemaking layer including agents for the acquisition and analysis of emotional state change data; a situation adaptation layer including agents for monitoring decision making results and reactions thereto and agents for analysing the reactions in the context of the situation for feedback into the decision making context and adaptation of the decision making model; a distribution coordination layer including agents for coordinating communication between user and agent; and an object layer including domain agents for use by the agents of the other agents to facilitate data processing and presentation for the user.
  • the cognitive sensemaking layer coordinates the activity of the various layer agents in accordance with a decision making model associated with a context.
  • Agents include procedures implemented in software and/or hardware for performing functions for example interaction between the decision making system and user environment, acquisition of data, or analysing data according to the given function. For example one agent may acquire facial images from a system user and analyse these to provide emotional state change data, another agent may record information input to the system from another source, such as historical data for an individual from a database or other source, another agent may perform analysis to compare current information input to the system by the user with the historical data.
  • the use or calling of the agents is controlled by the cognitive layer based on the decision making context and the decision making rules being applied, with the parameter or data provided to the agents by the system also based on the decision making rules .
  • each agent performs a specific function which is independent of the decision making context.
  • the decision making context such as determining credit card approval, suggesting holiday destinations, prescribing medical treatments, etc determined the decision making model and rules which are applied to call the agents and use the results of the agents 7 functions in the decision making context.
  • Figure 1 illustrates an example of a system for monitoring emotional state changes.
  • FIG. 2 is a functional block diagram of the system of
  • Figure 3 shows the affect space model used for mapping emotional states.
  • Figure 4 shows an example of image processing and analysis according to one embodiment of the invention using Gabor wavelets for image processing and a neural network for analysis to determine emotional state changes .
  • Figure 5 is an example of a feature identification mask used in an embodiment of the invention using an optical flow algorithm for image processing.
  • Figure 6 is an example of the results of the optical flow algorithm indicating the difference in facial features between two facial images .
  • Figure 7 is an illustration of examples of relative movements of facial features mapped to emotional state changes.
  • Figures 8a, b & c illustrate a series of emotional state changes over time with consistent intensities.
  • Figures 9a, b & c illustrate a series of emotional state changes over time having varying intensities.
  • Figure 10a, b & c illustrate examples of transient emotional state transitions exhibited during a transition from one absolute emotional state to another.
  • Figure 11 is a functional block diagram of a system for correlating a subject's emotional state changes with their cognitive responses.
  • Figure 12 illustrates a salesperson behavioural model mapped onto fuzzy categories for the application of fuzzy rules.
  • Figure 13 illustrates Maslow's hierarchy of personal needs overlayed with transitory behaviour categories of the salesperson behavioural model .
  • Figure 14 illustrates transitory behaviour categories of the salesperson model .
  • Figure 15 illustrates the steps for determining a sales candidate's primary fuzzy selling behaviour.
  • Figure 16 illustrates a sales candidate's cognitive responses to a number of questions relating to competition.
  • Figure 17 illustrates a candidate's emotional state changes experience while answering a number of questions relating to competition.
  • Figure 18 illustrates a candidate's cognitive responses and associated emotional state changes for questions relating to success and failure mapped onto a common axis.
  • Figure 19a illustrates parallel responses between a subject and a benchmark.
  • Figure 19b illustrates opposing responses between a subject and a benchmark.
  • Figure 20a illustrates comparison of a sales candidate's emotional state changes with those of a benchmark.
  • Figure 20b illustrates comparison of a sales candidate's cognitive responses with those of a benchmark.
  • Figure 20c illustrates comparison of a sales candidate's emotional state changes over the course of an entire survey with those of a benchmark.
  • Figure 21 illustrates an example of an emotionally intelligent sales recruitment system.
  • Figure 22 illustrates an example of a system for automatically providing feedback regarding the emotional states of students in a lecture context.
  • Figure 23 illustrates an example of a system for monitoring critical event operator's cognitive and emotional responses .
  • Figure 24 illustrates an example of a system for monitoring a person's behaviour and emotional state changes during on-line holiday planning.
  • Figure 25 illustrates an embodiment of a system adapted for decision making and adaptive decision making using a combination of monitored cognitive and emotional reactions.
  • Figure 26 illustrates a typical object-based domain for banking products .
  • Figure 27 illustrates an example of an application of the decision making system of figure 25 applied within a banking product domain.
  • Figure 28 illustrates an example of situation-adaptation layer agents for credit card approval.
  • Figure 29 illustrates the performance comparison of the neural network back propagation agent without the adaptation applied and after adaptation.
  • the system 100 comprises a visual image recorder 120 and a processor 130.
  • the visual image recorder 120 is adapted to capture a series of facial images 200 of the subject 110 wherein each facial image is separated from the next by a known short period of time.
  • the processor 130 is adapted to identify changes between a first 210a and a second 21On facial image from the series 200, and analyse the changes in facial images 210a, 21On to provide emotional state change data for the subject for the period of time between the capture of the first facial image 210a and the second facial image 21On.
  • the first step in the monitoring of emotional state changes is the capturing of a sequence 200 of facial images 210a to 21On of the subject 110, wherein each image is separated by a known short period of time.
  • the visual image recorder 120 can be a video camera which captures images and digitally stores the images either within memory within the camera or an external memory device such as the memory of a computer linked to the camera.
  • a digital video camera will capture a predetermined number of images, also known as frames, per second. Typically the number of frames per second is constant for a recording period. The number of frames per second captured may vary depending on the type of camera or be variable for a camera depending on a user selected property. Thus where the images are captured using a digital video camera the length of time between the capture of each image in the sequence will be known.
  • a first and a second image from a sequence of images are compared and changes between the two images identified.
  • the first and second images can be sequential images captured one after the other, for example image 210b and image 210c, or separated by a number of images, for example first image 210a and second image 21On.
  • the selection of the images may be based on the timing of the images or the time separation of the images .
  • the first and second images may be selected based on an external trigger, such as a first image captured immediately before an external stimulus is applied and a second image captured after the stimulus is applied.
  • the second image may be captured half a second after the stimulus is applied.
  • Image processing 220 identifies changes between the first and second images. These facial image changes 230 are analysed 240 to provide emotional state change data 250 for the subject for the period of time between the capture of the first facial image 210a and the second facial image 21On.
  • the emotional state change data characterises the emotional state change using at least one measure, such as a change in the direction, positive or negative, or the in intensity of the emotional state change.
  • Facial expressions are an important physiological indicator of human emotions.
  • An affect space model 300 used by psychologists is shown in figure 3.
  • the model 300 uses three dimensions each represented on an axis of the model.
  • the first dimension is Valance which is measured on a scale of pleasure (+) to displeasure (-), as represented on the valence axis 320.
  • the second dimension is Arousal which is measured on scale of excited/aroused (+) to sleepy (-) , as represented on the arousal axis 310.
  • the third dimension is Stance which is measured on a scale of high confidence (+) to low confidence (-), as represented on the stance axis 330.
  • Facial expressions correspond to affect states such as happy 340, surprise 350 and bored 360.
  • the affect space model 300 characterises absolute emotional states based on the three dimensional measures, valence, arousal and stance. Versions of the affect space model have mapped affect states onto facial expressions.
  • Embodiments of the present invention are concerned with determining the change in emotional state rather than determining the absolute emotional state of the subject.
  • the emotional state change data can be a measure only indicating change in one dimension, for example whether the emotional state change is in a positive or negative direction, or neutral in the case of no change.
  • the emotional state change may be a vector indicating a direction (+ve or -ve) and intensity of the change .
  • the changes in facial image are mapped to a vector indicating the transition between two states of an affect space model 300.
  • a transition between the emotional states of anger 355 and fear 365 is indicated by a reduction in the level of arousal, without a change in valence, whereas a transition between anger 355 and defensive 390 also includes a negative valence change and also some negative stance change.
  • a transition between anger 355 and defensive 390 also includes a negative valence change and also some negative stance change.
  • a number of methods can be used to identify changes between the first and second images and to analyse these changes to provide emotional state change data.
  • the methods used for analysing the identified facial image changes to determine the emotional state change can vary depending on the image processing method used to identify the facial image changes .
  • Gabor wavelet filters at varying scales and orientations have been used to represent facial expression images.
  • the image processing step described with reference to Figure 4, uses Gabor wavelets to identify the facial image changes between subsequent images 410a, 410b and 410c.
  • the image processing step 220 of this embodiment includes the steps of pre-processing to obtain difference images between frames in a video stream, and feature representation using Gabor wavelets .
  • the analysis step 240 includes classification into positive, negative or neutral emotional states by a neural network.
  • Pre-processing can include the optional steps of finding the eye coordinates or other facial features in the video stream, and geometric normalisation centred on the eyes (or other facial features) prior to generation of difference images.
  • finding the eye coordinates or other facial features in the video stream and geometric normalisation centred on the eyes (or other facial features) prior to generation of difference images.
  • n was selected to be 3 i.e. the subtracted image was the image three frames earlier.
  • image 410a is the first selected image
  • image 410b is the image 3 frames after image 410a
  • image 410c is the image 3 frames after image 410b.
  • the video was recorded at 10 frames per second in this example. So the time difference image 410a and 410b, and 410b and 410c was approximately 300 milliseconds.
  • Figure 4 shows the difference image 420a, 420b, 420c between two subsequent images of a subject determined using Gabor wavelets, and the subject's emotional state response 430a, 430b and 430c determined based on a difference image.
  • the difference image 420b represents a plurality of vectors of Gabor Jets calculated at points on a grid overlaid on the images 420a and 420b after normalisation.
  • a Gabor jet is the magnitude of the complex valued Gabor wavelet filter output which is applied determine the difference between the first 410a and second 410b images.
  • the filters used were at three scales an octave apart and at six orientations within each scale. The bandwidth of the masks was approximately an octave.
  • the analysis of the difference images 420a, 420b and 420c is performed using a neural network to determine any emotional state change based on the change in facial expression between two subsequent images.
  • the neural net was trained on a selection of image sequences from the Cohn-Kanade facial expression data base.
  • the Cohn-Kanade facial expression database provides samples of facial images representing a variety of absolute emotional states and facial expressions assumed during transitions from one emotional state to another. The sequences of images in this database are taken from video and so it was assumed the frame rate is approximately 30 frames per second.
  • n was selected to be 8 representing a time difference of approximately 270 milliseconds.
  • the sequences selected were the ones that represented joy, anger and expressions ranging therebetween which were classed into positive and negative emotion respectively.
  • Difference images representing no change i.e. relatively flat images where generated artificially and were classed as neutral emotion i.e. no change in emotional state.
  • the neural net architecture was 1296 input nodes, corresponding to the dimensionality of the Gabor jets vector, 10 hidden nodes and 3 output nodes. Back propagation was the training algorithm used.
  • the output of the network was visualised by an image 430a, 430b and 430c where different areas of the image 431, 432 and 433 represented different classes and/or relative strength of the three different classes namely neutral 433, positive 432 or negative 431 emotion state change.
  • the output of the neural net three nodes representing the three classes, where mapped to red, green and blue for negative, positive and neutral change respectively.
  • the top half of the visualisation image 434 was a colour representing all three classes 7 colour mixed together.
  • the bottom half of the image was divided up into three equal areas 431, 432, 433, each devoted to one of the three colour/classes. Ideally it would be expected to display either of red, green or blue.
  • the image processing method uses an optical flow algorithm to track changes in facial features associated with emotional expression.
  • Optical flow tracks the displacement of selected facial features or selected areas of an image based on the changes in pixels when comparing two images, in this case consecutive images in a video stream.
  • the selection of features or areas of the image is based on the Facial Action Coding System (FACS) developed by Ekman and Friesen, is a method of measuring facial activity in terms of facial muscle movements.
  • FACS consists of over 45 distinct action units corresponding to a distinct muscle or muscle group. More than 7,000 different action unit combinations have been observed.
  • FACS has been criticized as only capturing a spatial description of facial activity and ignoring the temporal component, it is perhaps the most widely used language to describe facial activity at the muscle level.
  • the subtle variations are usefully modelled by tracking the eye shape and movement, eyebrow movement, and cheek and lip movement.
  • the facial action units associated with eyes, eyebrows, cheek and lips have been analysed in this project include (but are not limited to) inner eyebrow raiser, outer eyebrow raiser, eyebrow lowerer, upper eye lid raiser, eye lid tightener, depressing lower lip facial, cheek raiser action units and their combinations.
  • these facial action units are tracked/recognised and mapped and mapped to positive and negative emotional states.
  • To track subtle changes we utilise the video stream's inherent temporal relationship between consecutive images (corresponding to consecutive questions) in the stream and recognise the facial action units in consecutive images with time. We then extract the facial changes from the facial units of say two consecutive images for analysis to model the functional relationship between extracted facial changes from facial units and positive and negative emotional states.
  • the particular algorithm used was the Lucas & Kanade algorithm and is an efficient, sparse optical flow algorithm.
  • the changes in facial features of interest are directly tracked.
  • An example of this embodiment is described with reference to figures 5 and 6.
  • the video stream (as in the previous example) was used unprocessed as an input to the optical flow algorithm.
  • a template or mask of points to be tracked on the face(s) in the video stream was initialised for the sparse optical flow algorithm to track, as shown in image 500 of figure 5.
  • the tracking points mask used to generate the results in this example was initialised by the use of object detectors to locate faces as a whole and facial features as reference points for points mask placement.
  • This example tracks whether a person's emotional state is changing towards a positive or negative expression.
  • a simple mask was generated as shown on image 500 in figure
  • This mask consists of eight points 501-508.
  • Two reference points 501, 502 on areas which are relatively inert with respect to movement in facial expression changes were used as reference points.
  • the other six tracking points' 503-508 motions were calculated relative to the two reference points.
  • One of the two reference points 501 is on the forehead and the other reference point 502 is on the nose.
  • the tracking points 503-508 of the mask are placed to correspond to features or areas of the face which move when facial expressions change.
  • the tracking points are right eyebrow 503, left eyebrow 504, right cheek 505, left cheek 506, right upper chin 507 and left upper chin 508. It was found empirically that the six points were sufficient to differentiate between the two facial expressions we were interested in for this example more or less tracking points may be used in alternative embodiments depending on the application.
  • the relative motions of the points 501-508, as calculated by optical flow, are shown in figure 6.
  • superimposed on these cross hairs are small circles 611-618 representing the relative position of the corresponding point 501-508 in the current image. So the displacement of the circle 611- 618 from the centre of the cross hair 601-608 represents the relative motion of that point 501-508 from image to image.
  • the cross hair 620 is a superimposition of all the points 501-508, with the relative movements represented as blob 629 which corresponds to a superimposition of circles 611, 612 and 615-618, and circles 623 and 624 which correspond to circles 613 and 614. This represents that both the left eyebrow tracking point 504 and right eyebrow tracking point 503 moved further between the two images than any of the other feature tracking points.
  • the axis 630 is used to represent a gauge of positive or negative emotion predicted by the analysis of the points ' 501-508 motions; positive being up, negative being down and relative intensity indicated by the distance from the centre of the axis.
  • the points 501-508 from each different area have signature motions when a change in facial expression occurs between consecutive frames in the video stream.
  • the direction component indicates the class of facial expression component, represented by that point for the specified area the point inhabits, and the displacement represents the intensity.
  • the left eyebrow 504 and right eyebrow 503 have a downward and slightly inward direction of motion when a glare, such as exhibited in an expression of anger, is tracked and an upward movement when the eyebrows are raised in an expression of surprise or fear (more input from other points relative motions would be needed to distinguish between surprise and fear) .
  • These changes are analysed using fuzzy logic to model these observed movements to produce an indication 640 of relative positive or negative emotion displayed by this guage 630.
  • the movement of each feature was characterised by the angle and magnitude of the movement in 2 dimensions.
  • the angle was simplified in this example to UP or DOWN and the magnitude classified as HIGH or LOW.
  • the Sugeno method was used for inferencing and defuzzification by weighted average.
  • Figure 7 illustrates the relative movements of facial features mapped onto cheek axis 710 , eyebrow axis 720 and mouth axis 730 to show the combination of movements indicating emotional states.
  • This diagram illustrates that it is movements in combination used by the fuzzy- rules rather than movements of one single feature which are used to determine an emotional state change.
  • neutral 760 indicating no emotional state change.
  • the mouth moves with a high magnitude in combination with a high magnitude cheek movement in an upward direction and low movement of the eyebrows in either direction, this is deemed a high magnitude positive emotional state change 740.
  • the direction of the mouth or eyebrow movements are not considered significant, only the magnitude is significant.
  • both the direction and magnitude of the cheek movement is significant. For example, consider the difference between your face moving from a bland or neutral expression to a smile or to laugh. In a smile the mouth, measured in our example based on the movement of the upper chin, moves upward, however when laughing the mouth opens so the movement of the upper chin is down, so either movement direction in combination with an upward cheek movement and little movement in the eyebrows can be interpreted as a positive emotional response of high magnitude.
  • a negative emotional state change 750 is indicated by a combination of low magnitude cheek and mouth movement in either direction with high magnitude downward eyebrow movement.
  • the reference points on the forehead 501 and nose 502 can be used to normalise the measured movements of the tracking points 503-508 to compensate for any head movements .
  • the above examples describe two ways in which emotional state changes can be automatically determined from facial images.
  • the complexity of the image processing and emotional state analysis may be varied. For example more facial features may be tracked, more complex neural networks and fuzzy rules used, other aspects of facial expressions such as facial hue (for example flush or pallor) , body language such as head angles or posture may also be monitored, or verbal indicators such as tone, speed, pitch and modulation may also be monitored. All these alternatives and variations are contemplated within the scope of the present invention.
  • the emotional state change data may include data indicating the time an emotional state change occurred in addition to the data characterising the emotional state change, such as magnitude and direction.
  • Further trends in a persons emotional state change over time can also give indication of a persons emotional state change style over a relatively long duration of time, for example someone who exhibits generally high emotional state change responses, or for a relatively short duration of time, for example during the period of the person changing from one absolute emotional state to another, monitoring the transient changes within this absolute change can be an indicator for a measure of the absolute emotional state change, for example the stance measure for the level of confidence a person exhibits which can add a further dimension to the emotional state change data.
  • FIGS 8a-c and 9a-c Examples of trends in emotional state changes over time are shown in figures 8a-c and 9a-c.
  • the example in figure 8a shows emotional state changes of high magnitude in both positive and negative directions so the person being monitored appears to have a consistently high magnitude of emotional state change.
  • the person exhibits moderate or medium intensity emotional state changes and in figure 8c the person is exhibiting only low intensity emotional state changes.
  • These examples can be used to classify a person's emotional state change style, for example very expressive for 8a and controlled or minimally expressive for 8c. Knowledge of this style trend can be of use when comparing the responses of two or more subjects.
  • the magnitude of the emotional state changes can be weighted or normalised when wishing to compare responses of more and less expressive subjects.
  • FIG. 9a shows a series of emotional state changes where the magnitude of the changes is relatively constant, whether the magnitude is high or low may influence the interpretation of this trend. For example, consider the emotional state changes of a person watching a romantic movie are being monitored, if the magnitude of the emotional state changes are low this may indicate that the person is simply not interested or emotionally engaged with whatever they are watching, whereas if the emotional state changes are high then this may indicate that they are engaged and responding emotionally to what they are watching.
  • Figure 9b shows decreasing magnitude of emotional state changes over time.
  • Figure 9c is an example of a persons emotional state change intensity varying irregularly over time. Depending on the context this may indicate a person lacks emotional stability, for example in the absence of stimulus to evoke an emotional response and a person is going through a number of varying emotional state changes.
  • the varying level of emotional state change may indicate varying levels of emotional engagement with a variety of stimulus, for example when looking at pictures of a variety of holiday destinations, some may be appealing and evoke a strong positive emotional response, whereas others may evoke a less strong, neutral or negative responses of varying intensities .
  • transient emotional state changes When responding to stimulus such as looking at a photograph or answering a question a person may experience a number of transient emotional state changes during a transition from one absolute emotional state to another or back to the starting absolute state. Examples of such transient emotional state changes are shown in figures
  • FIG. 10a-10c The plot of the transient emotional state changes in figure 10a shows a single peak indicating a single emotional state change during the monitoring period.
  • Figures 10b and 10c shows multiple peaks within the monitoring period, indicating the subject has shown more that one emotional state change during the monitoring period, this may indicate some confusion of lack of confidence leading to several emotional state changes .
  • the number and duration of the emotional state changes can indicate the level of confidence or confusion with a greater number of emotional state changes indicating lower confidence or greater confusion.
  • Emotional state change data can include data indicating the time an emotional state change occurred. This time may be a relative or absolute time. For example time- stamping of the start of a sequence of images can be used to determine the time of an emotional state change relative to the start time of the sequence, or recording the time each individual image within the sequence was recorded can be used to provide an absolute or actual time the emotional state change occurred.
  • Embodiments of the present invention provide a method and system for correlating emotional state changes with a subject's cognitive response to stimulus.
  • the subject 1110 is monitored, using monitor 1120 during the time the stimulus is applied and data indicative of emotional reaction of the subject to the stimulus automatically recorded.
  • the data is then analysed, by the processor 1130, to provide emotional state change data for the subject during the time the stimulus was applied, for associating the emotional state change data with the cognitive response to the stimulus.
  • the stimulus may be applied to the subject via the processor or by another means, for example manually or via another processor.
  • the data indicative of emotional reactions of the subject can be facial image data which is then analysed as described above and/or other data indicative of a persons emotional reactions such as posture, vocal change, physiological changes etc. These alternatives are all considered within the scope of the present invention.
  • a sequence of stimuli is applied to the subject and the subject's emotional reactions monitored for the duration of the application of the sequence and the recorded data analysed to provide emotional state change data associated with each stimulus in the sequence.
  • the recorded data may include time-stamp data indicating the timing of the application of each stimulus.
  • the relative timing of each emotional state change from the time where the first stimulus was applied can be compared with the relative timing of the application of each stimulus in the sequence.
  • the cognitive response may also be recorded by the system 1100 and the timing of each cognitive response also used for correlation of the cognitive and emotional responses. For example, the time when a stimulus, such as asking a question, is applied could be recorded by the system 1100.
  • the time taken by the subject 1110 between the asking of the question and making of a cognitive response, such as answering the question can be included in the recorded data for analysis.
  • the above described method and system for monitoring emotional state changes using facial images can be used to obtain emotional state change data.
  • the synchronisation of the emotional state changes with the cognitive response by the subject can be based on the timing of the visual image recording.
  • the selection of images, to compare for determining emotional state changes, out of a series of images can be based on the timing of the application of the stimulus and/or the subject's response.
  • FSRBS Fuzzy Sales Recruitment and Benchmarking System
  • AHP Analytical Hierarchy Process
  • Soft computing techniques for selling behaviour categorisation and benchmarking of salesperson.
  • This system uses a survey of specially formulates questions to determine a recruitment candidate's selling behavioural style based on the answers, cognitive responses, given by the candidate to the survey questions. A detailed explanation of this behavioural model is given below.
  • the behavioural model for classification of a salesperson's behavioural style in this example has two dimensions namely, ⁇ Warm-Hostile and Submissive-Dominant' . This model has been used based upon interactions with senior managers in the sales and human resources arena in the consumer and manufacturing industries in Australia.
  • Warmth is regard for others. A warm person is optimistic and willing to place confidence in others. Hostility is lack of regard for others, the attitude that other people matter less than oneself. A hostile person rarely trusts others.
  • submission is the disposition to let others take the lead in personal encounters. It includes traits like dependence, unassertiveness, and passiveness. Dominance is the drive to take control in face-to- face situations. It includes a cluster of traits like initiative, forcefulness, and independence.
  • Submissive-Dominant and Warm-Hostile give rise to four broad groups of salespersons and customers, i.e., Dominant-Hostile (DH) , Submissive-Hostile (SH) , Submissive-Warm (SW) , and Dominant-Warm (DW) .
  • DH Dominant-Hostile
  • SH Submissive-Hostile
  • SW Submissive-Warm
  • DW Dominant-Warm
  • the linguistic variables provide information on the intensity (or extent to which a candidate's behaviour belongs to a fuzzy category within each category) .
  • a SH salesperson In terms of the selling behaviour model, a SH salesperson is motivated by the needs of stability and security. A SW salesperson is motivated strongly by social and to a lesser extent by security and esteem needs. Thus, the SW salesperson sees the world as warm and accepting. The DH salesperson is motivated by strong needs of independence and self-esteem. They exhibit their needs through a strong desire to succeed. On the other hand, a DW salesperson is motivated strongly by needs of self- realisation and independence. In order to succeed, the DW salesperson controls the situation. Therefore, a DW salesperson is thorough with product knowledge, competition and other essential skills in sales.
  • the purpose of the fuzzy selling behaviour model is to evaluate the primary selling behaviour category of a sales candidate before recruitment.
  • seventeen (17) areas are selling as a profession, assertiveness, decisiveness, prospecting, product, customers, competition, success and failure, boss, peers, rules and regulations, expenses and reports, training, job satisfaction, view about people and relationship with non- selling departments.
  • Analytical Hierarchy Process is a powerful tool. It is used for flexible decision making processes to help set priorities.
  • the AHP makes the best decision when both qualitative and quantitative aspects of a decision need to be considered.
  • the application of the AHP in this context is in the four objectives to be considered: competition, selling, customer and product.
  • This first step in the AHP is pair-wise comparison.
  • AHP decides the relative importance of the objectives. This is done by comparing each pair of objectives and ranking them on the scale from 1 to 10. Comparing objective i and objective j give a value a ⁇ j as follows:
  • the purpose of using the fuzzy selling behaviour model is to determine the primary selling behaviour category of a sales candidate.
  • the primary selling behaviour category according to the model accounts for selling behaviour of a sales candidate form majority of the time in their interaction with customers.
  • seventeen areas are used to design questions for evaluating the primary selling behaviour category of the sales candidate.
  • At least four (4) questions (one corresponding to each quadrant (DH, SH, SW and DW) of the fuzzy selling behaviour model are designed) related to each area are designed.
  • the questions are designed to contradict each other to facilitate pattern of commitment in the answers by the candidate. This is envisaged that the pattern of commitment is skewed towards the behaviour category.
  • attributes related to each of these areas with respect to different behavioural categories have been determined.
  • the attributes of each of these areas have been designed in the form of questions.
  • the other parameters that have been kept in view while designing the questions are: a) What is going to be the tone of various questions? b) What is going to be the length of each question? c) What is going to be the total number of questions? d) What is going to be the ordering of the five answer options? and e) What is going to be the pattern of questions?
  • the negative tone of the questions is neutralised as much as possible without losing the actual meaning of the questions. This has done by underplaying the negative tone of such questions and by introducing suitable justifications in the question itself.
  • the questions are designed with five answer options to provide for quick answers .
  • the five answer options are shown in table 4 :
  • the answering option sequence shown in table 4 has provided us with the best results.
  • ⁇ V Yes" and “No” are the first two options to capture snap and immediate instinct of the candidate after reading the question. Additionally, it also results in higher commitment by candidate towards a certain answering pattern or behavioural category.
  • the weighting of the "Not Sure” option has been kept low because it does not contribute meaningfully towards a behavioural category or commitment towards an answering pattern.
  • a sample set of four questions related to the area of competition is shown in table 5 and each question is related to one of the four behavioural categories.
  • a total of 76 questions have been designed for the salesperson recruitment survey.
  • the survey is carefully constructed in order to make an assessment of the cognitive responses of a candidate to determine their primary behaviour category. However, each individual answer in the survey can also be reviewed and data presented indicating the progressive responses of the candidate, for example as shown in figure 16. As the candidate answers the survey their emotional reactions can be monitored to record and track the emotional state changes associated with each question in the survey.
  • the survey may also be constructed to enable it to be modified within set limits dependent on the analysis model, such changing the order of the questions or having interchangeable or replacement questions to enable the same model to be used for analysis of the acquired data regardless of the version of the survey applied.
  • the display of the results may also be structured to accommodate any variation in the survey, for example aggregation of results for various categories rather than presenting responses to individual questions where the questions or question order changes .
  • the monitoring and display of emotional state changes can be performed in real time, as the survey is being answered, or the data, such as visual images, recorded for later analysis and correlation.
  • the recorded data can be time stamped for later synchronisation with the timing of cognitive answers.
  • the recorded data can include both the data indicative of the subject's emotional response recorded by the monitor and the actual cognitive responses, either simultaneously recorded by the monitor or added to the recorded data, by the processor.
  • Data can be presented correlating candidates cognitive response with the emotional state change experienced by the subject while answering the question, for example as shown in figure 16, 17 for individual questions and Figure 18 showing the emotional and cognitive responses represented along one axis.
  • this data can be simply presented as an output for analysis by a person. For example, to highlight inconsistencies or unexpected emotional responses compared with cognitive responses such as is shown in Figure 18 where the candidate experiences a high intensity negative emotional state change while answering a question positively for the Dominant-Warm category, this may indicate the candidate is not comfortable with the cognitive answer given.
  • the number and duration of a person's emotional state changes during answering a single question can also be indicative of confusion regarding the question or lack of commitment to the answer given.
  • Some embodiments of the system can perform further analysis on the correlated cognitive response data and emotional state change data. For example qualify the cognitive response based on the associated emotional state change.
  • a series of cognitive and emotional responses can be analysed to determine trends in emotional state changes in comparison with cognitive responses . Again these emotional state change trends may be used to qualify cognitive responses.
  • additional fuzzy rules can be applied to compare a number of cognitive and emotional responses, such as comparing the emotional state change responses for all the questions with a positive cognitive answer in a particular category, where there are consistent positive emotional responses as well as the cognitive responses this can be an indicator of a strong preference for the particular behavioural style, whereas inconsistent emotion responses may indicate the candidate does not have a strong preference for this style.
  • the pruning of contradictory and superfluous answers is a method to establish the primary behaviour category.
  • the example described below relates to the analysis and pruning of cognitive answers only to determine the overall selling behavioural style, however it is envisaged that embodiments also include analysis of emotional state changes.
  • emotional sate changes can be used as supplementary information when analysing contradictory responses, or to identify contradictory answers.
  • emotional state changes could also be used to gather insights as to preference for candidates who appear to be in transition between two styles.
  • the general guideline involves firstly, to determine the highest aggregated raw (unpruned score) over all the 17 areas of evaluation in one of the four behaviour quadrant (DH, SH, SW and DW) .
  • the aggregated score in a particular behaviour quadrant helps to establish the personal need level of the sales candidate. For example, based on the Abraham Maslow's model; a highest aggregated/cumulative score in DH quadrant will correspond to independence and control needs.
  • the emotional state change data may also be taken into consideration for this assessment as the fulfilling or denial of needs is often associated with emotional reactions.
  • the answers in a particular area which reflect a higher need level than the needs level corresponding to behavioural quadrant (For example, DH) with the highest aggregated score are considered as superfluous answers .
  • These superfluous answers reflect what a candidate would like to be as against what they actually are.
  • the emotional state changes associate with these answers can be a secondary indicator of their superfluous or aspiratory nature.
  • based on the fuzzy selling behaviour model candidate may provide masked answers in a particular area which contradicts the answer corresponding to the initial predominant category based on the highest aggregated raw (unpruned) score.
  • the experience of the domain experts in practice is also used to determine masked or contradictory answers, and again emotional state change data may also be applied to identify contradictions.
  • the removal of the contradictory and superfluous masked answers results in re-adjustment of the scores in the entire four quadrants.
  • the end product from this pruning exercise is a set of pruned scores in four selling behaviour quadrants DH, SH, SW and DW.
  • the pruned scores in four behaviour categories namely, DH, SH, SW and DW are used to compute the primary fuzzy selling behaviour category.
  • the fuzzy selling behaviour model is employed.
  • the primary fuzzy selling behaviour category indicates that for the majority of the time, the sales candidate's selling behaviour will be determined by the attributes related to the overall selling behaviour category.
  • the pruned scores in DH, SH, SW and DW primary behaviour categories are represented diagonally on the each quadrant of the fuzzy selling behaviour model.
  • the computation method involves projection of the pruned scores (DH, SH, SW and DW categories) from the diagonal of each quadrant on to the two dimensions of the fuzzy selling behaviour model (either Hostile-Warm axis or Dominant-Submissive axis) .
  • the projected score values are added or subtracted depending on the directional offset from the origin (0,0) of the dimension axis. For example, if the projected score (with reference to the origin) lies on either the hostile end of Hostile-Warm axis and the other lies on a warm end of the
  • the two scores are subtracted from each other. If, however, the two scores lie on the hostile or warm end with reference to the origin, then, they are added together.
  • the two scores are subtracted from each other. If, however, the two scores lie on either the hostile or warm end with reference to the origin, then, they are added together. It may be noted here that through projection of the pruned scores the two dimensions of the model would be able to determine the intensity (Low, Medium and High) of the sales candidate's resultant score in the selling behaviour category (SH, SW, DH, and DW) .
  • Step 1 Project four pruned scores along the Hostile- Warm axis.
  • Step 2 Project four pruned scores along the Dominant- Submissive axis.
  • Step 3 Use the Pythagorean Theorem to compute the resultant score using the formula .
  • Step 4 Compute the percentage value of the resultant score.
  • Step 5 Determine the Fuzzy behaviour category according to the fuzzy membership function and fuzzy categorisation rule.
  • Step 6 Determine the fuzzy membership of sales candidate in other category using the fuzzy categorisation rule.
  • Analysis of correlated cognitive response data and emotional state change data may also include comparison of data between individual subjects, among a group of subjects, or between a subject and model response data and emotional data.
  • Benchmarking is an important component of recruitment. "Benchmarking" is the process of comparing and measuring an organisation, system, process and/or product against recognised leaders anywhere in the world to gain information that will help this particular organisation take action to improve its performance. In simple words, benchmarking is a standard of performance. In a recruitment context, benchmarking helps the sales/HR managers to compare the behaviour profiles of the prospective sales candidate with the existing most successful salespersons (benchmark/s) in their organisation. The benchmarking profile of the successful salesperson in a given organisation can be considered as a cultural or person fit profile for comparison. It may be noted that the behaviour profile is constructed from the pruned scores .
  • Figure 19a and Figure 19b illustrate the comparison between the selling behaviour profile of one candidate 1910 and a the selling behaviour profile of a benchmark salesperson 1930; and; the selling behaviour profile of another candidate 1920 and the benchmark salesperson 1930 in an organisation respectively.
  • the relationships are considered as analysing the shapes of the behaviour profiles.
  • both profiles 1910 and 1930 are parallel and are in almost perfect correlation despite the differences in basal expression level and scale. This strong correlation implies that the selling behaviour profiles of the two individuals have similar shape and have similar scores in most selling behaviour categories.
  • the parallel profiles indicate a behavioural and cultural fit. The degree of closeness of parrel profile indicates a tight or loose coupling behavioural and cultural fit.
  • the profiles 1920 and 1930 which intersect or cross each other represent significant variations in selling behaviour of the two subjects being compared.
  • the selling behaviour profiles which intersect with the benchmark profile as shown in Figure 19b can be used as a way of building varieties of mix of salespersons in an organisation. This mix can be use to cater to changing business needs and culture overtime.
  • Embodiments of the present invention enable emotional state change data to be incorporated into the benchmark data.
  • the method and system for correlating emotional state changes with a subject's response to stimulus can further be used modelling correlated cognitive and emotional responses.
  • a model may be developed based on a typical or desired individual subject, for example a model employee for a particular role.
  • a model can be developed by selecting a model subject and applying a predetermined sequence of stimuli to the subject to elicit a series of cognitive responses from the subject.
  • the subject is monitored and their cognitive response and data indicative of their emotional response recorded for each stimulus .
  • the data can then be analysed to provide emotional state change data for the subject during the time when the stimulus was applied for each stimulus, and the emotional state change data correlated with the cognitive responses.
  • the top salesperson in an organisation may be desirable to use as a model. This person can execute the candidate survey to enable their cognitive responses associated emotional state changes to be recorded, and this then becomes the model or benchmark to be used for comparison with recruitment candidates.
  • Modelling correlated cognitive and emotional responses can also be based on a number of subjects using the method above applied to a number of subjects, such as a group of salespeople, and performing further modelling analysis.
  • the further modelling analysis may vary depending on the number of subjects.
  • a model may be developed based on statistical norms within the sample.
  • the above steps can be applied to each subject in the sample and statistical analysis performed on the data acquired from all the subjects. For example, each subject completes a sales recruitment survey and the dominant primary behavioural style determined, along with the normal for the cognitive and emotional state change for each question to provide the benchmark data.
  • subjects may be classified into groups based on particular trends or similarities in individual subject's cognitive data and/or emotional response data and the model based on particular trends within a group. For example, a group of subjects all answer the sales recruitment survey and both cognitive and emotional responses are recorded. The recorded data can be analysed to select a group of subjects who all showed similar cognitive responses.
  • the emotional responses of this group of subjects can then be analysed and a model developed based on the combined cognitive and emotional responses for the group.
  • the analysis performed to develop such a model can vary depending on the size of the group and the context.
  • the sequence of stimuli may be a survey relating to job behaviour and attitudes which may produce very distinct trends in both cognitive and emotional responses which may be converted to a model by averaging of the responses across the group.
  • models can be developed based on actual monitored emotional responses of individual subjects, correlated with cognitive responses of one or more individuals to a selected sequence of stimuli.
  • the sequence of stimuli can be selected based on the cognitive responses of interest or relevance for a particular context and by recording actual cognitive and emotional responses of selected subjects the emotional state change data for the model is acquired.
  • This enables models to be easily developed for any context without necessarily requiring detained psychological analysis of each context, for example it is not necessary for every sales department to be reviewed and analysed using psychological profiling to develop a model, the model can be developed simply by applying the same survey to a sample of "model" salespeople form each team, thus a model specific to each team is developed automatically using the system.
  • Embodiments of the model development are applicable in a variety of contexts and with varied analysis applicable for each context, however, all the applicable alternatives are considered within the scope of the present invention.
  • the developed model can be used for benchmarking both cognitive and emotional responses.
  • the cognitive and emotional responses of the subject and the benchmark can be represented on the same axes for visual comparison.
  • FIG. 21 Some examples of systems for utilising the above modelling and comparison are shown in figures 21 to 24.
  • the example of Figure 21 is a sales recruitment system 2100.
  • the candidate's cognitive text inputs 2110 in response to questions are input to an intelligent selling behaviour evaluation component 2120 (for example as described above) , the selling behaviour profile 2135 is developed but the selling behaviour profiling component 2130, this selling behaviour can then be benchitiarked against a model by the selling behaviour benchmarking and comparison component 2140.
  • a video stream 2160 of the candidate recorded while they answered the questions is input.
  • the facial image extraction component 2170 extracts relevant facial images to analyse to determine emotional state changes for each answer, these images are analysed and processed (as described above) by the facial image processing component 2180, and the emotional state change data extracted and correlated with the cognitive answers by the emotional state extraction and profiling component 2190, this emotional behavioural profile 2155 can then be benchmarked against the model by the emotional profile benchmarking component 2145.
  • the emotional selling behaviour profile visualisation component 2150 for ease of interpretation by the recruiter, for example the comparison graphs similar to Figure 20c prepared to show the profile compares to the benchmark of a good salesperson 2157, also reports showing the sales candidate's selling behaviour category and scores 2125, further reports identifying discrepancies either between the emotional and cognitive responses of the candidate or between the candidate and the model can also be produced.
  • a system 2200 for automatically providing feedback in an education context is shown in Figure 22.
  • a video of a lecture 2210 and a video of on or more students 2220 are input to the system 2200.
  • the lecture video stream is broken down into sections based on topics or blocks of time by the lecture breakdown component 2215, and student facial images extracted from the student video stream 2220 by the student group facial image extraction component 2230 for each lecture segment.
  • the group facial image processing component 2240 analyses the student facial images to determine emotional state changes for the lecture segment which are then analysed by the intelligent emotional state extraction and profiling component 2250.
  • the emotional state changes may be from one or more students from the group, and analysed to determine trends, such as the level of engagement over the course of the lecture based on the intensity of emotional responses from one or more of the students, alternatively differences in emotional reaction between students may be identified for highlighting to the lecturer as this may indicate some students are confused or there are mixed reactions to a topic, for example the topic may need to be revisited for clarification or interactive student debate.
  • the trends in the group engagement levels or profiles can be related to particular topics by synchronisation with the lecture segments by the synchronisation component 2260.
  • the student emotional profile visualisation component 2270 can present the emotional responses for interpretation by the lecturer, for example where the lecture is being given live and the lecturer my be provided with a real time lecture adaptation decision aid 2290 such as a monitor showing the average engagement level of the students in real time, thus enabling the lecturer to either delve further into a topic if the students are showing high level of engagement or take action such as changing topic or lecturing style where students are showing lack of engagement or reducing engagement with a topic.
  • a real time lecture adaptation decision aid 2290 such as a monitor showing the average engagement level of the students in real time
  • a lecture is pre-recorded, say for online delivery over the internet, in several sections or includes supplementary learning tools such as games or quizzes associated with the lecture topics
  • supplementary learning tools such as games or quizzes associated with the lecture topics
  • a falling level of engagement from an online student at certain points in the lecture could trigger a change in topic or breaking to play a game or do a quiz
  • decisions could be made by a real-time pro-active lecture delivery adaptation and quality assurance component 2280.
  • the system can also output information for quality assurance 2295 purposes to be reviewed by the lecturer or colleagues, or compared with similar emotional state profiling information from other lectures in relation to similar topics to determine what lecturing or learning style is most effective for different groups of students.
  • a system 2300 for monitoring critical event operators is illustrated in Figure 23.
  • This system may be used for monitoring a person in a live situation or in a simulator.
  • the critical event operator may be a fire- fighting team leader, emergency response coordinator in a power station or factory, a commander of a battle ship or platoon, a fighter pilot, a paramedic, a racing car driver, etc.
  • This system could be applied in any area where a person is required to make cognitive decisions in response to ongoing events in an important and fluid situation.
  • the operator's cognitive responses 2310 and a video stream of the operator during the situation 2320 are input to the system.
  • facial images associated with cognitive responses are extracted and analysed to determine emotional state changes experienced by the operator while making cognitive decisions by the facial image extraction component 2330, facial image processing component 2335, and emotional state extraction component 2340.
  • the system is adapted to analyse the cognitive behaviour 2315 and compare this with a benchmark using the cognitive behaviour component 2360 and also benchmark the operator's emotional profile 2350, which can be presented for comparison 2370.
  • the profile can be monitored for signs of stress and to determine where stress, particularly emotional stress, affects the cognitive behaviour of the operator.
  • the stimuli can include a structured simulation, such as a game, event, emergency flight or battlefield simulation or a real life event.
  • the simulation may be designed to test particular aspects of peoples' abilities and resilience, such simulations or sets of stimuli may be modified based on the requirements of a situation. It is envisaged that the stimuli may be modified by replacement of the entire sequence, for example, it may be necessary to change the simulation if the content of the simulation becomes known, either through repeated use or breach of security. Alternatively parts of the sequence may be modified either in real-time, during the execution of the simulation, based on feedback from the subject, for example if it appears too easy or in response to a particular cognitive or emotional reaction, or to avoid familiarity, for example randomly changing the order of stimuli so what is coming next cannot be predicted.
  • the system 2400 illustrated in figure 24 is used to monitor a person's browsing behaviour and emotional state changes when planning a holiday using the Internet.
  • travelingler 2450 is tracked, for example which pages are being viewed or thumbnail pictures expanded for better viewing.
  • the person's eye movement or gaze is also tracked to monitor what they are viewing, for example how long a picture is viewd or how often the gaze returns to the picture.
  • the facial expression component 2420 of the system 2400 uses a device such as a webcam to acquire a video image stream 2415 to capture the person's facial expressions and analyses the visual images to monitor, using the above techniques, the person's nonverbal emotional responses to what is being viewed.
  • This data can be used by the e-tourism web site 2450 to tailor the information presented to the person, for example by giving priority or highlighting destinations, accommodation, tours etc, similar to those that the person has shown a positive emotional reaction to.
  • the emotional reactions of people can be monitored in order to provide feedback to service suppliers, for example where a great number of people have a positive reaction when looking at a picture of one accommodation venue then an extreme negative reaction (compare to the reaction for other accommodation venues of similar standard) when viewing the price, this may indicate that the accommodation is overpriced.
  • Embodiments may also be applied in application where interaction occurs between people, one being the subject and the other being a person providing the stimuli.
  • the subject can be a patient and the third party providing the stimuli a doctor.
  • the patient can be monitored to determine emotional state changes, for example reactions to questions, pain, treatment or medication.
  • the stimuli may be applied externally for example by the doctor asking a question or administering a drug.
  • the stimuli effecting the patients emotional state may be applied by the patient themselves, for example pain, pain caused by attempted movement, frustration caused by an inability to move, or distress caused by environmental factors such as light, noise or heat.
  • Embodiments of the present invention can be of value when accumulating data, such as monitoring browsing behaviour or acquiring survey data, as the emotional state changes associated with cognitive responses adds a further dimension to the data accumulated and thus enables more detailed a subtle analysis of the cognitive results by qualifying those results based on the emotional reactions associated therewith.
  • Research in data mining has historically been driven by design and refinement of data and data/web mining algorithms.
  • the data mining algorithm driven approach has primarily focused on predictive accuracy (given a set of training data) and other technology-driven outcomes.
  • the embodiment described herein is directed towards a more context- centred and utility-centred approach compared to technology-centred data mining approaches.
  • data mining has its hurdles: the 'meanings' are not suggested by the data or the computers; they are imposed on data by human beings. This problem is further acerbated by the fact that data mining technologies are largely designed based on technology-push models as against a strategy-pull models driven by business managers.
  • strategy-pull model business managers make sense of a new situation by constructing meaning or knowledge based on their cognitive constructs and adapting these cognitive constructs to the dynamics of the business situation.
  • the managers may honor as well as reject pre-specified meanings and outcomes mined using historical data (as is the case in a technology-push model) .
  • the cognitive constructs also help to establish the semantic context in which data mining systems are used and interpreted.
  • Embodiments of the present invention enable emotional state change data to be automatically gathered and correlated with cognitive data for analysis such that intelligent technologies can account for emotional factors while delivering knowledge driven business outcomes to organizations
  • the context model applied in this embodiment is divided into three main categories: i) Social Context: This describes the varied social units that structure work and information, organizations and teams, communities and their distinctive social processes and practices;
  • Semantic context describes the individual interpretation of a situation based upon an existing system of cognitive frameworks and constructs, goals and tasks; it represents the personal meaning or sense ascribed to information related to certain task or situation. It is also called sensemaking. This definition can also be extended to group interpretation with some provisos; and
  • the pragmatic context The process of translating the personal interpretation or meaning into a specific behaviour or action is moderated by interaction of an individual's rational characteristics with their affective (emotional) characteristics. This includes also the need for adaptation and interpretation of meaning in terms of dynamic and evolving environment surrounding business situations and the spatio-temporal context (location and time) as applicable.
  • Semantic context describes the individual interpretation of a situation based upon existing (or learnt) cognitive models, goals and tasks related to the situation; it represents the personal meaning or sense ascribed to information related to certain task or situation. This description is theoretically underpinned in the area of sensemaking and naturalistic decision making which as the name suggest is about constructing (or interpreting) meaning or making sense of a given situation. The process of making sense involves interplay of action and interpretation rather than the influence of evaluation on choice.
  • Knowledge acts as an interpretant to turn data into information.
  • the new information causes some level of dissonance prompting the question "What's the story here?".
  • this dissonance we create knowledge. Knowledge is created through a sensemaking process.
  • the chain connecting abstract with the personal is also called a pattern or a schema.
  • Intelligent sensemakeing involves identifying, retrieving and adapting a set of patterns or schemas and adapting those patterns or schemas to a given situation.
  • the underlying assumption here is that ignorance and knowledge coexist, which means that adaptive sensemaking both dismisss and rejects the past.
  • nurses (and physicians) like everyone else, make sense by acting thinkingly, which means they simultaneously interpret their knowledge with trusted frameworks or cognitive structures, yet mistrust those very same frameworks by testing new frameworks and new interpretations.
  • in all work people face evolving disorder. Progressive changes through time in work stipulate that a seemingly correct action "back then" is becoming an incorrect action now.
  • context-aware feedback situational and affective
  • sensemaking and intelligent data mining technologies are integrated in this embodiment to provide adaptive context-aware data mining systems.
  • Semantic and pragmatic context issues cal also be modelled in the context-aware data mining architecture.
  • the context- awareness in particular is captured at three levels, namely, cognitive, affective and situational.
  • An embodiment of the present invention provides a system for applying a sensemaking model for automated decision making or decision making support.
  • the framework of the system is illustrated in Figure 25.
  • the system comprises a controller in which is defined a decision making model for executing a plurality of decision making phases and applying rules in accordance with the model, and a plurality of agents each adapted to perform a function for use in the decision making process.
  • the controller and agents are implemented in a processor.
  • the controller is adapted to request actions be performed by the plurality of agents in accordance with the decision making model.
  • At least one of the agents is adapted to provide emotional state change data associated with a situation for the decision making process.
  • controller and agents are arranged in a seven layer architecture.
  • the layers are:
  • Reactive layer - including agents for raw data acquisition and basic data manipulation
  • Intelligent technology layer including agents for data mining and identification of patterns in the acquired data
  • Cognitive sensemaking layer for coordinating the use of agents associated with other layers and interpretation of data in accordance with a decision making context;
  • Affective sensemaking layer including agents for the acquisition and analysis of emotional state change data;
  • Situation adaptation layer including agents for monitoring decision making results and reactions thereto and agents for analysing the reactions in the context of the situation for feedback into the decision making context and adaptation of the decision making model;
  • distribution coordination layer including agents for coordinating communication between user and agent;
  • Object layer including domain agents for use by the agents of the other agents to facilitate data processing and presentation for the user.
  • the cognitive sensemaking layer coordinates the activity of the various layer agents in accordance with a decision making model associated with a context.
  • the controller controls the execution of a number of decision making phases.
  • the first phase involves data acquisition 2521 • and pre-processing 2522 such as manipulation of raw data and improving data quality performed by the reactive layer.
  • the manipulation of raw data can involve basic decision making 2523 in accordance with fixed rules, such as data conversion from one format to another, or data quality improvement such as filtering to remove errors.
  • Action which does not involve learning may be taken by reactive layer agents 2524, such as display of manipulated data.
  • the reactive layer consists of agents which represent stimulus-response phenomenon in a user defined environment.
  • the agents in this layer may include data aggregation agents, data transformation agents, data visualization agents which may not need learning.
  • the second phase is a context elicitation phase 2530 for analysis of the data to identify a situation with a problem to be solved, for example based on patterns of data, and determining the decision making model to be applied.
  • This phase is coordinated by the cognitive sensemaking layer 2510.
  • the context elicitation phase makes use of agents from the intelligent technology layer for determination of patterns of data to identify the context for which a decision making model is to be applied.
  • the intelligent technology layer contains data mining agents which involve learning to find patterns from data. This layer includes clustering, fusion (e.g., fuzzy-neuro) , generic algorithm (GA) and other agents. At the procedural level, a set of rules or patterns direct inference.
  • This set may be large, but is always closed, i.e., it corresponds to pre- formed, pre-determined insights and learnt patterns of behaviour.
  • This level is represented by the intelligent technology layer.
  • the context elicitation phase elicits context in a given situation by defining a set of orthogonal task based contexts based on the data patterns and inference output from the intelligent technology layer agents, identifies the decision making context and selects the appropriate model for this context which defines the set of rules and tasks to be applied for the context.
  • the decision making model can include tasks for execution based on the situation and rules for making decisions based on the data.
  • the third phase is a situation interpretation labelling phase 2540 for identifying the required tasks and data associated with the problem to be solved according to a predetermined decision making model.
  • This phase determines functional deployment labels within a context for the situation under study and defines the selection knowledge for navigating between functional deployment labels. Defining functional labels, includes identifying categories with in each orthogonal context relevant to a situation, and determining conflict resolution knowledge between various functional deployment labels within each context .
  • the fourth phase, controlled by the cognitive sensemaking layer 2510, is a situation action phase 2550 for executing the tasks associated with the identified problem based on the data.
  • This situation action phase 2550 applies the data to define and model outcomes or action related instances of interest to the user based on the rules for the context. These outcomes and actions can be output to a user or other system.
  • the fifth phase, controlled by the cognitive sensemaking layer 2510, is a situation adaptation phase 2560 for monitoring results from task execution and reactions from external to the system to the results, including emotional reactions. Agents from the affective sensemaking layer 2555 and the situation adaptation layer 2565 are utilised to monitoring and analysing these results.
  • the affective agents in the sensemaking (affective) layer 2555 model the affective characteristics (e.g., negative/positive emotional state) which are used for interpretation of user's feedback and actions in a given situation. This then forms an integral part of user action and affect profiling agent over time.
  • affective characteristics e.g., negative/positive emotional state
  • the situation-adaptation layer 2565 consists of situation monitoring, situation adaptation agents monitor the result of the action of the system on the user /environment in a given situation (e.g., acceptance/ rejection of a recommendation, prediction by the user/environment) and incorporate this feedback to adapt the actions of the situation-action phase agents.
  • the goals of the situation adaptation phase are to adapt existing actions to a new situation based on feedback on the actions from the user/environment and affective feedback from the user, and construct and explore new situation-action pathways. For example by defining and modelling situation monitoring parameters, situation adaptation parameters, and user's affective (emotion) parameters, and using these models to adapt the context model for future application.
  • This situation adaptation phase enables the external reaction to decisions to be fed back into the system so the model can be adapted on an ongoing basis in the environment in which it is deployed. Models may also be adapted for future application based on the external reactions .
  • a distribution and coordination layer 2570 comprising agents who process data on behalf of agents in other layers in real-time and in a distributed manner to meet the real-time and speed up needs of applications is also provided.
  • the coordination layer agents are used to coordinate communication between the user and the agents in sensemaking (cognitive) , optimization and sensemaking (affective ⁇ emotion) layers of the architecture.
  • the coordination layer agents are also used to coordinate communication between agents in the seven layers and maintaining a blackboard type global system state representation.
  • An object layer 2580 is used to represent ontology of the domain objects which are manipulated by the sensemaking (cognitive) layer agents defined by a user in a particular situation.
  • the monitoring of reactions in the situation adaptation phase enables data relating to the external reaction to the results of the task execution to be fed back into the system for automatic adjustment or refinement of the decision making model based on reaction patterns.
  • the decision making model may comprise a set of rules for manipulation of data and algorithms to apply to the data to achieve a result, such as a credit card approval or disapproval.
  • the result (approved or disapproved) and some key data can be presented to a controlling person (underwriter/manger) for vetting or approval of the automatically generated decision.
  • the reasons for the overturning can be fed back into the system for analysis and, over time, the rules adjusted in accordance with trends or patterns distinguished from the reasons for overturing decisions.
  • the data fed back in relation to overturning decision can include emotional state change data reflecting the emotional state changes exhibited by the overruling person while making the decisions to overrule the automatic decision. For example, based on the emotional state changes of the overruling person, it can be determined whether the overruling was a clear decision that the person was comfortable with (positive state change) or uncomfortable with (negative state change) or a decision they were unsure about or required substantial deliberation (multiple state changes during the decision making process and possible a long time to make the decision) .
  • This emotional state change data in combination with the cognitive decision and the data input to both the automated decision and the overruling decision can be analysed and the emotional state change data used to weight the other cognitive data during analysis for rule adaptation.
  • Relationship manager in a financial institution engages in sensemaking process using cognitive schema or constructs in a customer resource management (CRM) situation and how their cognitive structure contextualises and leverages the use intelligent data mining technologies; ii) that the schema adopted by the relationship manager is more consistent with strategy-pull approach rather than schema which may be perpetuated by objects and relationships defined by a technology push-model approach, and iii) how situation adaptation (simple form) is modelled using the agents in the situation-adaptation layer and the intelligent technology layer of an embodiment of the present invention.
  • CRM customer resource management
  • Figure 27 shows the application of the five sensemaking (Cognitive) layer agents.
  • the purpose of the sensemaking (cognitive) layer is to help relationship managers to model a CRM situation.
  • Figure 27 represents the construction level of the three behaviour levels.
  • the pre-processing, context elicitation, situation interpretation, situation-action and situation-adaptation agents assist a relationship manager to systematize and reducing dissonance in a CRM situation.
  • the dashed line in Figure 6 represents the situation action pathway related to credit card approval.
  • the shaded components in the situation construction structure represent where different layers (and their corresponding agents) have been leveraged by sensemaking layer agents in situation construction structure.
  • the reactive layer agents are used in the CRM preprocessing phase agent.
  • the situation adaptation layer agents are leveraged by the situation adaptation phase agents.
  • the intelligent technology layer agents are leveraged by situation-action phase agents and situation adaptation phase agents.
  • the arrows in the Figure 27 represent two-way communication between the sensemaking layer agents .
  • the labels shown in Figure 6 are constructed by the Relationship manager in a given CRM situation. These labels can be changed overtime and can be constructed differently by different relationship managers in different CRM situations.
  • the user action and affect profiling agent maintains a record of the situation-action pathways adopted by the user in a given CRM situation and corresponding. Affective responses (if applicable (the next section will illustrate this aspect)).
  • the affect response feature has applications in critical - 16 - event situations and warfare or where affective responses play an integral role in modelling situation-action pathways .
  • Figure 28 shows a sample implementation of the situation- adaptation layer agents for credit card approval.
  • Credit card approval process is to assess the credit level of customers based on their past commitments to the financial institution, economic ability, and demographic information.
  • the commitments to the financial institution can be the period retained in the institution.
  • the economic ability of the customer includes job status of the customer (with or without job) , years working in the current company (if any) , amount of money frequently deposited to the bank, monthly loan payment amount, etc.
  • Demographic information includes gender, age, location etc. to comprehensively judge the personal integrity of the customer. For example, some regions have much higher risks than others for credit card approval. In this case, customers living in high risk area find it hard to get the approval for a credit card from a bank/financial institution.
  • A1-A15 are the variables, A16 in the table represents credit card approval status (1-yes, 0- no) .
  • the data set in Table 4 is used at the procedural level for training a neural network (BP) prediction agent as shown in Figure 28.
  • BP neural network
  • the situation adaptation agent is responsible for adapting the weight parameters NN credit card approval prediction agent for prediction. These parameters may be changed by the situation- adaptation agent to improve the performance of the NN prediction agent. Predictions produced from the prediction agent are, of course, based on the data from the database of historical data. The prediction results in terms of their acceptance/rejection can be assessed manually (by the manager) or by situation monitoring agent (refer Table 5 for definition) shown in Figure 28 (once it has been trained on Manager's feedback over time).
  • the neural network in the situation monitoring agent compares the systems prediction with the human user/manager approval to learn the approval behaviour of human counterpart in a CRM situation. Initially, the feedback is provided by manager and situation monitoring agent models the gap between predicted variable and its acceptance/rejection by the relationship manager. Overtime/with enough training/learning based on manager's feedback, the situation monitoring agent's performance will be comparable to the human agent and takes over most of the situation assessment jobs from human counterpart.
  • the GA based adaptation process optimises the weights of the neural network adopting chromosome selection, crossover, and mutation so as to improve predictive behaviour of NN credit card approval agent based on an ongoing basis based on the feedback.
  • Figure 29 shows the performance comparison of the neural network back propagation (BP) agent without adaptation and after adaptation.
  • BP neural network back propagation

Abstract

L'invention concerne un procédé et un système de surveillance des changements d'état émotionnel chez un sujet sur la base des expressions du visage, s'appuyant sur la capture de premières puis des deuxièmes données d'image faciale du sujet à un premier puis à un deuxième instant, et le traitement des premières et deuxièmes données d'image faciale pour produire des données de changement d'état émotionnel relatives au sujet pour la période comprise entre la capture des premières données d'image faciale et des deuxièmes données d'image faciale. Dans certains modes de réalisation, une suite de stimuli est appliquée au sujet et des données de changement d'état émotionnel corrélées avec chaque stimulus sont acquises. Le procédé et le système peuvent être utilisés pour la modélisation des relations entre les réponses cognitives et émotionnelles. Les modèles et l'analyse des changements d'état émotionnel du sujet peuvent être employés dans divers contextes tels que la prise de décisions en matière de gestion, de recrutement, d'évaluation, d'enseignement et d'exploration approfondie de données.
PCT/AU2007/001854 2006-12-01 2007-11-30 Procédé et système de surveillance des changements d'état émotionnel WO2008064431A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU2007327315A AU2007327315B2 (en) 2006-12-01 2007-11-30 Method and system for monitoring emotional state changes

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
AU2006906746 2006-12-01
AU2006906746A AU2006906746A0 (en) 2006-12-01 Method and system for monitoring emotional state changes

Publications (1)

Publication Number Publication Date
WO2008064431A1 true WO2008064431A1 (fr) 2008-06-05

Family

ID=39467366

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/AU2007/001854 WO2008064431A1 (fr) 2006-12-01 2007-11-30 Procédé et système de surveillance des changements d'état émotionnel

Country Status (2)

Country Link
AU (1) AU2007327315B2 (fr)
WO (1) WO2008064431A1 (fr)

Cited By (52)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120083668A1 (en) * 2010-09-30 2012-04-05 Anantha Pradeep Systems and methods to modify a characteristic of a user device based on a neurological and/or physiological measurement
WO2012052880A3 (fr) * 2010-10-19 2012-06-21 Koninklijke Philips Electronics N.V. Surveillance de l'anxiété
TWI402777B (zh) * 2009-08-04 2013-07-21 Sinew System Tech Co Ltd Management Method of Real Estate in Community Building
NL1039419C2 (nl) * 2012-02-28 2013-09-02 Allprofs Group B V Werkwijze voor analyse van een video-opname.
WO2014066871A1 (fr) * 2012-10-27 2014-05-01 Affectiva, Inc. Collecte sporadique de données d'affect transitoire
CN104871531A (zh) * 2012-12-20 2015-08-26 皇家飞利浦有限公司 监控等候区
WO2016049234A1 (fr) * 2014-09-23 2016-03-31 Icahn School Of Medicine At Mount Sinai Systèmes et méthodes de traitement d'un trouble psychiatrique
US9336535B2 (en) 2010-05-12 2016-05-10 The Nielsen Company (Us), Llc Neuro-response data synchronization
CN105559804A (zh) * 2015-12-23 2016-05-11 上海矽昌通信技术有限公司 一种基于多种监控的心情管家系统
WO2016077177A1 (fr) * 2014-11-10 2016-05-19 Intel Corporation Indication sociale basée sur une observation en contexte
US9454646B2 (en) 2010-04-19 2016-09-27 The Nielsen Company (Us), Llc Short imagery task (SIT) research method
WO2016189202A1 (fr) * 2015-05-26 2016-12-01 Seniortek Oy Système et procédé de surveillance
US9521960B2 (en) 2007-10-31 2016-12-20 The Nielsen Company (Us), Llc Systems and methods providing en mass collection and centralized processing of physiological responses from viewers
US9560984B2 (en) 2009-10-29 2017-02-07 The Nielsen Company (Us), Llc Analysis of controlled and automatic attention for introduction of stimulus material
US9569986B2 (en) 2012-02-27 2017-02-14 The Nielsen Company (Us), Llc System and method for gathering and analyzing biometric user feedback for use in social media and advertising applications
US9571877B2 (en) 2007-10-02 2017-02-14 The Nielsen Company (Us), Llc Systems and methods to determine media effectiveness
EP3155961A1 (fr) * 2015-10-14 2017-04-19 Panasonic Intellectual Property Corporation of America Procédé d'estimation d'émotion, appareil d'estimation d'émotion et support d'enregistrement de programme
CN107320090A (zh) * 2017-06-28 2017-11-07 广东数相智能科技有限公司 一种突发疾病监护系统及方法
US20170364929A1 (en) * 2016-06-17 2017-12-21 Sanjiv Ferreira Method and system for identifying, aggregating & transforming emotional states of a user using a temporal phase topology framework
US9886981B2 (en) 2007-05-01 2018-02-06 The Nielsen Company (Us), Llc Neuro-feedback based stimulus compression device
US9936250B2 (en) 2015-05-19 2018-04-03 The Nielsen Company (Us), Llc Methods and apparatus to adjust content presented to an individual
EP3361467A1 (fr) * 2017-02-14 2018-08-15 Find Solution Artificial Intelligence Limited Apprentissage adaptatif et interactif et système de gestion d'apprentissage faisant appel au suivi du visage et la détection d'émotions et procédés associés
CN108596760A (zh) * 2018-05-14 2018-09-28 平安普惠企业管理有限公司 贷款风险评估方法及服务器
CN108632555A (zh) * 2017-03-16 2018-10-09 卡西欧计算机株式会社 动态图像处理装置、动态图像处理方法以及记录介质
US10127572B2 (en) 2007-08-28 2018-11-13 The Nielsen Company, (US), LLC Stimulus placement system using subject neuro-response measurements
US10140628B2 (en) 2007-08-29 2018-11-27 The Nielsen Company, (US), LLC Content based selection and meta tagging of advertisement breaks
WO2018218286A1 (fr) * 2017-05-29 2018-12-06 Saltor Pty Ltd Procédé et système de détection d'anomalie
US10216983B2 (en) 2016-12-06 2019-02-26 General Electric Company Techniques for assessing group level cognitive states
JP2019072371A (ja) * 2017-10-18 2019-05-16 株式会社日立製作所 システム及び意思疎通を図るために行うアクションの評価方法
CN109770918A (zh) * 2017-11-13 2019-05-21 株式会社何嘉 情绪分析装置、方法及记录该方法程序的可机读存储介质
US20190179970A1 (en) * 2017-12-07 2019-06-13 International Business Machines Corporation Cognitive human interaction and behavior advisor
CN110210289A (zh) * 2019-04-19 2019-09-06 平安科技(深圳)有限公司 情绪识别方法、装置、计算机可读存储介质及电子设备
RU2700537C1 (ru) * 2019-02-04 2019-09-17 Общество с ограниченной ответственностью "КВАРТА ВК" Способ определения эмоционального состояния человека
KR20200005986A (ko) * 2018-07-09 2020-01-17 주식회사 두브레인 얼굴인식을 이용한 인지장애 진단 시스템 및 방법
WO2020039152A2 (fr) 2018-08-24 2020-02-27 Pls Experience Système multimédia comportant un équipement matériel d'interaction homme-machine et un ordinateur
US10580031B2 (en) 2007-05-16 2020-03-03 The Nielsen Company (Us), Llc Neuro-physiology and neuro-behavioral based stimulus targeting system
CN111183455A (zh) * 2017-08-29 2020-05-19 互曼人工智能科技(上海)有限公司 图像数据处理系统与方法
US10679241B2 (en) 2007-03-29 2020-06-09 The Nielsen Company (Us), Llc Analysis of marketing and entertainment effectiveness using central nervous system, autonomic nervous system, and effector data
US10733625B2 (en) 2007-07-30 2020-08-04 The Nielsen Company (Us), Llc Neuro-response stimulus and stimulus attribute resonance estimator
CN112515674A (zh) * 2020-11-30 2021-03-19 重庆工程职业技术学院 心理危机预警系统
US10963895B2 (en) 2007-09-20 2021-03-30 Nielsen Consumer Llc Personalized content delivery using neuro-response priming data
US10987015B2 (en) 2009-08-24 2021-04-27 Nielsen Consumer Llc Dry electrodes for electroencephalography
CN113164119A (zh) * 2018-11-21 2021-07-23 法雷奥热系统公司 与机动车辆的乘员交互的系统
CN113239794A (zh) * 2021-05-11 2021-08-10 西北工业大学 一种面向在线学习的学习状态自动识别方法
CN113255530A (zh) * 2021-05-31 2021-08-13 合肥工业大学 基于注意力的多通道数据融合网络架构及数据处理方法
US11222199B2 (en) 2018-12-05 2022-01-11 International Business Machines Corporation Automatically suggesting behavioral adjustments during video conferences
WO2022067372A1 (fr) * 2020-09-29 2022-04-07 Human Centred Innovations Pty Ltd Robot social virtuel et physique à caractéristiques humanoïdes
CN114971658A (zh) * 2022-07-29 2022-08-30 四川安洵信息技术有限公司 一种反诈宣传方法、系统、电子设备以及存储介质
US11481788B2 (en) 2009-10-29 2022-10-25 Nielsen Consumer Llc Generating ratings predictions using neuro-response data
EP4099280A1 (fr) * 2021-06-04 2022-12-07 Tata Consultancy Services Limited Procédé et système de détection de niveau de confiance à partir des caractéristiques de l' il
CN115547501A (zh) * 2022-11-24 2022-12-30 国能大渡河大数据服务有限公司 一种结合工作特征的员工情绪感知方法及系统
US11704681B2 (en) 2009-03-24 2023-07-18 Nielsen Consumer Llc Neurological profiles for market matching and stimulus presentation

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10770072B2 (en) 2018-12-10 2020-09-08 International Business Machines Corporation Cognitive triggering of human interaction strategies to facilitate collaboration, productivity, and learning

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050089206A1 (en) * 2003-10-23 2005-04-28 Rice Robert R. Robust and low cost optical system for sensing stress, emotion and deception in human subjects
US20050289582A1 (en) * 2004-06-24 2005-12-29 Hitachi, Ltd. System and method for capturing and using biometrics to review a product, service, creative work or thing
JP2006069358A (ja) * 2004-09-01 2006-03-16 Fuji Heavy Ind Ltd 車両の運転支援装置
EP1667049A2 (fr) * 2004-12-03 2006-06-07 Invacare International Sàrl Système d'analyse de traits du visage

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050089206A1 (en) * 2003-10-23 2005-04-28 Rice Robert R. Robust and low cost optical system for sensing stress, emotion and deception in human subjects
US20050289582A1 (en) * 2004-06-24 2005-12-29 Hitachi, Ltd. System and method for capturing and using biometrics to review a product, service, creative work or thing
JP2006069358A (ja) * 2004-09-01 2006-03-16 Fuji Heavy Ind Ltd 車両の運転支援装置
EP1667049A2 (fr) * 2004-12-03 2006-06-07 Invacare International Sàrl Système d'analyse de traits du visage

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
BLACK ET AL.: "Recognizing Facial Expressions in Image Sequences Using Local Parameterized Models of Image Motion", INTERNATIONAL JOURNAL OF COMPUTER VISION, vol. 25, no. 1, 1997, pages 23 - 48, XP000723753, DOI: doi:10.1023/A:1007977618277 *
KHOSLA R. ET AL.: "Behaviour Profiling Based on Psychological Data and Emotional States", vol. 3215, 2004 *
TERZOPOULUS ET AL.: "Analysis and synthesis of facial image sequences using physical anatomical models", vol. 15, no. 6, 1993, pages 569 - 579, XP000369961, DOI: doi:10.1109/34.216726 *

Cited By (85)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11250465B2 (en) 2007-03-29 2022-02-15 Nielsen Consumer Llc Analysis of marketing and entertainment effectiveness using central nervous system, autonomic nervous sytem, and effector data
US11790393B2 (en) 2007-03-29 2023-10-17 Nielsen Consumer Llc Analysis of marketing and entertainment effectiveness using central nervous system, autonomic nervous system, and effector data
US10679241B2 (en) 2007-03-29 2020-06-09 The Nielsen Company (Us), Llc Analysis of marketing and entertainment effectiveness using central nervous system, autonomic nervous system, and effector data
US9886981B2 (en) 2007-05-01 2018-02-06 The Nielsen Company (Us), Llc Neuro-feedback based stimulus compression device
US10580031B2 (en) 2007-05-16 2020-03-03 The Nielsen Company (Us), Llc Neuro-physiology and neuro-behavioral based stimulus targeting system
US11049134B2 (en) 2007-05-16 2021-06-29 Nielsen Consumer Llc Neuro-physiology and neuro-behavioral based stimulus targeting system
US11763340B2 (en) 2007-07-30 2023-09-19 Nielsen Consumer Llc Neuro-response stimulus and stimulus attribute resonance estimator
US10733625B2 (en) 2007-07-30 2020-08-04 The Nielsen Company (Us), Llc Neuro-response stimulus and stimulus attribute resonance estimator
US11244345B2 (en) 2007-07-30 2022-02-08 Nielsen Consumer Llc Neuro-response stimulus and stimulus attribute resonance estimator
US11488198B2 (en) 2007-08-28 2022-11-01 Nielsen Consumer Llc Stimulus placement system using subject neuro-response measurements
US10937051B2 (en) 2007-08-28 2021-03-02 The Nielsen Company (Us), Llc Stimulus placement system using subject neuro-response measurements
US10127572B2 (en) 2007-08-28 2018-11-13 The Nielsen Company, (US), LLC Stimulus placement system using subject neuro-response measurements
US11023920B2 (en) 2007-08-29 2021-06-01 Nielsen Consumer Llc Content based selection and meta tagging of advertisement breaks
US10140628B2 (en) 2007-08-29 2018-11-27 The Nielsen Company, (US), LLC Content based selection and meta tagging of advertisement breaks
US11610223B2 (en) 2007-08-29 2023-03-21 Nielsen Consumer Llc Content based selection and meta tagging of advertisement breaks
US10963895B2 (en) 2007-09-20 2021-03-30 Nielsen Consumer Llc Personalized content delivery using neuro-response priming data
US9571877B2 (en) 2007-10-02 2017-02-14 The Nielsen Company (Us), Llc Systems and methods to determine media effectiveness
US9894399B2 (en) 2007-10-02 2018-02-13 The Nielsen Company (Us), Llc Systems and methods to determine media effectiveness
US9521960B2 (en) 2007-10-31 2016-12-20 The Nielsen Company (Us), Llc Systems and methods providing en mass collection and centralized processing of physiological responses from viewers
US11250447B2 (en) 2007-10-31 2022-02-15 Nielsen Consumer Llc Systems and methods providing en mass collection and centralized processing of physiological responses from viewers
US10580018B2 (en) 2007-10-31 2020-03-03 The Nielsen Company (Us), Llc Systems and methods providing EN mass collection and centralized processing of physiological responses from viewers
US11704681B2 (en) 2009-03-24 2023-07-18 Nielsen Consumer Llc Neurological profiles for market matching and stimulus presentation
TWI402777B (zh) * 2009-08-04 2013-07-21 Sinew System Tech Co Ltd Management Method of Real Estate in Community Building
US10987015B2 (en) 2009-08-24 2021-04-27 Nielsen Consumer Llc Dry electrodes for electroencephalography
US9560984B2 (en) 2009-10-29 2017-02-07 The Nielsen Company (Us), Llc Analysis of controlled and automatic attention for introduction of stimulus material
US10068248B2 (en) 2009-10-29 2018-09-04 The Nielsen Company (Us), Llc Analysis of controlled and automatic attention for introduction of stimulus material
US11669858B2 (en) 2009-10-29 2023-06-06 Nielsen Consumer Llc Analysis of controlled and automatic attention for introduction of stimulus material
US11170400B2 (en) 2009-10-29 2021-11-09 Nielsen Consumer Llc Analysis of controlled and automatic attention for introduction of stimulus material
US10269036B2 (en) 2009-10-29 2019-04-23 The Nielsen Company (Us), Llc Analysis of controlled and automatic attention for introduction of stimulus material
US11481788B2 (en) 2009-10-29 2022-10-25 Nielsen Consumer Llc Generating ratings predictions using neuro-response data
US11200964B2 (en) 2010-04-19 2021-12-14 Nielsen Consumer Llc Short imagery task (SIT) research method
US10248195B2 (en) 2010-04-19 2019-04-02 The Nielsen Company (Us), Llc. Short imagery task (SIT) research method
US9454646B2 (en) 2010-04-19 2016-09-27 The Nielsen Company (Us), Llc Short imagery task (SIT) research method
US9336535B2 (en) 2010-05-12 2016-05-10 The Nielsen Company (Us), Llc Neuro-response data synchronization
US20120083668A1 (en) * 2010-09-30 2012-04-05 Anantha Pradeep Systems and methods to modify a characteristic of a user device based on a neurological and/or physiological measurement
WO2012052880A3 (fr) * 2010-10-19 2012-06-21 Koninklijke Philips Electronics N.V. Surveillance de l'anxiété
CN103167831A (zh) * 2010-10-19 2013-06-19 皇家飞利浦电子股份有限公司 焦虑监测
US9569986B2 (en) 2012-02-27 2017-02-14 The Nielsen Company (Us), Llc System and method for gathering and analyzing biometric user feedback for use in social media and advertising applications
US10881348B2 (en) 2012-02-27 2021-01-05 The Nielsen Company (Us), Llc System and method for gathering and analyzing biometric user feedback for use in social media and advertising applications
NL1039419C2 (nl) * 2012-02-28 2013-09-02 Allprofs Group B V Werkwijze voor analyse van een video-opname.
WO2014066871A1 (fr) * 2012-10-27 2014-05-01 Affectiva, Inc. Collecte sporadique de données d'affect transitoire
CN104871531A (zh) * 2012-12-20 2015-08-26 皇家飞利浦有限公司 监控等候区
US20150324634A1 (en) * 2012-12-20 2015-11-12 Koninklijke Philips N.V. Monitoring a waiting area
WO2016049234A1 (fr) * 2014-09-23 2016-03-31 Icahn School Of Medicine At Mount Sinai Systèmes et méthodes de traitement d'un trouble psychiatrique
US10898131B2 (en) 2014-09-23 2021-01-26 Icahn School Of Medicine At Mount Sinai Systems and methods for treating a psychiatric disorder
US10123737B2 (en) 2014-09-23 2018-11-13 Icahn School Of Medicine At Mount Sinai Systems and methods for treating a psychiatric disorder
US11903725B2 (en) 2014-09-23 2024-02-20 Icahn School of Medicine of Mount Sinai Systems and methods for treating a psychiatric disorder
WO2016077177A1 (fr) * 2014-11-10 2016-05-19 Intel Corporation Indication sociale basée sur une observation en contexte
US11290779B2 (en) 2015-05-19 2022-03-29 Nielsen Consumer Llc Methods and apparatus to adjust content presented to an individual
US9936250B2 (en) 2015-05-19 2018-04-03 The Nielsen Company (Us), Llc Methods and apparatus to adjust content presented to an individual
US10771844B2 (en) 2015-05-19 2020-09-08 The Nielsen Company (Us), Llc Methods and apparatus to adjust content presented to an individual
WO2016189202A1 (fr) * 2015-05-26 2016-12-01 Seniortek Oy Système et procédé de surveillance
EP3155961A1 (fr) * 2015-10-14 2017-04-19 Panasonic Intellectual Property Corporation of America Procédé d'estimation d'émotion, appareil d'estimation d'émotion et support d'enregistrement de programme
CN105559804A (zh) * 2015-12-23 2016-05-11 上海矽昌通信技术有限公司 一种基于多种监控的心情管家系统
US20170364929A1 (en) * 2016-06-17 2017-12-21 Sanjiv Ferreira Method and system for identifying, aggregating & transforming emotional states of a user using a temporal phase topology framework
US10216983B2 (en) 2016-12-06 2019-02-26 General Electric Company Techniques for assessing group level cognitive states
EP3361467A1 (fr) * 2017-02-14 2018-08-15 Find Solution Artificial Intelligence Limited Apprentissage adaptatif et interactif et système de gestion d'apprentissage faisant appel au suivi du visage et la détection d'émotions et procédés associés
CN108632555B (zh) * 2017-03-16 2021-01-26 卡西欧计算机株式会社 动态图像处理装置、动态图像处理方法以及记录介质
CN108632555A (zh) * 2017-03-16 2018-10-09 卡西欧计算机株式会社 动态图像处理装置、动态图像处理方法以及记录介质
WO2018218286A1 (fr) * 2017-05-29 2018-12-06 Saltor Pty Ltd Procédé et système de détection d'anomalie
CN107320090A (zh) * 2017-06-28 2017-11-07 广东数相智能科技有限公司 一种突发疾病监护系统及方法
CN111183455A (zh) * 2017-08-29 2020-05-19 互曼人工智能科技(上海)有限公司 图像数据处理系统与方法
JP2019072371A (ja) * 2017-10-18 2019-05-16 株式会社日立製作所 システム及び意思疎通を図るために行うアクションの評価方法
CN109770918A (zh) * 2017-11-13 2019-05-21 株式会社何嘉 情绪分析装置、方法及记录该方法程序的可机读存储介质
US20190179970A1 (en) * 2017-12-07 2019-06-13 International Business Machines Corporation Cognitive human interaction and behavior advisor
CN108596760A (zh) * 2018-05-14 2018-09-28 平安普惠企业管理有限公司 贷款风险评估方法及服务器
KR20200005986A (ko) * 2018-07-09 2020-01-17 주식회사 두브레인 얼굴인식을 이용한 인지장애 진단 시스템 및 방법
KR102166010B1 (ko) * 2018-07-09 2020-10-15 주식회사 두브레인 얼굴인식을 이용한 인지장애 판단 방법 및 시스템
WO2020039152A3 (fr) * 2018-08-24 2020-05-14 Pls Experience Système multimédia comportant un équipement matériel d'interaction homme-machine et un ordinateur
WO2020039152A2 (fr) 2018-08-24 2020-02-27 Pls Experience Système multimédia comportant un équipement matériel d'interaction homme-machine et un ordinateur
FR3085221A1 (fr) * 2018-08-24 2020-02-28 Pls Experience Systeme multimedia comportant un equipement materiel d’interaction homme-machine et un ordinateur
CN113164119A (zh) * 2018-11-21 2021-07-23 法雷奥热系统公司 与机动车辆的乘员交互的系统
US11222199B2 (en) 2018-12-05 2022-01-11 International Business Machines Corporation Automatically suggesting behavioral adjustments during video conferences
RU2700537C1 (ru) * 2019-02-04 2019-09-17 Общество с ограниченной ответственностью "КВАРТА ВК" Способ определения эмоционального состояния человека
CN110210289A (zh) * 2019-04-19 2019-09-06 平安科技(深圳)有限公司 情绪识别方法、装置、计算机可读存储介质及电子设备
WO2022067372A1 (fr) * 2020-09-29 2022-04-07 Human Centred Innovations Pty Ltd Robot social virtuel et physique à caractéristiques humanoïdes
CN112515674B (zh) * 2020-11-30 2023-07-07 重庆工程职业技术学院 心理危机预警系统
CN112515674A (zh) * 2020-11-30 2021-03-19 重庆工程职业技术学院 心理危机预警系统
CN113239794B (zh) * 2021-05-11 2023-05-23 西北工业大学 一种面向在线学习的学习状态自动识别方法
CN113239794A (zh) * 2021-05-11 2021-08-10 西北工业大学 一种面向在线学习的学习状态自动识别方法
CN113255530A (zh) * 2021-05-31 2021-08-13 合肥工业大学 基于注意力的多通道数据融合网络架构及数据处理方法
CN113255530B (zh) * 2021-05-31 2024-03-29 合肥工业大学 基于注意力的多通道数据融合网络架构及数据处理方法
EP4099280A1 (fr) * 2021-06-04 2022-12-07 Tata Consultancy Services Limited Procédé et système de détection de niveau de confiance à partir des caractéristiques de l' il
CN114971658A (zh) * 2022-07-29 2022-08-30 四川安洵信息技术有限公司 一种反诈宣传方法、系统、电子设备以及存储介质
CN115547501A (zh) * 2022-11-24 2022-12-30 国能大渡河大数据服务有限公司 一种结合工作特征的员工情绪感知方法及系统

Also Published As

Publication number Publication date
AU2007327315A1 (en) 2008-06-05
AU2007327315B2 (en) 2013-07-04

Similar Documents

Publication Publication Date Title
AU2007327315B2 (en) Method and system for monitoring emotional state changes
Bernieri et al. Interactional synchrony and rapport: Measuring synchrony in displays devoid of sound and facial affect
Lisetti et al. MAUI: a multimodal affective user interface
Derrick et al. Design principles for special purpose, embodied, conversational intelligence with environmental sensors (SPECIES) agents
US20220392625A1 (en) Method and system for an interface to provide activity recommendations
Rizzo et al. Detection and computational analysis of psychological signals using a virtual human interviewing agent
Rasipuram et al. Automatic multimodal assessment of soft skills in social interactions: a review
Feng et al. Engagement evaluation for autism intervention by robots based on dynamic bayesian network and expert elicitation
Giannakos et al. Sensor-based analytics in education: Lessons learned from research in multimodal learning analytics
Khalid et al. Determinants of trust in human-robot interaction: Modeling, measuring, and predicting
Asher et al. Eliciting tacit knowledge in professions based on interpersonal interactions
Wagner et al. Psychological modeling of humans by assistive robots
Remland et al. Uses and consequences of nonverbal communication in the context of organizational life
Schneeberger et al. Towards a deeper modeling of emotions: The Deep method and its application on shame
Zeyda et al. Your body tells more than words–Predicting perceived meeting productivity through body signals
Yates Affective intelligence in built environments
Ghazy The evolution of well-being approach within the Industry 5.0 concept
Khaled et al. THE EVOLUTION OF WELL-BEING APPROACH WITHIN THE INDUSTRY 5.0 CONCEPT
Connors et al. Movement Pattern Analysis (MPA): decoding individual differences in embodied decision making
Dianiska et al. Communication Objectives Model (COM): A Taxonomy of Face-to-Face Communication Objectives to Inform Tele-Presence Technology Adoption
Patulny ‘The New Economy and the Privilege of Feeling’: Towards a Theory of Emotional Structuration
Keary Affective Computing for Emotion Detection using Vision and Wearable Sensors
Taheri Multimodal Multisensor attention modelling
Jarvie Making sense of employment after a cardiac arrhythmia diagnosis
Törmänen Emotion regulation in collaborative learning: students’ affective states as conditions for socially shared regulation

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 07815654

Country of ref document: EP

Kind code of ref document: A1

DPE1 Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101)
NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2007327315

Country of ref document: AU

ENP Entry into the national phase

Ref document number: 2007327315

Country of ref document: AU

Date of ref document: 20071130

Kind code of ref document: A

122 Ep: pct application non-entry in european phase

Ref document number: 07815654

Country of ref document: EP

Kind code of ref document: A1