AU2007327315B2 - Method and system for monitoring emotional state changes - Google Patents

Method and system for monitoring emotional state changes Download PDF

Info

Publication number
AU2007327315B2
AU2007327315B2 AU2007327315A AU2007327315A AU2007327315B2 AU 2007327315 B2 AU2007327315 B2 AU 2007327315B2 AU 2007327315 A AU2007327315 A AU 2007327315A AU 2007327315 A AU2007327315 A AU 2007327315A AU 2007327315 B2 AU2007327315 B2 AU 2007327315B2
Authority
AU
Australia
Prior art keywords
data
emotional state
subject
emotional
state change
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
AU2007327315A
Other versions
AU2007327315A1 (en
Inventor
Rajiv Khosla
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from AU2006906746A external-priority patent/AU2006906746A0/en
Application filed by Individual filed Critical Individual
Priority to AU2007327315A priority Critical patent/AU2007327315B2/en
Publication of AU2007327315A1 publication Critical patent/AU2007327315A1/en
Application granted granted Critical
Publication of AU2007327315B2 publication Critical patent/AU2007327315B2/en
Assigned to LA TROBE UNIVERSITY reassignment LA TROBE UNIVERSITY Amend patent request/document other than specification (104) Assignors: LATROBE UNIVERSITY
Assigned to KHOSLA, RAJIV reassignment KHOSLA, RAJIV Request for Assignment Assignors: LA TROBE UNIVERSITY
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • A61B5/7267Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/165Evaluating the state of mind, e.g. depression, anxiety
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/174Facial expression recognition
    • G06V40/176Dynamic expression
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/70ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for mining of medical data, e.g. analysing previous cases of other patients

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Psychiatry (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Animal Behavior & Ethology (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Physiology (AREA)
  • Hospice & Palliative Care (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Mathematical Physics (AREA)
  • Fuzzy Systems (AREA)
  • Evolutionary Computation (AREA)
  • Child & Adolescent Psychology (AREA)
  • Developmental Disabilities (AREA)
  • Educational Technology (AREA)
  • Signal Processing (AREA)
  • Psychology (AREA)
  • Social Psychology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

A method and system are provided for monitoring emotional state changes in a subject based on facial expressions, based on capturing first facial image data of the subject and second facial image data of the subject at first and second times respectively, and processing the first and second facial image data to produce emotional state change data for the subject for the period of time between the capture of the first facial image data and the second facial image data. In some embodiments a sequence of stimuli is applied to the subject and emotional state change data correlated with each stimulus acquired. The method and system can be used for modelling relationships between cognitive and emotional responses. Models and subject emotional state change analysis can be employed in a variety of contexts such as management decision making, recruitment, evaluation, education and data mining.

Description

WO 2008/064431 PCT/AU2007/001854 METHOD AND SYSTEM FOR MONITORING EMOTIONAL STATE CHANGES Field of the invention 5 The field of the present invention is the automated monitoring of emotional state changes. An example of an application for the method of the present invention is for monitoring a person's emotional reactions during an interview or assessment process. 10 Background Facial expressions, vocal expressions, gesture, posture, gait and physiological data such as blood pressure and 15 heart rate can all be used for the purpose of analysing the emotional state of a human subject. Measuring of physiological data typically requires some physical invasion, such as the application of heart rate or blood pressure monitoring equipment, which can be awkward or 20 discomforting for the subject. Non-invasive monitoring emotional states typically requires a trained person to observe facial expressions, gestures, postures and vocal expressions of a subject and make an educated, but subjective, assessment of the subject's emotional state 25 based on these observations. However, often changes in facial expression are very slight and difficult to observe or qualify. Emotional systems in humans influence many cognitive 30 processes (e.g., decision making, focus and attention, goal generation, categorization). Recognition of emotional information is a key part of human-human communication. In some contexts, such as a conversation or interview, the correlation between a person's expression and their words 35 or actions can influence the level of confidence or trust another person has in the first person. For example in a police interview context the level of confidence in the WO 2008/064431 PCT/AU2007/001854 -2 subject's responses to questions can be influenced by observation of their expressions reactions when providing the response and the interviewer get a sense for whether the person in lying or withholding information based on 5 these observations, however this is a subjective assessment and dependent on the skill of the interviewer. A known example of an objective measure of a person's reactions during an interview is a polygraph or "lie detector" test occasionally used during police interviews 10 where physiological reactions are monitored and used to indicate whether a person is being truthful. However, this is an invasive proves requiring the subject to be connected to various physiological monitoring equipment. 15 Looking at the context of recruitment, statistics show the most widely used tool for selecting a candidate is the interviewing. Eighty five to ninety six percent of organizations rely on this mode as their main strategy of recruitment. An interview is a complex social interaction 20 between candidate and interviewer. The interviewing process assesses applicant interests, motivation or affinity for a particular job. Personal qualities such as oral communication, decisiveness and manner of self preservation are evaluated in the interview. However, a 25 problem with the interview process is that decisions are based on subjective analysis, by the interviewer, of the candidate's reactions. Social psychology research shows that attitude similarity 30 may lead to interpersonal attraction. Research has shown that interviewers prefer candidates whom they like personally and perceive to be similar to them. Another study found that there is a significant effect on performance rating and biasing for interviewer behaviour. 35 Non-verbal communication such as eye contact, head moving, posture, smiling and speech fluency, body orientation and voice modulation influence rating of candidates in the WO 2008/064431 PCT/AU2007/001854 -3 interview. Further, the interview is primarily based on information provided in the resume of the candidate. It does not include information, for example, about behavioural capabilities of the candidate. Also, the 5 success of this approach is limited by subjectively and mood of the interviewer and time constraints in which the decision is made. The time constraint of an interview does not facilitate complete evaluation of the sales candidate. 10 In order to introduce some objective measures and consistency in the recruitment process some organizations employ psychometric techniques and aptitude tests. Aptitude tests, cognitive tests, intelligence tests and 15 personality tests can be used in selecting the candidates. These tests evaluate the variables such as aptitudes, achievements, attitudes, other cognitive and non-cognitive characteristics, also personality traits and characteristics. 20 Information technology (IT) has been implemented in the crucial HR functions such as recruitment, selection, training and performance appraisal. Psychometric testing uses IT in a limited way. The DISC® Classic and the 25 Myers-Briggs Type Indicator® are most commonly used indicators for personality testing within the IT paradigm. The DISC personal profiling system is a personality behavioural testing profile using a four dimensional model 30 of normal behaviour in an assessment, inventory, and survey format either as a self-scored paper or online version. The four dimensional behaviour is: Dominance, Influence, Steadiness and Conscientiousness/compliance (DISC). The four dimensional human behaviour could be 35 studied using a two-axis model based on a person's action in a favourable or an unfavourable environment providing observational methods to demonstrate how four primary WO 2008/064431 PCT/AU2007/001854 -4 emotions are related to a logical analysis of neurological results. The DISC personal profile system can present a plan 5 helping to understand candidates and others in a specific environment. Through using these profiles it is possible to understand their behaviour and temperament and identify the environment most conducive for personal and organisational success. At the same time, they can learn 10 about the ways others differ, and the different environments people need for maximum productivity and teamwork in an organisation. Research evidence supports the conclusion that the most effective candidates are those who know themselves, recognize the demands of a 15 situation, and adapt strategies to meet those needs. By maximizing strengths and minimizing weaknesses, these profiles can ensure that individuals are able to perform to potential. These personal profiles enable the candidates to identify their behavioural, character and 20 temperament profiles, capitalise on their strengths, anticipate and minimise potential problems and conflict and read and understand others better. The DISC model does not measure personalities but rather 25 behaviour in a specific situation. Unlike other profiles, the DISC model assumes that candidates have the ability to choose their preferred style and behave in a manner they have some control over. Although everyone tends to have a default behaviour style, by being aware of the process 30 they are able to understand themselves and those they deal with to ensure they can gain maximum benefit out of any situation. Thus, DISC has been used in many areas including: team development, leadership, change management, negotiation, sales and conflicts. 35 Myers-Briggs Personality Type Indicator® is a questionnaire based on the psychological teachings of Carl WO 2008/064431 PCT/AU2007/001854 -5 Jung. The Myers-Briggs Personality Type Indicator* measures a person's preferences, using four basic scales with opposite poles. The four scales are: (1) Extraversion/Introversion (describes where people prefer 5 to focus their attention and get their energy from the outer world of people and activity or their inner world of ideas and experiences), (2) Sensate/Intuitive (describes how people prefer to take in information focused on what is real and actual or on patterns and meanings in data), 10 (3) Thinking/Feeling: describes how people prefer to make decisions based on logical analysis or guided by concern for their impact on others; and , (4) Judging/Perceiving (describes how people prefer to deal with the outer world in a planned orderly way, or in a flexible spontaneous 15 way). The Indicator is a very useful tool to enlarge and deepen candidates' self-knowledge and understanding of their behaviour. It can be of real benefit to them in making 20 informed life-choices and in relationship building. From a recruiter's perspective it can be used to help understand the candidate's personality and preferences. It is not a test. There are no right or wrong answers. This is an "indicator" of candidates' personality as they see 25 the "real them". There are limitations to the value of these techniques. These techniques do not yield an absolute score. The performances on these tests are typically relative and 30 scores have significance according to some reference. Further, indirect questions are used in psychometric techniques for evaluation. These questions may be not well understood by the candidates leading, to unpredictable results, or the interviewers, leading to 35 poor result interpretation. For these reasons psychometric tests can be viewed as an ineffective tool for predicting a person's emotional behaviour in a WO 2008/064431 PCT/AU2007/001854 -6 workplace. There is a need for a method to obtain data indicating how a person actually reacts to a given situation or question 5 to objectively acquire information indicative of the person's attitude or emotional reaction to the situation. To this end it is desirable to be able to monitor changes in emotional state using observable physical reactions such as changes in facial expression in a non-invasive 10 manner. Further, it is desirable to provide a means for applying the acquired information in a decision making context. 15 Summary of the invention According to a first aspect of the present invention there is provided a method for monitoring emotional state changes in a subject based on facial expressions, the 20 method comprising the steps of: capturing first facial image data of the subject and second facial image data of the subject at first and second times respectively; and processing the first and second facial image data 25 to produce emotional state change data for the subject for the period of time between the capture of the first facial image data and the second facial image data. According to another aspect of the present invention there 30 is provided a system for monitoring emotional state changes in a subject based on facial expressions, the system comprising: a image capturer adapted to capture first facial image data and second facial image data of the subject at 35 first and second times respectively; and a processor adapted to process the first and a second facial image data to produce emotional state change WO 2008/064431 PCT/AU2007/001854 -7 data for the subject for the period of time between the capture of the first facial image data and the second facial image data. 5 Preferably a series of facial images of the subject is electronically captured as facial image data, wherein each facial image is separated from the next by a designated time period. For example, a series of facial images may be captured by a visual image recorder as frames of a 10 video stream. The first and second facial image data may be selected from this series of facial images. Preferably processing of the first and second image data includes identifying changes between the first and second 15 facial image data using an image processor, and analysing the changes between the facial image data for the first and the second facial images to produce the emotional state change data. 20 Preferably the emotional state change data characterises the direction of the emotional state change (positive or negative). Preferably the emotional state change data also characterises the intensity of the emotional state change. 25 In an embodiment the facial images are be normalised to compensate for changes in head placement between the first and second images. For example, geometric normalisation based on tracking the eyes or the relative location of 30 facial features or positions on the face or head may be used, such as ears, nose, forehead, or hair line. Changes between facial images may be identified and characterised to provide emotional state data using a 35 number of image processing techniques and analysis. In one embodiment of the present invention Gabor wavelets are used to identify changes between facial images and these WO 2008/064431 PCT/AU2007/001854 -8 changes analysed using a neural network to provide the emotional state change data. For example, the neural network can be trained to classify emotional state changes. 5 In another embodiment of the present invention changes between the facial images indicative of emotion states are determined using an optical flow algorithm to track displacement of selected facial features or selected 10 pixels, the selected pixels being indicative of facial features associated with emotional expression. For example, fuzzy logic rules can be used to determine emotional state change data based on the direction and magnitude of the displacement for each feature. 15 Optionally emotional state change data can include data indicating the time an emotional state change occurred. This time may be a relative or absolute time. For example time-stamping of the start of a sequence of images can be 20 used to determine the time of an emotional state change relative to the start time of the sequence, or recording the time each individual image within the sequence was recorded can be used to provide an absolute or actual time the emotional state change occurred. 25 According to another aspect of the present invention there is provided a method for correlating emotional state changes with a subject's cognitive response to stimulus, the method comprising the steps of: 30 applying a stimulus to the subject to evoke a cognitive response from the subject; monitoring the subject during the time the stimulus is applied to obtain emotional state change data indicative of emotional reaction of the subject to the 35 stimulus; and associating the emotional state change data with the cognitive response to the stimulus.
WO 2008/064431 PCT/AU2007/001854 -9 According to another aspect of the present invention there is provided a system for correlating emotional state changes with a subject's cognitive response to stimulus, 5 the system comprising: a monitor adapted to monitor the subject on application of a stimulus to evoke a cognitive response to obtain emotional state change data indicative of the emotional reaction of the subject to the stimulus; and 10 a processor adapted associate the emotional state change with the cognitive response to the stimulus. Preferably the monitor captures data indicative of the subject's emotional state changes during application of 15 the stimulus. The captured data is then analysed to provide emotional state change data for the subject during the time the stimulus was applied. Preferably a sequence of stimuli is applied to the 20 subject, and the subject's emotional reactions monitored for the duration of the application of the sequence and the recorded data analysed to provide emotional state change data associated with each stimulus in the sequence. For example, the recorded data may include time-stamp data 25 indicating the timing of the application of each stimulus. Alternatively, where the timing of the application of each stimulus in the sequence is known, the relative timing of each emotional state change from the time where the first stimulus was applied can be compared with the relative 30 timing of the application of each stimulus in the sequence. The cognitive response may also be recorded and the timing of each cognitive response also used for correlation of the cognitive and emotional responses. For example, the time taken between the application of a 35 stimulus, such as asking a question, and making of a cognitive response, such as answering the question, can be used. A cognitive response is a response made by the WO 2008/064431 PCT/AU2007/001854 - 10 subject employing their knowledge, perception or conception. For example, a cognitive response is a response made consciously generally employing reason, the cognitive response may be a verbal, written, key strokes 5 made on a computer keyboard or key pad of a user device, a selection from a number of objects/colours/letters/sounds etc. The form of cognitive responses may be related to the stimuli. In some instances a cognitive response may be non verbal, such a gesture, such as raising an arm or 10 finger in response to a question or instruction or pointing to a selected item from a set. Cognitive responses may include a subjects actions and interaction with their environment, for example selections of items, links or articles to view in an on-line environment or 15 actions taken in a physical environment. A predetermined sequence of stimuli can be prepared and applied to the subject, such as a survey, game, website or structured environment such as a simulation. This 20 sequence may also be adapted or modified. For example, the order or structure of the sequence may be modified to avoid repetition and learning of the sequence. In another example, the sequence may be dynamically adapted in response to responses given during the course of the 25 sequence. Alternatively the sequence of stimuli may involve the interaction between the subject and their environment, for example the reaction of a medical patient to pain they experience or effect of treatments, or a person performing an Internet search and the web sites 30 they enter. In these cases both the cognitive actions of the individual who is the subject of the monitoring and their emotional reactions are captured and time stamped for correlation. Also where appropriate a third party's involvement, such as the questions asked or treatment 35 applied by a doctor can also be monitored and time stamped for correlation with the subject's responses.
WO 2008/064431 PCT/AU2007/001854 - 11 In an embodiment where the subject's reactions to stimuli are monitored by recording facial images, the above described method and system for monitoring emotional state changes using facial images can be used to obtain 5 emotional state change data. The synchronisation of the emotional state changes with the cognitive response by the subject can be based on the timing of the visual image recording. For example, the selection of images, to compare for determining emotional state changes, out of a 10 series of images can be based on the timing of the application of the stimulus and/or the subject's response. Preferably, data can be presented correlating the subject's cognitive response to a stimulus with the 15 emotional change data exhibiting the emotional change experienced by the subject during the application and response to the stimulus. In some instances this data can be simply presented as an output for analysis by a person. 20 Alternatively further analysis can be performed on the correlated cognitive response data and emotional state change data. For example, the emotional state change data can be applied to qualify a subject's cognitive response. In another example a series of cognitive and emotional 25 responses can be analysed to determine trends in emotional state changes in comparison with cognitive responses. Analysis of correlated cognitive response data and emotional state change data may also include comparison of 30 data between individual subjects, among a group of subjects, or between a subject and model response data and emotional data. The method and system for correlating emotional state 35 changes with a subject's response to stimulus can further be used modelling correlated cognitive and emotional responses. For example an emotional response model may be WO 2008/064431 PCT/AU2007/001854 - 12 developed based on a typical or desired individual subject, for example a model employee for a particular role. Preferably such a method for developing an emotional response model comprises the steps of: 5 selecting a model subject; applying a predetermined sequence of stimuli to the subject to elicit a series of cognitive responses from the subject; monitoring the cognitive responses of the subject 10 to each stimulus and recording the cognitive response data; monitoring the subject during the time when each stimulus is applied and automatically recording data indicative of emotional reaction of the subject to each 15 stimulus; analysing the recorded data to provide emotional state change data for the subject during the time when the stimulus was applied for each stimulus; correlating the cognitive response data with the 20 emotional state change data for each stimulus; and developing an emotional response model for the sequence of stimuli based on the correlated data. Modelling correlated cognitive and emotional responses can 25 also be based on a number of subjects using the method above applied to a number of subjects and performing further modelling analysis. The further modelling analysis may vary depending on the number of subjects. 30 In one embodiment where enough subjects are monitored to provide a statistically valid sample a model may be developed based on statistical norms within the sample. The above steps can be applied to each subject in the sample and statistical analysis performed on the data 35 acquired from all the subjects. In another embodiment subjects may be classified into WO 2008/064431 PCT/AU2007/001854 - 13 groups based on particular trends or similarities in individual subject's cognitive data and/or emotional response data and the model based on particular trends within a group. For example, out of a sample of subjects 5 who all complete the above steps, the data can be analysed to select a group of subjects who all showed similar cognitive responses. The emotional responses can also be analysed to identify similarities across the group or to qualify cognitive responses. A model can be developed 10 based on the combined cognitive and emotional responses for the group. The analysis performed to develop such a model can vary depending on the size of the group and the context. For example, in a recruitment context the sequence of stimuli may be a survey relating to job 15 behaviour and attitudes which may produce very distinct trends in both cognitive and emotional responses in contrast to a holiday destination selection context where reactions may show greater variation across a sample of subject as the selection of a holiday destination is 20 subject to whim whereas responses to a job related survey are based on expectations of a particular job role or task. According to another aspect of the present invention there 25 is provided a method of acquiring cognitive and emotional state change data the method comprising the steps of: monitoring a subject's cognitive actions while performing a task; capturing cognitive data associated with the 30 task; monitoring the subject's emotional reactions while performing the task; capturing data indicative of the subject's emotional state changes during the task; 35 analysing the captured data to provide emotional state change data for the subject during the time when the task is being performed; and WO 2008/064431 PCT/AU2007/001854 - 14 correlating the cognitive data with the emotional state change data for each cognitive response. The cognitive and emotional state change data can then be 5 applied in a decision making context. According to another aspect of the present invention there is provided a system adapted to present correlated cognitive and emotional state change data for use in 10 decision making, the system comprising: a controller arranged to apply a decision making model for executing a plurality of decision making phases and applying rules in accordance with the model; and a plurality of agents each adapted to perform a 15 function for use in the decision making process, at least one of the agents being adapted to provide emotional state change data associated with a decision making context, the controller and agents being implemented such that the controller is adapted to request actions be 20 performed by the plurality of agents in accordance with the decision making model. For example the controller controls the execution of a number of decision making phases such as: 25 a pre-processing phase for data acquisition and manipulation of raw data; a context elicitation phase for analysis of the data to identify a situation with a problem to be solved; a situation interpretation labelling phase for 30 identifying the required tasks and data associated with the problem to be solved according to a predetermined decision making model; a situation action phase for executing tasks associated with the identified problem based on the data; 35 and a situation adaptation phase for monitoring results from task execution and reactions from external to WO 2008/064431 PCT/AU2007/001854 - 15 the system to the results, including emotional reactions, enabling these external reaction to be fed back into the system for adaptation of the predetermined model based on the external reactions. 5 Preferably the controller and agents are arranged in a layered architecture, the layers comprising: a reactive layer including agents for raw data acquisition and basic data manipulation; 10 an intelligent technology layer including agents for data mining and identification of patterns in the acquired data; a cognitive sensemaking layer for coordinating the use of agents associated with other layers and 15 interpretation of data in accordance with a decision making context; an affective sensemaking layer including agents for the acquisition and analysis of emotional state change data; 20 a situation adaptation layer including agents for monitoring decision making results and reactions thereto and agents for analysing the reactions in the context of the situation for feedback into the decision making context and adaptation of the decision making model; 25 a distribution coordination layer including agents for coordinating communication between user and agent; and an object layer including domain agents for use by the agents of the other agents to facilitate data 30 processing and presentation for the user. In this model the cognitive sensemaking layer coordinates the activity of the various layer agents in accordance with a decision making model associated with a context. 35 Agents include procedures implemented in software and/or hardware for performing functions for example interaction between the decision making system and user environment, WO 2008/064431 PCT/AU2007/001854 - 16 acquisition of data, or analysing data according to the given function. For example one agent may acquire facial images from a system user and analyse these to provide emotional state change data, another agent may record 5 information input to the system from another source, such as historical data for an individual from a database or other source, another agent may perform analysis to compare current information input to the system by the user with the historical data. The use or calling of the 10 agents is controlled by the cognitive layer based on the decision making context and the decision making rules being applied, with the parameter or data provided to the agents by the system also based on the decision making rules. 15 Preferably each agent performs a specific function which is independent of the decision making context. The decision making context, such as determining credit card approval, suggesting holiday destinations, prescribing 20 medical treatments, etc determined the decision making model and rules which are applied to call the agents and use the results of the agents' functions in the decision making context. 25 Brief description of the drawings Figure 1 illustrates an example of a system for monitoring emotional state changes. 30 Figure 2 is a functional block diagram of the system of Figure 1. Figure 3 shows the affect space model used for mapping 35 emotional states. Figure 4 shows an example of image processing and analysis WO 2008/064431 PCT/AU2007/001854 - 17 according to one embodiment of the invention using Gabor wavelets for image processing and a neural network for analysis to determine emotional state changes. 5 Figure 5 is an example of a feature identification mask used in an embodiment of the invention using an optical flow algorithm for image processing. 10 Figure 6 is an example of the results of the optical flow algorithm indicating the difference in facial features between two facial images. Figure 7 is an illustration of examples of relative 15 movements of facial features mapped to emotional state changes. Figures 8a, b & c illustrate a series of emotional state changes over time with consistent intensities. 20 Figures 9a, b & c illustrate a series of emotional state changes over time having varying intensities. Figure 10a, b & c illustrate examples of transient 25 emotional state transitions exhibited during a transition from one absolute emotional state to another. Figure 11 is a functional block diagram of a system for 30 correlating a subject's emotional state changes with their cognitive responses. Figure 12 illustrates a salesperson behavioural model mapped onto fuzzy categories for the application 35 of fuzzy rules. Figure 13 illustrates Maslow's hierarchy of personal needs WO 2008/064431 PCT/AU2007/001854 - 18 overlayed with transitory behaviour categories of the salesperson behavioural model. Figure 14 illustrates transitory behaviour categories of 5 the salesperson model. Figure 15 illustrates the steps for determining a sales candidate's primary fuzzy selling behaviour. 10 Figure 16 illustrates a sales candidate's cognitive responses to a number of questions relating to competition. Figure 17 illustrates a candidate's emotional state 15 changes experience while answering a number of questions relating to competition. Figure 18 illustrates a candidate's cognitive responses and associated emotional state changes for 20 questions relating to success and failure mapped onto a common axis. Figure 19a illustrates parallel responses between a subject and a benchmark. 25 Figure 19b illustrates opposing responses between a subject and a benchmark. Figure 20a illustrates comparison of a sales candidate's 30 emotional state changes with those of a benchmark. Figure 20b illustrates comparison of a sales candidate's cognitive responses with those of a benchmark. 35 Figure 20c illustrates comparison of a sales candidate's emotional state changes over the course of an entire survey with those of a benchmark.
WO 2008/064431 PCT/AU2007/001854 - 19 Figure 21 illustrates an example of an emotionally intelligent sales recruitment system. 5 Figure 22 illustrates an example of a system for automatically providing feedback regarding the emotional states of students in a lecture context. Figure 23 illustrates an example of a system for 10 monitoring critical event operator's cognitive and emotional responses. Figure 24 illustrates an example of a system for monitoring a person's behaviour and emotional 15 state changes during on-line holiday planning. Figure 25 illustrates an embodiment of a system adapted for decision making and adaptive decision making using a combination of monitored cognitive and 20 emotional reactions. Figure 26 illustrates a typical object-based domain for banking products. 25 Figure 27 illustrates an example of an application of the decision making system of figure 25 applied within a banking product domain. Figure 28 illustrates an example of situation-adaptation 30 layer agents for credit card approval. Figure 29 illustrates the performance comparison of the neural network back propagation agent without the adaptation applied and after adaptation. 35 Detailed description: WO 2008/064431 PCT/AU2007/001854 - 20 The description below has been separated into a number of sub-headings for the ease of location of information by the reader and clarity of description only. The use of these sub-headings should not be considered to limit the 5 scope of the invention to the embodiments described. Many alternatives to and permutations of the combination features described are possible and all such alternatives are considered to fall within the scope of the present invention. 10 Monitoring of emotional state changes based on facial expressions One way of monitoring emotional state changes is based on 15 monitoring changes in facial expressions. An embodiment of a system and method for monitoring emotional state changes in a subject based on facial expressions is illustrated in Figures 1 and 2. The system 100 comprises a visual image recorder 120 and a processor 130. The 20 visual image recorder 120 is adapted to capture a series of facial images 200 of the subject 110 wherein each facial image is separated from the next by a known short period of time. The processor 130 is adapted to identify changes between a first 210a and a second 210n facial 25 image from the series 200, and analyse the changes in facial images 210a, 210n to provide emotional state change data for the subject for the period of time between the capture of the first facial image 210a and the second facial image 210n. 30 The first step in the monitoring of emotional state changes is the capturing of a sequence 200 of facial images 210a to 210n of the subject 110, wherein each image is separated by a known short period of time. For example 35 the visual image recorder 120 can be a video camera which captures images and digitally stores the images either within memory within the camera or an external memory WO 2008/064431 PCT/AU2007/001854 - 21 device such as the memory of a computer linked to the camera. A digital video camera will capture a predetermined number of images, also known as frames, per second. Typically the number of frames per second is 5 constant for a recording period. The number of frames per second captured may vary depending on the type of camera or be variable for a camera depending on a user selected property. Thus where the images are captured using a digital video camera the length of time between the 10 capture of each image in the sequence will be known. A first and a second image from a sequence of images are compared and changes between the two images identified. The first and second images can be sequential images 15 captured one after the other, for example image 210b and image 210c, or separated by a number of images, for example first image 210a and second image 210n. The selection of the images may be based on the timing of the images or the time separation of the images. 20 Alternatively the first and second images may be selected based on an external trigger, such as a first image captured immediately before an external stimulus is applied and a second image captured after the stimulus is applied. For example the second image may be captured 25 half a second after the stimulus is applied. Image processing 220 identifies changes between the first and second images. These facial image changes 230 are analysed 240 to provide emotional state change data 250 30 for the subject for the period of time between the capture of the first facial image 210a and the second facial image 210n. The emotional state change data characterises the 35 emotional state change using at least one measure, such as a change in the direction, positive or negative, or the in intensity of the emotional state change.
WO 2008/064431 PCT/AU2007/001854 - 22 Facial expressions are an important physiological indicator of human emotions. An affect space model 300 used by psychologists is shown in figure 3. The model 300 5 uses three dimensions each represented on an axis of the model. The first dimension is Valance which is measured on a scale of pleasure (+) to displeasure (-), as represented on the valence axis 320. The second dimension is Arousal which is measured on scale of excited/aroused 10 (+) to sleepy (-), as represented on the arousal axis 310. The third dimension is Stance which is measured on a scale of high confidence (+) to low confidence (-), as represented on the stance axis 330. Facial expressions correspond to affect states such as happy 340, surprise 15 350 and bored 360. The affect space model 300 characterises absolute emotional states based on the three dimensional measures, valence, arousal and stance. Versions of the affect space 20 model have mapped affect states onto facial expressions. Psychologists point out that facial expression alone may not be an accurate indicator of the emotional state of a person but changes in facial expression may indicate a change in emotional state. 25 Embodiments of the present invention are concerned with determining the change in emotional state rather than determining the absolute emotional state of the subject. The emotional state change data can be a measure only 30 indicating change in one dimension, for example whether the emotional state change is in a positive or negative direction, or neutral in the case of no change. Alternatively the emotional state change may be a vector indicating a direction (+ve or -ve) and intensity of the 35 change. For example in one embodiment of the invention the changes WO 2008/064431 PCT/AU2007/001854 - 23 in facial image are mapped to a vector indicating the transition between two states of an affect space model 300. For example, a transition between the emotional states of anger 355 and fear 365 is indicated by a 5 reduction in the level of arousal, without a change in valence, whereas a transition between anger 355 and defensive 390 also includes a negative valence change and also some negative stance change. There are a number of ways such changes can be determined based on facial 10 expressions. A number of methods can be used to identify changes between the first and second images and to analyse these changes to provide emotional state change data. The 15 methods used for analysing the identified facial image changes to determine the emotional state change can vary depending on the image processing method used to identify the facial image changes. 20 Two common techniques being used for feature representation in face recognition and analysis are eigenvectors and Gabor wavelets. Gabor wavelet filters at varying scales and orientations have been used to represent facial expression images. In one embodiment of 25 the present invention the image processing step, described with reference to Figure 4, uses Gabor wavelets to identify the facial image changes between subsequent images 410a, 410b and 410c. 30 The image processing step 220 of this embodiment includes the steps of pre-processing to obtain difference images between frames in a video stream, and feature representation using Gabor wavelets. The analysis step 240 includes classification into positive, negative or 35 neutral emotional states by a neural network. Pre-processing can include the optional steps of finding WO 2008/064431 PCT/AU2007/001854 - 24 the eye coordinates or other facial features in the video stream, and geometric normalisation centred on the eyes (or other facial features) prior to generation of difference images. To expedite the location of the eyes 5 for the purposes of the example shown in Figure 4, a simple method of placing two dots 401, 402 on the candidate's forehead while recording video was used. Eye coordinates where estimated relative to these dots 401, 402. 10 Define the current image Ie, the image n frames earlier Ie. and then we have the difference image: Eq. 1 t n was selected to be 3 i.e. the subtracted image was the 15 image three frames earlier. For example, in Figure 4, where image 410a is the first selected image then image 410b is the image 3 frames after image 410a, and similarly image 410c is the image 3 frames after image 410b. The video was recorded at 10 frames per second in this 20 example. So the time difference image 410a and 410b, and 410b and 410c was approximately 300 milliseconds. Figure 4 shows the difference image 420a, 420b, 420c between two subsequent images of a subject determined using Gabor wavelets, and the subject's emotional state response 430a, 25 430b and 430c determined based on a difference image. The difference image 420b represents a plurality of vectors of Gabor Jets calculated at points on a grid overlaid on the images 420a and 420b after normalisation. 30 A Gabor jet is the magnitude of the complex valued Gabor wavelet filter output which is applied determine the difference between the first 410a and second 410b images. The filters used were at three scales an octave apart and at six orientations within each scale. The bandwidth of 35 the masks was approximately an octave. In this embodiment the analysis of the difference images WO 2008/064431 PCT/AU2007/001854 - 25 420a, 420b and 420c is performed using a neural network to determine any emotional state change based on the change in facial expression between two subsequent images. 5 In the example shown the neural net was trained on a selection of image sequences from the Cohn-Kanade facial expression data base. The Cohn-Kanade facial expression database provides samples of facial images representing a variety of absolute emotional states and facial 10 expressions assumed during transitions from one emotional state to another. The sequences of images in this database are taken from video and so it was assumed the frame rate is approximately 30 frames per second. For the difference images generated to train the neural net in 15 equation 1, n was selected to be 8 representing a time difference of approximately 270 milliseconds. The sequences selected were the ones that represented joy, anger and expressions ranging therebetween which were classed into positive and negative emotion respectively. 20 Difference images representing no change i.e. relatively flat images where generated artificially and were classed as neutral emotion i.e. no change in emotional state. The neural net architecture was 1296 input nodes, corresponding to the dimensionality of the Gabor jets 25 vector, 10 hidden nodes and 3 output nodes. Back propagation was the training algorithm used. For display purposes the output of the network was visualised by an image 430a, 430b and 430c where different 30 areas of the image 431, 432 and 433 represented different classes and/or relative strength of the three different classes namely neutral 433, positive 432 or negative 431 emotion state change. The output of the neural net, three nodes representing the three classes, where mapped to red, 35 green and blue for negative, positive and neutral change respectively. The top half of the visualisation image 434 was a colour representing all three classes' colour mixed WO 2008/064431 PCT/AU2007/001854 - 26 together. The bottom half of the image was divided up into three equal areas 431, 432, 433, each devoted to one of the three colour/classes. Ideally it would be expected to display either of red, green or blue. However mixing all 5 three classes in the top half if the display image makes it easy to judge where the neural network classification contained an overlap. Once such example 430c with negative emotion state classification 431 is shown in Figure 4. The negative emotion state is represented by the red colour in 10 visualisation image (next to the difference image) is shown in Figure 4. Alternative methods for representing or indicating the emotional state change may also be used. In the Gabor wavelet example shown only the direction, 15 positive or negative or no emotional state change, neutral, are determined. This model does not clearly indicate the intensity of the change. For example the subject has becoming only slightly happier would be represented as a positive emotional state change, the 20 subject becoming a lot happier would also be represented as a positive emotional state change, the output in this example does not distinguish between a slight and a large emotional state change. However, a more complex neural network having a greater number of output nodes could be 25 used to provide more detailed emotional state change information, such as an emotional state change vector indicating the direction and magnitude of the emotional state change. Alternatively a different analysis method may also be used. 30 In another embodiment the image processing method uses an optical flow algorithm to track changes in facial features associated with emotional expression. 35 Optical flow tracks the displacement of selected facial features or selected areas of an image based on the changes in pixels when comparing two images, in this case WO 2008/064431 PCT/AU2007/001854 - 27 consecutive images in a video stream. The selection of features or areas of the image is based on the Facial Action Coding System (FACS) developed by Ekman and Friesen, is a method of measuring facial activity in terms 5 of facial muscle movements. FACS consists of over 45 distinct action units corresponding to a distinct muscle or muscle group. More than 7,000 different action unit combinations have been observed. Though FACS has been criticized as only capturing a spatial description of 10 facial activity and ignoring the temporal component, it is perhaps the most widely used language to describe facial activity at the muscle level. The subtle variations are usefully modelled by tracking 15 the eye shape and movement, eyebrow movement, and cheek and lip movement. The facial action units associated with eyes, eyebrows, cheek and lips have been analysed in this project include (but are not limited to) inner eyebrow raiser, outer eyebrow raiser, eyebrow lowerer, upper eye 20 lid raiser, eye lid tightener, depressing lower lip facial, cheek raiser action units and their combinations. In this embodiment these facial action units are tracked/recognised and mapped and mapped to positive and negative emotional states. To track subtle changes we 25 utilise the video stream's inherent temporal relationship between consecutive images (corresponding to consecutive questions) in the stream and recognise the facial action units in consecutive images with time. We then extract the facial changes from the facial units of say two 30 consecutive images for analysis to model the functional relationship between extracted facial changes from facial units and positive and negative emotional states. The particular algorithm used was the Lucas & Kanade 35 algorithm and is an efficient, sparse optical flow algorithm. In the optical flow method the changes in facial features of interest are directly tracked. An WO 2008/064431 PCT/AU2007/001854 - 28 example of this embodiment is described with reference to figures 5 and 6. The video stream (as in the previous example) was used 5 unprocessed as an input to the optical flow algorithm. A template or mask of points to be tracked on the face(s) in the video stream was initialised for the sparse optical flow algorithm to track, as shown in image 500 of figure 5. The tracking points mask used to generate the results 10 in this example was initialised by the use of object detectors to locate faces as a whole and facial features as reference points for points mask placement. In this example we are seeking to track emotional state in 15 the range between positive to negative and variations therebetween. To do this we mapped a number of facial expressions, for example of joy and anger, to positive and negative emotional states respectively. For this example the positive and negative classifications were based on 20 the positive and negative quadrants of the affect space model, however, and alternative model may also be applied. This example tracks whether a person's emotional state is changing towards a positive or negative expression. A 25 simple mask was generated as shown on image 500 in figure 5. This mask consists of eight points 501-508. Two reference points 501, 502 on areas which are relatively inert with respect to movement in facial expression changes were used as reference points. The other six 30 tracking points' 503-508 motions were calculated relative to the two reference points. One of the two reference points 501 is on the forehead and the other reference point 502 is on the nose. The tracking points 503-508 of the mask are placed to correspond to features or areas of 35 the face which move when facial expressions change. In the example shown the tracking points are right eyebrow 503, left eyebrow 504, right cheek 505, left cheek 506, WO 2008/064431 PCT/AU2007/001854 - 29 right upper chin 507 and left upper chin 508. It was found empirically that the six points were sufficient to differentiate between the two facial expressions we were interested in for this example more or less tracking 5 points may be used in alternative embodiments depending on the application. The relative motions of the points 501-508, as calculated by optical flow, are shown in figure 6. Diagonally from 10 top left to bottom right there are a set of cross hairs 601-608 representing the relative centres of the points 501-508 in the previous image; forehead 601, nose 602, right eyebrow 603, left eyebrow 604, right cheek 605, left cheek 606, right upper chin (approximately) 607 and left 15 upper chin (approximately) 608. Superimposed on these cross hairs are small circles 611-618 representing the relative position of the corresponding point 501-508 in the current image. So the displacement of the circle 611 618 from the centre of the cross hair 601-608 represents 20 the relative motion of that point 501-508 from image to image. The cross hair 620 is a superimposition of all the points 501-508, with the relative movements represented as blob 629 which corresponds to a superimposition of circles 611, 612 and 615-618, and circles 623 and 624 which 25 correspond to circles 613 and 614. This represents that both the left eyebrow tracking point 504 and right eyebrow tracking point 503 moved further between the two images than any of the other feature tracking points. 30 The axis 630 is used to represent a gauge of positive or negative emotion predicted by the analysis of the points' 501-508 motions; positive being up, negative being down and relative intensity indicated by the distance from the centre of the axis. 35 The points 501-508 from each different area have signature motions when a change in facial expression occurs between WO 2008/064431 PCT/AU2007/001854 - 30 consecutive frames in the video stream. Generally the direction component indicates the class of facial expression component, represented by that point for the specified area the point inhabits, and the displacement 5 represents the intensity. For example the left eyebrow 504 and right eyebrow 503 have a downward and slightly inward direction of motion when a glare, such as exhibited in an expression of anger, is tracked and an upward movement when the eyebrows are raised in an expression of 10 surprise or fear (more input from other points relative motions would be needed to distinguish between surprise and fear). These changes are analysed using fuzzy logic to model these observed movements to produce an indication 640 of relative positive or negative emotion displayed by 15 this guage 630. In this example, the movement of each feature was characterised by the angle and magnitude of the movement in 2 dimensions. The angle was simplified in this example 20 to UP or DOWN and the magnitude classified as HIGH or LOW. The Sugeno method was used for inferencing and defuzzification by weighted average. An example of some fuzzy rules used for the analysis of 25 the movement of tracking points 503-508 to emotional state changes are as follows: if left eyebrow angle == down and 30 left eyebrow magnitude == high and right eyebrow angle == down and right eyebrow magnitude == high and left cheek magnitude == low and right cheek magnitude == low and 35 left mouth magnitude == low and right mouth magnitude == low then WO 2008/064431 PCT/AU2007/001854 - 31 negative emotion (High). if 5 all magnitudes == low then neutral 10 if left eyebrow magnitude == low and right eyebrow magnitude == low and left cheek angle == up and left cheek magnitude == high and 15 right cheek angle == up and right cheek magnitude == high then positive emotion (High). 20 if left eyebrow angle == up and left eyebrow magnitude == high and right eyebrow angle == up and 25 right eyebrow magnitude == high and left cheek magnitude == low and right cheek magnitude == low and left mouth angle == down and left mouth magnitude == high and 30 right mouth angle == down and right mouth magnitude == high then negative emotion(High). 35 Figure 7 illustrates the relative movements of facial features mapped onto cheek axis 710, eyebrow axis 720 and mouth axis 730 to show the combination of movements WO 2008/064431 PCT/AU2007/001854 - 32 indicating emotional states. This diagram illustrates that it is movements in combination used by the fuzzy rules rather than movements of one single feature which are used to determine an emotional state change. In the 5 example shown, where the magnitude of all of the mouth, cheek and eyebrow movements are low this is deemed neutral 760 indicating no emotional state change. Where the mouth moves with a high magnitude in combination with a high magnitude cheek movement in an upward direction and low 10 movement of the eyebrows in either direction, this is deemed a high magnitude positive emotional state change 740. In this assessment the direction of the mouth or eyebrow movements are not considered significant, only the magnitude is significant. However, both the direction and 15 magnitude of the cheek movement is significant. For example, consider the difference between your face moving from a bland or neutral expression to a smile or to laugh. In a smile the mouth, measured in our example based on the movement of the upper chin, moves upward, however when 20 laughing the mouth opens so the movement of the upper chin is down, so either movement direction in combination with an upward cheek movement and little movement in the eyebrows can be interpreted as a positive emotional response of high magnitude. A negative emotional state 25 change 750 is indicated by a combination of low magnitude cheek and mouth movement in either direction with high magnitude downward eyebrow movement. The example described herein was limited in the scope of 30 expressions which the algorithm is sensitive to and so the analysis of the points' 503-508 motion is simplified. The motion signatures from the set of six points 503-508 is enough to differentiate between expressions changing towards positive or negative emotional states or 35 neutrality. Essentially the direction of the points in combination was used to classify the expression and displacement in combination was used to indicate WO 2008/064431 PCT/AU2007/001854 - 33 intensity. The reference points on the forehead 501 and nose 502 can be used to normalise the measured movements of the 5 tracking points 503-508 to compensate for any head movements. The above examples describe two ways in which emotional state changes can be automatically determined from facial 10 images. Depending on the speed at which emotional state changes are to be monitored - for example, in real time from live image capture of a subject or from a re-recorded video, or non real time from pre-recorded images - or processing capacity of the system, the complexity of the 15 image processing and emotional state analysis may be varied. For example more facial features may be tracked, more complex neural networks and fuzzy rules used, other aspects of facial expressions such as facial hue (for example flush or pallor), body language such as head 20 angles or posture may also be monitored, or verbal indicators such as tone, speed, pitch and modulation may also be monitored. All these alternatives and variations are contemplated within the scope of the present invention. 25 Changes in emotional state can be tracked over time, thus both the rate of change of emotional states and the relative magnitude of the emotional state changes can be monitored. For example the emotional state change data may 30 include data indicating the time an emotional state change occurred in addition to the data characterising the emotional state change, such as magnitude and direction. Further trends in a persons emotional state change over 35 time can also give indication of a persons emotional state change style over a relatively long duration of time, for example someone who exhibits generally high emotional WO 2008/064431 PCT/AU2007/001854 - 34 state change responses, or for a relatively short duration of time, for example during the period of the person changing from one absolute emotional state to another, monitoring the transient changes within this absolute 5 change can be an indicator for a measure of the absolute emotional state change, for example the stance measure for the level of confidence a person exhibits which can add a further dimension to the emotional state change data. 10 Examples of trends in emotional state changes over time are shown in figures 8a-c and 9a-c. The example in figure 8a shows emotional state changes of high magnitude in both positive and negative directions so the person being monitored appears to have a consistently high magnitude of 15 emotional state change. In figure 8b the person exhibits moderate or medium intensity emotional state changes and in figure 8c the person is exhibiting only low intensity emotional state changes. These examples can be used to classify a person's emotional state change style, for 20 example very expressive for 8a and controlled or minimally expressive for 8c. Knowledge of this style trend can be of use when comparing the responses of two or more subjects. For example, the magnitude of the emotional state changes can be weighted or normalised when wishing 25 to compare responses of more and less expressive subjects. The example shown in figures 9a to 9c gives examples of variation in emotional state changes over time, the variation in the magnitude of the emotional state changes 30 may be of interest in some situations. Figure 9a shows a series of emotional state changes where the magnitude of the changes is relatively constant, whether the magnitude is high or low may influence the interpretation of this trend. For example, consider the emotional state changes 35 of a person watching a romantic movie are being monitored, if the magnitude of the emotional state changes are low this may indicate that the person is simply not interested WO 2008/064431 PCT/AU2007/001854 - 35 or emotionally engaged with whatever they are watching, whereas if the emotional state changes are high then this may indicate that they are engaged and responding emotionally to what they are watching. Figure 9b shows 5 decreasing magnitude of emotional state changes over time. This may indicate a person is calming and becoming less emotionally reactive to a situation, for example a person relaxing from a state of anxiety over the course of an interview process, or decreasing engagement, for example 10 losing interest in a movie they are watching. Figure 9c is an example of a persons emotional state change intensity varying irregularly over time. Depending on the context this may indicate a person lacks emotional stability, for example in the absence of stimulus to evoke 15 an emotional response and a person is going through a number of varying emotional state changes. Alternatively the varying level of emotional state change may indicate varying levels of emotional engagement with a variety of stimulus, for example when looking at pictures of a 20 variety of holiday destinations, some may be appealing and evoke a strong positive emotional response, whereas others may evoke a less strong, neutral or negative responses of varying intensities. 25 When responding to stimulus such as looking at a photograph or answering a question a person may experience a number of transient emotional state changes during a transition from one absolute emotional state to another or back to the starting absolute state. Examples of such 30 transient emotional state changes are shown in figures 10a-10c. The plot of the transient emotional state changes in figure 10a shows a single peak indicating a single emotional state change during the monitoring period. Figures 10b and 10c shows multiple peaks within 35 the monitoring period, indicating the subject has shown more that one emotional state change during the monitoring period, this may indicate some confusion of lack of WO 2008/064431 PCT/AU2007/001854 - 36 confidence leading to several emotional state changes. The number and duration of the emotional state changes can indicate the level of confidence or confusion with a greater number of emotional state changes indicating lower 5 confidence or greater confusion. All of these emotional state change trends can be interpreted differently depending on the context evoking the emotional response. Interpretation of the emotional 10 state changes may also involve analysis of a combination of the above trends. Emotional state change data can include data indicating the time an emotional state change occurred. This time 15 may be a relative or absolute time. For example time stamping of the start of a sequence of images can be used to determine the time of an emotional state change relative to the start time of the sequence, or recording the time each individual image within the sequence was 20 recorded can be used to provide an absolute or actual time the emotional state change occurred. Correlation of emotional and cognitive responses 25 Embodiments of the present invention provide a method and system for correlating emotional state changes with a subject's cognitive response to stimulus. One such embodiment will now be described with reference to figure 11. The subject 1110 is monitored, using monitor 1120 30 during the time the stimulus is applied and data indicative of emotional reaction of the subject to the stimulus automatically recorded. The data is then analysed, by the processor 1130, to provide emotional state change data for the subject during the time the 35 stimulus was applied, for associating the emotional state change data with the cognitive response to the stimulus.
WO 2008/064431 PCT/AU2007/001854 - 37 The stimulus may be applied to the subject via the processor or by another means, for example manually or via another processor. The data indicative of emotional reactions of the subject can be facial image data which is 5 then analysed as described above and/or other data indicative of a persons emotional reactions such as posture, vocal change, physiological changes etc. These alternatives are all considered within the scope of the present invention. 10 In one embodiment a sequence of stimuli is applied to the subject and the subject's emotional reactions monitored for the duration of the application of the sequence and the recorded data analysed to provide emotional state 15 change data associated with each stimulus in the sequence. For example, the recorded data may include time-stamp data indicating the timing of the application of each stimulus. Alternatively, where the timing of the application of each stimulus in the sequence is known, the relative timing of 20 each emotional state change from the time where the first stimulus was applied can be compared with the relative timing of the application of each stimulus in the sequence. The cognitive response may also be recorded by the system 1100 and the timing of each cognitive response 25 also used for correlation of the cognitive and emotional responses. For example, the time when a stimulus, such as asking a question, is applied could be recorded by the system 1100. The time taken by the subject 1110 between the asking of the question and making of a cognitive 30 response, such as answering the question, can be included in the recorded data for analysis. In an embodiment where the subject's reactions to stimuli are monitored by recording facial images, the above 35 described method and system for monitoring emotional state changes using facial images can be used to obtain emotional state change data. The synchronisation of the WO 2008/064431 PCT/AU2007/001854 - 38 emotional state changes with the cognitive response by the subject can be based on the timing of the visual image recording. For example, the selection of images, to compare for determining emotional state changes, out of a 5 series of images can be based on the timing of the application of the stimulus and/or the subject's response. An example of an embodiment will now be described for application in a recruiting context, in particular 10 recruitment of salespeople. Research by the inventor has developed a Fuzzy Sales Recruitment and Benchmarking System (FSRBS), which employs a selling behaviour model, AHP (Analytical Hierarchy 15 Process), expert system and soft computing techniques for selling behaviour categorisation and benchmarking of salesperson. This system uses a survey of specially formulates questions to determine a recruitment candidate's selling behavioural style based on the 20 answers, cognitive responses, given by the candidate to the survey questions. A detailed explanation of this behavioural model is given below. The behavioural model for classification of a 25 salesperson's behavioural style in this example has two dimensions namely, 'Warm-Hostile and Submissive-Dominant'. This model has been used based upon interactions with senior managers in the sales and human resources arena in the consumer and manufacturing industries in Australia. 30 The reasons for using this particular model are a) The domain experts found it less complex than other models, b) They found it easy to relate with as it 35 mimicked their way thinking for typifying/categorizing salesperson behaviours and, c) They found this model close to sales training WO 2008/064431 PCT/AU2007/001854 - 39 programs they had undertaken Some of the typical salesperson characteristics that emerge from this behavioural model are described in table 5 1 below: Hostile Warm Dominant-Hostile Dominant-Warm The salesperson must Sales are made when customers -0 impose their will in become convinced that they E the customer by can satisfy a need by buying. .- I r superior determination The salesperson's job is to 0 A and strength. Selling demonstrate to the customer is a struggle the that their product would best salesperson must win. satisfy the customer's need. Submissive-Hostile Submissive-Warm Customers buy only when People buy from salespersons w they are ready to buy. they like. Once a prospect Since persuasion does becomes a friend, it is only Co . not work, the reasonable that he should .9 salesperson's job is to also become a customer. M take their order when the customer is ready to give it. Table 1: Salesperson Behaviour Profile 10 Warmth is regard for others. A warm person is optimistic and willing to place confidence in others. Hostility is lack of regard for others, the attitude that other people matter less than oneself. A hostile person rarely trusts others. Submission is the disposition to let others take 15 the lead in personal encounters. It includes traits like dependence, unassertiveness, and passiveness. Dominance is the drive to take control in face-to-face situations. It includes a cluster of traits like initiative, forcefulness, and independence. The two dimensions WO 2008/064431 PCT/AU2007/001854 - 40 Submissive-Dominant and Warm-Hostile give rise to four broad groups of salespersons and customers, i.e., Dominant-Hostile (DH), Submissive-Hostile (SH), Submissive-Warm (SW), and Dominant-Warm (DW). 5 The four behaviour profiles described in table 1 are extreme caricatures. Since most salespersons in practice will not belong totally to any one category. Therefore, fuzzy granulation has been introduced. That is, we have 10 extended the model by introducing fuzzy categories like 'High, Med, and Low' in each category, the linguistic variables "High, Med, Low" also represent selling behaviour intensity in a particular category. So, as shown in figure 12, we have twelve clusters SH(High), 15 SH(Medium), SH(Low), SW(High), SW(Medium), SW(Low), DH(High), DH(Medium), DH(Low), DW(High), DW(Medium) and DW(Low). Therefore, qualitatively, the linguistic variables provide information on the intensity (or extent to which a candidate's behaviour belongs to a fuzzy 20 category within each category). In addition to the 12 fuzzy categories as shown in Figure 12, there are three transitory behaviour categories identified. It may be noted that the selling behaviour 25 categories are related to the Abraham Maslow's model of needs and motivation. The Figure 13 illustrates the hierarchy of the personal needs linked with human motivation model. Abraham Maslow said that the needs must be satisfied in the given order shown in Figure 13. That 30 is, the most basic needs at the bottom level must be satisfied before the needs of other levels are satisfied. In terms of the selling behaviour model, a SH salesperson is motivated by the needs of stability and security. A SW 35 salesperson is motivated strongly by social and to a lesser extent by security and esteem needs. Thus, the SW salesperson sees the world as warm and accepting.
WO 2008/064431 PCT/AU2007/001854 - 41 The DH salesperson is motivated by strong needs of independence and self-esteem. They exhibit their needs through a strong desire to succeed. On the other hand, a DW salesperson is motivated strongly by needs of self 5 realisation and independence. In order to succeed, the DW salesperson controls the situation. Therefore, a DW salesperson is thorough with product knowledge, competition and other essential skills in sales. 10 Additionally, in practice it has also been found that the salesperson exhibits transitory selling behaviour. That is, in FSRBS transitory selling behaviour is reflected when the pruned scores in two categories (for example, SH and SW, or DH and DW; or SW and DH) are close to each 15 other. This reflects that their need levels are moving between two categories (for example, security (SH) to social (SW)). The fuzzy selling behaviour needs to be enhanced to incorporate these transitory behaviour categories. The enhancements are shown in Figure 14. 20 The purpose of the fuzzy selling behaviour model is to evaluate the primary selling behaviour category of a sales candidate before recruitment. In order to evaluate a sales candidate's selling behaviour category, seventeen (17) 25 areas. These areas are selling as a profession, assertiveness, decisiveness, prospecting, product, customers, competition, success and failure, boss, peers, rules and regulations, expenses and reports, training, job satisfaction, view about people and relationship with non 30 selling departments. In order to quantify the varying degree of importance attached to the different areas of selling behaviour by the domain experts (sales managers), weights have been 35 assigned to them on a scale of 1 to 10. The analytical WO 2008/064431 PCT/AU2007/001854 - 42 Hierarchy Process (AHP) technique has been used to assign weights to each of the 17 areas. Analytical Hierarchy Process is a powerful tool. It is 5 used for flexible decision making processes to help set priorities. The AHP makes the best decision when both qualitative and quantitative aspects of a decision need to be considered. The application of the AHP in this context is in the four objectives to be considered: competition, 10 selling, customer and product. This first step in the AHP is pair-wise comparison. AHP decides the relative importance of the objectives. This is done by comparing each pair of objectives and ranking them 15 on the scale from 1 to 10. Comparing objective i and objective j give a value aig as follows: 1 Objectives i and j are of equal importance 3 Objectives i is weakly more important than j 20 5 Objectives i is strongly more important than j 7 Objectives i is very strongly more important than j 9 Objectives i is absolutely more important than j 2,4,6,8 Intermediated values 25 Therefore, ajj = 1. If ai = k, then ai = 1/k. This illustration is explained in Table 2. It shows the preferences on the objectives. Training Selling Customer Product Training 1 1/5 1/3 1/2 Selling 5 1 2 4 Customer 3 1/2 1 3 Product 2 1/4 1/3 1 30 WO 2008/064431 PCT/AU2007/001854 - 43 Table 2: Preferences on Objectives Next, AHP calculates to determine the overall weights of each objective. These weights lie in between 0 and 1 and 5 the total weights are adding up to 1. These calculations are done by dividing the each entry by the sum of each column. That is, a 1 1 /(1=5=3=2) = 0.09. All the values are as shown in Table 3. Training Selling Customer Product Training 1 1/5 1/3 1/2 Selling 5 1 2 4 Customer 3 1/2 1 3 Product 2 1/4 1/3 1 10 Table 3: Weights on Objectives The results show that 50 percent of the domain experts objective weight is on selling, 30 percent on customer, 13 15 percent on product and 9 percent on Training. The purpose of using the fuzzy selling behaviour model is to determine the primary selling behaviour category of a sales candidate. The primary selling behaviour category 20 according to the model accounts for selling behaviour of a sales candidate form majority of the time in their interaction with customers. In this context, seventeen areas are used to design 25 questions for evaluating the primary selling behaviour category of the sales candidate. At least four (4) questions (one corresponding to each quadrant (DH, SH, SW and DW) of the fuzzy selling behaviour model are designed) related to each area are designed. The questions are WO 2008/064431 PCT/AU2007/001854 - 44 designed to contradict each other to facilitate pattern of commitment in the answers by the candidate. This is envisaged that the pattern of commitment is skewed towards the behaviour category. 5 After determining the different areas and their weights, attributes related to each of these areas with respect to different behavioural categories have been determined. The attributes of each of these areas have been designed 10 in the form of questions. The other parameters that have been kept in view while designing the questions are: a) What is going to be the tone of various questions? b) What is going to be the length of each question? 15 c) What is going to be the total number of questions? d) What is going to be the ordering of the five answer options? and e) What is going to be the pattern of questions? 20 Questions related to different behaviour categories have negative undertones. It is likely that the salespersons would answer such questions with apparent negative undertones in the negative even if they felt that, the 25 particular question in fact confirmed one of their traits. Thus, in order to get accurate feedback and increase the effectiveness of the questionnaire the negative tone of the questions is neutralised as much as possible without losing the actual meaning of the questions. This has done 30 by underplaying the negative tone of such questions and by introducing suitable justifications in the question itself.
WO 2008/064431 PCT/AU2007/001854 - 45 The questions are designed with five answer options to provide for quick answers. The five answer options are shown in table 4: YES 100% NO 0% To a Large Extent Yes 75% To a Large Extent No 25% Not Sure 33% 5 Table 4: Answering Options The answering option sequence shown in table 4 has provided us with the best results. "Yes" and "No" are the first two options to capture snap and immediate instinct 10 of the candidate after reading the question. Additionally, it also results in higher commitment by candidate towards a certain answering pattern or behavioural category. The weighting of the "Not Sure" option has been kept low because it does not contribute meaningfully towards a 15 behavioural category or commitment towards an answering pattern. A sample set of four questions related to the area of competition is shown in table 5 and each question is 20 related to one of the four behavioural categories. A total of 76 questions have been designed for the salesperson recruitment survey. Question Behavioural category 1 In sales, the law of the jungle prevails. Dominant It's either you or the competitor. You Hostile relish defeating your competitors, and fight them hard, using every available weapon.
WO 2008/064431 PCT/AU2007/001854 - 46 2 The best chance to outwork and outsell Dominant competitors is by keeping abreast of Warm competitive activity and having sound product knowledge of your product. 3 You may not be aggressive otherwise, but Submissive when it comes to competition you are just Hostile the opposite. You spend a good deal of your time explaining to the customer why he should not buy from the competitor. 4 You do not believe in being aggressive Submissive towards your competitors. Competitors Warm are people like you and there is room for everybody. Table 5: Questions for the "competition" category The survey is carefully constructed in order to make an 5 assessment of the cognitive responses of a candidate to determine their primary behaviour category. However, each individual answer in the survey can also be reviewed and data presented indicating the progressive responses of the candidate, for example as shown in figure 16. 10 As the candidate answers the survey their emotional reactions can be monitored to record and track the emotional state changes associated with each question in the survey. 15 The survey may also be constructed to enable it to be modified within set limits dependent on the analysis model, such changing the order of the questions or having interchangeable or replacement questions to enable the same model to be used for analysis of the acquired data 20 regardless of the version of the survey applied. The display of the results may also be structured to accommodate any variation in the survey, for example aggregation of results for various categories rather than presenting responses to individual questions where the WO 2008/064431 PCT/AU2007/001854 - 47 questions or question order changes. The monitoring and display of emotional state changes can be performed in real time, as the survey is being 5 answered, or the data, such as visual images, recorded for later analysis and correlation. For example the recorded data can be time stamped for later synchronisation with the timing of cognitive answers. Alternatively the recorded data can include both the data indicative of the 10 subject's emotional response recorded by the monitor and the actual cognitive responses, either simultaneously recorded by the monitor or added to the recorded data, by the processor. 15 Data can be presented correlating candidates cognitive response with the emotional state change experienced by the subject while answering the question, for example as shown in figure 16, 17 for individual questions and Figure 18 showing the emotional and cognitive responses 20 represented along one axis. In some instances this data can be simply presented as an output for analysis by a person. For example, to highlight inconsistencies or unexpected emotional 25 responses compared with cognitive responses such as is shown in Figure 18 where the candidate experiences a high intensity negative emotional state change while answering a question positively for the Dominant-Warm category, this may indicate the candidate is not comfortable with the 30 cognitive answer given. Also as described above the number and duration of a person's emotional state changes during answering a single WO 2008/064431 PCT/AU2007/001854 - 48 question can also be indicative of confusion regarding the question or lack of commitment to the answer given. Some embodiments of the system can perform further 5 analysis on the correlated cognitive response data and emotional state change data. For example qualify the cognitive response based on the associated emotional state change. In another example a series of cognitive and emotional responses can be analysed to determine trends in 10 emotional state changes in comparison with cognitive responses. Again these emotional state change trends may be used to qualify cognitive responses. For example, additional fuzzy rules can be applied to compare a number of cognitive and emotional responses, such as comparing 15 the emotional state change responses for all the questions with a positive cognitive answer in a particular category, where there are consistent positive emotional responses as well as the cognitive responses this can be an indicator of a strong preference for the particular behavioural 20 style, whereas inconsistent emotion responses may indicate the candidate does not have a strong preference for this style. In the salesperson recruitment example the pruning of 25 contradictory and superfluous answers is a method to establish the primary behaviour category. The example described below relates to the analysis and pruning of cognitive answers only to determine the overall selling behavioural style, however it is envisaged that 30 embodiments also include analysis of emotional state changes. For example, emotional sate changes can be used as supplementary information when analysing contradictory responses, or to identify contradictory answers.
WO 2008/064431 PCT/AU2007/001854 - 49 Alternatively emotional state changes could also be used to gather insights as to preference for candidates who appear to be in transition between two styles. 5 One of the main issues to contend with in practice is pruning of contradictory and superfluous answers which the candidates provide to various questions. The four criteria chosen for pruning out the contradictory and superfluous answers are: 10 1. Sales candidate's initial predominant category (category based on highest aggregated raw (unpruned) score); 2. Personal need level (based on Abraham Maslow's 15 model); 3. Domain expert's experience on what sorts of contradictory answers, sales candidates with a particular initial predominant category engage 20 in; and 4. Account for secondary or masked behaviour (based on the behavioural model). 25 Although, all the permutations and combinations of applying the four criteria are difficult to outline, the general guideline involves firstly, to determine the highest aggregated raw (unpruned score) over all the 17 areas of evaluation in one of the four behaviour quadrant 30 (DH, SH, SW and DW). The aggregated score in a particular behaviour quadrant helps to establish the personal need level of the sales candidate. For example, based on the Abraham Maslow's model; a highest aggregated/cumulative WO 2008/064431 PCT/AU2007/001854 - 50 score in DH quadrant will correspond to independence and control needs. The emotional state change data may also be taken into consideration for this assessment as the fulfilling or denial of needs is often associated with 5 emotional reactions. Thus, the answers in a particular area which reflect a higher need level than the needs level corresponding to behavioural quadrant (For example, DH) with the highest aggregated score are considered as superfluous answers. These superfluous answers reflect 10 what a candidate would like to be as against what they actually are. The emotional state changes associate with these answers can be a secondary indicator of their superfluous or aspiratory nature. On the other hand, based on the fuzzy selling behaviour model candidate may 15 provide masked answers in a particular area which contradicts the answer corresponding to the initial predominant category based on the highest aggregated raw (unpruned) score. The experience of the domain experts in practice is also used to determine masked or contradictory 20 answers, and again emotional state change data may also be applied to identify contradictions. The removal of the contradictory and superfluous masked answers results in re-adjustment of the scores in the 25 entire four quadrants. The end product from this pruning exercise is a set of pruned scores in four selling behaviour quadrants DH, SH, SW and DW. The pruned scores in four behaviour categories, namely, 30 DH, SH, SW and DW are used to compute the primary fuzzy selling behaviour category. In order to compute the primary (or resultant) actual values of the fuzzy selling behaviour category score, the fuzzy selling behaviour WO 2008/064431 PCT/AU2007/001854 - 51 model is employed. Qualitatively speaking, it may be noted that the primary fuzzy selling behaviour category indicates that for the majority of the time, the sales candidate's selling behaviour will be determined by the 5 attributes related to the overall selling behaviour category. The pruned scores in DH, SH, SW and DW primary behaviour categories are represented diagonally on the each quadrant 10 of the fuzzy selling behaviour model. The computation method involves projection of the pruned scores (DH, SH, SW and DW categories) from the diagonal of each quadrant on to the two dimensions of the fuzzy selling behaviour model (either Hostile-Warm axis or Dominant-Submissive 15 axis). After projection on the two dimensions, the projected score values are added or subtracted depending on the directional offset from the origin (0,0) of the dimension axis. For example, if the projected score (with reference to the origin) lies on either the hostile end of 20 Hostile-Warm axis and the other lies on a warm end of the Hostile-Warm axis, then the two scores are subtracted from each other. If, however, the two scores lie on the hostile or warm end with reference to the origin, then, they are added together. 25 Similarly, if the projected score (reference to the origin) lies on the dominant end of the Dominant Submissive axis and the other lies on the submissive end of the Dominant-Submissive axis, then the two scores are 30 subtracted from each other. If, however, the two scores lie on either the hostile or warm end with reference to the origin, then, they are added together. It may be noted here that through projection of the pruned scores the two WO 2008/064431 PCT/AU2007/001854 - 52 dimensions of the model would be able to determine the intensity (Low, Medium and High) of the sales candidate's resultant score in the selling behaviour category (SH, SW, DH, and DW). After these computations, the score is 5 projected back on the diagonal (in the quadrant represents maximum score category) using Pythagorean Theorem to determine the final primary/resultant score in the particular fuzzy behaviour category. Six steps are undertaken to complete the primary fuzzy selling behaviour 10 score. That is, as illustrated in figure 15: Step 1: Project four pruned scores along the Hostile Warm axis. Compute the score using, 15 Mod[(( SW+DW)-( SW+DW))*Cos 450)] (where mod is referred as module or magnitude score) Step 2: Project four pruned scores along the Dominant 20 Submissive axis. Compute the score using, Mod [((DH+DW)-(SH+SW)* Sin 450)] Step 3: Use the Pythagorean Theorem to compute the 25 resultant score using the formula x 2 +y 2 Step 4: Compute the percentage value of the resultant score. 30 Step 5: Determine the Fuzzy behaviour category according to the fuzzy membership function and fuzzy categorisation rule. Step 6: Determine the fuzzy membership of sales WO 2008/064431 PCT/AU2007/001854 - 53 candidate in other category using the fuzzy categorisation rule. 5 Benchmarking of correlated cognitive and emotional responses Analysis of correlated cognitive response data and emotional state change data may also include comparison of 10 data between individual subjects, among a group of subjects, or between a subject and model response data and emotional data. Benchmarking is an important component of recruitment. 15 "Benchmarking" is the process of comparing and measuring an organisation, system, process and/or product against recognised leaders anywhere in the world to gain information that will help this particular organisation take action to improve its performance. In simple words, 20 benchmarking is a standard of performance. In a recruitment context, benchmarking helps the sales/HR managers to compare the behaviour profiles of the prospective sales candidate with the existing most successful salespersons (benchmark/s) in their 25 organisation. The benchmarking profile of the successful salesperson in a given organisation can be considered as a cultural or person fit profile for comparison. It may be noted that the behaviour profile is constructed from the pruned scores. 30 For example, Figure 19a and Figure 19b illustrate the comparison between the selling behaviour profile of one candidate 1910 and a the selling behaviour profile of a WO 2008/064431 PCT/AU2007/001854 - 54 benchmark salesperson 1930; and; the selling behaviour profile of another candidate 1920 and the benchmark salesperson 1930 in an organisation respectively. The relationships are considered as analysing the shapes of 5 the behaviour profiles. In Figure 19a, both profiles 1910 and 1930 are parallel and are in almost perfect correlation despite the differences in basal expression level and scale. This 10 strong correlation implies that the selling behaviour profiles of the two individuals have similar shape and have similar scores in most selling behaviour categories. The parallel profiles indicate a behavioural and cultural fit. The degree of closeness of parrel profile indicates a 15 tight or loose coupling behavioural and cultural fit. On the other hand, the profiles 1920 and 1930 which intersect or cross each other represent significant variations in selling behaviour of the two subjects being 20 compared. On the positive side, the selling behaviour profiles which intersect with the benchmark profile as shown in Figure 19b can be used as a way of building varieties of mix of salespersons in an organisation. This mix can be use to cater to changing business needs and 25 culture overtime. Embodiments of the present invention enable emotional state change data to be incorporated into the benchmark data. The method and system for correlating emotional 30 state changes with a subject's response to stimulus can further be used modelling correlated cognitive and emotional responses. For example a model may be developed based on a typical or desired individual subject, for WO 2008/064431 PCT/AU2007/001854 - 55 example a model employee for a particular role. For example a model can be developed by selecting a model subject and applying a predetermined sequence of stimuli 5 to the subject to elicit a series of cognitive responses from the subject. The subject is monitored and their cognitive response and data indicative of their emotional response recorded for each stimulus. The data can then be analysed to provide emotional state change data for the 10 subject during the time when the stimulus was applied for each stimulus, and the emotional state change data correlated with the cognitive responses. For example, the top salesperson in an organisation may be desirable to use as a model. This person can execute the candidate survey 15 to enable their cognitive responses associated emotional state changes to be recorded, and this then becomes the model or benchmark to be used for comparison with recruitment candidates. 20 Modelling correlated cognitive and emotional responses can also be based on a number of subjects using the method above applied to a number of subjects, such as a group of salespeople, and performing further modelling analysis. The further modelling analysis may vary depending on the 25 number of subjects. For example, where enough subjects are monitored to provide a statistically valid sample, a model may be developed based on statistical norms within the sample. 30 The above steps can be applied to each subject in the sample and statistical analysis performed on the data acquired from all the subjects. For example, each subject completes a sales recruitment survey and the dominant primary behavioural style determined, along with the 35 normal for the cognitive and emotional state change for each question to provide the benchmark data.
WO 2008/064431 PCT/AU2007/001854 - 56 In another embodiment subjects may be classified into groups based on particular trends or similarities in individual subject's cognitive data and/or emotional response data and the model based on particular trends 5 within a group. For example, a group of subjects all answer the sales recruitment survey and both cognitive and emotional responses are recorded. The recorded data can be analysed to select a group of subjects who all showed similar cognitive responses. The emotional responses of 10 this group of subjects can then be analysed and a model developed based on the combined cognitive and emotional responses for the group. The analysis performed to develop such a model can vary depending on the size of the group and the context. For example, in a recruitment 15 context the sequence of stimuli may be a survey relating to job behaviour and attitudes which may produce very distinct trends in both cognitive and emotional responses which may be converted to a model by averaging of the responses across the group. 20 In contrast, in the context of selecting a holiday destination, emotional reactions may show great variation across a sample of subjects as the selection of a holiday destination is often highly emotive, dependent on many and 25 varying personal influences and subject to whim with no readily anticipated response. Thus in this context factors other than similarities in responses may be used in developing a model. For example demographic information or response to a particular question, such as 30 do you prefer "outdoor" or "shopping" oriented holidays may be used to divide a group of subjects for further modelling analysis. The advantage of such models is they can be developed 35 based on actual monitored emotional responses of individual subjects, correlated with cognitive responses of one or more individuals to a selected sequence of WO 2008/064431 PCT/AU2007/001854 - 57 stimuli. The sequence of stimuli can be selected based on the cognitive responses of interest or relevance for a particular context and by recording actual cognitive and emotional responses of selected subjects the emotional 5 state change data for the model is acquired. This enables models to be easily developed for any context without necessarily requiring detained psychological analysis of each context, for example it is not necessary for every sales department to be reviewed and analysed using 10 psychological profiling to develop a model, the model can be developed simply by applying the same survey to a sample of "model" salespeople form each team, thus a model specific to each team is developed automatically using the system. 15 Embodiments of the model development are applicable in a variety of contexts and with varied analysis applicable for each context, however, all the applicable alternatives are considered within the scope of the present invention. 20 As discussed above the developed model can be used for benchmarking both cognitive and emotional responses. For example as shown in Figures 20a-c the cognitive and emotional responses of the subject and the benchmark can 25 be represented on the same axes for visual comparison. Some examples of systems for utilising the above modelling and comparison are shown in figures 21 to 24. The example of Figure 21 is a sales recruitment system 2100. The 30 candidate's cognitive text inputs 2110 in response to questions are input to an intelligent selling behaviour evaluation component 2120 (for example as described above), the selling behaviour profile 2135 is developed but the selling behaviour profiling component 2130, this WO 2008/064431 PCT/AU2007/001854 - 58 selling behaviour can then be benchmarked against a model by the selling behaviour benchmarking and comparison component 2140. In parallel to the analysis of the candidates cognitive responses, a video stream 2160 of the 5 candidate recorded while they answered the questions is input. The facial image extraction component 2170 extracts relevant facial images to analyse to determine emotional state changes for each answer, these images are analysed and processed (as described above) by the facial 10 image processing component 2180, and the emotional state change data extracted and correlated with the cognitive answers by the emotional state extraction and profiling component 2190, this emotional behavioural profile 2155 can then be benchmarked against the model by the emotional 15 profile benchmarking component 2145. The results of the above analysis and benchmarking can be presented by the emotional selling behaviour profile visualisation component 2150 for ease of interpretation by the recruiter, for example the comparison graphs similar to 20 Figure 20c prepared to show the profile compares to the benchmark of a good salesperson 2157, also reports showing the sales candidate's selling behaviour category and scores 2125, further reports identifying discrepancies either between the emotional and cognitive responses of 25 the candidate or between the candidate and the model can also be produced. A system 2200 for automatically providing feedback in an education context is shown in Figure 22. In this system 30 2200 a video of a lecture 2210 and a video of on or more students 2220 are input to the system 2200. The lecture video stream is broken down into sections based on topics or blocks of time by the lecture breakdown component 2215, WO 2008/064431 PCT/AU2007/001854 - 59 and student facial images extracted from the student video stream 2220 by the student group facial image extraction component 2230 for each lecture segment. The group facial image processing component 2240 analyses the student 5 facial images to determine emotional state changes for the lecture segment which are then analysed by the intelligent emotional state extraction and profiling component 2250. The emotional state changes may be from one or more students from the group, and analysed to determine trends, 10 such as the level of engagement over the course of the lecture -based on the intensity of emotional responses from one or more of the students, alternatively differences in emotional reaction between students may be identified for highlighting to the lecturer as this may indicate some 15 students are confused or there are mixed reactions to a topic, for example the topic may need to be revisited for clarification or interactive student debate. The trends in the group engagement levels or profiles can be related to particular topics by synchronisation with the lecture 20 segments by the synchronisation component 2260. The student emotional profile visualisation component 2270 can present the emotional responses for interpretation by the lecturer, for example where the lecture is being given live and the lecturer my be provided with a real time 25 lecture adaptation decision aid 2290 such as a monitor showing the average engagement level of the students in real time, thus enabling the lecturer to either delve further into a topic if the students are showing high level of engagement or take action such as changing topic 30 or lecturing style where students are showing lack of engagement or reducing engagement with a topic. Alternatively where a lecture is pre-recorded, say for online delivery over the internet, in several sections or WO 2008/064431 PCT/AU2007/001854 - 60 includes supplementary learning tools such as games or quizzes associated with the lecture topics, a falling level of engagement from an online student at certain points in the lecture could trigger a change in topic or 5 breaking to play a game or do a quiz, such decisions could be made by a real-time pro-active lecture delivery adaptation and quality assurance component 2280. The system can also output information for quality assurance 2295 purposes to be reviewed by the lecturer or 10 colleagues, or compared with similar emotional state profiling information from other lectures in relation to similar topics to determine what lecturing or learning style is most effective for different groups of students. 15 A system 2300 for monitoring critical event operators is illustrated in Figure 23. This system may be used for monitoring a person in a live situation or in a simulator. For example the critical event operator may be a fire fighting team leader, emergency response coordinator in a 20 power station or factory, a commander of a battle ship or platoon, a fighter pilot, a paramedic, a racing car driver, etc. This system could be applied in any area where a person is required to make cognitive decisions in response to ongoing events in an important and fluid 25 situation. In this system the operator's cognitive responses 2310 and a video stream of the operator during the situation 2320 are input to the system. Similarly as described above, facial images associated with cognitive responses are extracted and analysed to determine 30 emotional state changes experienced by the operator while making cognitive decisions by the facial image extraction component 2330, facial image processing component 2335, and emotional state extraction component 2340. The system WO 2008/064431 PCT/AU2007/001854 - 61 is adapted to analyse the cognitive behaviour 2315 and compare this with a benchmark using the cognitive behaviour component 2360 and also benchmark the operator's emotional profile 2350, which can be presented for 5 comparison 2370. The profile can be monitored for signs of stress and to determine where stress, particularly emotional stress, affects the cognitive behaviour of the operator. In this system the stimuli can include a structured simulation, such as a game, event, emergency 10 flight or battlefield simulation or a real life event. Where a simulation is being used the simulation may be designed to test particular aspects of peoples' abilities and resilience, such simulations or sets of stimuli may be modified based on the requirements of a situation. It is 15 envisaged that the stimuli may be modified by replacement of the entire sequence, for example, it may be necessary to change the simulation if the content of the simulation becomes known, either through repeated use or breach of security. Alternatively parts of the sequence may be 20 modified either in real-time, during the execution of the simulation, based on feedback from the subject, for example if it appears too easy or in response to a particular cognitive or emotional reaction, or to avoid familiarity, for example randomly changing the order of 25 stimuli so what is coming next cannot be predicted. The system 2400 illustrated in figure 24 is used to monitor a person's browsing behaviour and emotional state changes when planning a holiday using the Internet. In 30 this system the navigation behaviour 2470 of the person (traveller 2450) is tracked, for example which pages are being viewed or thumbnail pictures expanded for better viewing. Optionally the person's eye movement or gaze is WO 2008/064431 PCT/AU2007/001854 - 62 also tracked to monitor what they are viewing, for example how long a picture is viewd or how often the gaze returns to the picture. Simultaneously, the facial expression component 2420 of the system 2400 uses a device such as a 5 webcam to acquire a video image stream 2415 to capture the person's facial expressions and analyses the visual images to monitor, using the above techniques, the person's non verbal emotional responses to what is being viewed. This data can be used by the e-tourism web site 2450 to tailor 10 the information presented to the person, for example by giving priority or highlighting destinations, accommodation, tours etc, similar to those that the person has shown a positive emotional reaction to. Alternatively the emotional reactions of people can be monitored in 15 order to provide feedback to service suppliers, for example where a great number of people have a positive reaction when looking at a picture of one accommodation venue then an extreme negative reaction (compare to the reaction for other accommodation venues of similar 20 standard) when viewing the price, this may indicate that the accommodation is overpriced. Embodiments may also be applied in application where interaction occurs between people, one being the subject 25 and the other being a person providing the stimuli. For example, in a medical context the subject can be a patient and the third party providing the stimuli a doctor. The patient can be monitored to determine emotional state changes, for example reactions to questions, pain, 30 treatment or medication. The stimuli may be applied externally for example by the doctor asking a question or administering a drug. Alternatively the stimuli effecting the patients emotional state may be applied by the patient WO 2008/064431 PCT/AU2007/001854 - 63 themselves, for example pain, pain caused by attempted movement, frustration caused by an inability to move, or distress caused by environmental factors such as light, noise or heat. 5 Monitoring patterns of behaviour and application for decision making Embodiments of the present invention can be of value when 10 accumulating data, such as monitoring browsing behaviour or acquiring survey data, as the emotional state changes associated with cognitive responses adds a further dimension to the data accumulated and thus enables more detailed a subtle analysis of the cognitive results by 15 qualifying those results based on the emotional reactions associated therewith. Data mining as the name suggests is the art and science of uncovering hidden patterns. Research in data mining has 20 historically been driven by design and refinement of data and data/web mining algorithms. The data mining algorithm driven approach has primarily focused on predictive accuracy (given a set of training data) and other technology-driven outcomes. The embodiment 25 described herein is directed towards a more context centred and utility-centred approach compared to technology-centred data mining approaches. Further, data mining has its hurdles: the 'meanings' are 30 not suggested by the data or the computers; they are imposed on data by human beings. This problem is further acerbated by the fact that data mining technologies are largely designed based on technology-push models as against a strategy-pull models driven by business 35 managers. In a strategy-pull model business managers make sense of a new situation by constructing meaning or knowledge based on their cognitive constructs and adapting WO 2008/064431 PCT/AU2007/001854 - 64 these cognitive constructs to the dynamics of the business situation. In the process of applying and adapting the cognitive constructs or schemas, the managers may honour as well as reject pre-specified meanings and outcomes 5 mined using historical data (as is the case in a technology-push model). The cognitive constructs also help to establish the semantic context in which data mining systems are used and interpreted. 10 From a social and pragmatic context, people's interpretation of meaning and application of knowledge is not entirely based on their cold cognitive scripts or rules, but are mediated by their emotional attitudes. Recent developments and focus of organizations in the area 15 of emotional intelligence has established the role of emotions in human decision making. Embodiments of the present invention enable emotional state change data to be automatically gathered and correlated with cognitive data for analysis such that intelligent technologies can 20 account for emotional factors while delivering knowledge driven business outcomes to organizations Existing research has not effectively combined cognitive behaviour with emotional or affective characteristics of 25 users for meaningful interpretation of knowledge, and decision making in work in data mining has largely overlooked developing a social, semantic and pragmatic context based approach while addressing algorithmic and technology issues towards design of data mining systems. 30 Data when processed assumes an informational value. If applied within a particular context it becomes knowledge which is of value to an enterprise, otherwise it may only be viewed as the accumulation of additional information 35 and data. The term context is used in the broadest possible sense; it encompasses any information that might be useful for defining the user's situation. There are WO 2008/064431 PCT/AU2007/001854 - 65 potentially many more types of contextual information available than what is used to define a given situation. The context model applied in this embodiment is divided 5 into three main categories: i) Social Context: This describes the varied social units that structure work and information, organizations and teams, communities and their distinctive social processes and practices; 10 ii) Semantic context: describes the individual interpretation of a situation based upon an existing system of cognitive frameworks and constructs, goals and tasks; it represents the 15 personal meaning or sense ascribed to information related to certain task or situation. It is also called sensemaking. This definition can also be extended to group interpretation with some provisos; and 20 iii) The pragmatic context: The process of translating the personal interpretation or meaning into a specific behaviour or action is moderated by interaction of an individual's rational 25 characteristics with their affective (emotional) characteristics. This includes also the need for adaptation and interpretation of meaning in terms of dynamic and evolving environment surrounding business situations and the spatio-temporal 30 context (location and time) as applicable. This model is applicable where there exists a goal or problem in any situation. It would be futile to identify a situation unless there is some task connected to it - no 35 matter how mundane. Semantic context describes the individual interpretation WO 2008/064431 PCT/AU2007/001854 - 66 of a situation based upon existing (or learnt) cognitive models, goals and tasks related to the situation; it represents the personal meaning or sense ascribed to information related to certain task or situation. This 5 description is theoretically underpinned in the area of sensemaking and naturalistic decision making which as the name suggest is about constructing (or interpreting) meaning or making sense of a given situation. The process of making sense involves interplay of action and 10 interpretation rather than the influence of evaluation on choice. Knowledge acts as an interpretant to turn data into information. In a given situation, we may encounter 15 familiar as well unfamiliar or new information. The new information causes some level of dissonance prompting the question "What's the story here?". In the process of resolving this dissonance we create knowledge. Knowledge is created through a sensemaking process. 20 However, the sensemaking process takes place in a context. Data to one person is someone else's information. For example, an investment banker might stare at a computer screen of numbers which would look to most people as raw 25 data. To the investment banker, however, slight changes in the numbers conveys messages which act as information they might convert to knowledge (via sensemaking) and take action. Thus context is a key ingredient acting as an underlay to all three concepts of data, information and 30 knowledge. For purpose of interpreting, constructing meaning and resolving the dissonance, people engage in organised sensemaking which involves use of cognitive constructs for 35 labelling and categorizing to stabilize the streaming of experience. The process of labelling and categorisation involves connecting abstract and impersonal concepts with WO 2008/064431 PCT/AU2007/001854 - 67 concrete and personal concepts which are amenable to functional deployment. For example, functional deployment may involve diagnostic labels in medicine that suggest a plausible action or treatment or some decision related 5 labels like credit card approval which suggest plausible approval/disapproval of a credit card application. The chain connecting abstract with the personal is also called a pattern or a schema. Intelligent sensemakeing 10 involves identifying, retrieving and adapting a set of patterns or schemas and adapting those patterns or schemas to a given situation. The underlying assumption here is that ignorance and knowledge coexist, which means that adaptive sensemaking both honours and rejects the past. 15 For example, nurses (and physicians), like everyone else, make sense by acting thinkingly, which means they simultaneously interpret their knowledge with trusted frameworks or cognitive structures, yet mistrust those very same frameworks by testing new frameworks and new 20 interpretations. In other words, like in medical work, in all work people face evolving disorder. Progressive changes through time in work stipulate that a seemingly correct action "back then" is becoming an incorrect action now. This establishes the need for the cognitive structure 25 to adapt to the evolving disorder and change. Situation comprehension using cognitive constructs can also be also be seen as a way of identifying and retrieve these schemas in a cost effective (in terms of time) and efficient (in terms of resources required). These constructs also help 30 to determine the leverage points where intelligent data mining technologies may be applied. From another perspective, the recent emergence of the area of emotional intelligence has clearly established the role 35 of affect or emotions in human decision making. In the context of sensemaking it has helped in clarifying questions like whether intra-organizational institutions WO 2008/064431 PCT/AU2007/001854 - 68 are better portrayed as cold cognitive scripts built around rules or as hot emotional attitudes built around values. Since sensemaking involves interplay between interpretation and action, action is mediated by the 5 affective characteristics of an individual and thus should be factored into meaningful or context-based interpretation of knowledge. That is, the same set of numbers may be interpreted by one investment banker somewhat differently than another investment banker 10 depending on their emotional attitudes. In other words personal meaning is mediated through affective characteristics of users. Thus based on context-aware feedback (situational and 15 affective) of user, sensemaking and intelligent data mining technologies (and software artefacts like objects and agents) are integrated in this embodiment to provide adaptive context-aware data mining systems. Semantic and pragmatic context issues cal also be modelled in the 20 context-aware data mining architecture. The context awareness in particular is captured at three levels, namely, cognitive, affective and situational. An embodiment of the present invention provides a system 25 for applying a sensemaking model for automated decision making or decision making support. The framework of the system is illustrated in Figure 25. The system comprises a controller in which is defined a 30 decision making model for executing a plurality of decision making phases and applying rules in accordance with the model, and a plurality of agents each adapted to perform a function for use in the decision making process. The controller and agents are implemented in a processor. 35 The controller is adapted to request actions be performed by the plurality of agents in accordance with the decision making model. At least one of the agents is adapted to WO 2008/064431 PCT/AU2007/001854 - 69 provide emotional state change data associated with a situation for the decision making process. In the embodiment described the controller and agents are 5 arranged in a seven layer architecture. The layers are: 1. Reactive layer - including agents for raw data acquisition and basic data manipulation; 2. Intelligent technology layer - including agents for data mining and identification of patterns in the 10 acquired data; 3. Cognitive sensemaking layer - for coordinating the use of agents associated with other layers and interpretation of data in accordance with a decision making context; 15 4. Affective sensemaking layer including agents for the acquisition and analysis of emotional state change data; 5. Situation adaptation layer including agents for monitoring decision making results and reactions 20 thereto and agents for analysing the reactions in the context of the situation for feedback into the decision making context and adaptation of the decision making model; 6. distribution coordination layer including agents for 25 coordinating communication between user and agent; and 7. Object layer including domain agents for use by the agents of the other agents to facilitate data processing and presentation for the user. 30 In this model the cognitive sensemaking layer coordinates the activity of the various layer agents in accordance with a decision making model associated with a context. For example, as illustrated in figure 25, the controller 35 controls the execution of a number of decision making phases. The first phase involves data acquisition 2521 and pre-processing 2522 such as manipulation of raw data WO 2008/064431 PCT/AU2007/001854 - 70 and improving data quality performed by the reactive layer. The manipulation of raw data can involve basic decision making 2523 in accordance with fixed rules, such as data conversion from one format to another, or data 5 quality improvement such as filtering to remove errors. Action which does not involve learning may be taken by reactive layer agents 2524, such as display of manipulated data. The reactive layer consists of agents which represent stimulus-response phenomenon in a user defined 10 environment. The agents in this layer may include data aggregation agents, data transformation agents, data visualization agents which may not need learning. The second phase is a context elicitation phase 2530 for 15 analysis of the data to identify a situation with a problem to be solved, for example based on patterns of data, and determining the decision making model to be applied. This phase is coordinated by the cognitive sensemaking layer 2510. The context elicitation phase 20 makes use of agents from the intelligent technology layer for determination of patterns of data to identify the context for which a decision making model is to be applied. The intelligent technology layer contains data mining agents which involve learning to find patterns from 25 data. This layer includes clustering, fusion (e.g., fuzzy-neuro), generic algorithm (GA) and other agents. At the procedural level, a set of rules or patterns direct inference. This set may be large, but is always closed, i.e., it corresponds to pre-formed, pre-determined 30 insights and learnt patterns of behaviour. This level is represented by the intelligent technology layer. The context elicitation phase, elicits context in a given situation by defining a set of orthogonal task based contexts based on the data patterns and inference output 35 from the intelligent technology layer agents, identifies the decision making context and selects the appropriate model for this context which defines the set of rules and WO 2008/064431 PCT/AU2007/001854 - 71 tasks to be applied for the context. The decision making model can include tasks for execution based on the situation and rules for making decisions based on the data. 5 The third phase, also controlled by the cognitive sensemaking layer is a situation interpretation labelling phase 2540 for identifying the required tasks and data associated with the problem to be solved according to a 10 predetermined decision making model. This phase determines functional deployment labels within a context for the situation under study and defines the selection knowledge for navigating between functional deployment labels. Defining functional labels, includes identifying 15 categories with in each orthogonal context relevant to a situation, and determining conflict resolution knowledge between various functional deployment labels within each context. 20 The fourth phase, controlled by the cognitive sensemaking layer 2510, is a situation action phase 2550 for executing the tasks associated with the identified problem based on the data. This situation action phase 2550 applies the data to define and model outcomes or action related 25 instances of interest to the user based on the rules for the context. These outcomes and actions can be output to a user or other system. The fifth phase, controlled by the cognitive sensemaking 30 layer 2510, is a situation adaptation phase 2560 for monitoring results from task execution and reactions from external to the system to the results, including emotional reactions. Agents from the affective sensemaking layer 2555 and the situation adaptation layer 2565 are utilised 35 to monitoring and analysing these results. The affective agents in the sensemaking (affective) layer WO 2008/064431 PCT/AU2007/001854 - 72 2555 model the affective characteristics (e.g., negative/positive emotional state) which are used for interpretation of user's feedback and actions in a given situation. This then forms an integral part of user action 5 and affect profiling agent over time. The situation-adaptation layer 2565 consists of situation monitoring, situation adaptation agents monitor the result of the action of the system on the user /environment in a 10 given situation (e.g., acceptance/ rejection of a recommendation, prediction by the user/environment) and incorporate this feedback to adapt the actions of the situation-action phase agents. 15 The goals of the situation adaptation phase are to adapt existing actions to a new situation based on feedback on the actions from the user/environment and affective feedback from the user, and construct and explore new situation-action pathways. For example by defining and 20 modelling situation monitoring parameters, situation adaptation parameters, and user's affective (emotion) parameters, and using these models to adapt the context model for future application. This situation adaptation phase enables the external reaction to decisions to be fed 25 back into the system so the model can be adapted on an ongoing basis in the environment in which it is deployed. Models may also be adapted for future application based on the external reactions. 30 A distribution and coordination layer 2570 comprising agents who process data on behalf of agents in other layers in real-time and in a distributed manner to meet the real-time and speed up needs of applications is also WO 2008/064431 PCT/AU2007/001854 - 73 provided. The coordination layer agents are used to coordinate communication between the user and the agents in sensemaking (cognitive), optimization and sensemaking (affective\emotion) layers of the architecture. The 5 coordination layer agents are also used to coordinate communication between agents in the seven layers and maintaining a blackboard type global system state representation. An object layer 2580 is used to represent ontology of the domain objects which are manipulated by 10 the sensemaking (cognitive) layer agents defined by a user in a particular situation. The monitoring of reactions in the situation adaptation phase enables data relating to the external reaction to 15 the results of the task execution to be fed back into the system for automatic adjustment or refinement of the decision making model based on reaction patterns. For example the decision making model may comprise a set of rules for manipulation of data and algorithms to apply to 20 the data to achieve a result, such as a credit card approval or disapproval. The result (approved or disapproved) and some key data can be presented to a controlling person (underwriter/manger) for vetting or approval of the automatically generated decision. Where 25 automatically generated decisions are overruled by a controlling person, the reasons for the overturning can be fed back into the system for analysis and, over time, the rules adjusted in accordance with trends or patterns distinguished from the reasons for overturing decisions. 30 The data fed back in relation to overturning decision can include emotional state change data reflecting the emotional state changes exhibited by the overruling person while making the decisions to overrule the automatic WO 2008/064431 PCT/AU2007/001854 - 74 decision. For example, based on the emotional state changes of the overruling person, it can be determined whether the overruling was a clear decision that the person was comfortable with (positive state change) or 5 uncomfortable with (negative state change) or a decision they were unsure about or required substantial deliberation (multiple state changes during the decision making process and possible a long time to make the decision). This emotional state change data in 10 combination with the cognitive decision and the data input .to both the automated decision and the overruling decision can be analysed and the emotional state change data used to weight the other cognitive data during analysis for rule adaptation. 15 The following description illustrates i) how a Relationship manager in a financial institution engages in sensemaking process using cognitive schema or constructs in a customer resource management (CRM) situation and how 20 their cognitive structure contextualises and leverages the use intelligent data mining technologies; ii) that the schema adopted by the relationship manager is more consistent with strategy-pull approach rather than schema which may be perpetuated by objects and relationships 25 defined by a technology push-model approach, and iii) how situation adaptation (simple form) is modelled using the agents in the situation-adaptation layer and the intelligent technology layer of an embodiment of the present invention. 30 A typical object-based domain ontology of banking products in a financial institution is shown in Figure 26. Figure 27 shows the application of the five sensemaking WO 2008/064431 PCT/AU2007/001854 - 75 (Cognitive) layer agents. The purpose of the sensemaking (cognitive) layer is to help relationship managers to model a CRM situation. Figure 27 represents the construction level of the three behaviour levels. The 5 pre-processing, context elicitation, situation interpretation, situation-action and situation-adaptation agents assist a relationship manager to systematize and reducing dissonance in a CRM situation. The dashed line in Figure 6 represents the situation action pathway 10 related to credit card approval. The shaded components in the situation construction structure represent where different layers (and their corresponding agents) have been leveraged by sensemaking layer agents in situation construction structure. For example, the reactive layer 15 agents are used in the CRM preprocessing phase agent. The situation adaptation layer agents are leveraged by the situation adaptation phase agents. The intelligent technology layer agents are leveraged by situation-action phase agents and situation adaptation phase agents. The 20 arrows in the Figure 27 represent two-way communication between the sensemaking layer agents. The labels shown in Figure 6 are constructed by the Relationship manager in a given CRM situation. These labels can be changed overtime and can be constructed differently by different 25 relationship managers in different CRM situations. The user action and affect profiling agent maintains a record of the situation-action pathways adopted by the user in a given CRM situation and corresponding. Affective responses (if applicable (the next section will illustrate this 30 aspect)). These situation-action and affect profiles can be reused by the managers in future in similar situations or exploring a combination of situation-action pathways. The affect response feature has applications in critical WO 2008/064431 PCT/AU2007/001854 - 76 event situations and warfare or where affective responses play an integral role in modelling situation-action pathways. 5 Figure 28 shows a sample implementation of the situation adaptation layer agents for credit card approval. Credit card approval process is to assess the credit level of customers based on their past commitments to the financial institution, economic ability, and demographic 10 information. The commitments to the financial institution can be the period retained in the institution. The economic ability of the customer includes job status of the customer (with or without job), years working in the current company (if any), amount of money frequently 15 deposited to the bank, monthly loan payment amount, etc. Demographic information includes gender, age, location etc. to comprehensively judge the personal integrity of the customer. For example, some regions have much higher risks than others for credit card approval. In this case, 20 customers living in high risk area find it hard to get the approval for a credit card from a bank/financial institution.
WO 2008/064431 PCT/AU2007/001854 - 77 Name: Credit Card Situation monitoring agent Goals: Model the gap between the predicted situation action and actual feedback by the user/environment in a situation Tasks: Identify and quantify user feedback (either discrete, linguistic, or continuous) for situation monitoring Model the feedback over time by developing a functional relation between situation-action and user feedback (e.g., acceptance/rejection of the NN Credit Approval Prediction agent) Task Dynamic business environment, Feedback quality Constraints: constrained by Relationship Manager Precondition: Existing NN Credit Card Prediction Agent Postcondition: Getting Feedback from users or agents in the environment Communicates Peer agents, coordination agent, intelligent With: technology layer agents Table 5: Credit Card Situation Adaptation Agent Definition Fifteen variables are used to assess whether or not to 5 approve a customer for a credit card shown in Table 4. In Table 2, A1-A15 are the variables, A16 in the table represents credit card approval status (1-yes, 0- no). The data set in Table 4 is used at the procedural level for training a neural network (BP) prediction agent as 10 shown in Figure 28. The situation adaptation agent is responsible for adapting the weight parameters NN credit card approval prediction agent for prediction. These parameters may be changed by 15 the situation- adaptation agent to improve the performance of the NN prediction agent. Predictions produced from the prediction agent are, of course, based on the data from the database of historical data. The prediction results in terms of their acceptance/rejection can be assessed 20 manually (by the manager) or by situation monitoring agent (refer Table 5 for definition) shown in Figure 28 (once it has been trained on Manager's feedback over time) . The neural network in the situation monitoring agent compares WO 2008/064431 PCT/AU2007/001854 - 78 the systems prediction with the human user/manager approval to learn the approval behaviour of human counterpart in a CRM situation. Initially, the feedback is provided by manager and situation monitoring agent models 5 the gap between predicted variable and its acceptance/rejection by the relationship manager. Overtime/with enough training/learning based on manager's feedback, the situation monitoring agent's performance will be comparable to the human agent and takes over most 10 of the situation assessment jobs from human counterpart. It also triggers the situation adaptation agent in case of negative feedback from the relationship manager in a CRM situation. 15 The GA based adaptation process optimises the weights of the neural network adopting chromosome selection, crossover, and mutation so as to improve predictive behaviour of NN credit card approval agent based on an ongoing basis based on the feedback. 20 Partial C r-dit Card A A A A AA IA A A A AI 0 0.2 . 0.3 0.31 0 0.2 021 1 : 1 031 0 0 00 1 0 0.5 0.2 0.3 0.3 0.0 0.12 0.0 1 0.3 0 0.0 1 0 0.3 ?.5 0.3 0.3 0.7 0.2 0. 1 1 0.0 0 0.3 0.0 0 00 1 0 0.5 0.2 0.3 0 31 0 6 0 1 0.0 1 0.0.1. . .31 00 0 1 0. 0.60.30.3 0 .3 3 1 .2 .3 0 1 10 .50.0 0.3 0.3 080 0. 1 1 0. 1 0.3 0 0 1 0.5 !.1 0.3 0,3 .7 1 0.1 .21 0 1 0.3 02 0 10. 0.0 0.6 0.6 7 0.2 3 1 0 0 1 3 0 0 0 10.3 0.0 0.30.307010. 1 10.0 0301 0 00 0.0 0.6 0.6 10 00 I I 0.0 0 03 0.0 000 1 0. 0.41 0.3 0.31 0.7 0.1100 1 0 . 03101 0 0 Table 6. Normalised Credit Card Approval Data Sample 25 Figure 29 shows the performance comparison of the neural network back propagation (BP) agent without adaptation and after adaptation. In Figure 29, we can observe that when WO 2008/064431 PCT/AU2007/001854 - 79 the optimized BP network is trained for 200 epochs or less, its prediction is poorer than the un-optimized BP network. At 400 epochs, the adapted BP network gives best prediction of 94.7% and after 500 epochs of training the 5 adapted BP network shows over-fitting in prediction of 91.7. As can be observed form the embodiments described herein, aspects of the present invention can be employed across 10 many applications and in a variety of permutations and combinations. All of applications and modifications are considered within the scope of the present invention. It is to be understood that, if any prior art publication 15 is referred to herein, such reference does not constitute an admission that the publication forms a part of the common general knowledge in the art, in any country. In the preceding description, except where the context 20 requires otherwise due to express language or necessary implication, the word "comprise" or variations such as "comprises" or "comprising" is used in an inclusive sense, i.e. to specify the presence of the stated features but not to preclude the presence or addition of further 25 features in various embodiments of the invention.

Claims (30)

1. A method for monitoring emotional state changes in a subject based on facial expressions, the method comprising 5 the steps of: a) capturing first facial image data of the subject and second facial image data of the subject at first and second times respectively; b) identifying points in the facial image data 10 indicative of selected facial features associated with emotional expression; c) tracking relative displacement of the identified points between the first and second facial image data; and d) processing the tracked relative displacement of 15 the identified points using fuzzy logic to produce emotional state change data for the subject for the period of time between the capture of the first facial image data and the second facial image data, wherein the emotional state change data characterises the relative direction of 20 the emotional state change independent of an absolute emotional state of the subject.
2. A method for correlating emotional state changes with a subject's response to stimulus, the method comprising 25 the steps of: applying a stimulus to the subject to evoke a response from the subject; monitoring the subject during the time the stimulus is applied to obtain emotional state change data indicative 30 of emotional reaction of the subject to the stimulus, wherein the emotional state change data characterises the relative direction and optionally intensity of the emotional state change independent of an absolute emotional state of the subject; and 35 associating the emotional state change data with the response to the stimulus. 3574480_3 (GHMatters) P71021.AU.1.10.05.2013 - 81
3. A method for developing an emotional response model, the method comprising the steps of: selecting a model subject; applying a predetermined sequence of stimuli to the 5 subject to elicit a series of cognitive responses from the subject; monitoring the cognitive responses of the subject to each stimulus and recording the cognitive response data; monitoring the subject during application of each 10 stimulus to obtain data indicative of emotional reaction of the subject to each stimulus; analysing the data to provide emotional state change data for the subject during the time when the stimulus was applied for each stimulus, wherein the emotional state 15 change data characterises the relative direction of the emotional state change independent of an absolute emotional state of the subject; correlating the cognitive response data with the emotional state change data for each stimulus; and 20 developing an emotional response model for the sequence of stimuli based on the correlated data.
4. A method as claimed in claim 2 wherein the stimulus is a task and the step of monitoring the subject comprises 25 monitoring the subject's cognitive actions and associated emotional reactions during the time while the task is being performed.
5. A method as claimed in any one of claims 2 to claim 4 30 wherein the step of monitoring the subject at the time the stimulus is applied to obtain emotional state change data uses the method of claim 1.
6. A method as claimed in claim 1 or claim 5 wherein the 35 facial images are normalised to compensate for changes in head placement between the first and second images using geometric normalisation based on tracking the relative 3574480_3 (GHMatters) P71021.AU.1.10.05.2013 - 82 location of facial features or positions on the face or head.
7. A method as claimed in claim 6 wherein a series of 5 facial images of the subject are electronically captured as facial image data, wherein each facial image is separated from the next by a designated time period and the first and second facial image data are selected from the series of facial images, and wherein 10 the step of identifying points in the facial image data indicative of a selected feature comprises the steps of: bl) automatically locating the selected facial features in the facial image data; 15 b2) identifying points indicative of the facial feature associated with emotional expression; and b3) selecting a set of pixels for each point; and the step tracking relative displacement of the 20 identified points between first and second image data comprises for each point the steps of: c1) identifying a set of pixels in the second image data for the point which correspond to the selected set of pixels for the point in the first image data; 25 and c2) comparing the relative position of the selected and corresponding sets of pixels between the first image data and the second image data. 30
8. A method as claimed in claim 7 further comprising the steps of: e) repeating steps a) to d) one or more times to obtain a sequence of emotional state change data samples for a subject; and 35 f) processing the sequence emotional state change data samples to identify trends for the sequence. 3574480_3 (GHMatters) P71021.AU.1.10.05.2013 - 83
9. A method as claimed in claim 8 further comprising the step of: g) determining one or more emotional state change characteristics for a subject based on the identified 5 trends.
10. A method as claimed in any one of claims 2 to 5 and claims 6 to 7 where dependent upon claim 5 wherein the response includes a cognitive response and the method 10 comprising applying a sequence of stimuli to the subject, monitoring the subject's emotional reactions for the duration of the application of the sequence, and analysing the captured data to provide emotional state change data associated with each stimulus in the sequence. 15
11. A method as claimed in claim 10 wherein the sequence of stimuli is one of a survey, game, website, e-learning course material, a face-to-face or remote lecture, health care activities, customer relationship management 20 activities or structured environment such as a simulation.
12. A method as claimed in claim 10 or claim 11 wherein the sequence is dynamically adapted in response to responses given during the course of the sequence. 25
13. A method as claimed in any one of claims 10 to 12 wherein the sequence of stimuli involves interaction between the subject and their environment. 30
14. A method as claimed in any one of claims 10 to 13 further comprising analysing the correlated cognitive response data and emotional state change data, wherein the emotional state change data is applied to qualify a subject's cognitive response. 35
15. A method as claimed in claim 14 wherein a series of cognitive and emotional responses is analysed to determine 3574480_3 (GHMatters) P71021.AU.1.10.05.2013 - 84 trends in emotional state changes in comparison with cognitive responses.
16. A method as claimed in claim 14 or claim 15 wherein 5 analysis of correlated cognitive response data and emotional state change data includes comparison of data between individual subjects, among a group of subjects, or between a subject and model response data and emotional data. 10
17. A method as claimed in any one of claims 2 to 16 further comprises analysing correlated sets of emotional state changes for a subject wherein the correlated sets include a first set of emotional state changes occurring 15 over a first time interval and a second set of emotional state changes occurring over a second time interval wherein the first time interval is a relatively long time interval compared with the second time interval, wherein the second time interval is a time interval proximate a 20 cognitive response.
18. A method as claimed in claim 3 or claim 5 where dependent upon claim 3 wherein a model is developed based on the correlated cognitive and emotional responses of a 25 number of subjects sufficient to provide a statistically valid sample and includes analysing the correlated data of each subject in developing the emotional response model for the sequence of stimuli, and developing the emotional response model based on statistical norms within the 30 sample.
19. A method as claimed in claim 18 further comprising classifying subjects into groups based on particular trends or similarities in one or more of individual 35 subject's cognitive data and emotional response data, and the emotional response model based on particular trends within a group. 35744803 (GHMatters) P71021.AU.1.10.05.2013 - 85
20. A system for monitoring emotional state changes in a subject based on facial expressions, the system comprising: 5 an image capturer adapted to capture first facial image data and second facial image data of the subject at first and second times respectively; and a processor adapted to: identify points in the facial image data 10 indicative of selected facial features associated with emotional expression; track relative displacement of the identified points between the first and second facial image data; and 15 process the tracked relative displacement of the identified points using fuzzy logic to produce emotional state change data for the subject for the period of time between the capture of the first facial image data and the second facial image data, wherein 20 the emotional state change data characterises the relative direction and optionally intensity of the emotional state change independent of an absolute emotional state of the subject. 25
21. A system for correlating emotional state changes with a subject's response to stimulus, the system comprising: a monitor adapted to monitor the subject during an application of stimulus to the subject to evoke a response from the subject to obtain emotional state change data 30 indicative of emotional reaction of the subject to the stimulus, wherein the emotional state change data characterises the relative direction and optionally intensity of the emotional state change independent of an absolute emotional state of the subject; and 35 a processor adapted to associate the emotional state change with the response to the stimulus. 3574480_3 (GHMatters) P71021.AU.1.10.05.2013 - 86
22. A system as claimed in claim 21 wherein the monitor captures first facial image data of the subject and second facial image data of the subject at first and second times respectively during the time the stimulus is applied, 5 identifies points in the facial image data indicative of selected facial features associated with emotional expression, tracks relative displacement of the identified points between the first and second facial image data, and processes the tracked relative displacement of the 10 identified points using fuzzy logic to produce emotional state change data for the subject for the period of time between the capture of the first facial image data and the second facial image data. 15
23. A system as claimed in any one of claims 20 to 22 wherein the first and second facial images are normalised to compensate for changes in head placement between the first and second facial images using geometric normalisation is performed based on tracking the eyes or 20 the relative location of facial features or positions on the face or head.
24. A system as claimed in any one of claims 19 to claim 23 wherein the image capturer or monitor includes a visual 25 image recorder and a series of facial images is captured as frames of a video stream and wherein the first and second facial image data are be selected from the series of facial images. 30
25. A system as claimed in claim 24 wherein changes between the first and second facial images indicative of emotion states are determined using an optical flow algorithm to track relative displacement of selected pixels indicative of facial features associated with 35 emotional expression.
26. A system as claimed in any one of claims 20 to 25 3574480_3 (GHMatters) P71021.AU.1.10.05.2013 - 87 wherein a sequence of stimuli is applied to the subject, and the subject's emotional reactions monitored for the duration of the application of the sequence and the captured data analysed to provide emotional state change 5 data associated with each stimulus in the sequence.
27. A system as claimed in claim 26 wherein the sequence is dynamically adapted in response to responses given during the course of the sequence. 10
28. A system adapted to present correlated cognitive and emotional state change data for use in decision making, the system comprising: a controller arranged to apply a decision making model 15 for executing a plurality of decision making phases and applying rules in accordance with the model; and a plurality of agents each adapted to perform a function for use in the decision making process, at least one of the agents being adapted to provide emotional state 20 change data associated with a decision making context, wherein the emotional state change data characterises the relative direction of an emotional state change independent of an absolute emotional state, the controller and agents being implemented such that 25 the controller is adapted to request actions be performed by the plurality of agents in accordance with the decision making model.
29. A system as claimed in claim 28 wherein the controller 30 and agents are arranged in a layered architecture, the layers comprising: a reactive layer including agents for raw data acquisition and basic data manipulation; an intelligent technology layer including agents for 35 data mining and identification of patterns in the acquired data; a cognitive sensemaking layer for coordinating the use 35744803 (GHMatters) P71021.AU.1.10.05.2013 - 88 of agents associated with other layers and interpretation of data in accordance with a decision making context; an affective sensemaking layer including agents for the acquisition and analysis of emotional state change 5 data; a situation adaptation layer including agents for monitoring decision making results and reactions thereto and agents for analysing the reactions in the context of the situation for feedback into the decision making 10 context and adaptation of the decision making model; a distribution coordination layer including agents for coordinating communication between user and agent; and an object layer including domain agents for use by the agents of the other agents to facilitate data processing 15 and presentation for the user, wherein the controller controls the execution of a number of decision making phases such as: a pre-processing phase for data acquisition and manipulation of raw data; 20 a context elicitation phase for analysis of the data to identify a situation with a problem to be solved; a situation interpretation labelling phase for identifying the required tasks and data associated with the problem to be solved according to a predetermined decision making 25 model; a situation action phase for executing tasks associated with the identified problem based on the data; and a situation adaptation phase for monitoring results 30 from task execution and reactions from external to the system to the results, including emotional reactions, enabling these external reaction to be fed back into the system for adaptation of the predetermined model based on the external reactions, and 35 wherein the agents include procedures implemented in software and/or hardware for performing one or more of the functions of: interaction between the decision making 3574480_3 (GHMatters) P71021.AU.1.10.05.2013 - 89 system and user environment; acquisition of data; or analysing data according to the given function.
30. A system as claimed in claim 29 wherein an agent 5 acquires first and second facial images from a system user, identifies points in the facial image data indicative of selected facial features associated with emotional expression, tracks relative displacement of the identified points between the first and second facial 10 image data, and processes the tracked relative displacement of the identified points using fuzzy logic to produce emotional state change data for the subject for the period of time between the capture of the first facial image data and the second facial image data, wherein the 15 emotional state change data characterises the relative direction and magnitude of emotional state changes independent of any absolute emotional state of the subject, and correlates sets of one or more emotional state changes to apply in decision making wherein each set 20 of emotional state changes includes the magnitude and direction of one or more emotional state changes occurring within a given time interval. 3574480 _3 (GHMatters) P71021.AU.1.10.05.2013
AU2007327315A 2006-12-01 2007-11-30 Method and system for monitoring emotional state changes Active AU2007327315B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU2007327315A AU2007327315B2 (en) 2006-12-01 2007-11-30 Method and system for monitoring emotional state changes

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
AU2006906746 2006-12-01
AU2006906746A AU2006906746A0 (en) 2006-12-01 Method and system for monitoring emotional state changes
PCT/AU2007/001854 WO2008064431A1 (en) 2006-12-01 2007-11-30 Method and system for monitoring emotional state changes
AU2007327315A AU2007327315B2 (en) 2006-12-01 2007-11-30 Method and system for monitoring emotional state changes

Publications (2)

Publication Number Publication Date
AU2007327315A1 AU2007327315A1 (en) 2008-06-05
AU2007327315B2 true AU2007327315B2 (en) 2013-07-04

Family

ID=39467366

Family Applications (1)

Application Number Title Priority Date Filing Date
AU2007327315A Active AU2007327315B2 (en) 2006-12-01 2007-11-30 Method and system for monitoring emotional state changes

Country Status (2)

Country Link
AU (1) AU2007327315B2 (en)
WO (1) WO2008064431A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10770072B2 (en) 2018-12-10 2020-09-08 International Business Machines Corporation Cognitive triggering of human interaction strategies to facilitate collaboration, productivity, and learning

Families Citing this family (52)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090030717A1 (en) 2007-03-29 2009-01-29 Neurofocus, Inc. Intra-modality synthesis of central nervous system, autonomic nervous system, and effector data
WO2008137581A1 (en) 2007-05-01 2008-11-13 Neurofocus, Inc. Neuro-feedback based stimulus compression device
US8392253B2 (en) 2007-05-16 2013-03-05 The Nielsen Company (Us), Llc Neuro-physiology and neuro-behavioral based stimulus targeting system
JP5542051B2 (en) 2007-07-30 2014-07-09 ニューロフォーカス・インコーポレーテッド System, method, and apparatus for performing neural response stimulation and stimulation attribute resonance estimation
US8386313B2 (en) 2007-08-28 2013-02-26 The Nielsen Company (Us), Llc Stimulus placement system using subject neuro-response measurements
US8392255B2 (en) 2007-08-29 2013-03-05 The Nielsen Company (Us), Llc Content based selection and meta tagging of advertisement breaks
US20090083129A1 (en) 2007-09-20 2009-03-26 Neurofocus, Inc. Personalized content delivery using neuro-response priming data
US8151292B2 (en) 2007-10-02 2012-04-03 Emsense Corporation System for remote access to media, and reaction and survey data from viewers of the media
WO2009059246A1 (en) 2007-10-31 2009-05-07 Emsense Corporation Systems and methods providing en mass collection and centralized processing of physiological responses from viewers
US20100250325A1 (en) 2009-03-24 2010-09-30 Neurofocus, Inc. Neurological profiles for market matching and stimulus presentation
TWI402777B (en) * 2009-08-04 2013-07-21 Sinew System Tech Co Ltd Management Method of Real Estate in Community Building
US10987015B2 (en) 2009-08-24 2021-04-27 Nielsen Consumer Llc Dry electrodes for electroencephalography
US9560984B2 (en) 2009-10-29 2017-02-07 The Nielsen Company (Us), Llc Analysis of controlled and automatic attention for introduction of stimulus material
US20110106750A1 (en) 2009-10-29 2011-05-05 Neurofocus, Inc. Generating ratings predictions using neuro-response data
WO2011133548A2 (en) 2010-04-19 2011-10-27 Innerscope Research, Inc. Short imagery task (sit) research method
US8655428B2 (en) 2010-05-12 2014-02-18 The Nielsen Company (Us), Llc Neuro-response data synchronization
US20120083668A1 (en) * 2010-09-30 2012-04-05 Anantha Pradeep Systems and methods to modify a characteristic of a user device based on a neurological and/or physiological measurement
EP2629664B1 (en) * 2010-10-19 2015-12-30 Koninklijke Philips N.V. Anxiety monitoring
US9569986B2 (en) 2012-02-27 2017-02-14 The Nielsen Company (Us), Llc System and method for gathering and analyzing biometric user feedback for use in social media and advertising applications
NL1039419C2 (en) * 2012-02-28 2013-09-02 Allprofs Group B V METHOD FOR ANALYSIS OF A VIDEO RECORDING.
WO2014066871A1 (en) * 2012-10-27 2014-05-01 Affectiva, Inc. Sporadic collection of mobile affect data
WO2014097052A1 (en) * 2012-12-20 2014-06-26 Koninklijke Philips N.V. Monitoring a waiting area
WO2016049234A1 (en) 2014-09-23 2016-03-31 Icahn School Of Medicine At Mount Sinai Systems and methods for treating a psychiatric disorder
US20160128617A1 (en) * 2014-11-10 2016-05-12 Intel Corporation Social cuing based on in-context observation
US9936250B2 (en) 2015-05-19 2018-04-03 The Nielsen Company (Us), Llc Methods and apparatus to adjust content presented to an individual
FI126359B (en) * 2015-05-26 2016-10-31 Seniortek Oy Control system and method
JP6985005B2 (en) * 2015-10-14 2021-12-22 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America Emotion estimation method, emotion estimation device, and recording medium on which the program is recorded.
CN105559804A (en) * 2015-12-23 2016-05-11 上海矽昌通信技术有限公司 Mood manager system based on multiple monitoring
US20170364929A1 (en) * 2016-06-17 2017-12-21 Sanjiv Ferreira Method and system for identifying, aggregating & transforming emotional states of a user using a temporal phase topology framework
US10216983B2 (en) 2016-12-06 2019-02-26 General Electric Company Techniques for assessing group level cognitive states
US20200046277A1 (en) * 2017-02-14 2020-02-13 Yuen Lee Viola Lam Interactive and adaptive learning and neurocognitive disorder diagnosis systems using face tracking and emotion detection with associated methods
JP6520975B2 (en) * 2017-03-16 2019-05-29 カシオ計算機株式会社 Moving image processing apparatus, moving image processing method and program
WO2018218286A1 (en) * 2017-05-29 2018-12-06 Saltor Pty Ltd Method and system for abnormality detection
CN107320090A (en) * 2017-06-28 2017-11-07 广东数相智能科技有限公司 A kind of burst disease monitor system and method
GB201713829D0 (en) * 2017-08-29 2017-10-11 We Are Human Ltd Image data processing system and method
JP6910919B2 (en) * 2017-10-18 2021-07-28 株式会社日立製作所 How to evaluate the system and actions to be taken to communicate
KR102106517B1 (en) * 2017-11-13 2020-05-06 주식회사 하가 Apparatus for analyzing emotion of examinee, method thereof and computer recordable medium storing program to perform the method
US20190179970A1 (en) * 2017-12-07 2019-06-13 International Business Machines Corporation Cognitive human interaction and behavior advisor
CN108596760A (en) * 2018-05-14 2018-09-28 平安普惠企业管理有限公司 loan risk evaluation method and server
KR102166011B1 (en) * 2018-07-09 2020-10-15 주식회사 두브레인 System and method for determining cognitive impairment using touch input
FR3085221B1 (en) * 2018-08-24 2020-09-04 Pls Experience MULTIMEDIA SYSTEM INCLUDING HUMAN-MACHINE INTERACTION HARDWARE EQUIPMENT AND A COMPUTER
FR3088604B1 (en) * 2018-11-21 2021-07-23 Valeo Systemes Thermiques Interactive system with an occupant of a motor vehicle
US11222199B2 (en) 2018-12-05 2022-01-11 International Business Machines Corporation Automatically suggesting behavioral adjustments during video conferences
RU2700537C1 (en) * 2019-02-04 2019-09-17 Общество с ограниченной ответственностью "КВАРТА ВК" Method for human emotional state determining
CN110210289A (en) * 2019-04-19 2019-09-06 平安科技(深圳)有限公司 Emotion identification method, apparatus, computer readable storage medium and electronic equipment
AU2021352445A1 (en) * 2020-09-29 2023-06-08 Human Centred Innovations Pty Ltd Virtual and physical social robot with humanoid features
CN112515674B (en) * 2020-11-30 2023-07-07 重庆工程职业技术学院 Psychological crisis early warning system
CN113239794B (en) * 2021-05-11 2023-05-23 西北工业大学 Online learning-oriented learning state automatic identification method
CN113255530B (en) * 2021-05-31 2024-03-29 合肥工业大学 Attention-based multichannel data fusion network architecture and data processing method
EP4099280A1 (en) * 2021-06-04 2022-12-07 Tata Consultancy Services Limited Method and system for confidence level detection from eye features
CN114971658B (en) * 2022-07-29 2022-11-04 四川安洵信息技术有限公司 Anti-fraud propaganda method, system, electronic equipment and storage medium
CN115547501B (en) * 2022-11-24 2023-04-07 国能大渡河大数据服务有限公司 Employee emotion perception method and system combining working characteristics

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050089206A1 (en) * 2003-10-23 2005-04-28 Rice Robert R. Robust and low cost optical system for sensing stress, emotion and deception in human subjects
US20050289582A1 (en) * 2004-06-24 2005-12-29 Hitachi, Ltd. System and method for capturing and using biometrics to review a product, service, creative work or thing
JP2006069358A (en) * 2004-09-01 2006-03-16 Fuji Heavy Ind Ltd Drive assist device for vehicle

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1667049A3 (en) * 2004-12-03 2007-03-28 Invacare International Sàrl Facial feature analysis system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050089206A1 (en) * 2003-10-23 2005-04-28 Rice Robert R. Robust and low cost optical system for sensing stress, emotion and deception in human subjects
US20050289582A1 (en) * 2004-06-24 2005-12-29 Hitachi, Ltd. System and method for capturing and using biometrics to review a product, service, creative work or thing
JP2006069358A (en) * 2004-09-01 2006-03-16 Fuji Heavy Ind Ltd Drive assist device for vehicle

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Black et al.' Recognizing Facial Expressions in Image Sequences Using Local Parameterized Models of Image Motion,', International Journal of Computer Vision, 1997, Vol. 25( 1), pages 23-48. *
Khosla R., et al. 'Behaviour Profiling Based on Psychological Data and Emotional States,' 2004, published in LINAI 3215 'Knowledge-Based Intelligent Information and Engineering Systems, .' Vol. 3215, pages 772-779 *
Terzopoulus et aI., 'Analysis and synthesis of facial image sequences using physical anatomical models,' IEEE Pattern Analysis and Machine Intelligence, 1993, Vol. 15(6), pages 569-579. *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10770072B2 (en) 2018-12-10 2020-09-08 International Business Machines Corporation Cognitive triggering of human interaction strategies to facilitate collaboration, productivity, and learning

Also Published As

Publication number Publication date
WO2008064431A1 (en) 2008-06-05
AU2007327315A1 (en) 2008-06-05

Similar Documents

Publication Publication Date Title
AU2007327315B2 (en) Method and system for monitoring emotional state changes
Meißner et al. The promise of eye-tracking methodology in organizational research: A taxonomy, review, and future avenues
Derrick et al. Design principles for special purpose, embodied, conversational intelligence with environmental sensors (SPECIES) agents
US20220392625A1 (en) Method and system for an interface to provide activity recommendations
Simpson et al. Sociosexuality and relationship initiation: An ethological perspective of nonverbal behavior
Elias et al. Teaching the foundations of social decision making and problem solving in the elementary school
Lussier et al. Social anxiety and salesperson performance: The roles of mindful acceptance and perceived sales manager support
Peluso et al. Emotional expression
Giannakos et al. Sensor-based analytics in education: Lessons learned from research in multimodal learning analytics
Khalid et al. Determinants of trust in human-robot interaction: Modeling, measuring, and predicting
Kerns Leader life-span experience management: A practice-oriented approach
Glavey Exploring Preservice Teachers' Affective Response to Disruptive Student Behavior in an Immersive Simulation Classroom
Yates Affective intelligence in built environments
Ghazy The evolution of well-being approach within the Industry 5.0 concept
Khaled et al. THE EVOLUTION OF WELL-BEING APPROACH WITHIN THE INDUSTRY 5.0 CONCEPT
Rikkink The Game of balancing leadership behaviors: a qualitative study to disclose how leaders tailor leadership styles to be effective leaders in different kinds of situations
Connors et al. Movement Pattern Analysis (MPA): decoding individual differences in embodied decision making
Costa Procrastinate no more. How to overcome procrastination with machine learning. An exploration of design as a bridge between data science and human beings
Griffith Interactions Between Humans, Virtual Agent Characters and Virtual Avatars
Horan Participant preference in interventions in occupational health psychology: Potential implications for autonomy
Vettel et al. 2018 Human variability workshop: Insights to drive scientific innovations for human state detection and prediction
Jin Understanding operator engagement in safety-critical Chinese motorway traffic control rooms [Redacted]
Duhr Effects of non-verbal communication on leader effectiveness during task-and relation oriented behavior
Banire Attentional Model for Detecting Attention in Children with Autism Spectrum Disorder
Oggiano Staticity and Chronicity

Legal Events

Date Code Title Description
FGA Letters patent sealed or granted (standard patent)
PC Assignment registered

Owner name: KHOSLA, RAJIV

Free format text: FORMER OWNER WAS: LA TROBE UNIVERSITY

MK14 Patent ceased section 143(a) (annual fees not paid) or expired
NA Applications received for extensions of time, section 223

Free format text: AN APPLICATION TO EXTEND THE TIME FROM 30 NOV 2021 TO 30 JUN 2022 IN WHICH TO PAY A RENEWAL FEE HAS BEEN FILED

NB Applications allowed - extensions of time section 223(2)

Free format text: THE TIME IN WHICH TO PAY A RENEWAL FEE HAS BEEN EXTENDED TO 30 JUN 2022