GB2310377A - Processing device for monitoring psychological condition - Google Patents

Processing device for monitoring psychological condition Download PDF

Info

Publication number
GB2310377A
GB2310377A GB9603949A GB9603949A GB2310377A GB 2310377 A GB2310377 A GB 2310377A GB 9603949 A GB9603949 A GB 9603949A GB 9603949 A GB9603949 A GB 9603949A GB 2310377 A GB2310377 A GB 2310377A
Authority
GB
United Kingdom
Prior art keywords
user
processing device
recognition system
output
psychological condition
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB9603949A
Other versions
GB9603949D0 (en
Inventor
Andrew Gerald Stove
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Koninklijke Philips NV
Original Assignee
Philips Electronics NV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Philips Electronics NV filed Critical Philips Electronics NV
Priority to GB9603949A priority Critical patent/GB2310377A/en
Publication of GB9603949D0 publication Critical patent/GB9603949D0/en
Publication of GB2310377A publication Critical patent/GB2310377A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/486Bio-feedback
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/165Evaluating the state of mind, e.g. depression, anxiety
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Pathology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Psychiatry (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biophysics (AREA)
  • Surgery (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • Medical Informatics (AREA)
  • Educational Technology (AREA)
  • Child & Adolescent Psychology (AREA)
  • Developmental Disabilities (AREA)
  • Social Psychology (AREA)
  • Psychology (AREA)
  • Hospice & Palliative Care (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The device 20 has a recognition system 21, particularly an image recognition system such as a video camera, for generating data 32 representing the psychological condition of the user, the data being used to adapt the interface between the user and the device so that an output is provided to the user which aims to improve this condition, namely the mood of the user. The user is monitored to determine whether such improvement has taken place and the adaptation to the interface may be reversed if the user's mood has not improved as desired. The image recognition system may identify visual features of the user and/or the positions of the hands or limbs, such features/positions being representative of the psychological state. Skin conductivity sensors may instead be employed as the recognition system.

Description

DESCRIPTION PROCESSING DEVICE WITH USER MONITORING This invention relates to a processing device having a man-machine interface including a user monitoring system. In particular the invention concerns a processing device which alters its interface with a user depending upon the way in which the user operates the device.
Systems have been proposed with man-machine interfaces which may adapt according to the behaviour of the user. For example, the article "Adaptive Personalised Interfaces - A Question of Viability" in "Behaviour and Information Technology", Volume 4, pages 31 - 45, describes a system for finding telephone numbers which automatically adapts the order of its menus to ensure that the entries for which a given user searches most often are found most quickly.
This system, however, fails to identify whether the changes made to the interface are in line with the user's desires or expectations.
The invention therefore aims to provide a processing device having a man-machine interface which is adaptive so as to operate according to the desires of the user, but without requiring conscious updating from the user.
According to the invention, there is provided a processing device having a man-machine interface comprising a user input and an output to the user, the device further comprising a non-intrusive recognition system for generating data representative of a psychological condition of the user, wherein the output to the user is varied by the processing device in dependence upon the data, in order to obtain a change in the psychological condition, the condition being monitored by the recognition system in order to determine whether the change has taken place.
The non-intrusive recognition system enables the determination of the user's satisfaction with the interface. As an example of a possible recognition system, systems for detecting changes in the stress level of a human being are known, for example lie detectors. Generally, such a device comprises a number of electrodes to be held by a person undergoing interrogation and a bridge circuit for measuring the variation in skin conductivity between the electrodes. The skin conductivity is known to increase during periods of anxiety. European patent serial number 0 545 497 discloses a driver monitoring system for motor vehicles which adapts the information presented to a user depending upon a stress level of the driver determined using conductivity-based stress sensors.
Although the use of skin conductivity sensors may comprise a nonintrusive recognition system, since the user is not required to participate actively in generating input, such a system may not be desired from a practical of view. It may not be desired that contact is required between the user and the recognition system, since this may constrain the user's freedom.
Furthermore, when the user is constantly in contact with the recognition system, he or she may consciously or subconsciously adapt his or her interaction with the processing device.
Therefore, although the invention may employ the use of conductivity sensors, it is preferred that the recognition system comprises a non-contact recognition system, such as an image recognition system for obtaining images of the user, the image recognition system having means for identifying visual features of the user, and which are representative of the psychological state of the user. The use of a visual system also enables a wider range of information concerning the psychological condition of the user to be identified.
Various face recognition systems are under development which can be used to recognise expressions such as happiness, sorrow, anger or fear. One such approach for a face identification system is disclosed in an article entitled "Automatic Face Identification System Using Flexible Appearance Models" published in Image and Vision Computing, Vo. 13, No. 5, June 1995.
The use of an image recognition system enables the user to be less aware of the physical presence of the recognition system so that he may interact more freely with the processing device.
The visual features identified may comprise features of the face as described above, andlor positions or movements of the hands or limbs of the user. Thus, the recognition system may identify facial expressions or hand gestures.
The user input may be for inputting control commands in response to variables presented to the user by the output, and the variation to the output may then comprise selection from alternative values of the variables for presentation to the user. For example, the variables presented to the user may comprise parameters incorporated into a game, and the psychological condition of the user may determine the subsequent progression of the game through selection of the values for the variables which are subsequently presented to the user. For example, the output may comprise situations presented to the user in a "role play" type game, with the path through the "role play" game being adapted according to the psychological state of the player.
The variables presented to the user may comprise sets of options for the control of the processing device, and the variation to the outputs may then comprise selection from alternative sets of options. In this case, the user may be presented with various options for controlling a program, such as a word processing program, and the psychological state of the user (which may indicate the user is in a hurry or finds the response of the processing device too slow) will determine the type of options or menus presented to the user.
The workload of the user may also have an influence on the adaptation of the interface, where his behaviour and mood changes according to this workload.
The outputs to the user may also be varied in dependence upon information concerning the conscious inputs to the processing device supplied by the user. In this way, the way in which the processing device is being used may further enable adaptation of the interface with the user.
Embodiments of the invention will now be described, by way of example, with reference to the accompanying drawings, in which; Figure 1 shows the process of conventional interaction with the machine, and Figure 2 shows, in schematic form, a processing device in accordance with the invention.
In the preferred embodiment of the invention, a processing device includes a video camera, the signals from which are processed by a processing unit so as to identify facial expressions and hand gestures of the user. The signals are used to generate data representative of a physiological condition of the user so as to update form of the interaction of the processing device 20 with the user.
Figure 1 shows the conventional interaction between a user and a machine, such as a computer running a software program. The user generates an input 10 which is supplied to an inputloutput portion 12 of the processing device 20. The processing unit 14 decodes the user's input and from this determines the action to be taken by the processing device 20. An appropriate response 16 6 is generated which may be displayed to the user through a display device 18. The input 10 by the user may involve the use of a keyboard, mouse, light pen or other input device.
Systems have been devised which analyse the inputs 10 of the user in order to determine how the user is operating the processing device 20. In this way, the response 16 fed to the user may be adapted according to the perceived user's needs. However, it is not possible to determine unobtrusively (namely without requiring an input from the user) from such a system whether the user is satisfied with the evolution of his or her interface with the processing device 20. Such adaptivity of the interface will of course be advantageous when it makes the system's behaviour correspond more closely with needs of the user. However, this adaptivity may be dangerous if the user does not understand how the interface is being altered.
The need therefore arises for improved interaction between the user and the processing device 20. This improved interaction could allow greater use to be made of the wide range of functionality that microcontrollers now give household electronics goods. In order to enable the user to make the best use of the functionality of such goods, the machine should have a knowledge of the particular user so that the machine may respond to the needs and wishes of a particular user.
Figure 2 shows a processing device according to the invention in which the interaction between the user and the processing device 20 depends upon the psychological state of the user.
A video camera 22 obtains images of the user which are supplied to a processing unit 24 which can extract information 26 relating to hand movements of the user and information 28 relating to the facial expression of the user, using separate algorithms. Analysis of these physiological aspects of the user by an interpretation unit 30 enables data 32 representative of a psychological condition of the user to be supplied to the remainder of the processing device 20.
The use of the video camera 22 enables the recognition system 21 to rely upon non-intrusive inputs. In other words, the user continues conscious dialogue with the processing device 20 and at the same time the recognition system 21 receives information generated subconsciously by the user. This avoids the need to interrupt the user's conventional dialogue with the machine.
Subconscious information can be received because human beings communicate with one another in many ways in addition to the speaking of words. For example emblems may be used consciously to replace words, for example a pointing indication or the showing of a number of fingers to represent a number. Systems have been proposed for recognition of hand positions in an interface device. This enables the user to avoid physical contact with the processing unit if this may be desired. However, such emblems are conscious signals generated by the user, and therefore replace the more conventional input with the machine. The present invention is concerned with non-emblematic gestures (for example events or actions which are subconscious or triggered subconsciously, such as yawning or the expression of certain emotions). Various systems have been devised for obtaining information concerning the visual facial expressions of a subject.
Such face identification techniques can be divided into two main categories; those employing geometrical features and those using grey-level information.
Techniques based on geometrical features use a number of dimensional measurements, or the locations of a number of control points for classification.
However, geometrical features are expression and 3D orientation dependent, so that explicit methods of normalisation must be employed. Generally, this involves the normalisation of geometrical distances with respect to distances invariant to expression, such as the distance between the eyes.
Face identification using grey-level techniques involves analysis of the components of face images. One method involves approximating any face by a weighted sum of "Eigenfaces".
The article entitled "Face Recognition by Computer" by I. Craw and P.
Cameron in Proc. Br. Machine Vision Conf., Springer-Verlag, Berlin 1992, pages 409 - 507 discloses systems operating depending upon each of these approaches.
Such face recognition enables data 32 representing the psychological condition of the user to be generated. This condition is effectively the mood of the user, such as anger, fear or happiness, which can be determined from the visual information. The data 32 is fed to the processing unit 34 which generates the output to the user depending upon the psychological condition of the user. The inputloutput interface 12 supplies the generated output to the display device 18.
Furthermore, in the preferred embodiment shown, although not essential, the conscious input 10 of the user may also be supplied to an interpretation device 36 from the inputloutput interface 12, since it may be desirable to adapt the intended meaning of the user's input 10 according to his or her psychological condition. Thus, the adaptation device 36 receives the data 32 representing the psychological condition of the user, together with the conscious user input 10. From these inputs, a moderated input is generated by the adaptation device 36 and is fed to the processing unit 34 so that the processing unit 34 generates the response of the processing device 20 depending upon the psychological condition of the user, the actual user input 10, as well as the moderated interpretation 38 of the user input 10.
In accordance with the invention, the output supplied to the user through the display 18 is adapted according to the psychological condition of the user.
Furthermore, when the man-machine interface is adapted in this way, the response of the user, in terms of his or her emotional state, is monitored. In this way, if the adaptation of the interface does not have the desired effect of improving the mood of the user, as estimated by the psychological condition evaluated by the recognition system, then the adaptation effected may be reversed. Thus, the aim of the adaptation of the interface may be to improve the user's mood rather than to alter the interface in a way which is only perceived to be beneficial to the user.
The processing device of the invention may have applications in various fields. Almost all of the current input from humans to computers, either explicitly or when the computer is embedded in other devices, is concerned either with intellectual activities (such as work or solving puzzles) or with games which require rapid reactions. The system of the invention may be applied to systems in both of these categories. For example, in a game situation, the adaptation to the interface may comprise selecting an appropriate situation within the game for presentation to the user. Thus, the level of difficulty or the type of information presented to the user may be continuously selected in order to maintain the users interest. In a problem solving environment, one example of the use of the device according to the invention is the tailoring of menus to the specific abilities of the user. For example, an experienced user of a particular software package may become frustrated or bored when too many safety procedures are incorporated into the interface with the user. Therefore, an experienced user of the software may prefer more rapid access into different areas of the software without repeatedly being asked, for example, to give confirmation that a certain piece of work is to be deleted or is not to be saved.
Such frustration would be determined by the analysis of the psychological condition of the user. These considerations apply equally to the control of domestic appliances, such as video recorders, where a menu type environment is provided.
A further situation where the processing device of the invention may be applied is where a choice of sources of entertainment is presented to the user, such as choices of television channels, of pieces of music or of literary pieces such as poetry. In this case, menus may be narrowed by the processing device to the type of work which is appropriate for the user's particular mood.
Furthermore, a user's enjoyment of a particular category of entertainment (for example poems of a certain type) may be estimated according to whether emotions are displayed by the user which are expected from those entertainment sources. The emotions expected for a particular piece of entertainment may be evaluated from keywords associated with the work, and the interface may again be adapted according to the user's enjoyment, as expressed by his or her psychological state.
The embodiment of the invention described relies upon the use of a video camera, since this provides a very non-intrusive method of receiving subconscious data from the user, and enables a wide range of information to be obtained. Of course, other types of sensor may be employed which do not require a conscious input from the user, such as pressure sensors or conductivity sensors. Systems which employ speech recognition may also, in future, include analysis of the tone of the user's voice. This may also be used to evaluate the psychological condition of the user without requiring considered input from the user.
Although the preferred embodiment described requires a video camera, this will not significantly increase the expense of more advanced processing systems. In particular, the trend for integration of functions into single devices may lead to the integration of a computer system with, for example, a video phone which already requires a video camera. Thus, integration of a processing device as a controller for a video phone would allow the use of the video phone video camera.
From reading the present disclosure, other modifications will be apparent to persons skilled in the art. Such modifications may involve other features which are already known in the design and use of electrical or electronic circuits and component parts thereof and which may be used instead of or in addition to features already described herein. Although claims have been formulated in this application to particular combinations of features, it should be understood that the scope of the disclosure of the present application also includes any novel feature or any novel combination of features disclosed herein either explicitly or implicitly or any generalisation of one or more of those features which would be obvious to persons skilled in the art, whether or not it relates to the same invention as presently claimed in any claim and whether or not it mitigates any or all of the same technical problems as does the present invention. The applicants hereby give notice that new claims may be formulated to such features and/or combinations of such features during the prosecution of the present application or of any further application derived therefrom .

Claims (8)

1. A processing device having a man-machine interface comprising a user input and an output to the user, the device further comprising a nonintrusive recognition system for generating data representative of a psychological condition of the user, wherein the output to the user is varied by the processing device in dependence upon the data, in order to obtain a change in the psychological condition, the condition being monitored by the recognition system in order to determine whether the change has taken place.
2. A processing device as claimed in claim 1, wherein the recognition system comprises an image recognition system for obtaining images of the user, the image recognition system having means for identifying visual features of the user which are representative of the psychological condition of the user.
3. A processing device as claimed in claim 2, wherein the visual features comprise visual features of the face of the user.
4. A processing device as claimed in claim 2 or 3, wherein the visual features comprise positions or movements of one or both hands of the user.
5. A processing device as claimed in any preceding claim, wherein the user input is for inputting control commands in response to variables presented to the user by the output, and the variation to the output comprises selection from alternative values of the variables for presentation to the user.
6. A processing device as claimed in claim 5, wherein the variables presented to the user comprise sets of options for the control of the processing device, and the variation to the output comprises selection from alternative sets of options for presentation to the user.
7. A processing device as claimed in any preceding claim, wherein the output to the user is also varied in dependence upon information concerning the inputs to the processing device supplied by the user.
8. A processing device substantially as described herein with reference to and as shown in the accompanying drawings.
GB9603949A 1996-02-24 1996-02-24 Processing device for monitoring psychological condition Withdrawn GB2310377A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB9603949A GB2310377A (en) 1996-02-24 1996-02-24 Processing device for monitoring psychological condition

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB9603949A GB2310377A (en) 1996-02-24 1996-02-24 Processing device for monitoring psychological condition

Publications (2)

Publication Number Publication Date
GB9603949D0 GB9603949D0 (en) 1996-04-24
GB2310377A true GB2310377A (en) 1997-08-27

Family

ID=10789350

Family Applications (1)

Application Number Title Priority Date Filing Date
GB9603949A Withdrawn GB2310377A (en) 1996-02-24 1996-02-24 Processing device for monitoring psychological condition

Country Status (1)

Country Link
GB (1) GB2310377A (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1999028809A2 (en) * 1997-11-28 1999-06-10 Egon Stephan Method for directing a dialogue between an individual or a number of users and a computer
WO2001087158A1 (en) * 2000-05-12 2001-11-22 Commonwealth Scientific And Industrial Research Organisation .omputer diagnosis and screening of psychological and physical disorders
WO2004064638A1 (en) * 2003-01-24 2004-08-05 Pedro Monagas Asensio Mood analysing device for mammals
US9261967B2 (en) 2011-02-21 2016-02-16 Koninklijke Philips N.V. Gesture recognition system
WO2017162506A1 (en) 2016-03-22 2017-09-28 Koninklijke Philips N.V. Using visual context to timely trigger measuring physiological parameters
US11623517B2 (en) * 2006-11-09 2023-04-11 SmartDriven Systems, Inc. Vehicle exception event management systems
US11734964B2 (en) 2014-02-21 2023-08-22 Smartdrive Systems, Inc. System and method to detect execution of driving maneuvers
US11884255B2 (en) 2013-11-11 2024-01-30 Smartdrive Systems, Inc. Vehicle fuel consumption monitor and feedback systems

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1993002622A1 (en) * 1991-08-07 1993-02-18 Software Solutions Limited Operation of computer systems
US5343871A (en) * 1992-03-13 1994-09-06 Mindscope Incorporated Method and apparatus for biofeedback
WO1995002989A1 (en) * 1993-07-20 1995-02-02 Ultramind International Limited Video display apparatus
WO1997001984A1 (en) * 1995-06-30 1997-01-23 Samuel Ron Speech-based biofeedback method and system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1993002622A1 (en) * 1991-08-07 1993-02-18 Software Solutions Limited Operation of computer systems
US5343871A (en) * 1992-03-13 1994-09-06 Mindscope Incorporated Method and apparatus for biofeedback
WO1995002989A1 (en) * 1993-07-20 1995-02-02 Ultramind International Limited Video display apparatus
WO1997001984A1 (en) * 1995-06-30 1997-01-23 Samuel Ron Speech-based biofeedback method and system

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1999028809A2 (en) * 1997-11-28 1999-06-10 Egon Stephan Method for directing a dialogue between an individual or a number of users and a computer
WO1999028809A3 (en) * 1997-11-28 1999-07-22 Egon Stephan Method for directing a dialogue between an individual or a number of users and a computer
WO2001087158A1 (en) * 2000-05-12 2001-11-22 Commonwealth Scientific And Industrial Research Organisation .omputer diagnosis and screening of psychological and physical disorders
WO2004064638A1 (en) * 2003-01-24 2004-08-05 Pedro Monagas Asensio Mood analysing device for mammals
US11623517B2 (en) * 2006-11-09 2023-04-11 SmartDriven Systems, Inc. Vehicle exception event management systems
US9261967B2 (en) 2011-02-21 2016-02-16 Koninklijke Philips N.V. Gesture recognition system
US11884255B2 (en) 2013-11-11 2024-01-30 Smartdrive Systems, Inc. Vehicle fuel consumption monitor and feedback systems
US11734964B2 (en) 2014-02-21 2023-08-22 Smartdrive Systems, Inc. System and method to detect execution of driving maneuvers
WO2017162506A1 (en) 2016-03-22 2017-09-28 Koninklijke Philips N.V. Using visual context to timely trigger measuring physiological parameters
CN108882853A (en) * 2016-03-22 2018-11-23 皇家飞利浦有限公司 Measurement physiological parameter is triggered in time using visual context
US10813593B2 (en) 2016-03-22 2020-10-27 Koninklijke Philips N.V. Using visual context to timely trigger measuring physiological parameters
CN108882853B (en) * 2016-03-22 2021-11-09 皇家飞利浦有限公司 Triggering measurement of physiological parameters in time using visual context

Also Published As

Publication number Publication date
GB9603949D0 (en) 1996-04-24

Similar Documents

Publication Publication Date Title
Zhang et al. TapSkin: Recognizing on-skin input for smartwatches
MacLean Designing with haptic feedback
Jaimes et al. Multimodal human computer interaction: A survey
US6373463B1 (en) Cursor control system with tactile feedback
US20080150899A1 (en) Virtual workstation
WO2007041223A2 (en) Automated dialogue interface
US20220291753A1 (en) Spatial Gesture Recognition using Inputs from Different Devices to Control a Computing Device
Reynolds et al. Designing for affective interactions
JP2002251235A (en) User interface system
Bian et al. Facial position and expression-based human–computer interface for persons with tetraplegia
US20220253146A1 (en) Combine Inputs from Different Devices to Control a Computing Device
Yang et al. A review of emotion recognition methods from keystroke, mouse, and touchscreen dynamics
Picard Perceptual user interfaces: affective perception
GB2310377A (en) Processing device for monitoring psychological condition
Hansen et al. Wrist-worn pervasive gaze interaction
Arai et al. Evaluation of users' impact for using the proposed eye based HCI with moving and fixed keyboard by using eeg signals
CN114090862A (en) Information processing method and device and electronic equipment
Dael et al. Measuring body movement: Current and future directions in proxemics and kinesics.
KR20030037774A (en) Object growth control system and method
Wilson Sensor-and recognition-based input for interaction
Clark et al. What do we want from a wearable user interface?
Fukuoka et al. Sensory Attenuation with a Virtual Robotic Arm Controlled Using Facial Movements
Wang et al. Stabilizing graphically extended hand for hand tremors
Calhoun et al. Hands-free input devices for wearable computers
Choi et al. Intelligent Wearable Assistance System for Communicating with Interactive Electronic Media

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)