WO2012117335A2 - System and method for operating and/or controlling a functional unit and/or an application based on head movement - Google Patents

System and method for operating and/or controlling a functional unit and/or an application based on head movement Download PDF

Info

Publication number
WO2012117335A2
WO2012117335A2 PCT/IB2012/050895 IB2012050895W WO2012117335A2 WO 2012117335 A2 WO2012117335 A2 WO 2012117335A2 IB 2012050895 W IB2012050895 W IB 2012050895W WO 2012117335 A2 WO2012117335 A2 WO 2012117335A2
Authority
WO
WIPO (PCT)
Prior art keywords
mood
head
user
movements
functional unit
Prior art date
Application number
PCT/IB2012/050895
Other languages
French (fr)
Other versions
WO2012117335A3 (en
Inventor
Marjolein Dimmie VAN DER ZWAAG
Michelle Leigh THRASHER
Original Assignee
Koninklijke Philips Electronics N.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips Electronics N.V. filed Critical Koninklijke Philips Electronics N.V.
Publication of WO2012117335A2 publication Critical patent/WO2012117335A2/en
Publication of WO2012117335A3 publication Critical patent/WO2012117335A3/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6802Sensor mounted on worn items
    • A61B5/6803Head-worn items, e.g. helmets, masks, headphones or goggles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1116Determining posture transitions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
    • A61B5/1128Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique using image analysis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/165Evaluating the state of mind, e.g. depression, anxiety

Definitions

  • the present invention relates to relates to a system and a corresponding method for operating and/or controlling a functional unit, in particular an output device.
  • mood and body posture The relation between mood and body posture is typically investigated in two ways. From psychology, actors are used to express certain moods, subsequently the presence of several characteristics for this posture are documented. Further, a relatively new method uses computer animation of humans of which the joints are bent in several combinations. Next, participants have to indicate what mood the animations express, and afterwards the typical joint bends belonging to certain moods are derived automatically.
  • WO 2010/070463 Al discloses a system and a method for detecting a person's mood by measuring and interpreting the person's body reaction to a partner's touch.
  • the system comprises a body sign detecting device, a body contact detector generating a contact signal indicative for the occurrence of body contact between the person and a partner.
  • the system further comprises a signal processor and an output device controlled by the processor. In an analyzing mode the processor analyses the response signal from the body sign detecting device, determines the person's mood and, depending on the outcome of the determination, operates the output device.
  • a system for operating and/or controlling a functional unit, in particular an output device is presented that comprises
  • a head movement detection means for detecting movements of the head of a user
  • a mood determining means for determining a mood of the user taking into account the detected movements of the head of the user
  • a processing means for operating and/or controlling at least one function of the functional unit and/or the application dependent on the determined mood.
  • the invention is based on the idea to use movement information of the user's head, which in most cases is easy to obtain, in order to determine information about the user's current mood (also called user's state hereinafter). By this means an easy and completely unobtrusive way of monitoring the mood of the user can be provided.
  • the determined mood can then be used to control and/or operate devices or applications to keep the user in the same mood or change it in a desired direction (e.g. from sad to happy).
  • the system comprises accumulating means for accumulating a count indicative for a movement activity of the head over a selected time period, wherein the mood determining means is adapted to determine the mood using the accumulation result.
  • the mood determining means is adapted to determine the mood using the accumulation result.
  • the accumulating means are adapted to accumulate single movements of the head completed since a selected time. This is a way to detect the overall amount of the head completed since a selected time period. This is a simple but highly reliable way to determine the mental state and/or the mood. The inventors of the present invention have found proof for the relationship between the amount of head movements and the personal state of mind / mood.
  • the head movement detection means includes a movement sensor of the type whose sensitivity is dependent on accelerations of the head, the movements being calculated based on the detected accelerations.
  • a movement sensor of the type whose sensitivity is dependent on accelerations of the head, the movements being calculated based on the detected accelerations.
  • the movement sensor includes an accelerometer for attachment to the head of the user or for attachment to a device that is provided for attachment to the head of the user, in particular a helmet, head set or headphone.
  • Semiconductor elements may be used incorporating such an accelerometer in addition to other circuitry.
  • Such a simple single chip solution has favorable cost implications and provides a compact implementation for the movement detection means required.
  • the head movement detection means comprises a camera and the mood determining means is adapted to analyze data registered by the camera for the detection of the head movements.
  • the data from the camera it can easily be distinguished between passive movements of the head (i.e. the head is retrained by a movement e.g. of the torso) and active movements of the head.
  • passive movements of the head i.e. the head is retrained by a movement e.g. of the torso
  • active movements of the head In this way interference effects due to passive head movements can be eliminated.
  • the camera further has the advantage of being insensitive to acoustical interferences (e.g. caused by the beat of a provided music song).
  • the movement may be tracked by the camera using the above mentioned DPHT- method.
  • a monitoring means for monitoring a temporal course of a quantity characterizing the head movement for a change, wherein the mood determining means is adapted to recognize a change of the mood using information about the change of said quantity and wherein said processing means is adapted to operate and/or control the functional unit and/or the application using information about the change of the state of the mood.
  • an output device is provided for outputting information being operated depending on the determined mood.
  • the mood can be detected without the user having to indicate something him/herself.
  • the determined or detected mood can basically be used as input to any application, or for example directly fed back to the user to increase mood awareness.
  • the user's mood can be detected unobtrusively and used as input for any applications.
  • knowing the mood of the user can be important or helpful in many applications, e.g. in a personal music selection device, it can automatically tell whether the user likes a piece of music, without explicitly asking the user. In a web-communication setting an automatic interpretation of a certain person's state of mind can be made.
  • Audio information in particular in the form of a provided music song, is highly suitable to stimulate or influence the mood of a subject. Therefore the output device advantageously comprises an audio output device. This makes it possible to automatically select e.g. a certain music piece or song. Data of music rating systems that are available on the internet can now be uploaded automatically. Alternatively or in addition a visual output device, e.g. a monitor or TV, may be provided for outputting desired visual data to stimulate or influence the mood.
  • a visual output device e.g. a monitor or TV
  • the output device is adapted to output signals that are suitable to influence the mood in a predetermined way.
  • Knowledge of the user's mood can be used in various applications to initiate mood change by changing the operation of an output device.
  • Output devices that can be thought of are music output devices, light sources, or game console platforms.
  • a positive mood can have health advantages users can benefit if mood is directed to a positive state.
  • the mood can be directed to a state that is most convenient working in for the user, this can improve work productivity. Measuring head movement during desk work activities are feasible via a webcam or an accelerometer e.g. in a headphone.
  • Another possibility is the use of automatic mood detection by head movement detection in entertainment systems.
  • the mood of the gamer can be taken into account to change the game features.
  • Another example application is the music-directs-your-mood concept: based on the effect a song on the user's mood, songs can be selected to direct to a desired mood state.
  • a learning unit for collecting and evaluating data related to the user's response to an output of information, the learning unit being adapted to learn based on the evaluated data which output information leads to an improved response.
  • the movement detection means and/or the mood determining means are integrated in or at a headphone or head set. Through this integration the number of devices carried is cut down and the headphone can e.g. easily put on or taken of in one operation.
  • the mood determining means further receives an additional input from other means for determining one or more parameters including information about the user's mood.
  • the detected head movements could thus be combined with other objective parameters of user state, e.g. physiological measures. Integration of these additional signals lead to a more robust or in some situations more reliable estimation of the user's mood by the mood determining means.
  • the system comprises communication means for communicating the determined mood, e.g. to the user, to another person and/or to another functional unit or application.
  • the user state derived from the head movements could be communicated not only to the user itself but also to another person(s) e.g., via an online application / computer / smartphone.
  • the other person can or cannot take action upon this information (e.g., by calling/ text message) toward the user, to e.g. facilitate user state change or cause awareness of the user state.
  • display means are provided for displaying the determined mood of the user over time.
  • the system which derived the user state from head movements can show the user state over the last time period (minutes/hours/days), e.g. as a kind of diary with bars indicating the user state.
  • This diary makes it easier for the user to recognize his/her (mood) state patterns as he/she can now relate current state to the patterns shown over time. This leads to increased awareness of user state which could successively induce behavioral change.
  • Fig. 1 shows a diagram showing a graph of total time displayed along the ordinate versus time along the abscissa
  • Fig. 2 shows a block diagram of the general layout of the system according to the present invention
  • Fig. 3 shows a block diagram illustrating an information flow during performing the method according to the invention in a first operation mode
  • Fig. 4 shows a block diagram illustrating an information flow during performing the method according to the invention in a second operation mode
  • Fig. 5 shows two embodiments of a headphone designed according to the invention
  • Fig. 6 shows a block diagram of a first embodiment of the system according to the present invention
  • Fig. 7 shows a block diagram of a second embodiment of the system according to the present invention.
  • Fig. 8 shows a block diagram of a third embodiment of the system according to the present invention.
  • Fig. 9 shows a block diagram of a fourth embodiment of the system according to the present invention.
  • Fig. 10 shows a block diagram of a fifth embodiment of the system according to the present invention.
  • the mood can be derived by detecting the amount of head movements.
  • Head movement can objectively be measured by use of different means, e.g. via an accelerometer in headphones or via image processing of camera images. In desk situations the camera can be placed somewhere in the monitor of the PC.
  • Fig. 2 illustrates a system according to the present invention in a simple block diagram.
  • the system 10 is provided for operating and/or controlling a functional unit 20 (e.g. a portion of another device such as a light or sound control system of a room or an area) and/or an application 21 (e.g. an computer program controlling the output of music or visual effects), in particular an output device 22 (e.g. an audio device, such as an MP3 player).
  • the system 10 comprises a head movement detector 11 for detecting movements of the head of a user 1, a mood determining unit 12 (e.g.
  • a computer or processor, or a dedicated device for this purpose for determining a mood of the user 1 taking into account the detected movements of the head of the user 1, and a processor 13 for operating and/or controlling at least one function of the functional unit 20 and/or the application 21 dependent on the determined mood.
  • Fig. 3 shows a simplified block diagram illustrating an information flow when performing the method according to the invention in a first operation mode.
  • the mood of the user is determined by the mood determining unit 12 based on information about the overall amount of movements of the user's head completed since a predetermined time period.
  • a bidirectional communication link is employed between the processor 13 and the mood determining unit 12.
  • the processor 13 operates an output device 22 in order to direct the user's mood.
  • the user does not interface with the system.
  • Fig. 4 shows a simplified block diagram illustrating an information flow when performing the method according to the invention in a second operation mode.
  • the user 1 is involved or included in a loop of information flow between the several entities of the system 10 according to the invention.
  • the mood determining unit 12 again determines the mood of the user 1 based on information about the overall amount of movements of the user's head completed since a predetermined time period. Information about the current mood of the user 1 is then transferred to the processor 13 that controls an output device 22, e.g. as an audio output device, depending on the outcome of the mood determination.
  • a first interface 14 is arranged in the processor 13. Via the first interface 14 information is transferred from the processor 13 to the user 1, who in turn can transfer information via a second interface 15 arranged in the mood determining unit 12.
  • the mood determining unit 12 records information which is entered by the user 1. For instance, the user 1 can enter data indicative of how she/he wants to feel.
  • the audio output device 20, e.g. a music player can learn which effect each song in a database of the system has on the head movement of the user. Then, if the user 1 for instance selects to feel more positive, music can be selected that has shown to increase head movement, thus to improve the mood. On the other hand, when the user wants to feel more negative, songs can be presented that have shown to not evoke head movements.
  • the system may take the form of a wire-bound or wireless headphone or head set or may be included therein.
  • Fig. 5A shows a side perspective view of a first embodiment of such a headphone 30a designed according to the invention.
  • the headphone 30a includes a pair of earphones 31, 32 adapted to fit over a user's left and right outer ears (not shown).
  • the earphones 31, 32 are connected to one another by a headband 33 which is adapted to fit over the top of the user's head (not shown).
  • An accelerometer 34 which represents the head movement detector 11 in this embodiment, is integrally disposed with the headband 33.
  • the accelerometer 34 is coupled with a processor 35, which is also included in the headphone 30a according to this embodiment.
  • the processor 35 represents the mood determining unit 12 and detects a mood of the user using the output signals of the accelerometer 34.
  • Another processor 36 which is also included in the headphone 30, is provided to control and/or operate an output device, e.g. an external audio source 23 connected to the headphone 30a via cable or through an RF link 24, dependent on the detected mood, e.g. based on stored user settings, a desired user control scheme or a default setting.
  • FIG. 5B Another embodiment of such a headphone 30b is shown in Fig. 5B.
  • the processor 36 and preferably also the processor 35, are arranged externally (i.e. physically separate from the headphone 30b), e.g. in the external audio source 23 or in a RF reception unit (not shown) coupled to the headphone 30b via cable or wirelessly.
  • the detected head movements are thus transmitted from the accelerometer 34 to the processor 35 via an RF transmitter 37 in the headphone 30b and an RF receiver 25 in the audio source 23.
  • an accumulating unit 16 is provided for accumulating a count indicative for a movement activity of the head of the user 1 over a selected time period, wherein the mood detection unit 12 is adapted to determine the mood using the accumulation result.
  • the accumulating unit 15 is adapted to accumulate single movements of the head completed since a selected time.
  • the head movement detector 11 includes (alternatively or in addition to the accumulation unit 16) a movement sensor 17 of the type whose sensitivity is dependent on accelerations of the head, the movements being calculated based on the detected accelerations.
  • the movement sensor 17 includes an accelerometer 18 for attachment to the head of the user 1 or for attachment to a device, e.g. a helmet, head set or headphone (30, as shown in Fig. 5), that is provided for attachment to the head of the user.
  • a device e.g. a helmet, head set or headphone (30, as shown in Fig. 5
  • Such movement sensors 17 and accelerometer are commonly known and thus not explained here in more detail.
  • the head movement detector 11 comprises a camera 19, which may either be fixed to the user's head or moved at a fixed location such that it monitors the user's head (e.g. a webcam built into or fixed to a computer monitor, or a camera built into a mobile phone).
  • the mood determining unit 12 is adapted to analyze data registered by the camera 19 for the detection of the head movements.
  • the system 10b optionally comprises a monitoring unit 40 for monitoring a temporal course of a quantity characterizing the head movement for a change.
  • Said quantity may, for instance, be the repetition rate, repetition time, speed, intensity, etc. of the user's head movement.
  • the mood determining unit 12 is able to recognize a change of the user's mood using information about the change of said quantity.
  • the processor 13 then operates and/or controls the functional unit 20 and/or the application 21 using information about the change of the state of the mood.
  • the output device 22 is adapted for outputting information being operated depending on the determined mood.
  • the output device 22 may comprise an audio output device and/or visual output device and may be adapted to output signals that are suitable to influence the mood in a predetermined way.
  • a learning unit 41 is provided for collecting and evaluating data related to the user's response to an output of information.
  • the learning unit 41 is adapted to learn based on the evaluated data which output information leads to an improved response, which information can be used as an additional input to the processor 12 for controlling the devices 20-22.
  • acceleration identification means 42 can be provided to identify accelerations which are not caused by head movements (e.g. by the beat of a provided music song) so that only the remaining true movement activity induced acceleration signals are taken into account when determining the mood.
  • Fig. 8 shows another embodiment of a system 10c according to the present invention.
  • the mood determining means further receives an additional input from other means 43 for determining one or more parameters including information about the user's mood.
  • the detected head movements could thus be combined with other objective parameters of user state, e.g. physiological measures (for instance from an EEG or other sensors such as a respiration sensor 44 indicating the user's respiration rate, a blood pressure sensor 45 indicating the user's blood pressure and/or a heart rate sensor 46 indicating the user's heart rate. Integration of these additional signals lead to a more robust or in some situations more reliable estimation of the user's mood by the mood determining means.
  • the elements 43 to 46 may be part of the system 10c but may also be external elements as shown in Fig. 8.
  • a communication unit 47 is provided for communicating the determined mood, e.g. to the user, to another person and/or to another functional unit or application.
  • the user state derived from the head movements could be communicated not only to the user itself but also to another person(s) e.g., via an online application / computer / smartphone.
  • the other person can or cannot take action upon this information (e.g., by calling/ text message) toward the user, to e.g. facilitate user state change or cause awareness of the user state.
  • system lOe depicted in Fig. 10 display means 48, e.g. a small screen or a display, are provided for displaying the determined mood of the user over time.
  • the user state can be shown over the last time period
  • a computer program may be stored / distributed on a suitable non-transitory medium, such as an optical storage medium or a solid-state medium supplied together with or as part of other hardware, but may also be distributed in other forms, such as via the Internet or other wired or wireless telecommunication systems.
  • a suitable non-transitory medium such as an optical storage medium or a solid-state medium supplied together with or as part of other hardware, but may also be distributed in other forms, such as via the Internet or other wired or wireless telecommunication systems.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Animal Behavior & Ethology (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Physics & Mathematics (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Physiology (AREA)
  • Psychiatry (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Radiology & Medical Imaging (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Child & Adolescent Psychology (AREA)
  • Developmental Disabilities (AREA)
  • Educational Technology (AREA)
  • Hospice & Palliative Care (AREA)
  • Psychology (AREA)
  • Social Psychology (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present invention relates to a system and a corresponding method for operating and/or controlling a functional unit and/or an application. To enable that the can be operated and/or controlled according to an accurate mental state of the user, the proposed system comprises head movement detection means (11) for detecting movements of the head of a user (1), mood determining means (12) for determining a mood of the user (1) taking into account the detected movements of the head of the user (1), and processing means (13) for operating and/or controlling at least one function of the functional unit and/or the application dependent on the determined mood.

Description

SYSTEM AND METHOD FOR OPERATING AND/OR CONTROLLING A FUNCTIONAL UNIT AND/OR AN APPLICATION BASED ON HEAD MOVEMENT
FIELD OF THE INVENTION
The present invention relates to relates to a system and a corresponding method for operating and/or controlling a functional unit, in particular an output device. BACKGROUND OF THE INVENTION
Unobtrusive mood detection via body posture has not been a widely studied topic so far. However, the neurological underpinning of moods shows that moods are reflected via skeletal muscular system activity (i.e. posture detection, image processing or face expression).
The relation between mood and body posture is typically investigated in two ways. From psychology, actors are used to express certain moods, subsequently the presence of several characteristics for this posture are documented. Further, a relatively new method uses computer animation of humans of which the joints are bent in several combinations. Next, participants have to indicate what mood the animations express, and afterwards the typical joint bends belonging to certain moods are derived automatically.
From these measures three facets can be distinguished for the coding of the postures to indicate mood: body movements, postures, and gestures. For emotions on the one hand, most significant differences between emotions arose for different types of hand and arm postures and movements, and not for head and body posture. For mood on the other hand, all three facets provide information. Table 1 provides an overview of the posture characteristics which are found to characterize positive and negative effects. Table 1 further shows an overview of characteristics which are found to indicate happy and sad emotions.
Table 1
Sad emotions Happy emotions
Posture Upper body collapsed/ closed; Erected body posture;
Weight transfer-back Weight transfer-backward;
chest bend forward; Head downward; Head backwards;
Shoulders downward Lifting shoulders
withdrawal; approach;
Back of hands sideways; Back of hands sideways;
Arms near body; Arms up;
Elbow bend small; Elbow bend large
Hands close to the chest; Arms and hands open position
Movement Overall energy spend: least More activity;
activity; More energy used;
less energy used. Higher peak velocity of
movements.
The posture differences indicated in Table 1 seem to follow a distinctive pattern, and therefore the use of posture is promising for the use of mood recognition. Hence, there were no studies reporting measuring posture in daily-life settings, thus the ecological validity of the posture features still has to be estimated. For example, actors tend to exaggerate their postures since they their intent is to express an emotion, therefore, posture features which are found in these studies might be more difficult to distinguish in real-life settings.
WO 2010/070463 Al discloses a system and a method for detecting a person's mood by measuring and interpreting the person's body reaction to a partner's touch. The system comprises a body sign detecting device, a body contact detector generating a contact signal indicative for the occurrence of body contact between the person and a partner. The system further comprises a signal processor and an output device controlled by the processor. In an analyzing mode the processor analyses the response signal from the body sign detecting device, determines the person's mood and, depending on the outcome of the determination, operates the output device.
The paper of F.H. Yusoff, et al. "Temporal Modeling of Head Movement during Facial Expression via Dual Pivot Tracking System-A Case of Disgust", IMECS Proceedings of the International MultiConference of Engineers and Computer Scientists 2009, Vol. 1, pages 927-931 discloses a method to track the movement of the head of a subject by using a Dual Pivot Head Tracking system (DPHT). DPHT is applied on subjects depicting disgust. SUMMARY OF THE INVENTION
It is an object of the present invention to provide a system and a corresponding method for operating and/or controlling a functional unit, in particular an output device, which can be operated and/or controlled according to an accurate mental state of the user.
In a first aspect of the present invention a system for operating and/or controlling a functional unit, in particular an output device, is presented that comprises
a head movement detection means for detecting movements of the head of a user,
a mood determining means for determining a mood of the user taking into account the detected movements of the head of the user, and
a processing means for operating and/or controlling at least one function of the functional unit and/or the application dependent on the determined mood.
In a further aspect of the present invention a corresponding method is presented as well as a computer program for implementing said method.
Preferred embodiments of the invention are defined in the dependent claims. It shall be understood that the claimed method and the claimed computer program have similar and/or identical preferred embodiments as the claimed system and as defined in the dependent claims.
The invention is based on the idea to use movement information of the user's head, which in most cases is easy to obtain, in order to determine information about the user's current mood (also called user's state hereinafter). By this means an easy and completely unobtrusive way of monitoring the mood of the user can be provided. The determined mood can then be used to control and/or operate devices or applications to keep the user in the same mood or change it in a desired direction (e.g. from sad to happy).
In an embodiment of the invention the system comprises accumulating means for accumulating a count indicative for a movement activity of the head over a selected time period, wherein the mood determining means is adapted to determine the mood using the accumulation result. Thus, a way to quantify a movement activity level of the user is created. The movement activity level is suitable to indicate the mental state of the user
In another embodiment of the invention the accumulating means are adapted to accumulate single movements of the head completed since a selected time. This is a way to detect the overall amount of the head completed since a selected time period. This is a simple but highly reliable way to determine the mental state and/or the mood. The inventors of the present invention have found proof for the relationship between the amount of head movements and the personal state of mind / mood.
Advantageously, the head movement detection means includes a movement sensor of the type whose sensitivity is dependent on accelerations of the head, the movements being calculated based on the detected accelerations. Thus, an option is provided to employ highly reliable and miniaturized accelerometers for the desired purpose.
According to an embodiment of the invention the movement sensor includes an accelerometer for attachment to the head of the user or for attachment to a device that is provided for attachment to the head of the user, in particular a helmet, head set or headphone. Semiconductor elements may be used incorporating such an accelerometer in addition to other circuitry. Such a simple single chip solution has favorable cost implications and provides a compact implementation for the movement detection means required.
In an embodiment of the invention the head movement detection means comprises a camera and the mood determining means is adapted to analyze data registered by the camera for the detection of the head movements. By analyzing the data from the camera it can easily be distinguished between passive movements of the head (i.e. the head is retrained by a movement e.g. of the torso) and active movements of the head. In this way interference effects due to passive head movements can be eliminated. In other words to a certain extent the "noise" resulting from passive head movement activity is suppressed thereby enabling reliable observation of the true head movement signal. The camera further has the advantage of being insensitive to acoustical interferences (e.g. caused by the beat of a provided music song). The movement may be tracked by the camera using the above mentioned DPHT- method.
Upon a significant change of the mood of the user it may be advantageous to interrupt the current operation mode of the functional device and the application respectively and change to another operation mode. For instance, when the mood changes it may be advantageous to interrupt a currently played music song and to start to play a more suitable music piece. Therefore according to an embodiment of the invention a monitoring means for monitoring a temporal course of a quantity characterizing the head movement for a change, wherein the mood determining means is adapted to recognize a change of the mood using information about the change of said quantity and wherein said processing means is adapted to operate and/or control the functional unit and/or the application using information about the change of the state of the mood. According to another embodiment of the invention an output device is provided for outputting information being operated depending on the determined mood. This makes it possible to sensitively provide information that fits to a current mood of the user. The mood can be detected without the user having to indicate something him/herself. The determined or detected mood can basically be used as input to any application, or for example directly fed back to the user to increase mood awareness. The user's mood can be detected unobtrusively and used as input for any applications. Moreover, knowing the mood of the user can be important or helpful in many applications, e.g. in a personal music selection device, it can automatically tell whether the user likes a piece of music, without explicitly asking the user. In a web-communication setting an automatic interpretation of a certain person's state of mind can be made.
Audio information, in particular in the form of a provided music song, is highly suitable to stimulate or influence the mood of a subject. Therefore the output device advantageously comprises an audio output device. This makes it possible to automatically select e.g. a certain music piece or song. Data of music rating systems that are available on the internet can now be uploaded automatically. Alternatively or in addition a visual output device, e.g. a monitor or TV, may be provided for outputting desired visual data to stimulate or influence the mood.
According to yet another embodiment of the invention the output device is adapted to output signals that are suitable to influence the mood in a predetermined way.
Knowledge of the user's mood can be used in various applications to initiate mood change by changing the operation of an output device. Output devices that can be thought of are music output devices, light sources, or game console platforms. As a positive mood can have health advantages users can benefit if mood is directed to a positive state. For example, the mood can be directed to a state that is most convenient working in for the user, this can improve work productivity. Measuring head movement during desk work activities are feasible via a webcam or an accelerometer e.g. in a headphone. Another possibility is the use of automatic mood detection by head movement detection in entertainment systems. The mood of the gamer can be taken into account to change the game features. Another example application is the music-directs-your-mood concept: based on the effect a song on the user's mood, songs can be selected to direct to a desired mood state.
According to preferred embodiment of the invention a learning unit is provided for collecting and evaluating data related to the user's response to an output of information, the learning unit being adapted to learn based on the evaluated data which output information leads to an improved response. Such a learning process expands successively the capability of the system to direct the user's mood in a desired way.
Advantageously, the movement detection means and/or the mood determining means are integrated in or at a headphone or head set. Through this integration the number of devices carried is cut down and the headphone can e.g. easily put on or taken of in one operation.
In an embodiment the mood determining means further receives an additional input from other means for determining one or more parameters including information about the user's mood. The detected head movements could thus be combined with other objective parameters of user state, e.g. physiological measures. Integration of these additional signals lead to a more robust or in some situations more reliable estimation of the user's mood by the mood determining means.
Further, in an embodiment the system comprises communication means for communicating the determined mood, e.g. to the user, to another person and/or to another functional unit or application. Hence, the user state derived from the head movements could be communicated not only to the user itself but also to another person(s) e.g., via an online application / computer / smartphone. In this way the other person can or cannot take action upon this information (e.g., by calling/ text message) toward the user, to e.g. facilitate user state change or cause awareness of the user state.
Still further, in an embodiment display means are provided for displaying the determined mood of the user over time. Thus, the system which derived the user state from head movements can show the user state over the last time period (minutes/hours/days), e.g. as a kind of diary with bars indicating the user state. This diary makes it easier for the user to recognize his/her (mood) state patterns as he/she can now relate current state to the patterns shown over time. This leads to increased awareness of user state which could successively induce behavioral change.
BRIEF DESCRIPTION OF THE DRAWINGS
These and other aspects of the invention will be apparent from and elucidated with reference to the embodiment(s) described hereinafter. In the following drawings
Fig. 1 shows a diagram showing a graph of total time displayed along the ordinate versus time along the abscissa,
Fig. 2 shows a block diagram of the general layout of the system according to the present invention, Fig. 3 shows a block diagram illustrating an information flow during performing the method according to the invention in a first operation mode,
Fig. 4 shows a block diagram illustrating an information flow during performing the method according to the invention in a second operation mode,
Fig. 5 shows two embodiments of a headphone designed according to the invention,
Fig. 6 shows a block diagram of a first embodiment of the system according to the present invention,
Fig. 7 shows a block diagram of a second embodiment of the system according to the present invention,
Fig. 8 shows a block diagram of a third embodiment of the system according to the present invention,
Fig. 9 shows a block diagram of a fourth embodiment of the system according to the present invention, and
Fig. 10 shows a block diagram of a fifth embodiment of the system according to the present invention.
DETAILED DESCRIPTION OF THE INVENTION
In a study of the inventors of the present invention moods were induced with music in an eight minute induction period in a situation mimicking an office situation. The results of this study are illustrated in Fig. 1 showing the amount of time in each minute spent on head movements during the mood induction as a function of time. The first curve CI in the graph is assigned to a subject in a positive high energetic level, wherein the second curve in the graph is assigned to a subject in a negative low energetic mood. These results clearly indicate that head movement/ tapping is significantly higher in positive energetic states compared to negative low energetic mood states.
Based on this study the conclusion has been drawn that the mood can be derived by detecting the amount of head movements. Head movement can objectively be measured by use of different means, e.g. via an accelerometer in headphones or via image processing of camera images. In desk situations the camera can be placed somewhere in the monitor of the PC.
Fig. 2 illustrates a system according to the present invention in a simple block diagram. The system 10 is provided for operating and/or controlling a functional unit 20 (e.g. a portion of another device such as a light or sound control system of a room or an area) and/or an application 21 (e.g. an computer program controlling the output of music or visual effects), in particular an output device 22 (e.g. an audio device, such as an MP3 player). Generally, the system 10 comprises a head movement detector 11 for detecting movements of the head of a user 1, a mood determining unit 12 (e.g. a computer or processor, or a dedicated device for this purpose) for determining a mood of the user 1 taking into account the detected movements of the head of the user 1, and a processor 13 for operating and/or controlling at least one function of the functional unit 20 and/or the application 21 dependent on the determined mood.
Fig. 3 shows a simplified block diagram illustrating an information flow when performing the method according to the invention in a first operation mode. In this the mood of the user is determined by the mood determining unit 12 based on information about the overall amount of movements of the user's head completed since a predetermined time period. A bidirectional communication link is employed between the processor 13 and the mood determining unit 12. The processor 13 operates an output device 22 in order to direct the user's mood. In the first operation mode the user does not interface with the system.
Fig. 4 shows a simplified block diagram illustrating an information flow when performing the method according to the invention in a second operation mode. In the second operation mode the user 1 is involved or included in a loop of information flow between the several entities of the system 10 according to the invention. The mood determining unit 12 again determines the mood of the user 1 based on information about the overall amount of movements of the user's head completed since a predetermined time period. Information about the current mood of the user 1 is then transferred to the processor 13 that controls an output device 22, e.g. as an audio output device, depending on the outcome of the mood determination.
Preferably, a first interface 14 is arranged in the processor 13. Via the first interface 14 information is transferred from the processor 13 to the user 1, who in turn can transfer information via a second interface 15 arranged in the mood determining unit 12. The mood determining unit 12 records information which is entered by the user 1. For instance, the user 1 can enter data indicative of how she/he wants to feel. The audio output device 20, e.g. a music player, can learn which effect each song in a database of the system has on the head movement of the user. Then, if the user 1 for instance selects to feel more positive, music can be selected that has shown to increase head movement, thus to improve the mood. On the other hand, when the user wants to feel more negative, songs can be presented that have shown to not evoke head movements. The system may take the form of a wire-bound or wireless headphone or head set or may be included therein. Fig. 5A shows a side perspective view of a first embodiment of such a headphone 30a designed according to the invention. The headphone 30a includes a pair of earphones 31, 32 adapted to fit over a user's left and right outer ears (not shown). The earphones 31, 32 are connected to one another by a headband 33 which is adapted to fit over the top of the user's head (not shown). An accelerometer 34, which represents the head movement detector 11 in this embodiment, is integrally disposed with the headband 33. The accelerometer 34 is coupled with a processor 35, which is also included in the headphone 30a according to this embodiment. The processor 35 represents the mood determining unit 12 and detects a mood of the user using the output signals of the accelerometer 34. Another processor 36, which is also included in the headphone 30, is provided to control and/or operate an output device, e.g. an external audio source 23 connected to the headphone 30a via cable or through an RF link 24, dependent on the detected mood, e.g. based on stored user settings, a desired user control scheme or a default setting.
Another embodiment of such a headphone 30b is shown in Fig. 5B. In this embodiment the processor 36, and preferably also the processor 35, are arranged externally (i.e. physically separate from the headphone 30b), e.g. in the external audio source 23 or in a RF reception unit (not shown) coupled to the headphone 30b via cable or wirelessly. The detected head movements are thus transmitted from the accelerometer 34 to the processor 35 via an RF transmitter 37 in the headphone 30b and an RF receiver 25 in the audio source 23.
There are various modifications and additions possible for the system 10 according to the present invention. Optionally, as shown in the embodiment of the system 10a depicted in Fig. 6, an accumulating unit 16 is provided for accumulating a count indicative for a movement activity of the head of the user 1 over a selected time period, wherein the mood detection unit 12 is adapted to determine the mood using the accumulation result. Preferably, the accumulating unit 15 is adapted to accumulate single movements of the head completed since a selected time.
Further, as also shown in Fig. 6 the head movement detector 11 includes (alternatively or in addition to the accumulation unit 16) a movement sensor 17 of the type whose sensitivity is dependent on accelerations of the head, the movements being calculated based on the detected accelerations. Preferably, the movement sensor 17 includes an accelerometer 18 for attachment to the head of the user 1 or for attachment to a device, e.g. a helmet, head set or headphone (30, as shown in Fig. 5), that is provided for attachment to the head of the user. Such movement sensors 17 and accelerometer are commonly known and thus not explained here in more detail.
In an alternative embodiment of a system 10b according to the present invention shown in Fig. 7 the head movement detector 11 comprises a camera 19, which may either be fixed to the user's head or moved at a fixed location such that it monitors the user's head (e.g. a webcam built into or fixed to a computer monitor, or a camera built into a mobile phone). The mood determining unit 12 is adapted to analyze data registered by the camera 19 for the detection of the head movements.
Further, as also shown in Fig. 7, the system 10b optionally comprises a monitoring unit 40 for monitoring a temporal course of a quantity characterizing the head movement for a change. Said quantity may, for instance, be the repetition rate, repetition time, speed, intensity, etc. of the user's head movement. Based thereon the mood determining unit 12 is able to recognize a change of the user's mood using information about the change of said quantity. The processor 13 then operates and/or controls the functional unit 20 and/or the application 21 using information about the change of the state of the mood.
It shall be noted that in other embodiment the functional unit 20, application
21 and/or output device 22 is considered part of the system. In an embodiment the output unit
22 is adapted for outputting information being operated depending on the determined mood. For instance, the output device 22 may comprise an audio output device and/or visual output device and may be adapted to output signals that are suitable to influence the mood in a predetermined way.
Still further, as also shown in Fig. 7, a learning unit 41 is provided for collecting and evaluating data related to the user's response to an output of information. The learning unit 41 is adapted to learn based on the evaluated data which output information leads to an improved response, which information can be used as an additional input to the processor 12 for controlling the devices 20-22.
Optionally, acceleration identification means 42 can be provided to identify accelerations which are not caused by head movements (e.g. by the beat of a provided music song) so that only the remaining true movement activity induced acceleration signals are taken into account when determining the mood.
Fig. 8 shows another embodiment of a system 10c according to the present invention. In this embodiment the mood determining means further receives an additional input from other means 43 for determining one or more parameters including information about the user's mood. The detected head movements could thus be combined with other objective parameters of user state, e.g. physiological measures (for instance from an EEG or other sensors such as a respiration sensor 44 indicating the user's respiration rate, a blood pressure sensor 45 indicating the user's blood pressure and/or a heart rate sensor 46 indicating the user's heart rate. Integration of these additional signals lead to a more robust or in some situations more reliable estimation of the user's mood by the mood determining means. It shall be noted that the elements 43 to 46 may be part of the system 10c but may also be external elements as shown in Fig. 8.
Further, in an embodiment the system lOd depicted in Fig. 9 a communication unit 47 is provided for communicating the determined mood, e.g. to the user, to another person and/or to another functional unit or application. Hence, the user state derived from the head movements could be communicated not only to the user itself but also to another person(s) e.g., via an online application / computer / smartphone. In this way the other person can or cannot take action upon this information (e.g., by calling/ text message) toward the user, to e.g. facilitate user state change or cause awareness of the user state.
Still further, in an embodiment the system lOe depicted in Fig. 10 display means 48, e.g. a small screen or a display, are provided for displaying the determined mood of the user over time. Thus, the user state can be shown over the last time period
(minutes/hours/days), e.g. as a kind of diary with bars indicating the user state. This diary makes it easier for the user to recognize his/her (mood) state patterns as he/she can now relate current state to the patterns shown over time. This leads to increased awareness of user state which could successively induce behavioral change.
While the invention has been illustrated and described in detail in the drawings and foregoing description, such illustration and description are to be considered illustrative or exemplary and not restrictive; the invention is not limited to the disclosed embodiments. Other variations to the disclosed embodiments can be understood and effected by those skilled in the art in practicing the claimed invention, from a study of the drawings, the disclosure, and the appended claims.
In the claims, the word "comprising" does not exclude other elements or steps, and the indefinite article "a" or "an" does not exclude a plurality. A single element or other unit may fulfill the functions of several items recited in the claims. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.
A computer program may be stored / distributed on a suitable non-transitory medium, such as an optical storage medium or a solid-state medium supplied together with or as part of other hardware, but may also be distributed in other forms, such as via the Internet or other wired or wireless telecommunication systems.
Any reference signs in the claims should not be construed as limiting the scope.

Claims

CLAIMS:
1. System for operating and/or controlling a functional unit and/or an application, in particular an output device, said system (10, 10a, 10b) comprising:
a head movement detection means (11) for detecting movements of the head of a user (1),
- a mood determining means (12) for determining a mood of the user (1) taking into account the detected movements of the head of the user (1), and
a processing means (13) for operating and/or controlling at least one function of the functional unit and/or the application dependent on the determined mood.
2. System according to claim 1, further comprising an accumulating means (16) for accumulating a count indicative for a movement activity of the head over a selected time period, wherein the mood determining means (12) is adapted to determine the mood using the accumulation result.
3. System according to claim 2, wherein the accumulating means (16) is adapted to accumulate single movements of the head completed since a selected time.
4. System according to claim 1 , wherein the head movement detection means (11) includes a movement sensor (17) of the type whose sensitivity is dependent on accelerations of the head, the movements being calculated based on the detected
accelerations.
5. System according to claim 4, wherein the movement sensor (17) includes an accelerometer (18) for attachment to the head of the user (1) or for attachment to a device (30) that is provided for attachment to the head of the user, in particular a helmet, head set or headphone.
6. System according to claim 1 , wherein the head movement detection means
(11) comprises a camera (19) and wherein the mood determining means (12) is adapted to analyze data registered by the camera for the detection of the head movements.
7. System according to claim 1, further comprising a monitoring means (40) for monitoring a temporal course of a quantity characterizing the head movement for a change, wherein the mood determining means (12) is adapted to recognize a change of the mood using information about the change of said quantity and wherein said processing means (13) is adapted to operate and/or control the functional unit and/or the application using information about the change of the state of the mood.
8. System according to claim 1, further comprising an output device (20, 21, 22), in particular an audio output device and/or visual output device, for outputting information being operated depending on the determined mood.
9. System according to claim 8, wherein the output device (20, 21, 22) is adapted to output signals that are suitable to influence the mood in a predetermined way.
10. System according to claim 1, further comprising a learning unit (41) for collecting and evaluating data related to the user's response to an output of information, the learning unit being adapted to learn based on the evaluated data which output information leads to an improved response.
11. System according to claim 1, further comprising an input interface (16) for entering information into the system.
12. System according to 1 , wherein the head movement detection means (11) and/or the mood determining means (12) are integrated in or at a headphone (30) or head set.
13. System according to claim 1, further comprising an acceleration identification means (42) for identifying accelerations which are not caused by head movements.
14. Method for operating and/or controlling a functional unit and/or an
application, in particular an output device, said method comprising the steps of: detecting movements of the head of a user (1),
determining a mood of the user (1) taking into account the detected movements of the head of the user (1), and
operating and/or controlling at least one function of the functional unit and/or the application dependent on the determined mood.
15. Computer program comprising program code means for causing a computer to carry out the steps of the method as claimed in claim 14 when said computer program is carried out on the computer.
PCT/IB2012/050895 2011-03-01 2012-02-27 System and method for operating and/or controlling a functional unit and/or an application based on head movement WO2012117335A2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP11156412.6 2011-03-01
EP11156412 2011-03-01

Publications (2)

Publication Number Publication Date
WO2012117335A2 true WO2012117335A2 (en) 2012-09-07
WO2012117335A3 WO2012117335A3 (en) 2013-01-31

Family

ID=45809364

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2012/050895 WO2012117335A2 (en) 2011-03-01 2012-02-27 System and method for operating and/or controlling a functional unit and/or an application based on head movement

Country Status (1)

Country Link
WO (1) WO2012117335A2 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110049724A (en) * 2016-12-14 2019-07-23 三菱电机株式会社 Condition estimating device

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010070463A1 (en) 2008-12-15 2010-06-24 Koninklijke Philips Electronics N.V. Method and device for automatically creating a romantic atmosphere, and method and system for mood detection

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008520311A (en) * 2004-11-23 2008-06-19 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Depression detection device
CA2605239A1 (en) * 2005-05-02 2006-11-09 University Of Virginia Patent Foundation Systems, devices, and methods for interpreting movement
GB2437250C (en) * 2006-04-18 2012-08-15 Iti Scotland Ltd Method and system for monitoring the condition of livestock

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010070463A1 (en) 2008-12-15 2010-06-24 Koninklijke Philips Electronics N.V. Method and device for automatically creating a romantic atmosphere, and method and system for mood detection

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
H. YUSOFF ET AL.: "Temporal Modeling of Head Movement during Facial Expression via Dual Pivot Tracking System-A Case of Disgust", IMECS PROCEEDINGS OF THE INTERNATIONAL MULTICONFERENCE OF ENGINEERS AND COMPUTER SCIENTISTS, vol. 1, 2009, pages 927 - 931

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110049724A (en) * 2016-12-14 2019-07-23 三菱电机株式会社 Condition estimating device

Also Published As

Publication number Publication date
WO2012117335A3 (en) 2013-01-31

Similar Documents

Publication Publication Date Title
US11839473B2 (en) Systems and methods for estimating and predicting emotional states and affects and providing real time feedback
US10735831B2 (en) System and method communicating biofeedback to a user through a wearable device
JP4481682B2 (en) Information processing apparatus and control method thereof
CN103561652B (en) Method and system for assisting patients
US20140085101A1 (en) Devices and methods to facilitate affective feedback using wearable computing devices
TWI432994B (en) Apparatus and method for sensory feedback
US20130245396A1 (en) Mental state analysis using wearable-camera devices
US11986300B2 (en) Systems and methods for estimating and predicting emotional states and affects and providing real time feedback
KR20160044269A (en) Health management method and system using a wearable device
CN104023802B (en) Use the control of the electronic installation of neural analysis
KR102089002B1 (en) Method and wearable device for providing feedback on action
JP6742380B2 (en) Electronic device
JP7207468B2 (en) Output control device, output control method and program
WO2019086856A1 (en) Systems and methods for combining and analysing human states
CN111656304A (en) Communication method and system
JP2019030557A (en) Presentation device, presentation method, emotion estimation server, emotion estimation method, and emotion estimation system
WO2014116826A1 (en) Mobile, neurally-assisted personal assistant
KR102048551B1 (en) System and Method for Virtual reality rehabilitation training using Smart device
Hänsel et al. Wearable computing for health and fitness: exploring the relationship between data and human behaviour
Mimouna et al. A survey of human action recognition using accelerometer data
WO2012117335A2 (en) System and method for operating and/or controlling a functional unit and/or an application based on head movement
KR102323576B1 (en) System for Determining Dementia Using Public Wireless Medical Device Based on IoT
KR20190047644A (en) Method and wearable device for providing feedback on exercise
Nalepa et al. AfCAI systems: A ective Computing with Context Awareness for Ambient Intelligence. Research proposal.
Peter et al. Physiological sensing for affective computing

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12707655

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12707655

Country of ref document: EP

Kind code of ref document: A2