JP2007068620A - Psychological condition measuring apparatus - Google Patents

Psychological condition measuring apparatus Download PDF

Info

Publication number
JP2007068620A
JP2007068620A JP2005256231A JP2005256231A JP2007068620A JP 2007068620 A JP2007068620 A JP 2007068620A JP 2005256231 A JP2005256231 A JP 2005256231A JP 2005256231 A JP2005256231 A JP 2005256231A JP 2007068620 A JP2007068620 A JP 2007068620A
Authority
JP
Japan
Prior art keywords
psychological state
subject
temperature
data
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2005256231A
Other languages
Japanese (ja)
Inventor
Akihiro Baba
Daisaku Horie
Takayuki Ito
Yoshiyuki Toso
孝幸 伊藤
大作 保理江
善行 十都
昭洋 馬場
Original Assignee
Konica Minolta Holdings Inc
コニカミノルタホールディングス株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Konica Minolta Holdings Inc, コニカミノルタホールディングス株式会社 filed Critical Konica Minolta Holdings Inc
Priority to JP2005256231A priority Critical patent/JP2007068620A/en
Publication of JP2007068620A publication Critical patent/JP2007068620A/en
Pending legal-status Critical Current

Links

Images

Abstract

PROBLEM TO BE SOLVED: To provide a psychological state measuring apparatus capable of classifying a subject's psychological state with high accuracy and also accurately measuring the level of the psychological state in the subject, that is, the strength based on that.
Means for acquiring image information of a subject, means for classifying a psychological state of the subject using the image information, means for acquiring physiological information of the subject, and the physiological A psychological state measuring device comprising psychological state level calculating means for calculating the level of the psychological state of the subject based on the information and the psychological state classification result.
[Selection] Figure 7

Description

  The present invention relates to a psychological state measuring apparatus that classifies a psychological state of a subject and measures the level of the psychological state.

  In recent years, in the fields of marketing and product design support, there is an increasing demand for knowing what impressions and emotions customers and users have psychologically about products and services. Knowing the customer's psychological state can be used to design and sell products and services that satisfy customers.

  Moreover, if the customer's psychological state can be automatically known using a measuring device without directly asking a question from the customer or obtaining a comment directly from the customer, the scope of use and the effect of the use are greatly increased. Can be spread.

  Other scenes where it is greatly useful to automatically know the customer's psychological state using a measuring device include, for example, user interface fields such as changing the GUI (Graphical User Interface) depending on the user's trouble. is there. In addition, there are medical and nursing fields that perform daily health checks and automatic nurse calls by detecting fatigue and distress of people with speech disabilities who are not even aware of them.

  In order to satisfy such needs, means for measuring and estimating the psychological state of customers and the like are necessary, and various methods have been studied and various applications have been attempted.

  Various techniques for recognizing and discriminating an expression from a face image of a subject have been announced. It is well known that a subject's facial expression represents its psychological state. Therefore, methods for using the facial expression of the subject have been studied as means for knowing the psychological state of the subject.

  There has been proposed a technique for measuring the state of a predetermined psychological state and the level of the psychological state based on facial expressions using facial expression recognition based on images (see, for example, Patent Document 1).

  In Patent Document 1, average data of face images corresponding to a predetermined psychological state acquired from a large number of subjects is preliminarily stored for each stage as facial expression data that changes stepwise with respect to changes in the level of the psychological state. The average facial image representing these stepped facial expressions and the face image of the subject to be measured are matched, and the facial expression stage represented by the average facial image that is taken into account is determined according to the predetermined psychological state of the subject. A technique for outputting as a level is shown.

  Since this method only needs to shoot a face image, if the face image data of facial expressions that change stepwise in advance is collected for each predetermined psychological state without burdening the subject in particular, each psychological state It is possible to easily recognize a facial expression and measure the level of a predetermined psychological state simply by collating with the stepwise average face image.

  However, the level of psychological state and the degree of change in facial expression are actually different for each subject. In other words, even with the matching with the average face, even if the psychological state can be determined from the facial expression, it is difficult to accurately determine the level of the psychological state, that is, the strength, only from the degree of change in the facial expression. It is.

  On the other hand, it is well known that when a subject is in a predetermined psychological state, the physiological state of the subject's body changes depending on the level of the psychological state. For example, it is conventionally known that data obtained by measuring a physiological reaction such as a subject's brain wave or blood flow reflects the level of a predetermined psychological state of the subject who has affected the data. Therefore, methods for using data representing the physiological state of the subject's body have been studied as a means for knowing the level, ie, strength, of the subject's psychological state.

  A technique for measuring a plurality of data representing the physiological state of a subject and displaying and outputting a level as a predetermined psychological state based on the data is proposed (for example, see Patent Document 2).

  In Patent Document 2, a plurality of data representing the physiological state of the care recipient / care recipient is measured and acquired, and the abnormal state is obtained by performing icon display output or the like to represent the level in a predetermined psychological state. The technology to reduce the burden on nurses and caregivers by informing them whether or not they are present and how to treat them is shown.

  However, the method using data representing a physiological state as described above is excellent in the accuracy of detecting a change from a normal state to some psychological state, but the accuracy is insufficient to identify the psychological state. It is.

  For example, there are many examples of performing biometric measurements in fields such as sports and medical care, but such physiological data is only available when the subject is estimated to be in a specific psychological situation, such as by applying external stimuli. It is effective for measuring the level, and when the psychological state itself cannot be specified, the level of the psychological state cannot be obtained.

  In addition, a technique using body temperature data of a subject as data representing the physiological state has been proposed. Measurement of physiological responses such as a subject's brain wave and blood flow requires that the sensor be attached to the subject in contact with the subject, and this restricts the behavior of the subject. The body temperature data of the subject can measure the level in the predetermined psychological state of the subject, that is, the strength in a non-contact state without such a constraint.

  In order to obtain body temperature data, it is not always necessary to touch the subject's body, and it is possible to measure the body temperature of each part of the subject using a temperature image using infrared light etc. Physiological data can be collected without any problems. If the correlation between the body temperature data of each part and the level of a predetermined psychological state in a large number of subjects is prepared as reference data serving as a reference, the level of the predetermined psychological state can be easily obtained by simply collating the data. Can be measured.

However, it is excellent in the accuracy of detecting a change from the normal state to some psychological state, but the accuracy is insufficient to identify the psychological state, as with other physiological data. .
JP 2001-43345 A JP 2004-49309 A

  As described above, a technique for recognizing a test subject's facial expression from facial image information and measuring a psychological state generally has insufficient accuracy to determine the level even if the psychological state can be determined. In addition, regarding the method using physiological data, in order to obtain the level of the psychological state, if the psychological state is not specified, the accuracy is insufficient.

  The present invention solves the above-mentioned problems, and can accurately classify the psychological state of the subject into one or a plurality of predetermined psychological states, and based on that, the level of the psychological state in the subject, That is, it aims at providing the psychological state measuring apparatus which can measure intensity | strength accurately.

  The present invention has the following features in order to solve the above problems.

  (1) Image information acquisition means for acquiring image information of a subject who is a subject of psychological state measurement, first dictionary data that associates the image information with a predetermined psychological state, and the first dictionary data. Referring to, based on the image information, a psychological state classification means for classifying the psychological state of the subject into the predetermined psychological state, a physiological information acquisition means for acquiring physiological information of the subject, A psychological state measuring device comprising psychological state level calculating means for calculating a level of the psychological state of the subject in the psychological state of the subject classified by the psychological state classifying means based on physiological information.

  (2) The image information acquired by the image information acquisition unit is facial image information of the subject, and the psychological state classification unit has a function of classifying the psychological state of the subject based on facial expressions of the facial image information. The psychological state measuring apparatus as described in (1) characterized by the above-mentioned.

  (3) The psychological state measuring apparatus according to (2), wherein the facial expression of the facial image information is information relating to the subject's blink, gaze time, or pupil size.

  (4) The psychological state measurement apparatus according to any one of (1) to (3), wherein the physiological information acquired by the physiological information acquisition unit is information relating to the body temperature of the subject.

  (5) comprising second dictionary data for associating the physiological information acquired by the physiological information acquisition means with the predetermined psychological state, the psychological state classification means referring to the second dictionary data; The psychological state measuring apparatus according to any one of (1) to (4), wherein the apparatus has a function of classifying the psychological state of the subject based on the image information and physiological information of the subject.

  (6) The psychological state according to any one of (1) to (5), wherein the predetermined psychological state is a psychological state representing an emotion of the subject with respect to an arbitrarily set object. Condition measuring device.

  (7) The psychological state measuring apparatus according to any one of (1) to (5), wherein the predetermined psychological state is a psychological state representing a physiological state of the body of the subject. .

  (8) 3rd dictionary data which associates the physiological information which the physiological information acquisition means acquires with the level of the psychological state corresponding to the predetermined psychological state, The psychological state level calculation unit includes And (3) having a function of referring to the third dictionary data and calculating a level of the psychological state of the subject classified by the psychological state classification means based on the physiological information of the subject. The psychological state measuring apparatus of any one of thru | or (7).

  According to the present invention, it is possible to accurately classify a subject's psychological state by acquiring the subject's image information and the subject's physiological information and performing an integration process on both, and based on the acquired information. It is possible to provide a psychological state measuring apparatus capable of accurately measuring the level of the psychological state in the subject, that is, the strength.

  Embodiments according to the present invention will be described with reference to the drawings.

  The present embodiment measures the psychological state of the subject subject, and relates to observable data, for example, the appearance of the subject without directly asking the subject or obtaining a comment from the subject. Indirect measurement is performed from a visible image or physiological data that can be measured without touching the subject.

  There are many application examples in which it is required to measure the psychological state without bothering the subject subject. As an embodiment according to the present invention, a case will be described as an example in which a psychological impression of a subject who is watching program content such as a television is investigated.

(Example of application using psychological state measuring device)
FIG. 1 shows an example of an application using a psychological state measuring apparatus according to an embodiment of the present invention (television) in order to measure what kind of psychological state a viewer of a television program is during viewing. It is a figure which shows a program viewing information survey.

  In FIG. 1, 10 is a viewer of a television program and corresponds to a test subject of psychological state measurement. Reference numeral 11a denotes an input device for image information relating to the subject, such as a camera, for example, which also serves as a body temperature image, that is, an input device for physiological information. Reference numeral 13 denotes an apparatus for providing content such as a program to the viewer, and here, a television is taken up.

  14a is a data collection device. The psychological state measuring device may be incorporated in the data collection device 14a or may be incorporated in the image input device 11a. Alternatively, it may be included in both of them.

  Conventionally, the data collection device 14a can collect only data indicating whether a content such as a television program is being viewed, that is, whether the television is attached and which program is being viewed. In the present embodiment example, data is collected by measuring the impression of the viewer about the program, that is, the psychological state of the subject. In the application example of the psychological state measuring apparatus as shown in FIG. 1, not only the conventional program audience rating but also data such as program preference can be automatically collected.

  The configuration of the psychological state measurement apparatus according to this embodiment will be described with reference to FIGS. FIG. 2 shows a configuration example including an image input device as hardware as a component of the psychological state measuring device, and FIG. 4 shows a configuration example not including an image input device as hardware.

(Configuration example of psychological state measuring device)
FIG. 2 is a block diagram showing a configuration of the psychological state measuring apparatus as a minimum component including an input unit and a processing unit. In FIG. 2, the input unit is shown as hardware that directly images a subject.

  In FIG. 2, 10 is also a subject. An input unit 11 includes a visible image input unit 111 and a temperature image input unit 112. The input mechanism for the visible image and the temperature image will be described later. A processing unit 12 performs a psychological state measurement process using visible image data and temperature image data. This processing content will also be described later. The psychological state measuring apparatus 1 includes the input unit 11 and the processing unit 12.

  When the psychological state measuring apparatus 1 has the configuration of FIG. 2, the input unit 11 that is a constituent element corresponds to 11a in FIG. 1, and the processing unit 12 that is also a constituent element is within the data collection apparatus 14a in FIG. Will be included. Alternatively, the processing unit 12 may be configured to be included in 11a of FIG.

  In the configuration of FIG. 2, the visible image input unit 111 functions as an image information acquisition unit, and acquires visible image data (for example, face image information, hereinafter also referred to as face image data) as image information. The temperature image input unit 112 functions as a physiological information acquisition unit, and acquires temperature image data (for example, information related to body temperature, hereinafter also referred to as body temperature information or body temperature data) as physiological information. The processing unit 12 functions as a psychological state classification unit and a psychological state level calculation unit. The processing unit 12 also includes dictionary data. The detailed configuration of the processing unit will be described later.

(Example of input form of visible image and temperature image)
The input of a visible image and a temperature image is demonstrated using FIG. FIG. 3 is a diagram illustrating an example of a psychological state measuring apparatus having an input unit that can simultaneously capture both a visible image and a temperature image with a single image capturing apparatus.

  In FIG. 3, as in FIG. 2, 1 is a psychological state measuring device. As in FIG. 2, reference numeral 12 denotes a processing unit. The input unit 11 in FIG. 2 corresponds to components other than the processing unit 12, that is, the optical system 31, the half mirror 32, the visible image input unit 111, and the temperature image input. Part 112.

  The light beam incident through the optical system 31 is divided by the half mirror 32, is incident on both the visible image input unit 111 and the temperature image input unit 112, and the optical image of the subject to be tested is displayed on both imaging elements (34a and 34b). To form an image.

  The visible image input unit 111 includes an infrared light region cut filter 33a and an image sensor 34a, and the temperature image input unit 112 includes a visible light region cut filter 33b and an image sensor 34b.

  The infrared light region cut filter 33a cuts the infrared light region and passes only the visible light region, thereby forming a visible image on the image sensor 34a. The imaging element 34 a acquires visible image data and provides it to the processing unit 12. The visible light region cut filter 33b cuts the visible light region and passes only the infrared light region, thereby forming an image of infrared light, that is, a temperature image on the image sensor 34b. The image sensor 34 b acquires temperature image data and provides it to the processing unit 12.

  In this way, both visible and temperature images can be acquired and processed simultaneously. Next, an embodiment of a psychological state measuring apparatus that does not include an input device as hardware for directly photographing such a subject will be described.

(Overall configuration example of application form including psychological state measuring device)
FIG. 4 is a block diagram showing an example of the overall configuration including the configuration in accordance with the application form example of the TV program viewing information survey of FIG. In FIG. 4, an input device as hardware for directly photographing a subject is separately provided, and the psychological state measuring device is illustrated as receiving the image data.

  In FIG. 4, 10 is also a subject. Reference numeral 11 denotes an input unit, which includes a visible image input unit 111 and a temperature image input unit 112, as in FIG. Reference numeral 13 denotes an apparatus for providing content, and a typical example is a television. Reference numeral 131 denotes content to be viewed, that is, a television program. A data collecting unit 14 collects both visible image data and temperature image data obtained by the input unit 11 and content data obtained from the television 13.

  Reference numeral 15 denotes a data processing unit that performs a process of measuring the psychological state of the subject 10 with respect to the content 131 being viewed based on the data of the data collection unit 14. The data processing unit 15 includes a psychological state measurement unit 2, a recording unit 151, a content data processing unit 152, and an output unit 153.

  The psychological state measurement unit 2 further includes an input unit 21 that acquires visible image data and temperature image data from the data collection unit 14 and a processing unit 22 that performs a psychological state measurement process based on the data. In FIG. 4, the psychological state measuring unit 2 functions as a psychological state measuring device. A more detailed configuration of the psychological state measuring unit 2 will be described later with reference to FIG.

  The recording unit 151 records the result of the psychological state measurement unit 2 and sends it to the content data processing unit 152. The content data processing unit 152 combines the data of the content being viewed and the psychological state level measured by the psychological state measuring unit 2 and summarizes the data as favorableness data. The result is output by the output unit 153 as necessary.

  When the psychological state measuring apparatus 2 has the configuration shown in FIG. 4, the input unit 21 and the processing unit 22 that are constituent elements thereof are included in the data collection device 14a in FIG. That is, the data collection unit 14 and the entire data processing unit 15 in FIG. 4 are configured to be included in the data collection device 14a of FIG.

(Detailed configuration example of the psychological state measuring device)
A more detailed configuration of the psychological state measuring apparatus 2 in FIG. 4 will be described with reference to FIG. FIG. 5 is a block diagram showing a more detailed configuration of the psychological state measuring apparatus 2. (A), (b), and (c) in FIG. 5 are three examples of configurations that have the same components but are slightly different in the relationship between the components.

  In FIGS. 5A, 5 </ b> B, and 5 </ b> C, reference numeral 211 denotes a visible image input unit that obtains visible image data of a subject. A temperature image input unit 212 acquires the temperature image data of the subject. The visible image input unit 211 functions as an image information acquisition unit, and acquires a visible image (for example, face image information) as image information. The temperature image input unit 212 functions as a physiological information acquisition unit, and acquires a temperature image (for example, body temperature information) as physiological information. The visible image input unit 211 and the temperature image input unit 212 correspond to the input unit 21 in FIG.

  In order to measure a psychological state using a visible image, the psychological state is usually classified from facial expressions using facial image information. Although the processing operation will be described later, in this embodiment, the processing is performed based on the face image information.

  In addition, regarding the temperature image, the body temperature information of the subject is used as physiological information. In particular, if facial skin temperature is used, this is also processed based on facial temperature image data. Therefore, it is also possible to acquire data of a face visible image and a temperature image at a time. In the following description, both the visible image and the temperature image are face images.

  In FIGS. 5A, 5B, and 5C, reference numeral 221 denotes a face part position specifying unit that performs processing for specifying the position of the face in the image in the visible image and the temperature image. This is a process necessary for quantifying the facial expression as a feature value and for extracting the skin temperature at a specific position in the face.

  In FIGS. 5A, 5B, and 5C, reference numeral 222 denotes a psychological state classification unit, which is a predetermined psychological state based on the visible image data in which the position of the face part is specified, that is, the face image data. Classification into states. In some cases, it may be based on temperature image data, that is, body temperature data. The psychological state classification unit 222 functions as a psychological state classification unit.

  Reference numeral 224 denotes first dictionary data for collating face image information obtained from visible image data, and may include second dictionary data for collating body temperature information obtained from temperature image data. The psychological state classification unit 222 classifies the psychological state with reference to these dictionary data.

  In FIGS. 5A, 5B, and 5C, reference numeral 223 denotes a psychological state level calculation unit, which is based on the psychological state classified by the psychological state classification unit 222 and body temperature information obtained from the temperature image. , Calculate the level of its psychological state. The psychological state level calculation unit 223 functions as a psychological state level calculation unit.

  Reference numeral 225 denotes third dictionary data for collating body temperature information obtained from temperature image data. The psychological state classification unit 223 may calculate the level of the psychological state classified with reference to the dictionary data, in addition to obtaining directly from the body temperature information.

  The face part position specifying unit 221, the psychological state classifying unit 222, the psychological state level calculating unit 223, and the first and second dictionary data 224 and the third dictionary data 225 correspond to the processing unit 22 in FIG. .

  Accordingly, assuming the case of FIG. 1, based on the result of the psychological state level calculation unit 223, for example, the content that the subject is viewing is favorable, in an excited state, or disgusted. Therefore, the difference in favorable feelings such as feeling uncomfortable can be measured as a numerical level.

  Next, before explaining the difference between (a), (b), and (c) in FIG. 5, the outline of the processing procedure of psychological state measurement in the case of FIG. 1 will be described.

(1) A viewing channel at a certain time is recorded by an audience rating measurement mechanism in the data collection device. (Data collection unit 14)
This is the same as the conventional audience rating survey. In the present embodiment, the following psychological state measurement is added to this.

(2) A visible image and a temperature image are input at fixed time intervals using an image input device. (Visible image input unit 211 and temperature image input unit 212)
It is also possible to acquire a visible image and a temperature image at a time by the above-described image input device. The difference between (a), (b), and (c) in FIG.

(3) The face position is specified based on the visible image and the temperature image. (Face part position specifying unit 221)
In this case, a known face detection technique can be used, but it is also possible to increase the position specifying accuracy by the following device. For example, the presence of a face is confirmed based on information that a display screen of a television is reflected or information that a remote control operation has been performed. Precise position information of the face is given in advance.

(4) Classify from facial expressions using visible images. (Psychological state classification unit 222)
A known expression recognition technique based on face image information can be used. A typical method is to classify a representative face image of an expression corresponding to a psychological state set in advance for classification, collate which one is closest, and classify it.

(5) The level of the psychological state is measured using the temperature image. (Psychological state level calculation unit 223)
It is known that body temperature varies depending on the level of psychological state, and this is used. For example, when the user feels stress, the skin temperature at the nose decreases with respect to the surrounding skin temperature such as the forehead, and the body temperature increases when excited or furious. In accordance with each psychological state, the level of the psychological state is quantified based on the temperature of the nose and the forehead and the temperature difference.
In addition, there are individual differences in body temperature, so the accuracy is improved by taking the temperature difference between the nose and forehead and the relative value for the temperature of the forehead based on the body temperature measurement at the start of watching the TV program. be able to.

  The difference between (a), (b), and (c) in FIG. 5 is in the use of a visible image and a temperature image as follows.

  FIG. 5A is a block diagram when the psychological state is classified based on the visible image. The figure shows a case where the positions of the face and the facial part are calculated based only on the visible image. Although not shown, it is also possible to specify the position of the facial part using the temperature image.

  FIG. 5B is a block diagram when the psychological state is classified based on the temperature image. The figure shows a case where the position of the face and the facial part is also calculated based only on the temperature image. Although not shown, it is also possible to specify the position of the facial part using a visible image.

  FIG. 5C is a block diagram when the position of the face part is specified based on both the visible image and the temperature image. The case where the psychological state classification is calculated based only on the visible image is shown in the figure. Although not shown, psychological state classification can also be performed using both a visible image and a temperature image.

  Next, the processing flow of the television program viewing information survey in this embodiment and the psychological state measurement processing flow performed there will be described with reference to FIGS. FIG. 6 is a flowchart showing the processing operation of the TV program viewing information survey in the overall configuration of FIGS. FIG. 7 is a flowchart showing the psychological state measurement processing operation in the psychological state measuring device configuration of FIG.

(Flow showing the processing operation of TV program viewing information survey)
In FIG. 6, the viewing information acquisition process starts when the information acquisition function is turned on in step S11. ON / OFF switching (not shown) of the information acquisition function is provided in the data collection device 14a or the data collection unit 14.

  In step S12, it is determined whether or not the television apparatus 13 is turned on. The data collection device 14a or the data collection unit 14 is connected to the television device 13 so that information such as a viewing channel and its time can be acquired at any time.

  If it is determined in step S12 that the television apparatus 13 is turned on (step S12: YES), that is, if a television program is being viewed, step S13 is executed. If the television device 12 is not turned on (step S12: NO), that is, if the television program is not being viewed, step S12 is repeated, and the television device 13 is turned on.

  In step S13, the data collection unit 14 collects image and content data. As for the image, the face of the subject 10 watching the television is photographed as a visible image and a temperature image by the visible image input unit 111 and the temperature image input unit 112 of the image input unit 11, respectively. As the content data, a television program being watched, that is, a broadcast channel and time data are collected.

  Next, in step S <b> 14, the psychological state measuring unit 2 acquires a visible image and a temperature image from the data collecting unit 14, and performs a process for measuring the psychological state of the subject 10. Details of the processing operation will be described later with reference to the flowchart of FIG.

  When a psychological state measurement result is obtained in step S14, the recording unit 151 records the psychological state measurement result in step S15. In addition, at this stage, the content data acquired by the content data processing unit 152 may be recorded in the recording unit 151 in association with the psychological state measurement result, or may be output to the output unit 153 in some cases.

  When the processing of the result is finished, in step S16, the process sleeps until the next acquisition timing. In other words, the viewing information is set to be sampled every certain period of time, and when the viewing information is acquired, the execution of the process is stopped until the next acquisition timing after a certain time. The sampling method is not limited to the passage of a fixed time, and any sampling method may be used.

  At the next viewing information acquisition timing, it is determined whether or not the information acquisition function is OFF in step S17. If the information acquisition function is OFF (step S17: YES), the viewing information acquisition process ends.

  If the information acquisition function is not turned off in the determination of step S17 (step S17: NO), that is, if the information acquisition function is turned on, the process returns to step S12 to check whether the TV device 13 is turned on. Is determined. After waiting for the television device 13 to be turned on (step S12: YES), the data collection and psychological state measurement processing in step S13 and subsequent steps are repeated.

(Operational flow of psychological state measurement processing)
Next, the psychological state measurement process flow performed in step S14 of FIG. 6 will be described with reference to FIG. FIG. 7 is a flowchart showing the psychological state measurement processing operation in the psychological state measuring device configuration of FIG.

  When the psychological state measurement process is started, image data is first acquired by the psychological state measuring unit 2 in step S21. The visible image input unit 211 acquires visible image data from the data collection unit 14, and the temperature image input unit 212 similarly acquires temperature image data from the data collection unit 14.

  Next, in step S <b> 22, face part position specifying processing by the face part position specifying unit 221 is performed. The position specifying process of the facial part can be performed using either or both of the temperature image and the visible image.

To give an example of processing when both a temperature image and a visible image are used,
(1) Extract a region within a certain range of temperature (for example, 30 degrees or more and less than 50 degrees) from the temperature image;
(2) Extracting the intrusion candidate area from the visible image by the background subtraction method,
(3) The face position area can be specified by AND processing of both areas.

  Since the face is specified in a form that reflects the position of the eyes, nose, mouth, etc., once the face is specified, the nose and forehead position can be easily specified based on information on the position and size relative to the face. Is possible.

  Next, psychological state classification processing by the psychological state classification unit 222 is performed in step S23. Details of the psychological state classification processing performed in step S23 of FIG. 7 will be described later with reference to a flowchart showing the psychological state classification processing operation of FIG.

  In step S24, the psychological state level calculation unit 223 performs a body temperature information numerical calculation process based on the temperature image data. This is a process of calculating body temperature information necessary for calculating the level of the psychological state from the temperature image in accordance with the psychological state classification result in step S23.

  As will be described later, for example, in the psychological state corresponding to disgust or discomfort, the skin temperature near the nose is lower than the skin temperature of the forehead, and the body temperature is increased in the psychological state corresponding to favorable feelings / excitement, etc. The forehead temperature and the temperature difference between the nose and forehead may be calculated.

  For example, Ta = average body temperature in the forehead area, Tb = Ta−average body temperature in the nose area, and values for Ta and Tb (Ta0 and Tb0) immediately after starting viewing information acquisition are average body temperatures in normal times. As a result, relative values for them, that is, Ta ′ = Ta−Ta0 and Tb ′ = Tb−Tb0 are calculated. These body temperature information is used in the next psychological state level calculation process.

  At this point in time, since the psychological state classification result has been obtained, the calculation of the body temperature information may be limited to only necessary ones. For example, if a psychological state corresponding to disgust or discomfort is obtained, only Tb ′ may be calculated and Ta ′ may be omitted.

  It is also possible to use body temperature information for the psychological state classification process. For example, Ta ′ may be calculated, and processing may be performed so as to guide classification from the numerical level to a psychological state corresponding to disgust or discomfort. In that case, the body temperature information numerical calculation process in step S24 may be performed prior to the psychological state classification process in step S23, or a process for correcting the psychological state classification process result may be executed later. .

  In step S25, the psychological state level calculation unit 223 performs a psychological state level calculation process based on the body temperature information. The body temperature information to be used varies depending on the psychological state classification result. For example, Ta ′ / Tha is calculated in the case of a psychological state corresponding to likability, excitement, laughter, etc., and Tb ′ / Thb is calculated in the case of a psychological state corresponding to disgust, discomfort, sadness, etc. In the case of a psychological state corresponding to, 0 is calculated as a level value. However, Tha and Thb are coefficients for normalization, and can be obtained empirically by measuring a plurality of subjects at the time of system development.

  The level calculation method using the body temperature information is an example, and the present invention is not limited to this. Body temperature information and psychological state level dictionary data (third dictionary data) in each psychological state may be prepared in advance and collated for determination. In addition, various methods such as calculation using a nonlinear relational expression, reference to a table, and sampling of level setting by a statistical method in advance can be employed. The relationship between physiological information and psychological state level in each psychological state is expected to develop in the field of cognitive science in the future, and it is possible to use them.

  When the calculation of the psychological state level is completed as described above, the psychological state measurement process is completed. Since the psychological state classification result and the level value in the psychological state are calculated, the data processing unit 15 then records these psychological state measurement results or records corresponding to the content data. It will be.

(Contents of psychological state classification process)
Next, the contents of the psychological state classification process performed in step S23 of FIG. 7 will be described using FIG. FIG. 8 is a diagram illustrating an operation of the psychological state classification process performed by the psychological state classification unit 222 of FIG.

  In the present embodiment, the psychological state classification process performs a process of classifying the psychological state from the facial expression using face image information. In addition to the overall facial expression, information such as blinking and gazing time in the eyes, and pupil size is also effective. Other effective elements for psychological state classification can be used. In any case, those elements must be extracted from the image.

  Taking the expression of the entire face as an example, average facial expression data representative of various psychological states set in advance is set and prepared as dictionary data (first dictionary data) relating to the facial expression. It is a general technique to determine the closest facial expression by collating with facial image data, that is, to determine the psychological state represented by the facial expression data. However, various methods have been developed for facial expression recognition methods for determining the closest facial expression by collating with facial image data, and facial expression dictionary data can be created according to various known techniques.

  In FIG. 8, 40a is dictionary data. Here, for the classification of psychological states, three psychological states, psychological states A, B, and C, were set in advance. 41a is a psychological state A, which is a psychological state corresponding to a normal state. 41b is a psychological state B, which is a psychological state corresponding to disgust, discomfort, and sadness. 41c is a psychological state C, which is a psychological state corresponding to favorable feeling, excitement, and laughter.

  The kind and number of psychological states set in advance are arbitrary. Depending on the purpose, as described above, a psychological state representing the subject's emotions for an arbitrarily set object such as a TV program may be set, or sleepiness, fatigue, concentration, good or bad mood A psychological state that represents the physiological state of the subject may be set.

  For each of the three psychological states A, B, and C set in advance, a representative facial expression is set, and feature vector data representing each facial expression is stored as dictionary data A relating to the facial expression 42a, The dictionary data B relating to the facial expression 42b and the dictionary data C relating to the facial expression 42c were registered. These 42a, 42b and 42c are the first dictionary data.

  A procedure for classifying the psychological state by collating with the dictionary data 40a will be described.

  When face image data as a visible image is obtained in step S31, the facial expression evaluation site is sampled in step S32, and face image information is obtained. This may be an expression of the entire face, or an expression focusing on a specific part such as the eyes or mouth.

  In step S33, data of each facial expression is calculated as data of the same feature vector in order to collate with the first dictionary data 42a, 42b, 42c prepared. In step S37, the dictionary data 40a is collated.

  In step S38, the closest dictionary data is selected from the dictionary data A, B, and C relating to facial expressions, and the corresponding psychological states 41a, 41b, and 41c, that is, psychological states A, B, or C are determined as psychological state classification results. To do.

  Next, the case where the body temperature information is also used for the psychological state classification, which is the same as the content of the psychological state classification process of FIG. 8, will be described with reference to FIG. FIG. 9 is also a diagram illustrating an operation of the psychological state classification process performed by the psychological state classification unit 222 of FIG.

  In FIG. 9, 40b is dictionary data. The difference from FIG. 8 is that second dictionary data relating to body temperature is added as described below.

  That is, the dictionary data related to body temperature (second dictionary data) can be registered together with the first dictionary data related to facial expression for the classification of psychological states. In that case, the dictionary data related to facial expression and the dictionary data related to body temperature will be collated, or a dictionary data that integrates both will be created and collated to determine the closest psychological state overall. Is preferred.

  A procedure for classifying the psychological state by collating with the dictionary data 40b will be described. In addition to the procedure for collating face image information with the first dictionary data described in FIG. 8, the following procedure is added.

  When body temperature information is also used for psychological state classification, when temperature image data is obtained in step S34, a part for evaluating body temperature is sampled in step S35, and body temperature information is obtained. For example, body temperature information such as the nose and forehead.

  In step S36, body temperature data is calculated. This may be multidimensional data based on a plurality of parts, or body temperature information calculated by the body temperature information numerical calculation process in FIG. In the dictionary data 40b, second dictionary data 43a, 43b, and 43c relating to body temperature corresponding to the body temperature data are prepared.

  In step S37, the body temperature data calculated in step S36 is collated with the dictionary data 40b. In addition, since the data related to the facial expression calculated in step S33 is also collated with the dictionary data 40b, in this case, in step S38, the collation results of both the data related to the facial expression and the data related to the body temperature are combined, and the two are closest. Psychological states 41a, 41b, and 41c, that is, any of psychological states A, B, and C are determined as psychological state classification results.

  In addition, if it is assumed that the dictionary data 40b is used for level calculation based on body temperature information, the dictionary data A for the body temperature 43a, the dictionary data B for the body temperature 43b, and the dictionary data C for the body temperature 43c, A plurality of levels of dictionary data may be registered as third dictionary data in accordance with the level in each psychological state, that is, the strength. Then, the level in each psychological state can be calculated by collating this body temperature dictionary data (third dictionary data) with body temperature information.

(When using temperature images for psychological classification)
The situation where the temperature image contributes to the classification of the psychological state will be described with reference to FIG. FIG. 10 is temperature image data obtained by acquiring a temperature image for the face of the subject in the psychological state A, B, or C. 10A is in a psychological state corresponding to a normal state, FIG. 10B is in a psychological state corresponding to disgust, discomfort, and sadness, and FIG. 10C. Is temperature image data when the person is in a psychological state corresponding to likability, excitement, and even laughter.

  FIG. 10A shows a normal state, but in the case of a TV program viewing information survey, it is represented by a body temperature distribution immediately before the start of TV viewing. The temperature difference between the nose and the forehead is small, and the body temperature is generally stable. Regarding the overall body temperature, it is appropriate to use the temperature of the forehead with relatively little change in temperature distribution relative to the surroundings as body temperature information.

  FIG. 10 (b) shows a psychological state corresponding to disgust, discomfort, and sadness, but in the case of a TV program viewing information survey, it is in a state of poor liking for the TV program being viewed. It can be regarded as a body temperature distribution. The temperature near the nose (see N in the figure) is lower than the other parts. It is reasonable to use the temperature difference between the nose and the forehead that is relatively stable as body temperature information.

  FIG. 10 (c) shows a psychological state corresponding to favorable feeling, excitement, and even laughter, but in the case of a TV program viewing information survey, it is in a state of having a good feeling for the TV program being viewed. It can be regarded as a body temperature distribution. Overall, body temperature tends to increase. It is reasonable to use the temperature of the forehead, which is relatively stable, as body temperature information.

  As described above, the level of each psychological state can be calculated using the body temperature information. Further, as described above, it is effective not only for the level of the psychological state but also for the classification of the psychological state. Of course, it is not possible to classify all psychological states based only on body temperature information, but if they are used complementing each other based on facial expressions, the accuracy of psychological state classification can be improved.

  According to the above embodiment, it is possible to classify the psychological state of the subject with high accuracy by acquiring the image information of the subject and the physiological information of the subject, and performing an integration process on both. Based on the above, it is possible to provide a psychological state measuring apparatus capable of accurately measuring the level of the psychological state in the subject, that is, the strength.

  In addition, the above-mentioned embodiment example is an illustration and restrictive at no points. The scope of the present invention is defined by the terms of the claims, rather than the description above, and is intended to include any modifications within the scope and meaning equivalent to the terms of the claims.

It is a figure which shows the application system (television program viewing information investigation) using the psychological state measuring apparatus as one example of embodiment which concerns on this invention. It is a block diagram which shows the structural example of the psychological state measuring apparatus provided with the minimum component as one Embodiment which concerns on this invention. It is a block diagram which shows the example of the input mechanism of the visible image and temperature image in the psychological state measuring device which concerns on the example of this embodiment. It is a block diagram which shows the example of whole structure of the application system containing the psychological state measuring apparatus which concerns on the example of this embodiment. It is a block diagram which shows the more detailed structural example of the psychological state measuring apparatus which concerns on the example of this embodiment. It is a flowchart which shows the example of a processing operation of the application system (TV program viewing information survey) containing the psychological state measuring apparatus which concerns on the example of this embodiment. It is a flowchart which shows the psychological state measurement process operation example in the psychological state measurement apparatus structure of FIG. It is a figure which shows the operation example of the psychological state classification | category process using the 1st dictionary data performed in the psychological state classification | category part of FIG. It is a figure which shows the operation example of the psychological state classification | category process also performed by the psychological state classification | category part of FIG. It is an example of the temperature image data which acquired the temperature image about the test subject's face in three different psychological states.

Explanation of symbols

1, 2 Psychological state measurement device 10 Subject (viewer of TV program)
11a Image input device 11, 21 Input unit 12, 22 Processing unit 13 Content providing device (television)
14a Data collection device 14 Data collection unit 15 Data processing unit 31 Optical system 32 Half mirror 33a, 33b Filter 34a, 34b Image sensor 40a, 40b Dictionary data 41a, 41b, 41c Psychological state classification 42a, 42b, 42c Dictionary data about facial expressions 43a , 43b, 43c Dictionary data relating to body temperature 111, 211 Visible image input unit 112, 212 Temperature image input unit 131 Content (TV program)
151 Recording unit 152 Content data processing unit 153 Output unit 221 Face part position specifying unit 222 Psychological state classification unit 223 Psychological state level calculation unit 224 First (& second) dictionary data 225 Third dictionary data

Claims (8)

  1. Image information acquisition means for acquiring image information of a subject who is a subject of psychological state measurement;
    First dictionary data associating the image information with a predetermined psychological state;
    Psychological state classification means for referring to the first dictionary data and classifying the psychological state of the subject into the predetermined psychological state based on the image information;
    Physiological information acquisition means for acquiring physiological information of the subject;
    A psychological state measuring device comprising: a psychological state level calculating means for calculating a level of the psychological state of the subject in the psychological state of the subject classified by the psychological state classifying means based on the physiological information of the subject. .
  2. The image information acquired by the image information acquisition means is face image information of the subject,
    The psychological state measuring apparatus according to claim 1, wherein the psychological state classification unit has a function of classifying the psychological state of the subject based on facial expressions of the face image information.
  3. The psychological state measuring apparatus according to claim 2, wherein the facial expression of the facial image information is information related to a subject's blink, gaze time, or pupil size.
  4. The psychological information measuring apparatus according to any one of claims 1 to 3, wherein the physiological information acquired by the physiological information acquiring means is information on the body temperature of the subject.
  5. Comprising second dictionary data for associating the physiological information acquired by the physiological information acquisition means with the predetermined psychological state;
    5. The psychological state classification unit has a function of referring to the second dictionary data and classifying the psychological state of the subject based on image information and physiological information of the subject. The psychological state measuring device according to any one of the above.
  6. The psychological state measuring apparatus according to any one of claims 1 to 5, wherein the predetermined psychological state is a psychological state representing an emotion of a subject with respect to an arbitrarily set object.
  7. The psychological state measuring apparatus according to any one of claims 1 to 5, wherein the predetermined psychological state is a psychological state representing a physiological state of the body of the subject.
  8. The physiological information acquired by the physiological information acquisition means comprises third dictionary data that correlates with the psychological state level corresponding to the predetermined psychological state,
    The psychological state level calculating means has a function of referring to the third dictionary data and calculating the level of the psychological state of the subject classified by the psychological state classifying means based on the physiological information of the subject. The psychological state measuring apparatus according to any one of claims 1 to 7, wherein
JP2005256231A 2005-09-05 2005-09-05 Psychological condition measuring apparatus Pending JP2007068620A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2005256231A JP2007068620A (en) 2005-09-05 2005-09-05 Psychological condition measuring apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2005256231A JP2007068620A (en) 2005-09-05 2005-09-05 Psychological condition measuring apparatus

Publications (1)

Publication Number Publication Date
JP2007068620A true JP2007068620A (en) 2007-03-22

Family

ID=37930616

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2005256231A Pending JP2007068620A (en) 2005-09-05 2005-09-05 Psychological condition measuring apparatus

Country Status (1)

Country Link
JP (1) JP2007068620A (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1970672A2 (en) 2007-03-16 2008-09-17 Okuma Corporation Position detector
JP2009254498A (en) * 2008-04-15 2009-11-05 Olympus Corp Diagnosis-observation device
DE102009042893A1 (en) 2008-09-26 2010-04-01 Okuma Corporation, Niwa Detection device for position of circular table of machine tool, has correction unit correcting offset of signals based on correction value of offset, as correction value corresponds to absolute position and is selected from storage unit
JP2010094493A (en) * 2008-09-22 2010-04-30 Koichi Kikuchi System for deciding viewer's feeling on viewing scene
KR101133845B1 (en) * 2010-10-19 2012-04-06 성균관대학교산학협력단 Method for creating brainwave signal database
WO2014054293A1 (en) * 2012-10-05 2014-04-10 パナソニック株式会社 Drowsiness estimation device, drowsiness estimation method, and computer-readable non-transient recording medium
JP2016093313A (en) * 2014-11-13 2016-05-26 大和ハウス工業株式会社 Psychological state estimation method, psychological state estimation system, and care system using psychological state estimation method
JP2018517958A (en) * 2015-04-02 2018-07-05 ハートフロー, インコーポレイテッド System and method for predicting perfusion injury from physiological, anatomical and patient characteristics

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1970672A2 (en) 2007-03-16 2008-09-17 Okuma Corporation Position detector
JP2009254498A (en) * 2008-04-15 2009-11-05 Olympus Corp Diagnosis-observation device
JP2010094493A (en) * 2008-09-22 2010-04-30 Koichi Kikuchi System for deciding viewer's feeling on viewing scene
DE102009042893A1 (en) 2008-09-26 2010-04-01 Okuma Corporation, Niwa Detection device for position of circular table of machine tool, has correction unit correcting offset of signals based on correction value of offset, as correction value corresponds to absolute position and is selected from storage unit
KR101133845B1 (en) * 2010-10-19 2012-04-06 성균관대학교산학협력단 Method for creating brainwave signal database
WO2014054293A1 (en) * 2012-10-05 2014-04-10 パナソニック株式会社 Drowsiness estimation device, drowsiness estimation method, and computer-readable non-transient recording medium
JPWO2014054293A1 (en) * 2012-10-05 2016-08-25 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America Sleepiness estimation device, sleepiness estimation method, computer-readable non-transitory recording medium
US9501704B2 (en) 2012-10-05 2016-11-22 Panasonic Intellectual Property Corporation Of America Drowsiness estimation device, drowsiness estimation method, and computer-readable non-transient recording medium
JP2016093313A (en) * 2014-11-13 2016-05-26 大和ハウス工業株式会社 Psychological state estimation method, psychological state estimation system, and care system using psychological state estimation method
JP2018517958A (en) * 2015-04-02 2018-07-05 ハートフロー, インコーポレイテッド System and method for predicting perfusion injury from physiological, anatomical and patient characteristics

Similar Documents

Publication Publication Date Title
US10219736B2 (en) Methods and arrangements concerning dermatology
US8986218B2 (en) System and method for calibrating and normalizing eye data in emotional testing
US20190282153A1 (en) Presentation Measure Using Neurographics
Nhan et al. Classifying affective states using thermal infrared imaging of the human face
WO2018201633A1 (en) Fundus image-based diabetic retinopathy identification system
JP4697949B2 (en) Mental symptom / psychological state evaluation apparatus and evaluation method
RU2292839C2 (en) Method and device for analyzing human behavior
US7930199B1 (en) Method and report assessing consumer reaction to a stimulus by matching eye position with facial coding
Alghowinem et al. Eye movement analysis for depression detection
US7319780B2 (en) Imaging method and system for health monitoring and personal security
Heisz et al. More efficient scanning for familiar faces
Abdelrahman et al. Cognitive heat: exploring the usage of thermal imaging to unobtrusively estimate cognitive load
JP5977255B2 (en) Dysphagia detection device and operation method thereof
US8392254B2 (en) Consumer experience assessment system
US20160345832A1 (en) System and method for monitoring biological status through contactless sensing
DE112014006082T5 (en) Pulse wave measuring device, mobile device, medical equipment system and biological information communication system
US20160022193A1 (en) Real time biometric recording, information analytics and monitoring systems for behavioral health management
US20140313303A1 (en) Longitudinal dermoscopic study employing smartphone-based image registration
WO2012105196A1 (en) Interest estimation device and interest estimation method
US5617855A (en) Medical testing device and associated method
JP5912351B2 (en) Autism diagnosis support system and autism diagnosis support apparatus
JP6101684B2 (en) Method and system for assisting patients
Linstrom Objective facial motion analysis in patients with facial nerve dysfunction
EP2081100B1 (en) Adjusting device for brain wave identification method, adjusting method and computer program
Tcherkassof et al. Facial expressions of emotions: A methodological contribution to the study of spontaneous and dynamic emotional faces