WO2023162410A1 - Emotion correction device, emotion estimation device, emotion compensation method, emotion estimation method, and program - Google Patents

Emotion correction device, emotion estimation device, emotion compensation method, emotion estimation method, and program Download PDF

Info

Publication number
WO2023162410A1
WO2023162410A1 PCT/JP2022/045615 JP2022045615W WO2023162410A1 WO 2023162410 A1 WO2023162410 A1 WO 2023162410A1 JP 2022045615 W JP2022045615 W JP 2022045615W WO 2023162410 A1 WO2023162410 A1 WO 2023162410A1
Authority
WO
WIPO (PCT)
Prior art keywords
emotion
correction
reference value
correction reference
intensity
Prior art date
Application number
PCT/JP2022/045615
Other languages
French (fr)
Japanese (ja)
Inventor
聡平 武智
Original Assignee
Necソリューションイノベータ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Necソリューションイノベータ株式会社 filed Critical Necソリューションイノベータ株式会社
Publication of WO2023162410A1 publication Critical patent/WO2023162410A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/906Clustering; Classification

Definitions

  • the present invention relates to an emotion correction device, an emotion estimation device, an emotion correction method, an emotion estimation method, and a program.
  • FACS Facial Action Coding Systems
  • FACS is a technology that encodes facial muscle movements called Action Units (AU) for machine discrimination.
  • AU Action Units
  • an object of the present invention is to provide an emotion correction device capable of correcting emotion information in consideration of the characteristics of the subject.
  • the emotion correction device of the present invention including an emotion information acquisition unit, a correction reference value calculation unit, and a correction unit
  • the emotion information acquisition unit acquires emotion information based on vital data
  • the emotion information includes emotion types and emotion intensity for each emotion type
  • the correction reference value calculation unit calculates a correction reference value based on the frequency of appearance of the emotion intensity for each type of emotion
  • the correction unit corrects the emotion information based on the correction reference value.
  • the emotion estimation device of the present invention is including a vital data acquisition unit, an emotion estimation unit, and an emotion correction unit,
  • the vital data acquisition unit acquires the subject's vital data
  • the emotion estimation unit estimates an emotion of the subject based on the vital data
  • the emotion correction unit corrects the estimated emotion
  • the emotion correction unit is the emotion correction device of the present invention.
  • the emotion correction method of the present invention is including an emotion information acquisition step, a correction reference value calculation step, and a correction step,
  • the emotion information acquisition step acquires emotion information based on vital data
  • the emotion information includes emotion types and emotion intensity for each emotion type
  • the correction reference value calculating step calculates a correction reference value based on the frequency of appearance of the emotion intensity for each type of emotion
  • the correction step corrects the emotion information based on the correction reference value.
  • the emotion estimation method of the present invention is including a vital data acquisition process, an emotion estimation process, and an emotion correction process
  • the vital data acquisition step acquires the subject's vital data
  • the emotion estimation step estimates the subject's emotion based on the vital data
  • the emotion correction step corrects the estimated emotion
  • the emotion correction step is the emotion correction method of the present invention.
  • a first program of the present invention includes an emotion information acquisition procedure, a correction reference value calculation procedure, and a correction procedure
  • the emotion information acquisition step acquires emotion information based on vital data
  • the emotion information includes emotion types and emotion intensity for each emotion type
  • the correction reference value calculation step calculates a correction reference value based on the frequency of appearance of the emotion intensity for each type of emotion
  • the correction procedure corrects the emotion information based on the correction reference value
  • Each procedure is a program executed by a computer.
  • emotions can be corrected in consideration of the subject's characteristics.
  • FIG. 1 is a block diagram showing an example configuration of an emotion correction device according to a first embodiment.
  • FIG. 2 is a block diagram showing an example of the hardware configuration of the emotion correction device according to the first embodiment;
  • FIG. 3 is a flow chart showing an example of processing in the emotion correction device according to the first embodiment.
  • FIGS. 4A and 4B are graphs showing an example of clustering results of emotion intensity for each emotion type.
  • FIG. 5 is a flow chart showing a specific example of processing in the correction unit of the emotion correction device according to the first embodiment.
  • FIG. 6 is a schematic diagram for explaining emotion estimation based on facial expressions.
  • FIG. 7 is a block diagram showing an example configuration of an emotion estimation device according to the second embodiment.
  • FIG. 8 is a block diagram showing an example of the hardware configuration of the emotion estimation device according to the second embodiment.
  • FIG. 9 is a flow chart showing an example of processing in the emotion estimation device of the second embodiment.
  • FIG. 1 is a block diagram showing an example configuration of an emotion correction device 10 of this embodiment.
  • the device 10 includes an emotion information acquisition section 11 , a correction reference value calculation section 12 and a correction section 13 .
  • the device 10 may also include a storage unit, although not shown.
  • the device 10 may be, for example, a single device including each of the above units, or may be a device to which each of the above units can be connected via a communication network. Further, the device 10 can be connected to an external device, which will be described later, via the communication network.
  • the communication line network is not particularly limited, and a known network can be used, and may be wired or wireless, for example.
  • the communication line network includes, for example, the Internet line, WWW (World Wide Web), telephone line, LAN (Local Area Network), SAN (Storage Area Network), DTN (Delay Tolerant Networking), LPWA (Low Power Wide Area), L5G (local 5G), and the like.
  • wireless communication examples include Wi-Fi (registered trademark), Bluetooth (registered trademark), local 5G, and LPWA.
  • the wireless communication may be a form in which each device communicates directly (Ad Hoc communication), infrastructure communication, indirect communication via an access point, or the like.
  • the device 10 may be incorporated in a server as a system, for example. Further, the device 10 may be, for example, a personal computer (PC, for example, desktop type, notebook type), a smart phone, a tablet terminal, etc. in which the program of the present invention is installed.
  • the device 10 may be, for example, in a form of cloud computing, edge computing, or the like, in which at least one of the units is on the server and the other units are on the terminal.
  • the device 10 includes, for example, a CPU 101, a memory 102, a bus 103, a storage device 104, an input device 106, an output device 107, a communication device 108, and the like. Each unit of the device 10 is interconnected via a bus 103 by each interface (I/F).
  • I/F interface
  • the CPU 101 operates in cooperation with other components by means of a controller (system controller, I/O controller, etc.) and takes charge of overall control of the device 10 .
  • the CPU 101 executes, for example, the program 105 of the present invention and other programs, and reads and writes various information.
  • the CPU 101 functions as an emotion information acquisition unit 11 , a correction reference value calculation unit 12 , and a correction unit 13 .
  • the device 10 includes a CPU as a computing device, but may include other computing devices such as a GPU (Graphics Processing Unit) and APU (Accelerated Processing Unit), or may include a combination of the CPU and these. good.
  • the bus 103 can also be connected to external devices, for example.
  • the external device include an emotion estimation device such as an emotion estimation device of the present invention, which will be described later, an external storage device (external database, etc.), a printer, an external input device, an external display device, an external imaging device, and the like.
  • the device 10 can be connected to an external network (the communication line network) by means of a communication device 108 connected to the bus 103, and can also be connected to other devices via the external network.
  • the memory 102 is, for example, a main memory (main storage device).
  • main memory main storage device
  • the memory 102 reads various operating programs such as the program of the present invention stored in the storage device 104 to be described later, and the central processing unit 101 reads from the memory 102 Get the data and run the program.
  • the main memory is, for example, RAM (random access memory).
  • the memory 102 may be, for example, a ROM (read only memory).
  • the storage device 104 is also called a so-called auxiliary storage device, for example, in contrast to the main memory (main storage device). As described above, the storage device 104 stores operating programs including the program of the present invention. Storage device 104 may be, for example, a combination of a recording medium and a drive that reads from and writes to the recording medium.
  • the recording medium is not particularly limited, and may be, for example, a built-in type or an external type, and includes HD (hard disk), CD-ROM, CD-R, CD-RW, MO, DVD, flash memory, memory card, and the like. be done.
  • the storage device 104 may be, for example, a hard disk drive (HDD) in which a recording medium and drive are integrated, and a solid state drive (SSD). If the device 10 includes the storage section, for example, the storage device 104 functions as the storage section.
  • the storage unit can store, for example, the acquired emotion information, the calculated correction reference value, and the like.
  • the memory 102 and the storage device 104 store log information, information acquired from an external database (not shown) or an external device, information generated by the device 10, and data stored when the device 10 executes processing. It is also possible to store various information such as information to be used.
  • the memory 102 and the storage device 104 are selected from the group consisting of, for example, subject's emotional information, subject's identification information, subject's attribute information, correction reference value, post-correction emotional information, etc., which will be described later. At least one piece of information may be stored. At least part of the information may be stored in an external server other than the memory 102 and the storage device 104, or may be distributed and stored in a plurality of terminals using blockchain technology or the like. .
  • the emotion correction method of this embodiment is implemented as follows using, for example, the emotion correction device 10 of FIG. 1 or FIG. It should be noted that the emotion correction method of this embodiment is not limited to the use of the emotion correction device 10 of FIG. 1 or FIG.
  • the emotion information acquisition unit 11 acquires emotion information based on vital data (S1, emotion information acquisition step).
  • the emotion information is, for example, information indicating that the subject's emotion is presumed to have occurred based on the subject's vital data, and includes the emotion type and the emotion intensity for each emotion type.
  • the emotion information acquisition unit 11 may acquire the emotion information from an emotion estimation engine capable of estimating emotions based on vital data via a communication network, or the emotion information estimated by the emotion estimation engine may be recorded.
  • the emotion information may be acquired from an external database.
  • the emotion information acquisition unit 11 may acquire the subject's vital data, estimate the subject's emotion based on the vital data, and acquire the subject's emotion information. In this case, the emotion information acquisition unit 11 is also called an emotion estimation unit, for example.
  • the vital data is, for example, Action Unit (AU) information that encodes the movement of facial muscles of the subject; image information such as facial expressions of the subject; vocalization information such as voiced and unvoiced sounds; text information such as utterance content; Blood pressure, heart rate;
  • AU Action Unit
  • the emotion estimation engine is not particularly limited, and can be appropriately selected according to the type of the vital data, for example. Specific examples of the emotion estimation engine include those that estimate emotions based on image information such as facial expressions such as Affdex, Microsoft Azure Face API, Amazon Rekognition, Realeyes, User Local facial expression estimation AI; STEmotion, Empath, BeyondVerbal Emotion Estimation Software, IBM Watson Tone Analyzer, User Local Speech Emotion Recognition AI, etc.
  • the combination of the vital data and the emotion estimation engine is preferably, for example, the AU information and an emotion estimation engine that uses Facial Action Coding Systems (FACS) that can use the AU information, but is not limited to this.
  • the emotion estimation engine may be, for example, a configuration outside the device 10 or a configuration of the device 10 . In the following description, the case of performing emotion estimation based on the AU information will be described as an example.
  • the type of emotion may be, for example, one type, or two or more types.
  • Specific examples of the types of emotions are not particularly limited, and include, for example, emotions that can be estimated by the emotion estimation engine.
  • Specific examples of the types of emotions include joy, anger, sadness, fear, disgust, surprise, contempt, and the like. be done.
  • the emotional intensity is, for example, information indicating the intensity of each type of emotion of the subject, and is also called an emotional value or an emotional score.
  • the emotion intensity is, for example, a score calculated by the emotion estimation engine based on the vital data.
  • the score may be, for example, qualitative information (for example, whether or not the emotion occurred), quantitative information (for example, to what extent the emotion occurred), or information for each emotion. Probability information may also be used.
  • the score may be, for example, an absolute evaluation for each type of emotion, or a relative evaluation with other emotions (whether the emotion is stronger or weaker than other emotions).
  • the emotional information may include other information, for example.
  • Examples of the other information include identification information of the subject, attribute information of the subject, information on the date and time when the emotion was estimated, information such as the place where the emotion was estimated, and information on the situation in which the emotion was estimated. be done.
  • the identification information of the subject is information that can identify the subject, and includes, for example, the name of the subject; the image of the subject; the ID of the terminal of the subject;
  • the attribute information includes, for example, the subject's sex, age, affiliation, position in an organization, and the like.
  • the subject's attribute information may be associated with, for example, the above-described subject's identification information.
  • the information on the date, place, and situation when the emotion is estimated includes, for example, information on the date, place, and situation when the vital data, which is the basis for estimating the emotion, is acquired.
  • the emotional information includes the other information, for example, the emotional information and the other information are preferably associated with each other.
  • the correction reference value calculation unit 12 calculates a correction reference value based on the appearance frequency of the emotion intensity for each type of emotion (S2, correction reference value calculation step).
  • the appearance frequency is, for example, the number of times an emotion appears (also referred to as the number of occurrences) per unit time.
  • the unit time is not particularly limited, and any unit can be set according to the purpose.
  • the correction reference value calculation unit 12 may calculate the correction reference value based on the appearance frequencies of all the obtained emotion intensities, or may calculate the correction reference values based on the appearance frequencies of some of the emotion intensities.
  • the correction reference value calculation unit 12 clusters, for example, the appearance frequency of emotion intensity exceeding 0 for each type of emotion, and sets the highest emotion intensity in the cluster having the maximum area as the correction reference value. can. Since the correction reference value is calculated based on the appearance frequency of the emotion intensity of the individual subject, it is also called an individual bias, for example.
  • an approximate curve of the appearance frequency is created, and the next increase from the rising start point of the approximate curve (if there is no rising start point, the minimum value of the emotional intensity may be used).
  • Each cluster may be a section up to the starting point (also referred to as the ending point of the descent. If there is no rising starting point, the maximum value of emotion intensity may be used.).
  • the correction reference value calculator 12 selects the cluster having the largest area among the clusters. Then, the correction reference value calculation unit 12 sets, for example, the maximum value of emotion intensity in the selected cluster as the correction reference value. Specifically, the correction reference value calculation unit 12 selects, for example, the second cluster from the first to third clusters shown in FIG. "8" can be used as the correction reference value.
  • correction reference value calculation unit 12 selects, for example, the second cluster from the first and second clusters shown in FIG. ” can be used as the correction reference value.
  • the correction reference value calculation unit 12 selects, for example, the second cluster from the first and second clusters shown in FIG. ” can be used as the correction reference value.
  • the device 10 may include the storage unit and store the emotional information acquired by the storage unit.
  • the emotional information includes, for example, the identification information of the subject, and the storage unit associates and stores the identification information of the subject and the emotional information.
  • the correction reference value calculation unit 12 may calculate the correction reference value based on, for example, the emotion information acquired in S1 and the emotion information stored in the storage unit. According to this aspect, for example, since the storage section can accumulate emotion information that is the subject's individual characteristics, it is possible to correct the subject's emotion more accurately.
  • FIG. 5 is a flow chart showing an example of processing S3 (S31, S32, S33) by the correction unit 13.
  • the correction unit 13 determines whether the emotion intensity of the acquired emotion information exceeds the correction reference value (S31). Then, if the emotion intensity is less than the correction reference value (S31, No), the correction unit 13 corrects the emotion intensity to 0, for example (S32).
  • the correction unit 13 corrects the emotion intensity based on the correction reference value, for example (S33).
  • the correction of emotion intensity based on the correction reference value can be performed by the following formula (1), but is not limited to this. (emotion value finally output by emotion estimation engine ⁇ correction reference value)/(maximum value of emotion intensity included in emotion information ⁇ correction reference value) ⁇ maximum value of emotion intensity included in emotion information (1)
  • emotional information estimated based on the vital data is affected by individual characteristics.
  • emotion estimation based on the AU information for example, as shown in FIG. Even if it is not there, it is easy to be misjudged that it is showing a feeling of joy. Conversely, as shown in FIG. 6(B), if the corners of the mouth are usually lowered, the feeling of sadness or disgust will appear regardless of the person's feelings, even if the person is not actually sad. misjudgment is likely to occur.
  • the influence of such personal characteristics is the same in general emotion estimation based on vital data.
  • the correction reference value calculation unit can calculate the correction reference value based on the appearance frequency of the emotion intensity for each type of emotion, and correct the emotion information based on the correction reference value. . Therefore, according to the emotion correction device of the present invention, it is possible to perform more accurate emotion estimation based on, for example, individual characteristics.
  • Embodiment 2 is an example of the emotion estimation device of the present invention.
  • FIG. 7 is a block diagram showing the configuration of an example of the emotion estimation device of this embodiment.
  • the emotion estimation device 20 of this embodiment includes, for example, a vital data acquisition unit 21, an emotion estimation unit 22, and an emotion correction unit 10.
  • FIG. The emotion correction unit 10 is, for example, similar to the emotion correction device 10 of Embodiment 1, and the description thereof can be used. Therefore, in the following description, the description of "emotion correction unit 10" can be read as, for example, "emotion correction device 10".
  • the emotion estimation device 20 may include, for example, a storage unit.
  • the emotion estimation device 20 may be in the form of cloud computing, edge computing, or the like, in which at least one of the units is on the server and the other units are on the terminal, for example.
  • the emotion estimation device 20 including the vital data acquisition unit 21 and the emotion estimation unit 22, and the emotion correction device 10 of the first embodiment are used as the emotion correction unit 10.
  • a form of the emotion estimation system 200 connected via the communication network 30 can be mentioned.
  • the vital data acquisition unit 21 acquires the subject's vital data
  • the emotion estimation unit 22 estimates the subject's emotion based on the vital data
  • the emotion correction unit 10 corrects the estimated emotion.
  • FIG. 8 illustrates a block diagram of the hardware configuration of the emotion estimation device 20.
  • the emotion estimation device 20 includes, for example, a CPU 201, a memory 202, a bus 203, a storage device 204, an input device 206, an output device 207, a communication device 208, and the like.
  • the description of each component of the emotion estimation device 20 can use the description of each component of the emotion correction device 10 .
  • Each part of emotion estimation device 20 is connected via bus 203 by each interface (I/F).
  • CPU 201 functions as vital data acquisition section 21 and emotion estimation section 22 .
  • the emotion correction unit 10 (emotion correction device 10) is connected to the bus 203, for example, but the present invention is not limited to this.
  • the emotion correction unit 10 may be connected via the communication network 30 by, for example, a communication device 208, and causes each process of the emotion correction method of the first embodiment to be executed.
  • a program for this purpose may be stored in storage device 204 or memory 202, and CPU 201 may function as emotion correction unit 10 by reading and executing the program.
  • the emotion correction method of this embodiment is implemented as follows, for example, using the emotion estimation device 20 of FIGS.
  • the emotion estimation method of this embodiment is not limited to the use of the emotion estimation device 20 of FIGS. 7 and 8.
  • FIG. 1
  • the vital data acquisition unit 21 of the emotion estimation device 20 acquires the subject's vital data (S11, vital data acquisition step).
  • the vital data acquisition unit 21 may acquire the vital data from various external devices such as cameras and sensors connected via a communication network, or may acquire the vital data from an external database storing the vital data. Vital data may be obtained.
  • the vital data is, for example, as described above.
  • the emotion estimation unit 22 of the emotion estimation device 20 estimates the subject's emotion based on the vital data (S12, emotion estimation step).
  • the processing by the emotion estimation unit 22 can be performed, for example, in the same manner as the emotion estimation engine.
  • the emotion correction unit 10 performs S1 to S3 in the same manner as the processes S1 to S3 in the first embodiment (S13, emotion correction step).
  • the emotion estimation device of this embodiment the subject's emotion estimated based on vital data can be corrected. Therefore, the emotion estimating apparatus of the present invention is capable of, for example, estimating emotions in consideration of individual characteristics.
  • a first program of the present embodiment is a program for causing a computer to execute each step of the emotion correction method described above.
  • the program of the present embodiment is a program for causing a computer to execute an emotion information acquisition procedure, a correction reference value calculation procedure, and a correction procedure.
  • the emotion information acquisition step acquires emotion information based on vital data,
  • the emotion information includes emotion types and emotion intensity for each emotion type
  • the correction reference value calculation step calculates a correction reference value based on the frequency of appearance of the emotion intensity for each type of emotion,
  • the correction procedure corrects the emotion information based on the correction reference value.
  • the first program of the present embodiment can also be said to be a program that causes a computer to function as an emotion information acquisition procedure, a correction reference value calculation procedure, and a correction procedure.
  • the description of the emotion correction device and emotion correction method of the present invention can be used.
  • “procedure” can be read as “processing”.
  • the program of this embodiment may be recorded on a computer-readable recording medium, for example.
  • the recording medium is, for example, a non-transitory computer-readable storage medium.
  • the recording medium is not particularly limited. cards, etc.), optical discs (eg, CD-R/CD-RW, DVD-R/DVD-RW, BD-R/BD-RE, etc.), magneto-optical discs (MO) floppy (registered trademark) discs (FD), etc. is given.
  • the program of the present embodiment may be distributed from an external computer, for example.
  • the "distribution" may be, for example, distribution via a communication network or distribution via a device connected by wire.
  • the program of this embodiment may be installed and executed on the distributed device, or may be executed without being installed.
  • a second program of the present embodiment is a program for causing a computer to execute each step of the emotion estimation method described above.
  • the program of this embodiment is a program for causing a computer to execute a vital data acquisition procedure, an emotion estimation procedure, and an emotion correction procedure.
  • the vital data acquisition step acquires the subject's vital data
  • the emotion estimation procedure estimates the subject's emotion based on the vital data
  • the emotion correction procedure corrects the estimated emotion
  • the emotion correction procedure is executed by the first program.
  • the second program of the present embodiment can also be said to be a program that causes a computer to function as a vital data acquisition procedure, an emotion estimation procedure, and an emotion correction procedure.
  • the description of the emotion estimation device and emotion estimation method of the present invention can be used.
  • “procedure” can be read as “processing”.
  • the program of this embodiment may be recorded on a computer-readable recording medium, for example.
  • the recording medium is, for example, a non-transitory computer-readable storage medium.
  • the recording medium is not particularly limited. cards, etc.), optical discs (eg, CD-R/CD-RW, DVD-R/DVD-RW, BD-R/BD-RE, etc.), magneto-optical discs (MO) floppy (registered trademark) discs (FD), etc. is given.
  • the program of the present embodiment may be distributed from an external computer, for example.
  • the "distribution" may be, for example, distribution via a communication network or distribution via a device connected by wire.
  • the program of this embodiment may be installed and executed on the distributed device, or may be executed without being installed.
  • Appendix 1 including an emotion information acquisition unit, a correction reference value calculation unit, and a correction unit
  • the emotion information acquisition unit acquires emotion information based on vital data
  • the emotion information includes emotion types and emotion intensity for each emotion type
  • the correction reference value calculation unit calculates a correction reference value based on the frequency of appearance of the emotion intensity for each type of emotion
  • the emotion correction device wherein the correction unit corrects the emotion information based on the correction reference value.
  • the correction unit determines whether the emotion intensity of the acquired emotion information exceeds the correction reference value, and if the emotion intensity is equal to or greater than the correction reference value, adjusts the emotion intensity based on the correction reference value. 5.
  • (Appendix 6) 6.
  • the vital data acquisition unit acquires the subject's vital data
  • the emotion estimation unit estimates an emotion of the subject based on the vital data
  • the emotion correction unit corrects the estimated emotion
  • the emotion correction unit is the emotion correction device according to any one of appendices 1 to 6, emotion estimation device.
  • the emotion information acquisition step acquires emotion information based on vital data
  • the emotion information includes emotion types and emotion intensity for each emotion type
  • the correction reference value calculating step calculates a correction reference value based on the frequency of appearance of the emotion intensity for each type of emotion
  • the emotion correction method wherein the correction step corrects the emotion information based on the correction reference value.
  • Appendix 9 Supplementary Note 8, in the correction reference value calculation step, for each type of emotion, the frequency of occurrence of emotion intensity exceeding 0 is clustered, and the highest emotion intensity in the cluster having the maximum area is set as the correction reference value. Emotion correction method described.
  • the vital data acquisition step acquires the subject's vital data
  • the emotion estimation step estimates the subject's emotion based on the vital data
  • the emotion correction step corrects the estimated emotion
  • the emotion correction step is the emotion correction method according to any one of Appendices 8 to 13, emotion estimation method.
  • the emotion information acquisition step acquires emotion information based on vital data
  • the emotion information includes emotion types and emotion intensity for each emotion type
  • the correction reference value calculation step calculates a correction reference value based on the frequency of appearance of the emotion intensity for each type of emotion
  • the correction procedure corrects the emotion information based on the correction reference value, A program in which each of the above procedures is executed by a computer.
  • the correction reference value calculation procedure clusters appearance frequencies of emotion intensity exceeding 0 for each type of emotion, and sets the highest emotion intensity in the cluster having the maximum area as the correction reference value. program as described.
  • the vital data acquisition step acquires the subject's vital data
  • the emotion estimation procedure estimates the subject's emotion based on the vital data
  • the emotion correction procedure corrects the estimated emotion, 21.
  • a program, wherein the emotion correction procedure is executed by the program according to any one of Appendices 15 to 20.
  • (Appendix 22) A computer-readable recording medium recording the program according to any one of appendices 15 to 21.
  • the present invention emotions can be corrected in consideration of the subject's characteristics. Therefore, the present invention is widely useful in various fields that utilize estimation of individual emotions.
  • Emotion correction device 11 Emotion information acquisition unit 12
  • Correction reference value calculation unit 13 Correction unit 101
  • emotion estimation device 21 vital data acquisition unit 22
  • emotion estimation unit 10 emotion correction unit 201

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Physics & Mathematics (AREA)
  • Social Psychology (AREA)
  • Pathology (AREA)
  • Child & Adolescent Psychology (AREA)
  • Developmental Disabilities (AREA)
  • Educational Technology (AREA)
  • Hospice & Palliative Care (AREA)
  • Psychiatry (AREA)
  • Psychology (AREA)
  • General Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Biophysics (AREA)
  • General Physics & Mathematics (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

Provided is an emotion correction device for correcting emotion information upon taking characteristics of an individual into consideration. This emotion correction device includes an emotion information acquisition unit, a correction reference value calculation unit, and a correction unit. The emotion information acquisition unit acquires emotion information based on vital data. The emotion information includes the type of emotion and the emotional intensity for each type of emotion. The correction reference value calculation unit calculates, for each type of emotion, a correction reference value on the basis of the frequency of appearance of the emotional intensity. The correction unit corrects the emotion information on the basis of the correction reference value.

Description

感情補正装置、感情推定装置、感情補正方法、感情推定方法、及びプログラムEmotion correction device, emotion estimation device, emotion compensation method, emotion estimation method, and program
 本発明は、感情補正装置、感情推定装置、感情補正方法、感情推定方法、及びプログラムに関する。 The present invention relates to an emotion correction device, an emotion estimation device, an emotion correction method, an emotion estimation method, and a program.
 近年、表情や音声などのバイタルデータからAIを用いた対象者の感情推定が行われている。このような技術の一種として、Facial Action Coding Systems(FACS)が知られている(非特許文献1)。 In recent years, AI has been used to estimate a subject's emotions from vital data such as facial expressions and voice. As one of such techniques, Facial Action Coding Systems (FACS) are known (Non-Patent Document 1).
 FACSは、Action Unit(AU)と呼ばれる顔面筋肉の動きを符号化して、機械判別するようにする技術であるが、このような手法により感情を推定した場合、感情の発露とは関係のない個人の顔の造形、個人の表情の癖等の影響を受け、感情推定の精度が下がるという課題がある。 FACS is a technology that encodes facial muscle movements called Action Units (AU) for machine discrimination. There is a problem that the accuracy of emotion estimation is lowered due to the influence of the shape of the person's face, the habit of personal expression, etc.
 そこで、本発明は、対象者の特性を考慮して感情情報を補正できる感情補正装置の提供を目的とする。 Therefore, an object of the present invention is to provide an emotion correction device capable of correcting emotion information in consideration of the characteristics of the subject.
 前記目的を達成するために、本発明の感情補正装置は、
感情情報取得部、補正基準値算出部、及び補正部を含み、
前記感情情報取得部は、バイタルデータに基づく感情情報を取得し、
 前記感情情報は、感情の種類と、感情の種類毎の感情強度を含み、
前記補正基準値算出部は、前記感情の種類毎に、前記感情強度の出現頻度に基づいて補正基準値を算出し、
前記補正部は、前記補正基準値に基づいて前記感情情報を補正する。
In order to achieve the above object, the emotion correction device of the present invention
including an emotion information acquisition unit, a correction reference value calculation unit, and a correction unit,
The emotion information acquisition unit acquires emotion information based on vital data,
The emotion information includes emotion types and emotion intensity for each emotion type,
The correction reference value calculation unit calculates a correction reference value based on the frequency of appearance of the emotion intensity for each type of emotion,
The correction unit corrects the emotion information based on the correction reference value.
 本発明の感情推定装置は、
バイタルデータ取得部、感情推定部、及び感情補正部を含み、
前記バイタルデータ取得部は、対象者のバイタルデータを取得し、
前記感情推定部は、前記バイタルデータに基づいて対象者の感情を推定し、
前記感情補正部は、推定した前記感情を補正し、
 前記感情補正部は、前記本発明の感情補正装置である。
The emotion estimation device of the present invention is
including a vital data acquisition unit, an emotion estimation unit, and an emotion correction unit,
The vital data acquisition unit acquires the subject's vital data,
The emotion estimation unit estimates an emotion of the subject based on the vital data,
The emotion correction unit corrects the estimated emotion,
The emotion correction unit is the emotion correction device of the present invention.
 本発明の感情補正方法は、
感情情報取得工程、補正基準値算出工程、及び補正工程を含み、
前記感情情報取得工程は、バイタルデータに基づく感情情報を取得し、
 前記感情情報は、感情の種類と、感情の種類毎の感情強度を含み、
前記補正基準値算出工程は、前記感情の種類毎に、前記感情強度の出現頻度に基づいて補正基準値を算出し、
前記補正工程は、前記補正基準値に基づいて前記感情情報を補正する。
The emotion correction method of the present invention is
including an emotion information acquisition step, a correction reference value calculation step, and a correction step,
The emotion information acquisition step acquires emotion information based on vital data,
The emotion information includes emotion types and emotion intensity for each emotion type,
The correction reference value calculating step calculates a correction reference value based on the frequency of appearance of the emotion intensity for each type of emotion,
The correction step corrects the emotion information based on the correction reference value.
 本発明の感情推定方法は、
バイタルデータ取得工程、感情推定工程、及び感情補正工程を含み、
前記バイタルデータ取得工程は、対象者のバイタルデータを取得し、
前記感情推定工程は、前記バイタルデータに基づいて対象者の感情を推定し、
前記感情補正工程は、推定した前記感情を補正し、
 前記感情補正工程は、前記本発明の感情補正方法である。
The emotion estimation method of the present invention is
including a vital data acquisition process, an emotion estimation process, and an emotion correction process,
The vital data acquisition step acquires the subject's vital data,
The emotion estimation step estimates the subject's emotion based on the vital data,
The emotion correction step corrects the estimated emotion,
The emotion correction step is the emotion correction method of the present invention.
 本発明の第1のプログラムは、感情情報取得手順、補正基準値算出手順、及び補正手順を含み、
前記感情情報取得手順は、バイタルデータに基づく感情情報を取得し、
 前記感情情報は、感情の種類と、感情の種類毎の感情強度を含み、
前記補正基準値算出手順は、前記感情の種類毎に、前記感情強度の出現頻度に基づいて補正基準値を算出し、
前記補正手順は、前記補正基準値に基づいて前記感情情報を補正し、
前記各手順が、コンピュータにより実行されるプログラムである。
A first program of the present invention includes an emotion information acquisition procedure, a correction reference value calculation procedure, and a correction procedure,
The emotion information acquisition step acquires emotion information based on vital data,
The emotion information includes emotion types and emotion intensity for each emotion type,
The correction reference value calculation step calculates a correction reference value based on the frequency of appearance of the emotion intensity for each type of emotion,
The correction procedure corrects the emotion information based on the correction reference value,
Each procedure is a program executed by a computer.
 本発明によれば、対象者の特性を考慮して感情を補正できる。 According to the present invention, emotions can be corrected in consideration of the subject's characteristics.
図1は、実施形態1の感情補正装置の一例の構成を示すブロック図である。FIG. 1 is a block diagram showing an example configuration of an emotion correction device according to a first embodiment. 図2は、実施形態1の感情補正装置のハードウェア構成の一例を示すブロック図である。FIG. 2 is a block diagram showing an example of the hardware configuration of the emotion correction device according to the first embodiment; 図3は、実施形態1の感情補正装置における処理の一例を示すフローチャートである。FIG. 3 is a flow chart showing an example of processing in the emotion correction device according to the first embodiment. 図4(A)及び(B)は、感情の種類毎の感情強度のクラスタリング結果の一例を示すグラフである。FIGS. 4A and 4B are graphs showing an example of clustering results of emotion intensity for each emotion type. 図5は、実施形態1の感情補正装置の補正部における処理の具体例を示すフローチャートである。FIG. 5 is a flow chart showing a specific example of processing in the correction unit of the emotion correction device according to the first embodiment. 図6は、表情に基づく感情推定について説明するための模式図である。FIG. 6 is a schematic diagram for explaining emotion estimation based on facial expressions. 図7は、実施形態2の感情推定装置の一例の構成を示すブロック図である。FIG. 7 is a block diagram showing an example configuration of an emotion estimation device according to the second embodiment. 図8は、実施形態2の感情推定装置のハードウェア構成の一例を示すブロック図である。FIG. 8 is a block diagram showing an example of the hardware configuration of the emotion estimation device according to the second embodiment. 図9は、実施形態2の感情推定装置における処理の一例を示すフローチャートである。FIG. 9 is a flow chart showing an example of processing in the emotion estimation device of the second embodiment.
 本発明の実施形態について図を用いて説明する。本発明は、以下の実施形態には限定されない。以下の各図において、同一部分には、同一符号を付している。また、各実施形態の説明は、特に言及がない限り、互いの説明を援用でき、各実施形態の構成は、特に言及がない限り、組合せ可能である。 An embodiment of the present invention will be described with reference to the drawings. The invention is not limited to the following embodiments. In each figure below, the same reference numerals are given to the same parts. In addition, the description of each embodiment can be used with reference to each other's description unless otherwise specified, and the configurations of the respective embodiments can be combined unless otherwise specified.
[実施形態1]
 図1は、本実施形態の感情補正装置10の一例の構成を示すブロック図である。図1に示すように、本装置10は、感情情報取得部11、補正基準値算出部12、及び補正部13を含む。また、本装置10は、図示していないが、記憶部を含んでもよい。
[Embodiment 1]
FIG. 1 is a block diagram showing an example configuration of an emotion correction device 10 of this embodiment. As shown in FIG. 1 , the device 10 includes an emotion information acquisition section 11 , a correction reference value calculation section 12 and a correction section 13 . The device 10 may also include a storage unit, although not shown.
 本装置10は、例えば、前記各部を含む1つの装置でもよいし、前記各部が、通信回線網を介して接続可能な装置でもよい。また、本装置10は、前記通信回線網を介して、後述する外部装置と接続可能である。前記通信回線網は、特に制限されず、公知のネットワークを使用でき、例えば、有線でも無線でもよい。前記通信回線網は、例えば、インターネット回線、WWW(World Wide Web)、電話回線、LAN(Local Area Network)、SAN(Storage Area Network)、DTN(Delay Tolerant Networking)、LPWA(Low Power Wide Area)、L5G(ローカル5G)、等があげられる。無線通信としては、例えば、Wi-Fi(登録商標)、Bluetooth(登録商標)、ローカル5G、LPWA等が挙げられる。前記無線通信としては、各装置が直接通信する形態(Ad Hoc通信)、インフラストラクチャ(infrastructure通信)、アクセスポイントを介した間接通信等であってもよい。本装置10は、例えば、システムとしてサーバに組み込まれていてもよい。また、本装置10は、例えば、本発明のプログラムがインストールされたパーソナルコンピュータ(PC、例えば、デスクトップ型、ノート型)、スマートフォン、タブレット端末等であってもよい。本装置10は、例えば、前記各部のうち少なくとも一つがサーバ上にあり、その他の前記各部が端末上にあるような、クラウドコンピューティングやエッジコンピューティング等の形態であってもよい。 The device 10 may be, for example, a single device including each of the above units, or may be a device to which each of the above units can be connected via a communication network. Further, the device 10 can be connected to an external device, which will be described later, via the communication network. The communication line network is not particularly limited, and a known network can be used, and may be wired or wireless, for example. The communication line network includes, for example, the Internet line, WWW (World Wide Web), telephone line, LAN (Local Area Network), SAN (Storage Area Network), DTN (Delay Tolerant Networking), LPWA (Low Power Wide Area), L5G (local 5G), and the like. Examples of wireless communication include Wi-Fi (registered trademark), Bluetooth (registered trademark), local 5G, and LPWA. The wireless communication may be a form in which each device communicates directly (Ad Hoc communication), infrastructure communication, indirect communication via an access point, or the like. The device 10 may be incorporated in a server as a system, for example. Further, the device 10 may be, for example, a personal computer (PC, for example, desktop type, notebook type), a smart phone, a tablet terminal, etc. in which the program of the present invention is installed. The device 10 may be, for example, in a form of cloud computing, edge computing, or the like, in which at least one of the units is on the server and the other units are on the terminal.
 図2に、本装置10のハードウェア構成のブロック図を例示する。本装置10は、例えば、CPU101、メモリ102、バス103、記憶装置104、入力装置106、出力装置107、通信デバイス108等を含む。本装置10の各部は、それぞれのインタフェース(I/F)により、バス103を介して相互に接続されている。 A block diagram of the hardware configuration of this device 10 is illustrated in FIG. The device 10 includes, for example, a CPU 101, a memory 102, a bus 103, a storage device 104, an input device 106, an output device 107, a communication device 108, and the like. Each unit of the device 10 is interconnected via a bus 103 by each interface (I/F).
 CPU101は、コントローラ(システムコントローラ、I/Oコントローラ等)等により、他の構成と連携動作し、本装置10の全体の制御を担う。本装置10において、CPU101により、例えば、本発明のプログラム105やその他のプログラムが実行され、また、各種情報の読み込みや書き込みが行われる。具体的には、例えば、CPU101が、感情情報取得部11、補正基準値算出部12、及び補正部13として機能する。本装置10は、演算装置として、CPUを備えるが、GPU(Graphics Processing Unit)、APU(Accelerated Processing Unit)等の他の演算装置を備えてもよいし、CPUとこれらとの組合せを備えてもよい。 The CPU 101 operates in cooperation with other components by means of a controller (system controller, I/O controller, etc.) and takes charge of overall control of the device 10 . In the device 10, the CPU 101 executes, for example, the program 105 of the present invention and other programs, and reads and writes various information. Specifically, for example, the CPU 101 functions as an emotion information acquisition unit 11 , a correction reference value calculation unit 12 , and a correction unit 13 . The device 10 includes a CPU as a computing device, but may include other computing devices such as a GPU (Graphics Processing Unit) and APU (Accelerated Processing Unit), or may include a combination of the CPU and these. good.
 バス103は、例えば、外部装置とも接続できる。前記外部装置は、例えば、後述する本発明の感情推定装置等の感情推定装置、外部記憶装置(外部データベース等)、プリンタ、外部入力装置、外部表示装置、外部撮像装置等があげられる。本装置10は、例えば、バス103に接続された通信デバイス108により、外部ネットワーク(前記通信回線網)に接続でき、外部ネットワークを介して、他の装置と接続することもできる。 The bus 103 can also be connected to external devices, for example. Examples of the external device include an emotion estimation device such as an emotion estimation device of the present invention, which will be described later, an external storage device (external database, etc.), a printer, an external input device, an external display device, an external imaging device, and the like. For example, the device 10 can be connected to an external network (the communication line network) by means of a communication device 108 connected to the bus 103, and can also be connected to other devices via the external network.
 メモリ102は、例えば、メインメモリ(主記憶装置)が挙げられる。中央処理装置101が処理を行う際には、例えば、後述する記憶装置104に記憶されている本発明のプログラム等の種々の動作プログラムを、メモリ102が読み込み、中央処理装置101は、メモリ102からデータを受け取って、プログラムを実行する。前記メインメモリは、例えば、RAM(ランダムアクセスメモリ)である。また、メモリ102は、例えば、ROM(読み出し専用メモリ)であってもよい。 The memory 102 is, for example, a main memory (main storage device). When the central processing unit 101 performs processing, for example, the memory 102 reads various operating programs such as the program of the present invention stored in the storage device 104 to be described later, and the central processing unit 101 reads from the memory 102 Get the data and run the program. The main memory is, for example, RAM (random access memory). Also, the memory 102 may be, for example, a ROM (read only memory).
 記憶装置104は、例えば、前記メインメモリ(主記憶装置)に対して、いわゆる補助記憶装置ともいう。前述のように、記憶装置104には、本発明のプログラムを含む動作プログラムが格納されている。記憶装置104は、例えば、記録媒体と、記録媒体に読み書きするドライブとの組合せであってもよい。前記記録媒体は、特に制限されず、例えば、内蔵型でも外付け型でもよく、HD(ハードディスク)、CD-ROM、CD-R、CD-RW、MO、DVD、フラッシュメモリー、メモリーカード等が挙げられる。記憶装置104は、例えば、記録媒体とドライブとが一体化されたハードディスクドライブ(HDD)、及びソリッドステートドライブ(SSD)であってもよい。本装置10が前記記憶部を含む場合、例えば、記憶装置104は、前記記憶部として機能する。前記記憶部は、例えば、取得した前記感情情報、算出した補正基準値等を記憶できる。 The storage device 104 is also called a so-called auxiliary storage device, for example, in contrast to the main memory (main storage device). As described above, the storage device 104 stores operating programs including the program of the present invention. Storage device 104 may be, for example, a combination of a recording medium and a drive that reads from and writes to the recording medium. The recording medium is not particularly limited, and may be, for example, a built-in type or an external type, and includes HD (hard disk), CD-ROM, CD-R, CD-RW, MO, DVD, flash memory, memory card, and the like. be done. The storage device 104 may be, for example, a hard disk drive (HDD) in which a recording medium and drive are integrated, and a solid state drive (SSD). If the device 10 includes the storage section, for example, the storage device 104 functions as the storage section. The storage unit can store, for example, the acquired emotion information, the calculated correction reference value, and the like.
 本装置10において、メモリ102及び記憶装置104は、ログ情報、外部データベース(図示せず)や外部の装置から取得した情報、本装置10によって生成した情報、本装置10が処理を実行する際に用いる情報等の種々の情報を記憶することも可能である。この場合、メモリ102及び記憶装置104は、例えば、後述する対象者の感情情報、対象者の識別情報、対象者の属性情報、補正基準値、補正後の感情情報等からなる群から選択された少なくとも一つの情報を記憶していてもよい。なお、少なくとも一部の情報は、例えば、メモリ102及び記憶装置104以外の外部サーバに記憶されていてもよいし、複数の端末にブロックチェーン技術等を用いて分散して記憶されていてもよい。 In the device 10, the memory 102 and the storage device 104 store log information, information acquired from an external database (not shown) or an external device, information generated by the device 10, and data stored when the device 10 executes processing. It is also possible to store various information such as information to be used. In this case, the memory 102 and the storage device 104 are selected from the group consisting of, for example, subject's emotional information, subject's identification information, subject's attribute information, correction reference value, post-correction emotional information, etc., which will be described later. At least one piece of information may be stored. At least part of the information may be stored in an external server other than the memory 102 and the storage device 104, or may be distributed and stored in a plurality of terminals using blockchain technology or the like. .
 本装置10は、例えば、さらに、入力装置106、出力装置107を備える。入力装置106は、例えば、タッチパネル、トラックパッド、マウス等のポインティングデバイス;キーボード;カメラ、スキャナ等の撮像手段;ICカードリーダ、磁気カードリーダ等のカードリーダ;マイク等の音声入力手段;等があげられる。出力装置107は、例えば、LEDディスプレイ、液晶ディスプレイ等の表示装置;スピーカ等の音声出力装置;プリンタ;等があげられる。本実施形態1において、入力装置106と出力装置107とは、別個に構成されているが、入力装置106と出力装置107とは、タッチパネルディスプレイのように、一体として構成されてもよい。 The device 10 further includes an input device 106 and an output device 107, for example. The input device 106 includes, for example, a touch panel, a track pad, a pointing device such as a mouse; a keyboard; an imaging means such as a camera or scanner; a card reader such as an IC card reader or a magnetic card reader; be done. Examples of the output device 107 include a display device such as an LED display and a liquid crystal display; an audio output device such as a speaker; a printer; In Embodiment 1, the input device 106 and the output device 107 are configured separately, but the input device 106 and the output device 107 may be configured integrally like a touch panel display.
 つぎに、本実施形態の感情補正方法の一例を、図3のフローチャートに基づき説明する。本実施形態の感情補正方法は、例えば、図1又は図2の感情補正装置10を用いて、次のように実施する。なお、本実施形態の感情補正方法は、図1又は図2の感情補正装置10の使用には限定されない。 Next, an example of the emotion correction method of this embodiment will be described based on the flowchart of FIG. The emotion correction method of this embodiment is implemented as follows using, for example, the emotion correction device 10 of FIG. 1 or FIG. It should be noted that the emotion correction method of this embodiment is not limited to the use of the emotion correction device 10 of FIG. 1 or FIG.
 まず、感情情報取得部11により、バイタルデータに基づく感情情報を取得する(S1、感情情報取得工程)。前記感情情報は、例えば、対象者のバイタルデータに基づいて、対象者の感情が発生したと推定されたことを示す情報であり、感情の種類と、感情の種類毎の感情強度とを含む。感情情報取得部11は、通信回線網を介してバイタルデータに基づく感情推定が可能な感情推定エンジンから前記感情情報を取得してもよいし、前記感情推定エンジンが推定した感情情報が記録された外部のデータベースから前記感情情報を取得してもよい。また、感情情報取得部11は、例えば、対象者のバイタルデータを取得し、前記バイタルデータに基づき、対象者の感情を推定して前記対象者の感情情報を取得してもよい。この場合、感情情報取得部11は、例えば、感情推定部ともいう。 First, the emotion information acquisition unit 11 acquires emotion information based on vital data (S1, emotion information acquisition step). The emotion information is, for example, information indicating that the subject's emotion is presumed to have occurred based on the subject's vital data, and includes the emotion type and the emotion intensity for each emotion type. The emotion information acquisition unit 11 may acquire the emotion information from an emotion estimation engine capable of estimating emotions based on vital data via a communication network, or the emotion information estimated by the emotion estimation engine may be recorded. The emotion information may be acquired from an external database. The emotion information acquisition unit 11 may acquire the subject's vital data, estimate the subject's emotion based on the vital data, and acquire the subject's emotion information. In this case, the emotion information acquisition unit 11 is also called an emotion estimation unit, for example.
 前記バイタルデータは、例えば、対象者の表情筋の動作を符号化したAction Unit(AU)情報;対象者の表情等の画像情報;有声音および無声音等の発声情報;発話内容等のテキスト情報;血圧、心拍数;等があげられる。前記感情推定エンジンは、特に制限されず、例えば、前記バイタルデータの種類に応じて適宜選択できる。前記感情推定エンジンの具体例としては、例えば、Affdex、Microsoft Azure Face API、Amazon Rekognition、Realeyes、User Local 表情推定AI等の表情等の画像情報に基づいて感情を推定するもの;STEmotion、Empath、BeyondVerbal社の感情推定ソフトウェア、IBM Watson Tone Analyzer、User Local 音声感情認識AI等の音声情報に基づいて感情を推定するもの;User Local テキスト感情認識AI等のテキスト情報に基づいて自然言語処理にて感情を推定するもの;NEC 感情分析ソリューション等のバイタルデータに基づいて感情を推定するもの;等があげられる。前記バイタルデータ及び感情推定エンジンの組み合わせは、例えば、前記AU情報と、前記AU情報を利用可能なFacial Action Coding Systems(FACS)を利用する感情推定エンジンであることが好ましいが、これには限定されない。前記感情推定エンジンは、例えば、本装置10外部の構成であってもよいし、本装置10の構成であってもよい。以下の説明においては、前記AU情報に基づく感情推定を行う場合を例に挙げて説明する。 The vital data is, for example, Action Unit (AU) information that encodes the movement of facial muscles of the subject; image information such as facial expressions of the subject; vocalization information such as voiced and unvoiced sounds; text information such as utterance content; Blood pressure, heart rate; The emotion estimation engine is not particularly limited, and can be appropriately selected according to the type of the vital data, for example. Specific examples of the emotion estimation engine include those that estimate emotions based on image information such as facial expressions such as Affdex, Microsoft Azure Face API, Amazon Rekognition, Realeyes, User Local facial expression estimation AI; STEmotion, Empath, BeyondVerbal Emotion Estimation Software, IBM Watson Tone Analyzer, User Local Speech Emotion Recognition AI, etc. that estimate emotions based on voice information; User Local Text Emotion Recognition AI, etc. that estimate emotions by natural language processing based on text information Estimating; estimating emotions based on vital data such as NEC Emotion Analysis Solution; The combination of the vital data and the emotion estimation engine is preferably, for example, the AU information and an emotion estimation engine that uses Facial Action Coding Systems (FACS) that can use the AU information, but is not limited to this. . The emotion estimation engine may be, for example, a configuration outside the device 10 or a configuration of the device 10 . In the following description, the case of performing emotion estimation based on the AU information will be described as an example.
 前記感情の種類は、例えば、1種類でもよいし、2種類以上の複数でもよい。前記感情の種類の具体例としては、特に制限されず、例えば、前記感情推定エンジンにより推定可能な感情があげられる。前記感情の種類の具体例としては、joy(喜び)、anger(怒り)、sadness(悲しみ)、fear(恐れ、恐怖)、disgust(嫌悪感)、surprise(驚き)、contempt(軽蔑)等があげられる。 The type of emotion may be, for example, one type, or two or more types. Specific examples of the types of emotions are not particularly limited, and include, for example, emotions that can be estimated by the emotion estimation engine. Specific examples of the types of emotions include joy, anger, sadness, fear, disgust, surprise, contempt, and the like. be done.
 前記感情強度は、例えば、対象者の感情の種類毎の強さを示す情報であり、感情値、感情のスコアともいう。前記感情強度は、例えば、前記バイタルデータに基づいて、前記感情推定エンジンにより算出されたスコアがあげられる。前記スコアは、例えば、定性的(例えば、その感情が発生したか否か)な情報でもよいし、定量的(例えば、その感情がどの程度発生したか)の情報でもよいし、感情ごとの発生確率の情報でもよい。また、前記スコアは、例えば、感情の種類ごとの絶対評価でもよいし、他の感情との相対評価(その感情が他の感情と比較して強いか、弱いか)でもよい。 The emotional intensity is, for example, information indicating the intensity of each type of emotion of the subject, and is also called an emotional value or an emotional score. The emotion intensity is, for example, a score calculated by the emotion estimation engine based on the vital data. The score may be, for example, qualitative information (for example, whether or not the emotion occurred), quantitative information (for example, to what extent the emotion occurred), or information for each emotion. Probability information may also be used. Also, the score may be, for example, an absolute evaluation for each type of emotion, or a relative evaluation with other emotions (whether the emotion is stronger or weaker than other emotions).
 前記感情情報は、例えば、その他の情報を含んでもよい。前記その他の情報は、例えば、対象者の識別情報、対象者の属性情報、感情が推定された日時の情報、感情が推定された場所等の情報、感情が推定された状況の情報等があげられる。前記対象者の識別情報は、前記対象者を識別可能な情報であり、例えば、対象者の氏名;対象者を撮像した画像;対象者の端末のID;等の情報があげられる。前記属性情報は、例えば、対象者の性別、年齢、所属、組織におけるポジション等の情報があげられる。前記対象者の属性情報は、例えば、前述の対象者の識別情報と互いに紐づけられていてもよい。前記感情が推定された日時、場所、状況の情報とは、例えば、前記感情が推定される基準となる前記バイタルデータが取得された日時、場所、状況の情報等があげられる。前記感情情報が前記その他の情報を含む場合、例えば、前記感情情報と前記その他の情報とは、互いに紐づけられていることが好ましい。 The emotional information may include other information, for example. Examples of the other information include identification information of the subject, attribute information of the subject, information on the date and time when the emotion was estimated, information such as the place where the emotion was estimated, and information on the situation in which the emotion was estimated. be done. The identification information of the subject is information that can identify the subject, and includes, for example, the name of the subject; the image of the subject; the ID of the terminal of the subject; The attribute information includes, for example, the subject's sex, age, affiliation, position in an organization, and the like. The subject's attribute information may be associated with, for example, the above-described subject's identification information. The information on the date, place, and situation when the emotion is estimated includes, for example, information on the date, place, and situation when the vital data, which is the basis for estimating the emotion, is acquired. When the emotional information includes the other information, for example, the emotional information and the other information are preferably associated with each other.
 つぎに、補正基準値算出部12は、前記感情の種類毎に、前記感情強度の出現頻度に基づいて補正基準値を算出する(S2、補正基準値算出工程)。前記出現頻度は、例えば、単位時間当たりにおける感情の出現回数(発生回数ともいう)である。前記単位時間は、特に制限されず、目的に応じて任意の単位が設定できる。補正基準値算出部12は、例えば、取得した全ての感情強度の出現頻度に基づいて補正基準値を算出してもよいし、一部の感情強度の出現頻度に基づいて補正基準値を算出してもよい。後者の場合、補正基準値算出部12は、例えば、前記感情の種類毎に、0を超える感情強度の出現頻度をクラスタリングし、面積が最大となるクラスタにおける最も高い感情強度を前記補正基準値とできる。前記補正基準値は、対象者個人の感情強度の出現頻度に基づいて算出されるため、例えば、個人バイアスともいう。 Next, the correction reference value calculation unit 12 calculates a correction reference value based on the appearance frequency of the emotion intensity for each type of emotion (S2, correction reference value calculation step). The appearance frequency is, for example, the number of times an emotion appears (also referred to as the number of occurrences) per unit time. The unit time is not particularly limited, and any unit can be set according to the purpose. For example, the correction reference value calculation unit 12 may calculate the correction reference value based on the appearance frequencies of all the obtained emotion intensities, or may calculate the correction reference values based on the appearance frequencies of some of the emotion intensities. may In the latter case, the correction reference value calculation unit 12 clusters, for example, the appearance frequency of emotion intensity exceeding 0 for each type of emotion, and sets the highest emotion intensity in the cluster having the maximum area as the correction reference value. can. Since the correction reference value is calculated based on the appearance frequency of the emotion intensity of the individual subject, it is also called an individual bias, for example.
 図4を用いて、補正基準値算出部12による処理の具体例を説明する。なお、補正基準値算出部12による処理は、以下の説明には何ら限定されない。補正基準値算出部12は、例えば、前記取得した感情情報に基づいて、0を超える感情強度について、感情強度順に出現頻度(出現回数)をクラスタリングする。図4(A)及び(B)は、感情の種類毎の感情強度のクラスタリング結果の一例を示すグラフである。前記クラスタリングは、例えば、図4(A)に示すように、出現頻度の上昇開始点(前記上昇開始点がない場合は感情強度の最小値でもよい。)から次の上昇開始点(下降の終了点ともいう、なお、前記上昇開始点がない場合は感情強度の最大値でもよい。)までの区間を各クラスタとしてもよい。また、図4(B)に示すように、出現頻度の近似曲線を作成し、前記近似曲線の上昇開始点(前記上昇開始点がない場合は感情強度の最小値でもよい。)から次の上昇開始点(下降の終了点ともいう、なお、前記上昇開始点がない場合は感情強度の最大値でもよい。)までの区間を各クラスタとしてもよい。つぎに、補正基準値算出部12は、前記各クラスタにおいて、面積が最大となるクラスタを選択する。そして、補正基準値算出部12は、例えば、選択したクラスタにおける感情強度の最大値を前記補正基準値とする。具体的に、補正基準値算出部12は、例えば、図4(A)に示す第1~第3クラスタからは、第2クラスタを選択し、前記第2クラスタにおける感情強度の最大値、すなわち、「8」を前記補正基準値とできる。また、補正基準値算出部12は、例えば、図4(B)に示す第1及び第2クラスタからは、第2クラスタを選択し、前記第2クラスタにおける感情強度の最大値、すなわち、「12」を前記補正基準値とできる。以下の説明においては、説明の便宜上、図4(A)に示すクラスタから、前記補正基準値として、「8」が算出された場合を例に挙げて説明する。 A specific example of processing by the correction reference value calculation unit 12 will be described with reference to FIG. Note that the processing by the correction reference value calculator 12 is not limited to the following description. For example, based on the acquired emotion information, the correction reference value calculation unit 12 clusters appearance frequencies (number of appearances) of emotion intensities exceeding 0 in order of emotion intensity. FIGS. 4A and 4B are graphs showing an example of clustering results of emotion intensity for each emotion type. The clustering, for example, as shown in FIG. Note that if there is no rising starting point, the maximum value of emotion intensity may be used.) may be set as each cluster. Further, as shown in FIG. 4(B), an approximate curve of the appearance frequency is created, and the next increase from the rising start point of the approximate curve (if there is no rising start point, the minimum value of the emotional intensity may be used). Each cluster may be a section up to the starting point (also referred to as the ending point of the descent. If there is no rising starting point, the maximum value of emotion intensity may be used.). Next, the correction reference value calculator 12 selects the cluster having the largest area among the clusters. Then, the correction reference value calculation unit 12 sets, for example, the maximum value of emotion intensity in the selected cluster as the correction reference value. Specifically, the correction reference value calculation unit 12 selects, for example, the second cluster from the first to third clusters shown in FIG. "8" can be used as the correction reference value. Further, the correction reference value calculation unit 12 selects, for example, the second cluster from the first and second clusters shown in FIG. ” can be used as the correction reference value. In the following explanation, for convenience of explanation, an example in which "8" is calculated as the correction reference value from the cluster shown in FIG. 4A will be explained.
 本装置10が前記記憶部を含み、前記記憶部が取得した前記感情情報を記憶してもよい。これにより、例えば、対象者それぞれにおける感情の発生頻度、すなわち、対象者の個人の特性を表すデータベースが作成できる。前記感情情報は、例えば、前記対象者の識別情報を含み、前記記憶部は、前記対象者の識別情報と前記感情情報を紐づけて記憶していることが好ましい。この場合、補正基準値算出部12は、例えば、前記S1において取得した感情情報と、前記記憶部が記憶している感情情報とに基づいて前記補正基準値を算出してもよい。このような態様によれば、例えば、前記記憶部が対象者個人の特性となる感情情報を蓄積することができるため、より精度よく対象者の感情を補正できる。 The device 10 may include the storage unit and store the emotional information acquired by the storage unit. As a result, for example, it is possible to create a database representing the frequency of occurrence of emotions in each subject, that is, the individual characteristics of the subject. It is preferable that the emotional information includes, for example, the identification information of the subject, and the storage unit associates and stores the identification information of the subject and the emotional information. In this case, the correction reference value calculation unit 12 may calculate the correction reference value based on, for example, the emotion information acquired in S1 and the emotion information stored in the storage unit. According to this aspect, for example, since the storage section can accumulate emotion information that is the subject's individual characteristics, it is possible to correct the subject's emotion more accurately.
 つぎに、補正部13は、前記補正基準値に基づいて前記感情情報を補正する(S3、補正工程)。補正部13の処理の一例について、図5のフローチャートを用いて説明する。図5は、補正部13による処理S3(S31、S32、S33)の一例を示すフローチャートである。まず、補正部13は、例えば、前記取得した感情情報の感情強度が、前記補正基準値を超えるかを判定する(S31)。そして、前記感情強度が前記補正基準値未満である場合(S31、No)、補正部13は、例えば、前記感情強度を0に補正する(S32)。また、前記感情強度が前記補正基準値以上である場合(S31、Yes)、補正部13は、例えば、前記補正基準値に基づいて前記感情強度を補正する(S33)。前記補正基準値に基づく感情強度の補正は、下記式(1)により補正できるが、これには限定されない。
(感情推定エンジンにより最終的に出力された感情値-補正基準値)/(感情情報が含む感情強度の最大値-補正基準値)×感情情報が含む感情強度の最大値……(1)
Next, the correction unit 13 corrects the emotion information based on the correction reference value (S3, correction step). An example of the processing of the correction unit 13 will be described with reference to the flowchart of FIG. FIG. 5 is a flow chart showing an example of processing S3 (S31, S32, S33) by the correction unit 13. As shown in FIG. First, the correction unit 13, for example, determines whether the emotion intensity of the acquired emotion information exceeds the correction reference value (S31). Then, if the emotion intensity is less than the correction reference value (S31, No), the correction unit 13 corrects the emotion intensity to 0, for example (S32). Further, when the emotion intensity is equal to or greater than the correction reference value (S31, Yes), the correction unit 13 corrects the emotion intensity based on the correction reference value, for example (S33). The correction of emotion intensity based on the correction reference value can be performed by the following formula (1), but is not limited to this.
(emotion value finally output by emotion estimation engine−correction reference value)/(maximum value of emotion intensity included in emotion information−correction reference value)×maximum value of emotion intensity included in emotion information (1)
 前述のように、前記バイタルデータに基づいて推定された感情情報は、個人の特性の影響を受ける。具体例として、前記AU情報に基づく感情推定の場合、例えば、図6(A)に示すように、普段から口角が上がっている人物であれば、本人の感情と関わらず、実際には喜んでいなかったとしても、喜びの感情が表れていると誤判定されやすい。逆に、図6(B)に示すように、普段から口角が下がっている人物であれば、本人の感情と関わらず、実際には悲しんでいなかったとしても、悲しみや嫌悪の感情が表れていると誤判定されやすい。このような個人特性による影響は、AU情報に基づく感情推定以外にも、バイタルデータに基づく感情推定全般において同様である。本発明の感情補正装置は、補正基準値算出部により、前記感情の種類毎に、前記感情強度の出現頻度に基づいて補正基準値を算出し、前記補正基準値に基づいて感情情報を補正できる。このため、本発明の感情補正装置によれば、例えば、個人特性に基づいて、より精度のよい感情推定を実施することができる。 As mentioned above, emotional information estimated based on the vital data is affected by individual characteristics. As a specific example, in the case of emotion estimation based on the AU information, for example, as shown in FIG. Even if it is not there, it is easy to be misjudged that it is showing a feeling of joy. Conversely, as shown in FIG. 6(B), if the corners of the mouth are usually lowered, the feeling of sadness or disgust will appear regardless of the person's feelings, even if the person is not actually sad. misjudgment is likely to occur. In addition to emotion estimation based on AU information, the influence of such personal characteristics is the same in general emotion estimation based on vital data. In the emotion correction device of the present invention, the correction reference value calculation unit can calculate the correction reference value based on the appearance frequency of the emotion intensity for each type of emotion, and correct the emotion information based on the correction reference value. . Therefore, according to the emotion correction device of the present invention, it is possible to perform more accurate emotion estimation based on, for example, individual characteristics.
[実施形態2]
 実施形態2は、本発明の感情推定装置の例である。
[Embodiment 2]
Embodiment 2 is an example of the emotion estimation device of the present invention.
 本実施形態の感情推定装置について、図7を用いて説明する。図7は、本実施形態の感情推定装置の一例の構成を示すブロック図である。図7(A)に示すように、本実施形態の感情推定装置20は、例えば、バイタルデータ取得部21、感情推定部22、及び感情補正部10を含む。感情補正部10は、例えば、実施形態1の感情補正装置10と同様であり、その説明を援用できる。このため、以下の説明において、「感情補正部10」との記載は、例えば、「感情補正装置10」に読み替えが可能である。また、図示していないが、感情推定装置20は、例えば、記憶部を含んでもよい。 The emotion estimation device of this embodiment will be explained using FIG. FIG. 7 is a block diagram showing the configuration of an example of the emotion estimation device of this embodiment. As shown in FIG. 7A, the emotion estimation device 20 of this embodiment includes, for example, a vital data acquisition unit 21, an emotion estimation unit 22, and an emotion correction unit 10. FIG. The emotion correction unit 10 is, for example, similar to the emotion correction device 10 of Embodiment 1, and the description thereof can be used. Therefore, in the following description, the description of "emotion correction unit 10" can be read as, for example, "emotion correction device 10". Moreover, although not shown, the emotion estimation device 20 may include, for example, a storage unit.
 感情推定装置20は、例えば、前記各部を含む1つの装置でもよいし、前記各部が、通信回線網を介して接続可能な装置でもよい。また、感情推定装置20は、通信回線網を介して、後述する外部装置と接続可能である。通信回線網は、例えば、前述の通りである。感情推定装置20は、例えば、システムとしてサーバに組み込まれていてもよい。感情推定装置20は、例えば、本発明のプログラムがインストールされたパーソナルコンピュータ(PC、例えば、デスクトップ型、ノート型)、スマートフォン、タブレット端末等であってもよい。さらに、感情推定装置20は、例えば、前記各部のうち少なくとも一つがサーバ上にあり、その他の前記各部が端末上にあるような、クラウドコンピューティングやエッジコンピューティング等の形態であってもよい。具体例として、例えば、図7(B)に示すように、バイタルデータ取得部21、感情推定部22を含む感情推定装置20と、感情補正部10として、前記実施形態1の感情補正装置10が通信回線網30を介して接続された、感情推定システム200の形態があげられる。 The emotion estimating device 20 may be, for example, a single device including each of the above units, or may be a device to which each of the above units can be connected via a communication network. Emotion estimation device 20 can also be connected to an external device, which will be described later, via a communication network. The communication network is, for example, as described above. The emotion estimation device 20 may be incorporated in a server as a system, for example. The emotion estimation device 20 may be, for example, a personal computer (PC, for example, desktop type, notebook type), a smart phone, a tablet terminal, etc. in which the program of the present invention is installed. Furthermore, the emotion estimation device 20 may be in the form of cloud computing, edge computing, or the like, in which at least one of the units is on the server and the other units are on the terminal, for example. As a specific example, for example, as shown in FIG. 7B, the emotion estimation device 20 including the vital data acquisition unit 21 and the emotion estimation unit 22, and the emotion correction device 10 of the first embodiment are used as the emotion correction unit 10. A form of the emotion estimation system 200 connected via the communication network 30 can be mentioned.
 実施形態2の感情推定装置において、例えば、バイタルデータ取得部21は、対象者のバイタルデータを取得し、感情推定部22は、前記バイタルデータに基づいて対象者の感情を推定し、感情補正部10は、推定した前記感情を補正する。 In the emotion estimation device of the second embodiment, for example, the vital data acquisition unit 21 acquires the subject's vital data, the emotion estimation unit 22 estimates the subject's emotion based on the vital data, and the emotion correction unit 10 corrects the estimated emotion.
 図8に、感情推定装置20のハードウェア構成のブロック図を例示する。図8に示すように、感情推定装置20は、例えば、CPU201、メモリ202、バス203、記憶装置204、入力装置206、出力装置207、通信デバイス208等を備える。感情推定装置20の各構成の説明は、感情補正装置10の各構成の説明を援用できる。感情推定装置20の各部は、それぞれのインタフェース(I/F)により、バス203を介して接続されている。感情推定装置20において、CPU201が、バイタルデータ取得部21、及び感情推定部22として機能する。図8に示す感情推定装置20において、感情補正部10(感情補正装置10)は、例えば、バス203に接続されている形態を示したが、本発明はこれには制限されない。感情推定装置20において、例えば、感情補正部10は、例えば、通信デバイス208により、通信回線網30を介して接続されていてもよいし、前記実施形態1の感情補正方法の各処理を実行させるためのプログラムが記憶装置204またはメモリ202に記憶され、CPU201が前記プログラムを読み取って実行することにより、CPU201が感情補正部10として機能してもよい。 FIG. 8 illustrates a block diagram of the hardware configuration of the emotion estimation device 20. As shown in FIG. 8, the emotion estimation device 20 includes, for example, a CPU 201, a memory 202, a bus 203, a storage device 204, an input device 206, an output device 207, a communication device 208, and the like. The description of each component of the emotion estimation device 20 can use the description of each component of the emotion correction device 10 . Each part of emotion estimation device 20 is connected via bus 203 by each interface (I/F). In emotion estimation device 20 , CPU 201 functions as vital data acquisition section 21 and emotion estimation section 22 . In the emotion estimation device 20 shown in FIG. 8, the emotion correction unit 10 (emotion correction device 10) is connected to the bus 203, for example, but the present invention is not limited to this. In the emotion estimation device 20, for example, the emotion correction unit 10 may be connected via the communication network 30 by, for example, a communication device 208, and causes each process of the emotion correction method of the first embodiment to be executed. A program for this purpose may be stored in storage device 204 or memory 202, and CPU 201 may function as emotion correction unit 10 by reading and executing the program.
 つぎに、本実施形態の感情推定方法の一例を、図9のフローチャートに基づき説明する。本実施形態の感情補正方法は、例えば、図7および図8の感情推定装置20を用いて、次のように実施する。なお、本実施形態の感情推定方法は、図7および図8の感情推定装置20の使用には限定されない。 Next, an example of the emotion estimation method of this embodiment will be described based on the flowchart of FIG. The emotion correction method of this embodiment is implemented as follows, for example, using the emotion estimation device 20 of FIGS. The emotion estimation method of this embodiment is not limited to the use of the emotion estimation device 20 of FIGS. 7 and 8. FIG.
 まず、感情推定装置20のバイタルデータ取得部21は、対象者のバイタルデータを取得する(S11、バイタルデータ取得工程)。バイタルデータ取得部21は、例えば、通信回線網を介して接続されたカメラ、センサ等の各種外部装置から前記バイタルデータを取得してもよいし、前記バイタルデータが記憶された外部のデータベースから前記バイタルデータを取得してもよい。前記バイタルデータは、例えば、前述の通りである。 First, the vital data acquisition unit 21 of the emotion estimation device 20 acquires the subject's vital data (S11, vital data acquisition step). For example, the vital data acquisition unit 21 may acquire the vital data from various external devices such as cameras and sensors connected via a communication network, or may acquire the vital data from an external database storing the vital data. Vital data may be obtained. The vital data is, for example, as described above.
 つぎに、感情推定装置20の感情推定部22は、前記バイタルデータに基づいて対象者の感情を推定する(S12、感情推定工程)。感情推定部22による処理は、例えば、前記感情推定エンジンと同様にして実施できる。また、感情推定部22は、例えば、通信回線網を介してS11で取得した前記バイタルデータを外部の感情推定エンジンに送信し、前記感情推定エンジンが推定した感情情報を取得することにより対象者の感情を推定してもよい。 Next, the emotion estimation unit 22 of the emotion estimation device 20 estimates the subject's emotion based on the vital data (S12, emotion estimation step). The processing by the emotion estimation unit 22 can be performed, for example, in the same manner as the emotion estimation engine. In addition, the emotion estimation unit 22, for example, transmits the vital data acquired in S11 to an external emotion estimation engine via a communication line network, and acquires the emotion information estimated by the emotion estimation engine. Emotions may be inferred.
 そして、感情補正部10により、前記実施形態1における処理S1~S3と同様にして、S1~S3を実施する(S13、感情補正工程)。 Then, the emotion correction unit 10 performs S1 to S3 in the same manner as the processes S1 to S3 in the first embodiment (S13, emotion correction step).
 本実施形態の感情推定装置によれば、バイタルデータに基づいて推定した対象者の感情を補正できる。このため、本発明の感情推定装置は、例えば、個人の特性を考慮した感情推定が可能である。 According to the emotion estimation device of this embodiment, the subject's emotion estimated based on vital data can be corrected. Therefore, the emotion estimating apparatus of the present invention is capable of, for example, estimating emotions in consideration of individual characteristics.
[実施形態3]
 本実施形態の第1のプログラムは、前述の感情補正方法の各工程を、コンピュータに実行させるためのプログラムである。具体的に、本実施形態のプログラムは、コンピュータに、感情情報取得手順、補正基準値算出手順、及び補正手順を実行させるためのプログラムである。
[Embodiment 3]
A first program of the present embodiment is a program for causing a computer to execute each step of the emotion correction method described above. Specifically, the program of the present embodiment is a program for causing a computer to execute an emotion information acquisition procedure, a correction reference value calculation procedure, and a correction procedure.
前記感情情報取得手順は、バイタルデータに基づく感情情報を取得し、
 前記感情情報は、感情の種類と、感情の種類毎の感情強度を含み、
前記補正基準値算出手順は、前記感情の種類毎に、前記感情強度の出現頻度に基づいて補正基準値を算出し、
前記補正手順は、前記補正基準値に基づいて前記感情情報を補正する。
The emotion information acquisition step acquires emotion information based on vital data,
The emotion information includes emotion types and emotion intensity for each emotion type,
The correction reference value calculation step calculates a correction reference value based on the frequency of appearance of the emotion intensity for each type of emotion,
The correction procedure corrects the emotion information based on the correction reference value.
 また、本実施形態の第1のプログラムは、コンピュータを、感情情報取得手順、補正基準値算出手順、及び補正手順として機能させるプログラムということもできる。 The first program of the present embodiment can also be said to be a program that causes a computer to function as an emotion information acquisition procedure, a correction reference value calculation procedure, and a correction procedure.
 本実施形態の第1のプログラムは、前記本発明の感情補正装置および感情補正方法における記載を援用できる。前記各手順は、例えば、「手順」を「処理」と読み替え可能である。また、本実施形態のプログラムは、例えば、コンピュータ読み取り可能な記録媒体に記録されてもよい。前記記録媒体は、例えば、非一時的なコンピュータ可読記録媒体(non-transitory computer-readable storage medium)である。前記記録媒体は、特に制限されず、例えば、ランダムアクセスメモリ(RAM)、読み出し専用メモリ(ROM)、ハードディスク(HD)、フラッシュメモリ(例えば、SSD(Solid State Drive)、USBフラッシュメモリ、SD/SDHCカード等)、光ディスク(例えば、CD‐R/CD‐RW、DVD‐R/DVD‐RW、BD‐R/BD‐RE等)、光磁気ディスク(MO)フロッピー(登録商標)ディスク(FD)等があげられる。また、本実施形態のプログラム(例えば、プログラミング製品、又はプログラム製品ともいう)は、例えば、外部のコンピュータから配信される形態であってもよい。前記「配信」は、例えば、通信回線網を介した配信でもよいし、有線で接続された装置を介した配信であってもよい。本実施形態のプログラムは、配信された装置にインストールされて実行されてもよいし、インストールされずに実行されてもよい。 For the first program of the present embodiment, the description of the emotion correction device and emotion correction method of the present invention can be used. For each procedure described above, for example, "procedure" can be read as "processing". Moreover, the program of this embodiment may be recorded on a computer-readable recording medium, for example. The recording medium is, for example, a non-transitory computer-readable storage medium. The recording medium is not particularly limited. cards, etc.), optical discs (eg, CD-R/CD-RW, DVD-R/DVD-RW, BD-R/BD-RE, etc.), magneto-optical discs (MO) floppy (registered trademark) discs (FD), etc. is given. Also, the program of the present embodiment (eg, programming product or program product) may be distributed from an external computer, for example. The "distribution" may be, for example, distribution via a communication network or distribution via a device connected by wire. The program of this embodiment may be installed and executed on the distributed device, or may be executed without being installed.
[実施形態4]
 本実施形態の第2のプログラムは、前述の感情推定方法の各工程を、コンピュータに実行させるためのプログラムである。具体的に、本実施形態のプログラムは、コンピュータに、バイタルデータ取得手順、感情推定手順、及び感情補正手順を実行させるためのプログラムである。
[Embodiment 4]
A second program of the present embodiment is a program for causing a computer to execute each step of the emotion estimation method described above. Specifically, the program of this embodiment is a program for causing a computer to execute a vital data acquisition procedure, an emotion estimation procedure, and an emotion correction procedure.
前記バイタルデータ取得手順は、対象者のバイタルデータを取得し、
前記感情推定手順は、前記バイタルデータに基づいて対象者の感情を推定し、
前記感情補正手順は、推定した前記感情を補正し、
 前記感情補正手順は、前記第1のプログラムにより実行される。
The vital data acquisition step acquires the subject's vital data,
The emotion estimation procedure estimates the subject's emotion based on the vital data,
The emotion correction procedure corrects the estimated emotion,
The emotion correction procedure is executed by the first program.
 また、本実施形態の第2のプログラムは、コンピュータを、バイタルデータ取得手順、感情推定手順、及び感情補正手順として機能させるプログラムということもできる。 The second program of the present embodiment can also be said to be a program that causes a computer to function as a vital data acquisition procedure, an emotion estimation procedure, and an emotion correction procedure.
 本実施形態の第2のプログラムは、前記本発明の感情推定装置および感情推定方法における記載を援用できる。前記各手順は、例えば、「手順」を「処理」と読み替え可能である。また、本実施形態のプログラムは、例えば、コンピュータ読み取り可能な記録媒体に記録されてもよい。前記記録媒体は、例えば、非一時的なコンピュータ可読記録媒体(non-transitory computer-readable storage medium)である。前記記録媒体は、特に制限されず、例えば、ランダムアクセスメモリ(RAM)、読み出し専用メモリ(ROM)、ハードディスク(HD)、フラッシュメモリ(例えば、SSD(Solid State Drive)、USBフラッシュメモリ、SD/SDHCカード等)、光ディスク(例えば、CD‐R/CD‐RW、DVD‐R/DVD‐RW、BD‐R/BD‐RE等)、光磁気ディスク(MO)フロッピー(登録商標)ディスク(FD)等があげられる。また、本実施形態のプログラム(例えば、プログラミング製品、又はプログラム製品ともいう)は、例えば、外部のコンピュータから配信される形態であってもよい。前記「配信」は、例えば、通信回線網を介した配信でもよいし、有線で接続された装置を介した配信であってもよい。本実施形態のプログラムは、配信された装置にインストールされて実行されてもよいし、インストールされずに実行されてもよい。 For the second program of this embodiment, the description of the emotion estimation device and emotion estimation method of the present invention can be used. For each procedure described above, for example, "procedure" can be read as "processing". Moreover, the program of this embodiment may be recorded on a computer-readable recording medium, for example. The recording medium is, for example, a non-transitory computer-readable storage medium. The recording medium is not particularly limited. cards, etc.), optical discs (eg, CD-R/CD-RW, DVD-R/DVD-RW, BD-R/BD-RE, etc.), magneto-optical discs (MO) floppy (registered trademark) discs (FD), etc. is given. Also, the program of the present embodiment (eg, programming product or program product) may be distributed from an external computer, for example. The "distribution" may be, for example, distribution via a communication network or distribution via a device connected by wire. The program of this embodiment may be installed and executed on the distributed device, or may be executed without being installed.
 以上、実施形態を参照して本発明を説明したが、本発明は、上記実施形態に限定されるものではない。本発明の構成や詳細には、本発明のスコープ内で当業者が理解しうる様々な変更をできる。 Although the present invention has been described with reference to the embodiments, the present invention is not limited to the above embodiments. Various changes can be made to the configuration and details of the present invention within the scope of the present invention that can be understood by those skilled in the art.
 この出願は、2022年2月28日に出願された日本出願特願2022-029674を基礎とする優先権を主張し、その開示のすべてをここに取り込む。 This application claims priority based on Japanese Patent Application No. 2022-029674 filed on February 28, 2022, and incorporates all of its disclosure herein.
<付記>
 上記の実施形態の一部または全部は、以下の付記のように記載されうるが、以下には限られない。
(付記1)
感情情報取得部、補正基準値算出部、及び補正部を含み、
前記感情情報取得部は、バイタルデータに基づく感情情報を取得し、
 前記感情情報は、感情の種類と、感情の種類毎の感情強度を含み、
前記補正基準値算出部は、前記感情の種類毎に、前記感情強度の出現頻度に基づいて補正基準値を算出し、
前記補正部は、前記補正基準値に基づいて前記感情情報を補正する、感情補正装置。
(付記2)
前記補正基準値算出部は、前記感情の種類毎に、0を超える感情強度の出現頻度をクラスタリングし、且つ、面積が最大となるクラスタにおける最も高い感情強度を前記補正基準値とする、付記1記載の感情補正装置。
(付記3)
記憶部を含み、
前記記憶部は、取得した前記感情情報を記憶し、
前記補正基準値算出部は、取得した感情情報と、記憶した感情情報に基づいて前記補正基準値を算出する、付記1または2記載の感情補正装置。
(付記4)
前記補正部は、前記取得した感情情報の感情強度が、前記補正基準値を超えるかを判定し、前記感情強度が前記補正基準値未満である場合、前記感情強度を0に補正する、付記1から3のいずれかに記載の感情補正装置。
(付記5)
前記補正部は、前記取得した感情情報の感情強度が、前記補正基準値を超えるかを判定し、前記感情強度が前記補正基準値以上である場合、前記補正基準値に基づいて前記感情強度を補正する、付記1から4のいずれかに記載の感情補正装置。
(付記6)
前記バイタルデータが、対象者の表情または音声である、付記1から5のいずれかに記載の感情補正装置。
(付記7)
バイタルデータ取得部、感情推定部、及び感情補正部を含み、
前記バイタルデータ取得部は、対象者のバイタルデータを取得し、
前記感情推定部は、前記バイタルデータに基づいて対象者の感情を推定し、
前記感情補正部は、推定した前記感情を補正し、
 前記感情補正部は、付記1から6のいずれかに記載の感情補正装置である、
感情推定装置。
(付記8)
感情情報取得工程、補正基準値算出工程、及び補正工程を含み、
前記感情情報取得工程は、バイタルデータに基づく感情情報を取得し、
 前記感情情報は、感情の種類と、感情の種類毎の感情強度を含み、
前記補正基準値算出工程は、前記感情の種類毎に、前記感情強度の出現頻度に基づいて補正基準値を算出し、
前記補正工程は、前記補正基準値に基づいて前記感情情報を補正する、感情補正方法。
(付記9)
前記補正基準値算出工程は、前記感情の種類毎に、0を超える感情強度の出現頻度をクラスタリングし、且つ、面積が最大となるクラスタにおける最も高い感情強度を前記補正基準値とする、付記8記載の感情補正方法。
(付記10)
記憶工程を含み、
前記記憶工程は、取得した前記感情情報を記憶し、
前記補正基準値算出工程は、取得した感情情報と、記憶した感情情報に基づいて前記補正基準値を算出する、付記8または9記載の感情補正方法。
(付記11)
前記補正工程は、前記取得した感情情報の感情強度が、前記補正基準値を超えるかを判定し、前記感情強度が前記補正基準値未満である場合、前記感情強度を0に補正する、付記8から10のいずれかに記載の感情補正方法。
(付記12)
前記補正工程は、前記取得した感情情報の感情強度が、前記補正基準値を超えるかを判定し、前記感情強度が前記補正基準値以上である場合、前記補正基準値に基づいて前記感情強度を補正する、付記8から11のいずれかに記載の感情補正方法。
(付記13)
前記バイタルデータが、対象者の表情または音声である、付記8から12のいずれかに記載の感情補正方法。
(付記14)
バイタルデータ取得工程、感情推定工程、及び感情補正工程を含み、
前記バイタルデータ取得工程は、対象者のバイタルデータを取得し、
前記感情推定工程は、前記バイタルデータに基づいて対象者の感情を推定し、
前記感情補正工程は、推定した前記感情を補正し、
 前記感情補正工程は、付記8から13のいずれかに記載の感情補正方法である、
感情推定方法。
(付記15)
感情情報取得手順、補正基準値算出手順、及び補正手順を含み、
前記感情情報取得手順は、バイタルデータに基づく感情情報を取得し、
 前記感情情報は、感情の種類と、感情の種類毎の感情強度を含み、
前記補正基準値算出手順は、前記感情の種類毎に、前記感情強度の出現頻度に基づいて補正基準値を算出し、
前記補正手順は、前記補正基準値に基づいて前記感情情報を補正し、
前記各手順が、コンピュータにより実行される、プログラム。
(付記16)
前記補正基準値算出手順は、前記感情の種類毎に、0を超える感情強度の出現頻度をクラスタリングし、且つ、面積が最大となるクラスタにおける最も高い感情強度を前記補正基準値とする、付記15記載のプログラム。
(付記17)
記憶手順を含み、
前記記憶手順は、取得した前記感情情報を記憶し、
前記補正基準値算出手順は、取得した感情情報と、記憶した感情情報に基づいて前記補正基準値を算出する、付記15または16記載のプログラム。
(付記18)
前記補正手順は、前記取得した感情情報の感情強度が、前記補正基準値を超えるかを判定し、前記感情強度が前記補正基準値未満である場合、前記感情強度を0に補正する、付記15から17のいずれかに記載のプログラム。
(付記19)
前記補正手順は、前記取得した感情情報の感情強度が、前記補正基準値を超えるかを判定し、前記感情強度が前記補正基準値以上である場合、前記補正基準値に基づいて前記感情強度を補正する、付記15から18のいずれかに記載のプログラム。
(付記20)
前記バイタルデータが、対象者の表情または音声である、付記15から19のいずれかに記載のプログラム。
(付記21)
バイタルデータ取得手順、感情推定手順、及び感情補正手順を含み、
前記バイタルデータ取得手順は、対象者のバイタルデータを取得し、
前記感情推定手順は、前記バイタルデータに基づいて対象者の感情を推定し、
前記感情補正手順は、推定した前記感情を補正し、
 前記感情補正手順は、付記15から20のいずれかに記載のプログラムにより実行される、プログラム。
(付記22)
付記15から21のいずれかに記載のプログラムを記録しているコンピュータ読み取り可能な記録媒体。
<Appendix>
Some or all of the above-described embodiments can be described as the following notes, but are not limited to the following.
(Appendix 1)
including an emotion information acquisition unit, a correction reference value calculation unit, and a correction unit,
The emotion information acquisition unit acquires emotion information based on vital data,
The emotion information includes emotion types and emotion intensity for each emotion type,
The correction reference value calculation unit calculates a correction reference value based on the frequency of appearance of the emotion intensity for each type of emotion,
The emotion correction device, wherein the correction unit corrects the emotion information based on the correction reference value.
(Appendix 2)
Supplementary Note 1, wherein the correction reference value calculation unit clusters appearance frequencies of emotion intensities exceeding 0 for each type of emotion, and sets the highest emotion intensity in a cluster having a maximum area as the correction reference value. Emotion correction device as described.
(Appendix 3)
including a storage unit,
The storage unit stores the acquired emotion information,
3. The emotion correction device according to appendix 1 or 2, wherein the correction reference value calculation unit calculates the correction reference value based on acquired emotion information and stored emotion information.
(Appendix 4)
Supplementary Note 1, wherein the correction unit determines whether the emotion intensity of the acquired emotion information exceeds the correction reference value, and corrects the emotion intensity to 0 when the emotion intensity is less than the correction reference value. 4. The emotion correction device according to any one of 3 to 3.
(Appendix 5)
The correction unit determines whether the emotion intensity of the acquired emotion information exceeds the correction reference value, and if the emotion intensity is equal to or greater than the correction reference value, adjusts the emotion intensity based on the correction reference value. 5. The emotion correction device according to any one of appendices 1 to 4, for correcting.
(Appendix 6)
6. The emotion correction device according to any one of Appendices 1 to 5, wherein the vital data is facial expression or voice of the subject.
(Appendix 7)
including a vital data acquisition unit, an emotion estimation unit, and an emotion correction unit,
The vital data acquisition unit acquires the subject's vital data,
The emotion estimation unit estimates an emotion of the subject based on the vital data,
The emotion correction unit corrects the estimated emotion,
The emotion correction unit is the emotion correction device according to any one of appendices 1 to 6,
emotion estimation device.
(Appendix 8)
including an emotion information acquisition step, a correction reference value calculation step, and a correction step,
The emotion information acquisition step acquires emotion information based on vital data,
The emotion information includes emotion types and emotion intensity for each emotion type,
The correction reference value calculating step calculates a correction reference value based on the frequency of appearance of the emotion intensity for each type of emotion,
The emotion correction method, wherein the correction step corrects the emotion information based on the correction reference value.
(Appendix 9)
Supplementary Note 8, in the correction reference value calculation step, for each type of emotion, the frequency of occurrence of emotion intensity exceeding 0 is clustered, and the highest emotion intensity in the cluster having the maximum area is set as the correction reference value. Emotion correction method described.
(Appendix 10)
including a storage step;
The storing step stores the acquired emotion information,
10. The emotion correction method according to appendix 8 or 9, wherein the correction reference value calculating step calculates the correction reference value based on acquired emotion information and stored emotion information.
(Appendix 11)
Supplementary note 8 wherein the correction step determines whether the emotion intensity of the acquired emotion information exceeds the correction reference value, and corrects the emotion intensity to 0 when the emotion intensity is less than the correction reference value. 10. The emotion correction method according to any one of 10 to 10.
(Appendix 12)
The correction step determines whether the emotion intensity of the acquired emotion information exceeds the correction reference value, and if the emotion intensity is equal to or greater than the correction reference value, adjusts the emotion intensity based on the correction reference value. 12. The emotion correction method according to any one of Appendices 8 to 11, wherein the correction is performed.
(Appendix 13)
13. The emotion correction method according to any one of Appendices 8 to 12, wherein the vital data is facial expression or voice of the subject.
(Appendix 14)
including a vital data acquisition process, an emotion estimation process, and an emotion correction process,
The vital data acquisition step acquires the subject's vital data,
The emotion estimation step estimates the subject's emotion based on the vital data,
The emotion correction step corrects the estimated emotion,
The emotion correction step is the emotion correction method according to any one of Appendices 8 to 13,
emotion estimation method.
(Appendix 15)
Including emotion information acquisition procedure, correction reference value calculation procedure, and correction procedure,
The emotion information acquisition step acquires emotion information based on vital data,
The emotion information includes emotion types and emotion intensity for each emotion type,
The correction reference value calculation step calculates a correction reference value based on the frequency of appearance of the emotion intensity for each type of emotion,
The correction procedure corrects the emotion information based on the correction reference value,
A program in which each of the above procedures is executed by a computer.
(Appendix 16)
Supplementary note 15, wherein the correction reference value calculation procedure clusters appearance frequencies of emotion intensity exceeding 0 for each type of emotion, and sets the highest emotion intensity in the cluster having the maximum area as the correction reference value. program as described.
(Appendix 17)
including a memory procedure;
The storing procedure stores the acquired emotional information,
17. The program according to appendix 15 or 16, wherein the correction reference value calculation procedure calculates the correction reference value based on acquired emotion information and stored emotion information.
(Appendix 18)
Supplementary note 15, wherein the correction procedure determines whether the emotion intensity of the acquired emotion information exceeds the correction reference value, and corrects the emotion intensity to 0 when the emotion intensity is less than the correction reference value. 18. The program according to any one of 17 to 17.
(Appendix 19)
The correction procedure determines whether the emotion intensity of the acquired emotion information exceeds the correction reference value, and if the emotion intensity is equal to or greater than the correction reference value, adjusts the emotion intensity based on the correction reference value. 19. A program according to any one of appendices 15 to 18, for correcting.
(Appendix 20)
20. The program according to any one of appendices 15 to 19, wherein the vital data is the subject's facial expression or voice.
(Appendix 21)
Including a vital data acquisition procedure, an emotion estimation procedure, and an emotion correction procedure,
The vital data acquisition step acquires the subject's vital data,
The emotion estimation procedure estimates the subject's emotion based on the vital data,
The emotion correction procedure corrects the estimated emotion,
21. A program, wherein the emotion correction procedure is executed by the program according to any one of Appendices 15 to 20.
(Appendix 22)
A computer-readable recording medium recording the program according to any one of appendices 15 to 21.
 本発明によれば、対象者の特性を考慮して感情を補正できる。このため、本発明は、個人の感情の推定を利用する様々な分野において広く有用である。 According to the present invention, emotions can be corrected in consideration of the subject's characteristics. Therefore, the present invention is widely useful in various fields that utilize estimation of individual emotions.
10 感情補正装置
11 感情情報取得部
12 補正基準値算出部
13 補正部
101 CPU
102 メモリ
103 バス
104 記憶装置
105 プログラム
106 入力装置
107 出力装置
108 通信デバイス
20 感情推定装置
21 バイタルデータ取得部
22 感情推定部
10 感情補正部
201  CPU
202 メモリ
203 バス
204 記憶装置
205 プログラム
206 入力装置
207 出力装置
208 通信デバイス
30 通信回線網

 
10 Emotion correction device 11 Emotion information acquisition unit 12 Correction reference value calculation unit 13 Correction unit 101 CPU
102 memory 103 bus 104 storage device 105 program 106 input device 107 output device 108 communication device 20 emotion estimation device 21 vital data acquisition unit 22 emotion estimation unit 10 emotion correction unit 201 CPU
202 memory 203 bus 204 storage device 205 program 206 input device 207 output device 208 communication device 30 communication network

Claims (22)

  1. 感情情報取得部、補正基準値算出部、及び補正部を含み、
    前記感情情報取得部は、バイタルデータに基づく感情情報を取得し、
     前記感情情報は、感情の種類と、感情の種類毎の感情強度を含み、
    前記補正基準値算出部は、前記感情の種類毎に、前記感情強度の出現頻度に基づいて補正基準値を算出し、
    前記補正部は、前記補正基準値に基づいて前記感情情報を補正する、感情補正装置。
    including an emotion information acquisition unit, a correction reference value calculation unit, and a correction unit,
    The emotion information acquisition unit acquires emotion information based on vital data,
    The emotion information includes emotion types and emotion intensity for each emotion type,
    The correction reference value calculation unit calculates a correction reference value based on the frequency of appearance of the emotion intensity for each type of emotion,
    The emotion correction device, wherein the correction unit corrects the emotion information based on the correction reference value.
  2. 前記補正基準値算出部は、前記感情の種類毎に、0を超える感情強度の出現頻度をクラスタリングし、且つ、面積が最大となるクラスタにおける最も高い感情強度を前記補正基準値とする、請求項1記載の感情補正装置。 The correction reference value calculation unit clusters appearance frequencies of emotion intensities exceeding 0 for each type of emotion, and sets the highest emotion intensity in a cluster having a maximum area as the correction reference value. 2. The emotion correction device according to 1.
  3. 記憶部を含み、
    前記記憶部は、取得した前記感情情報を記憶し、
    前記補正基準値算出部は、取得した感情情報と、記憶した感情情報に基づいて前記補正基準値を算出する、請求項1または2記載の感情補正装置。
    including a storage unit,
    The storage unit stores the acquired emotion information,
    3. The emotion correction device according to claim 1, wherein said correction reference value calculation unit calculates said correction reference value based on acquired emotion information and stored emotion information.
  4. 前記補正部は、前記取得した感情情報の感情強度が、前記補正基準値を超えるかを判定し、前記感情強度が前記補正基準値未満である場合、前記感情強度を0に補正する、請求項1から3のいずれか一項に記載の感情補正装置。 The correction unit determines whether the emotion intensity of the acquired emotion information exceeds the correction reference value, and corrects the emotion intensity to 0 when the emotion intensity is less than the correction reference value. 4. The emotion correction device according to any one of 1 to 3.
  5. 前記補正部は、前記取得した感情情報の感情強度が、前記補正基準値を超えるかを判定し、前記感情強度が前記補正基準値以上である場合、前記補正基準値に基づいて前記感情強度を補正する、請求項1から4のいずれか一項に記載の感情補正装置。 The correction unit determines whether the emotion intensity of the acquired emotion information exceeds the correction reference value, and if the emotion intensity is equal to or greater than the correction reference value, adjusts the emotion intensity based on the correction reference value. 5. The emotion correction device according to any one of claims 1 to 4, for correcting.
  6. 前記バイタルデータが、対象者の表情または音声である、請求項1から5のいずれか一項に記載の感情補正装置。 The emotion correction device according to any one of claims 1 to 5, wherein the vital data is facial expression or voice of the subject.
  7. バイタルデータ取得部、感情推定部、及び感情補正部を含み、
    前記バイタルデータ取得部は、対象者のバイタルデータを取得し、
    前記感情推定部は、前記バイタルデータに基づいて対象者の感情を推定し、
    前記感情補正部は、推定した前記感情を補正し、
     前記感情補正部は、請求項1から6のいずれか一項に記載の感情補正装置である、
    感情推定装置。
    including a vital data acquisition unit, an emotion estimation unit, and an emotion correction unit,
    The vital data acquisition unit acquires the subject's vital data,
    The emotion estimation unit estimates an emotion of the subject based on the vital data,
    The emotion correction unit corrects the estimated emotion,
    The emotion correction unit is the emotion correction device according to any one of claims 1 to 6,
    emotion estimation device.
  8. 感情情報取得工程、補正基準値算出工程、及び補正工程を含み、
    前記感情情報取得工程は、バイタルデータに基づく感情情報を取得し、
     前記感情情報は、感情の種類と、感情の種類毎の感情強度を含み、
    前記補正基準値算出工程は、前記感情の種類毎に、前記感情強度の出現頻度に基づいて補正基準値を算出し、
    前記補正工程は、前記補正基準値に基づいて前記感情情報を補正する、感情補正方法。
    including an emotion information acquisition step, a correction reference value calculation step, and a correction step,
    The emotion information acquisition step acquires emotion information based on vital data,
    The emotion information includes emotion types and emotion intensity for each emotion type,
    The correction reference value calculating step calculates a correction reference value based on the frequency of appearance of the emotion intensity for each type of emotion,
    The emotion correction method, wherein the correction step corrects the emotion information based on the correction reference value.
  9. 前記補正基準値算出工程は、前記感情の種類毎に、0を超える感情強度の出現頻度をクラスタリングし、且つ、面積が最大となるクラスタにおける最も高い感情強度を前記補正基準値とする、請求項8記載の感情補正方法。 The correction reference value calculating step clusters appearance frequencies of emotion intensities exceeding 0 for each type of emotion, and sets the highest emotion intensity in a cluster having a maximum area as the correction reference value. 9. The emotion correction method according to 8.
  10. 記憶工程を含み、
    前記記憶工程は、取得した前記感情情報を記憶し、
    前記補正基準値算出工程は、取得した感情情報と、記憶した感情情報に基づいて前記補正基準値を算出する、請求項8または9記載の感情補正方法。
    including a storage step;
    The storing step stores the acquired emotion information,
    10. The emotion correction method according to claim 8, wherein said correction reference value calculating step calculates said correction reference value based on acquired emotion information and stored emotion information.
  11. 前記補正工程は、前記取得した感情情報の感情強度が、前記補正基準値を超えるかを判定し、前記感情強度が前記補正基準値未満である場合、前記感情強度を0に補正する、請求項8から10のいずれか一項に記載の感情補正方法。 The correcting step determines whether the emotion intensity of the acquired emotion information exceeds the correction reference value, and corrects the emotion intensity to 0 when the emotion intensity is less than the correction reference value. 11. The emotion correction method according to any one of 8 to 10.
  12. 前記補正工程は、前記取得した感情情報の感情強度が、前記補正基準値を超えるかを判定し、前記感情強度が前記補正基準値以上である場合、前記補正基準値に基づいて前記感情強度を補正する、請求項8から11のいずれか一項に記載の感情補正方法。 The correction step determines whether the emotion intensity of the acquired emotion information exceeds the correction reference value, and if the emotion intensity is equal to or greater than the correction reference value, adjusts the emotion intensity based on the correction reference value. The emotion correction method according to any one of claims 8 to 11, wherein the correction is performed.
  13. 前記バイタルデータが、対象者の表情または音声である、請求項8から12のいずれか一項に記載の感情補正方法。 13. The emotion correction method according to any one of claims 8 to 12, wherein the vital data is facial expression or voice of the subject.
  14. バイタルデータ取得工程、感情推定工程、及び感情補正工程を含み、
    前記バイタルデータ取得工程は、対象者のバイタルデータを取得し、
    前記感情推定工程は、前記バイタルデータに基づいて対象者の感情を推定し、
    前記感情補正工程は、推定した前記感情を補正し、
     前記感情補正工程は、請求項8から13のいずれか一項に記載の感情補正方法である、
    感情推定方法。
    including a vital data acquisition process, an emotion estimation process, and an emotion correction process,
    The vital data acquisition step acquires the subject's vital data,
    The emotion estimation step estimates the subject's emotion based on the vital data,
    The emotion correction step corrects the estimated emotion,
    The emotion correction step is the emotion correction method according to any one of claims 8 to 13,
    emotion estimation method.
  15. 感情情報取得手順、補正基準値算出手順、及び補正手順を含み、
    前記感情情報取得手順は、バイタルデータに基づく感情情報を取得し、
     前記感情情報は、感情の種類と、感情の種類毎の感情強度を含み、
    前記補正基準値算出手順は、前記感情の種類毎に、前記感情強度の出現頻度に基づいて補正基準値を算出し、
    前記補正手順は、前記補正基準値に基づいて前記感情情報を補正し、
    前記各手順が、コンピュータにより実行される、プログラム。
    Including emotion information acquisition procedure, correction reference value calculation procedure, and correction procedure,
    The emotion information acquisition step acquires emotion information based on vital data,
    The emotion information includes emotion types and emotion intensity for each emotion type,
    The correction reference value calculation step calculates a correction reference value based on the frequency of appearance of the emotion intensity for each type of emotion,
    The correction procedure corrects the emotion information based on the correction reference value,
    A program in which each of the above procedures is executed by a computer.
  16. 前記補正基準値算出手順は、前記感情の種類毎に、0を超える感情強度の出現頻度をクラスタリングし、且つ、面積が最大となるクラスタにおける最も高い感情強度を前記補正基準値とする、請求項15記載のプログラム。 The correction reference value calculation step clusters appearance frequencies of emotion intensities exceeding 0 for each type of emotion, and sets the highest emotion intensity in a cluster having a maximum area as the correction reference value. 15. The program according to 15 above.
  17. 記憶手順を含み、
    前記記憶手順は、取得した前記感情情報を記憶し、
    前記補正基準値算出手順は、取得した感情情報と、記憶した感情情報に基づいて前記補正基準値を算出する、請求項15または16記載のプログラム。
    including a memory procedure;
    The storing procedure stores the acquired emotional information,
    17. The program according to claim 15, wherein said correction reference value calculation step calculates said correction reference value based on acquired emotion information and stored emotion information.
  18. 前記補正手順は、前記取得した感情情報の感情強度が、前記補正基準値を超えるかを判定し、前記感情強度が前記補正基準値未満である場合、前記感情強度を0に補正する、請求項15から17のいずれか一項に記載のプログラム。 The correction step determines whether the emotion intensity of the acquired emotion information exceeds the correction reference value, and corrects the emotion intensity to 0 when the emotion intensity is less than the correction reference value. 18. A program according to any one of 15-17.
  19. 前記補正手順は、前記取得した感情情報の感情強度が、前記補正基準値を超えるかを判定し、前記感情強度が前記補正基準値以上である場合、前記補正基準値に基づいて前記感情強度を補正する、請求項15から18のいずれか一項に記載のプログラム。 The correction procedure determines whether the emotion intensity of the acquired emotion information exceeds the correction reference value, and if the emotion intensity is equal to or greater than the correction reference value, adjusts the emotion intensity based on the correction reference value. 19. A program according to any one of claims 15 to 18, for correcting.
  20. 前記バイタルデータが、対象者の表情または音声である、請求項15から19のいずれか一項に記載のプログラム。 20. The program according to any one of claims 15 to 19, wherein said vital data is the subject's facial expression or voice.
  21. バイタルデータ取得手順、感情推定手順、及び感情補正手順を含み、
    前記バイタルデータ取得手順は、対象者のバイタルデータを取得し、
    前記感情推定手順は、前記バイタルデータに基づいて対象者の感情を推定し、
    前記感情補正手順は、推定した前記感情を補正し、
     前記感情補正手順は、請求項15から20のいずれか一項に記載のプログラムにより実行される、プログラム。
    Including a vital data acquisition procedure, an emotion estimation procedure, and an emotion correction procedure,
    The vital data acquisition step acquires the subject's vital data,
    The emotion estimation procedure estimates the subject's emotion based on the vital data,
    The emotion correction procedure corrects the estimated emotion,
    A program, wherein the emotion correction procedure is executed by the program according to any one of claims 15 to 20.
  22. 請求項15から21のいずれか一項に記載のプログラムを記録しているコンピュータ読み取り可能な記録媒体。

     
    A computer-readable recording medium recording the program according to any one of claims 15 to 21.

PCT/JP2022/045615 2022-02-28 2022-12-12 Emotion correction device, emotion estimation device, emotion compensation method, emotion estimation method, and program WO2023162410A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022029674 2022-02-28
JP2022-029674 2022-02-28

Publications (1)

Publication Number Publication Date
WO2023162410A1 true WO2023162410A1 (en) 2023-08-31

Family

ID=87765479

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/045615 WO2023162410A1 (en) 2022-02-28 2022-12-12 Emotion correction device, emotion estimation device, emotion compensation method, emotion estimation method, and program

Country Status (1)

Country Link
WO (1) WO2023162410A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010054568A (en) * 2008-08-26 2010-03-11 Oki Electric Ind Co Ltd Emotional identification device, method and program
JP2017073107A (en) * 2015-10-08 2017-04-13 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America Control method for information presentation device, and information presentation device
JP2020103462A (en) * 2018-12-26 2020-07-09 トヨタ紡織株式会社 Emotion estimation device, environment providing system, vehicle, emotion estimation method, and information processing program

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010054568A (en) * 2008-08-26 2010-03-11 Oki Electric Ind Co Ltd Emotional identification device, method and program
JP2017073107A (en) * 2015-10-08 2017-04-13 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America Control method for information presentation device, and information presentation device
JP2020103462A (en) * 2018-12-26 2020-07-09 トヨタ紡織株式会社 Emotion estimation device, environment providing system, vehicle, emotion estimation method, and information processing program

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
TAKESHI SOHEI: "Correction of emotion estimation considering individual's characteristic facial expressions", PROCEEDINGS OF THE 84TH NATIONAL CONVENTION OF IPSJ, 17 February 2022 (2022-02-17), pages 4 - 4-22, XP093088087 *

Similar Documents

Publication Publication Date Title
US20180129647A1 (en) Systems and methods for dynamically collecting and evaluating potential imprecise characteristics for creating precise characteristics
KR102276415B1 (en) Apparatus and method for predicting/recognizing occurrence of personal concerned context
US9760767B1 (en) Rating applications based on emotional states
US20210271864A1 (en) Applying multi-channel communication metrics and semantic analysis to human interaction data extraction
WO2019017922A1 (en) Automated speech coaching systems and methods
US11741986B2 (en) System and method for passive subject specific monitoring
JP7160095B2 (en) ATTRIBUTE IDENTIFIER, ATTRIBUTE IDENTIFICATION METHOD, AND PROGRAM
KR20220018461A (en) server that operates a platform that analyzes voice and generates events
US20200245949A1 (en) Forecasting Mood Changes from Digital Biomarkers
WO2023162410A1 (en) Emotion correction device, emotion estimation device, emotion compensation method, emotion estimation method, and program
WO2021128847A1 (en) Terminal interaction method and apparatus, computer device, and storage medium
US20180342240A1 (en) System and method for assessing audio files for transcription services
JP4715704B2 (en) Speech recognition apparatus and speech recognition program
US11301615B2 (en) Information processing device using recognition difficulty score and information processing method
JP7485454B2 (en) Sign language translation processing device, sign language translation processing system, sign language translation processing method, program, and recording medium
JP2023073620A (en) Emotional movement estimate apparatus, emotional movement estimation method, program, and recording medium
WO2023176144A1 (en) Living body detection support device, facial authentication device, living body detection support method, facial authentication method, program, and recording medium
JP2023073619A (en) Emotional difference detection apparatus, emotional difference detection method, and program
US20210007704A1 (en) Detecting subjects with disordered breathing
JP2022077831A (en) Question estimation device, learned model generation device, question estimation method, production method of learned model, program and recording medium
JPWO2019167848A1 (en) Data conversion system, data conversion method and program
WO2023181272A1 (en) Information processing device, information processing method, and recording medium
US20170105681A1 (en) Method and device for non-invasive monitoring of physiological parameters
US20230130263A1 (en) Method For Recognizing Abnormal Sleep Audio Clip, Electronic Device
US20230148264A1 (en) Information processing apparatus, method, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22928927

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2024502848

Country of ref document: JP

Kind code of ref document: A