WO2019146767A1 - Emotional analysis system - Google Patents

Emotional analysis system Download PDF

Info

Publication number
WO2019146767A1
WO2019146767A1 PCT/JP2019/002570 JP2019002570W WO2019146767A1 WO 2019146767 A1 WO2019146767 A1 WO 2019146767A1 JP 2019002570 W JP2019002570 W JP 2019002570W WO 2019146767 A1 WO2019146767 A1 WO 2019146767A1
Authority
WO
WIPO (PCT)
Prior art keywords
subject
emotion
unit
message
analysis system
Prior art date
Application number
PCT/JP2019/002570
Other languages
French (fr)
Japanese (ja)
Inventor
久和 正岡
Original Assignee
久和 正岡
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 久和 正岡 filed Critical 久和 正岡
Publication of WO2019146767A1 publication Critical patent/WO2019146767A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state

Definitions

  • the present disclosure relates to a system for analyzing a subject's emotions.
  • Patent Document 1 and Patent Document 2 disclose techniques for estimating a subject's emotion.
  • the purpose of this disclosure is to provide an emotion analysis system that analyzes and reports the subject's emotions. Another object of the present disclosure is to provide an emotion analysis system that intuitively reports subject's emotions in a form that can be understood by anyone. Yet another object of the present disclosure is to provide an emotion analysis system that supports the improvement of the subject's emotions. Another object of the present disclosure is to provide an emotion analysis system that modifies a subsequent improvement based on the subject's emotion and the change in emotion after the improvement of the emotion provided for the emotion. It is.
  • the present disclosure includes the following configurations.
  • a detection unit for detecting the heart rate and body temperature of the subject an acquisition unit for acquiring the heart rate and body temperature detected by the detection unit and analyzing the emotion of the subject, and the emotion analyzed by the acquisition unit Emotion analysis system provided with a notification unit for giving notification.
  • the notification unit classifies the emotions analyzed by the acquisition unit into a plurality of categories, and uses the classified emotions as visual information including at least one of color, numbers, characters, symbols, pictures, and patterns. At least one of a display unit to be displayed, a sound output unit outputting by auditory information including at least one of sound, voice, and music, or a tactile sensation output unit outputting by tactile information including vibration, shape, and surface roughness Equipped with
  • the notification unit notifies an improvement measure for improving the emotion analyzed by the acquisition unit.
  • the notification unit notifies a maintenance measure for maintaining the emotion analyzed by the acquisition unit.
  • the acquisition unit analyzes the subject's emotion based on the Russell circle, and the notification unit determines from the left half side the right half when it is determined that the discomfort area is the left half of the Russell circle. And output at least one of visual information, auditory information, and tactile information that changes the subject's emotion to the pleasure area of the subject.
  • the acquisition unit analyzes the subject's emotion based on the Russell circle
  • the notification unit is visual information or auditory information that changes the subject's emotion toward the radial center of the Russell circle.
  • Output at least one of dynamic information and tactile information.
  • the notification unit outputs a message prompting a change from the subject's current action to a different action when improving the feeling, and / or directly to the subject when maintaining the feeling or
  • An emotion analysis system that outputs messages that give awards indirectly.
  • An acquisition unit for acquiring the body temperature and analyzing the emotion, and a display unit for displaying the analyzed emotion are provided.
  • the wearable terminal includes the detection unit and the display unit, and the portable terminal includes the acquisition unit and the display unit
  • the emotion analysis system further includes a communication unit on which the wearable terminal and the portable terminal communicate by either wireless communication or wired communication.
  • the portable terminal is provided with a plurality of verbal messages representing emotions, and among the messages, ones according to the emotion are automatically visually displayed on the display unit of the wearable terminal and the portable terminal, And / or an emotion analysis system including a sound output unit that outputs a message by voice.
  • the subject can input a subject message of a reply or a reply from the terminal in response to the message displayed on either the wearable terminal or the portable terminal, or both, and another separated display means.
  • An emotion analysis system comprising: a conversation providing unit which enables the portable terminal to send a reply to the subject message.
  • the emotion analysis system 10 includes a detection unit 11 (DET), an acquisition unit 12 (ACC), and a notification unit 13 (NOT).
  • DET detection unit 11
  • ACC acquisition unit 12
  • NOT notification unit 13
  • the detection unit 11 for detecting the heart rate and the body temperature of the subject can be provided by the wearable terminal 1a worn on the subject or the wearable measurement instrument 1b worn on the subject.
  • the wearable terminal 1a can be provided, for example, by a detachable watch-type terminal.
  • the wearable measuring instrument 1b can be provided, for example, by a terminal taped to the skin, or an implant implanted subcutaneously.
  • the detection unit 11 accumulates the subject's heart rate and body temperature over a fixed period of time. Such data storage is realized by a non-transitional tangible memory device.
  • the acquisition unit 12 can be provided by a so-called computer system.
  • the computer systems can be arranged centrally or decentrally.
  • the acquisition unit 12 can be provided, for example, on the wearable terminal 1a worn on the subject.
  • the acquisition unit 12 can be provided on the wearable terminal 1a and various terminal devices 2a, 2b, and 3a (see FIGS. 6 and 9) disclosed in the embodiments described later.
  • the acquisition unit 12 may be provided in a device different from the subsequent notification unit 13.
  • one or more acquisition units 12 may be provided for one subject.
  • the acquisition unit 12 can be provided in a terminal device available to the user.
  • the notification unit 13 classifies and reports the emotions analyzed by the acquisition unit 12 into a plurality.
  • the notification unit 13 can be provided in the wearable terminal 1a.
  • the notification unit 13 can be provided to the wearable terminal 1a and various terminal devices 2a, 2b, and 3a disclosed in the embodiments described later.
  • One or more notification units 13 may be provided for one subject.
  • the notification unit 13 can be provided in a terminal device available to the user.
  • the notification unit 13 can include a display unit that displays classified emotions using visual information including at least one of color, numbers, characters, symbols, pictures, and patterns.
  • the notification unit 13 can include a sound output unit that outputs classified emotions as auditory information including at least one of sound, voice, and music.
  • the notification unit 13 can include a tactile sensation output unit that outputs classified emotions as tactile information including vibration, shape, and surface roughness.
  • the notification unit 13 can include at least one of a display unit, a sound output unit, and a tactile sensation output unit.
  • the notification unit 13 preferably includes at least a display unit. Furthermore, in this embodiment, it is desirable to provide a display that can assign different emotions to different colors and display emotions in color in order to intuitively intelligibly convey emotions. Furthermore, the notification unit 13 may notify of an improvement measure for improving the emotion analyzed by the acquisition unit 12. The notification unit 13 may provide a message for supporting improvement in the subject's emotion by the display unit, the sound output unit, or the tactile sensation output unit.
  • This embodiment has a wearable terminal 1a provided with a display unit 1c for displaying the analyzed emotion.
  • the detection unit 11 may be worn apart from the wearable terminal 1a as the wearable measurement device 1b and apart from the wearable terminal 1a and worn on the skin of the subject's body, for example, the heart.
  • the wearable terminal 1a and the wearable measurement device 1b can be connected by any communication means, for example, wireless communication, so that heart rate and body temperature data can be transmitted and received mutually.
  • the communication means is provided by the communication unit 1d provided in the wearable terminal 1a and the communication unit 1e provided in the wearable measurement instrument 1b.
  • the emotion analysis method is performed, for example, as follows.
  • the wearable terminal 1 a or the wearable measurement device 1 b functions as the detection unit 11.
  • the detection unit 11 measures the heart rate and the skin temperature of the subject.
  • the acquisition unit 12 analyzes the subject's emotion based on the chart of FIG. 2 from the measured heart rate and skin temperature. In this embodiment, analysis is performed by the computer system of the wearable terminal 1a.
  • the notification unit 13 displays the emotion analyzed by the acquisition unit 12 on the display unit 1 c of the wearable terminal 1 a.
  • the display unit 1c may be a light emitter such as an LED, or may be a screen using liquid crystal or the like.
  • An aspect of the display is a number, a symbol, a character or the like indicating an emotion with a color corresponding to the emotion as a basic element.
  • the wearable terminal 1a also has a tactile sensation output unit provided by a motor called a vibrator.
  • a vibrator When the subject's emotion enters the caution zone and the danger zone shown in FIG. 2, the notification unit 13 of the wearable terminal 1a vibrates in small steps in each pattern to intuitively indicate to the subject the transition to the caution zone and the danger zone. Also has the ability to
  • the color of emotion in the area is continuously turned on by the notification unit 13, and when it enters the danger zone, it is blinked and displayed intermittently.
  • the subject can objectively know his / her own emotion by himself, and a person who looks at the informing unit 13 such as a supporter of the subject can know the subject's emotion.
  • the wearable type refers to a type in which data such as heart rate and body temperature can be detected without the intention of measurement while the subject's body is constantly touched.
  • the emotion analysis system 10 includes the detection unit 11 (DET), the acquisition unit 12 (ACC), and the notification unit 13 (NOT).
  • the detection unit 11 measures the heartbeat and the temperature of the subject 31 (EXP).
  • the acquisition unit 12 is analyzed by a Russell circle calculation unit 14 (CAL) that calculates coordinates on the Russell circle, and an analysis unit 15 (ANA) that analyzes emotions using the coordinates on the Russell circle as main elements.
  • CAL Russell circle calculation unit 14
  • ANA analysis unit 15
  • ANA analysis unit 15
  • CRE notification data generation unit 16
  • the notification data generation unit 16 generates notification data for notifying the analyzed emotion in a color that is easy for anyone to understand.
  • the emotion analysis system also includes an advice generation unit 17 (ADV) for functioning as an emotion analysis improvement system.
  • the advice generation unit 17 generates a visual or audio message to improve emotions. That is, the advice generation unit 17 generates a message that is effective to move the current coordinates on the Russell's ring close to the origin of the ring radial direction. If there is a current coordinate in the annoyance area of the left half of the torus, then generate a message that is validated to move it to the side of the pleasure area of the torn right half.
  • These messages inform the subject's feelings at that time and encourage them to take some other action, such as the following. "Now I feel like I'm unstable.” "Let's go away from where you are, take three deep breaths.” "Let's stop what you are doing and take three deep breaths.”
  • the advice generation unit 17 maintains the state in that area, and further, a direction closer to the origin in the ring radial direction. Generate a message that is validated for transfer to These messages inform the subject's emotions at that time, acknowledge the emotion maintenance efforts so far, and encourage their continuation. For example: “I have been feeling calm for over two hours so far.” “This is the third longest record of maintaining your emotions. We have achieved great results.” "How about taking your favorite treat? "Let's take three deep breaths to keep this feeling as long as possible.”
  • the notification unit 13 not only indicates the analyzed emotion of the subject 31 but also notifies a message.
  • this message includes an action change message and / or a reward message. Additionally, the message may additionally include a recommendation action message.
  • the notification unit 13 outputs an action change message prompting a change from the current action of the subject 31 to another different action.
  • the action change message is given to the subject 31 directly or indirectly.
  • the action change message may be given to the supporter 32.
  • the supporter 32 having received the action change message can work on the subject 31 to change the action of the subject 31.
  • the notification unit 13 outputs a reward message that directly or indirectly gives a reward to the subject 31.
  • the reward message may be given directly to the subject 31. Further, the reward message may be given to the supporter 32. In this case, a reward may be given to the subject 31 from the supporter 32.
  • the award includes at least one of a word that compliments the subject 31, a favorite item that the subject 31 likes, an article that the subject 31 likes, an image, and the like.
  • the notification unit 13 may output a recommended action message indicating an action of the subject 31 effective to improve the feeling.
  • reporting part 13 may output the recommendation action message which shows the test subject's 31 action effective in order to maintain an emotion, when maintaining an emotion.
  • the recommended action message is exemplified by, for example, "deep breathing".
  • the messages are preset and stored in the storage device. For example, a plurality of messages for returning anger's emotions to calm emotions are stored in advance.
  • the advice generation unit 17 generates a message so as to increase the strength in order according to the strength data set in advance for each message.
  • the advice generation unit 17 uses the data calculated by the Russell circle calculation unit 14 to generate a message.
  • the advice generation unit 17 uses the notification unit 13 to notify a user of a subject including the subject and the supporter.
  • this emotion analysis system comprises a learning unit 18 (LEA).
  • the learning unit 18 evaluates the effect (effect) of the generated old message based on the feedback information, and changes the subsequently generated new message.
  • the change of the message based on such feedback information is called learning control.
  • the change or change of the message is performed by selecting a previously stored message.
  • the change or change of the message is given by, for example, a change of the end word strength.
  • the change or change of the message may be implemented by the additional storage of a new message obtained by data integration with another emotion analysis system and the use of the new message. Alternatively, the change or change of the message may be implemented by storing a new message obtained by data integration with another emotion analysis system and switching to the new message.
  • One piece of feedback information can be obtained from changes in the coordinates of the most recent emotion on the Russell circle after the old message. The effectiveness of the old message is evaluated depending on whether the change from the old coordinate to the new coordinate is the approach to the target coordinate or the separation from the target coordinate.
  • the learning unit 18 obtains coordinate information from the Russell circular calculation unit 14.
  • One piece of feedback information is obtained from the subject's declaration, ie, input to the system.
  • One piece of feedback information is obtained by a declaration from the supporter 32 (SUP), that is, an input to the system. By making the declaration from the supporter 32 one of feedback information, objective learning becomes possible.
  • the declaration from the user including the subject 31 or the supporter 32 is input from the teaching unit 19 (TEA) to the learning unit 18.
  • the emotion improvement function by providing a message can be adapted to the individuality of the subject. Specifically, the message is adjusted so that emotion can be more easily improved. Also, by providing such a learning control function, the message is adjusted so that the desired change in emotion occurs.
  • a part or the whole of the notification unit 13 may be provided by artificial intelligence 20 (ATI).
  • the emotion analysis system provides analysis of emotions, selection and display of messages, and transmission and reception of messages by a server system equipped with an artificial intelligence 20.
  • This server system may be placed in independent hardware or may be placed on the cloud.
  • the history information including the message provided from the artificial intelligence 20 to the subject 31 and the feedback information corresponding to the message are all recorded in the artificial intelligence 20.
  • the effect of the message maintaining the emotion in the safe zone, or the effect of the message returning the emotion in the attention zone or the danger zone to the safe zone is automatically accumulated.
  • the artificial intelligence 20 integrates learning data of a plurality of emotion analysis systems 21 (EANAS) regarding a plurality of subjects 33 (EXPA).
  • EANAS emotion analysis systems 21
  • EXPA subjects 33
  • the answer from the artificial intelligence to a certain subject is selected from learning data collected for a specific subject 33 and learning data collected from other subjects. This will make the more relevant linguistic messages faster to create and send.
  • the artificial intelligence 20 includes an environment detection unit 22 (EDET) that detects an environment in which the subject 31 is placed.
  • the environment detection unit 22 detects information of the environment in which the subject 31 is placed.
  • the environmental information includes at least one of date and time, temperature, humidity, barometric pressure, ambient noise level and sound quality, ambient people, and ambient brightness.
  • the artificial intelligence 20 providing the learning unit 18 relates the change in coordinates, that is, the change in emotion, to the environment by monitoring the change in coordinates on the Russell's circle and the environment. Thereby, the change in emotion is predicted from the change in environment, and the predicted change in emotion is notified to the subject. As a result, it is possible to prevent excessive swings of emotions in advance. Furthermore, artificial intelligence 20 may automatically generate a new message by correlating the change of emotion with the change of environment.
  • each normal value is calculated as a normal value STD from the data of heart rate and body temperature detected from the subject.
  • the numerical value of the state in which the subject is at rest may be used as a normal value, for example, if it is the body temperature, the numerical value at the time of awakening may be used.
  • the maximum and minimum values of the subject's heart rate and body temperature are measured.
  • the maximum and minimum values are the maximum and minimum heart rate or temperature in the daily life of the subject. For example, it refers to the maximum or minimum value recorded during daily measurements. These numerical values are regarded as temporary maximum and minimum values, and from the next day, if there is a value exceeding these temporary maximum and temporary minimum values, the previous maximum value or minimum value is updated. If there is no value exceeding, use the maximum and minimum values up to the previous day without updating.
  • the normal value and the maximum and minimum values may be given as initial values. Then, from the following equation, the ratio of the current measurement value to the maximum value and the minimum value can be obtained for each of the heart rate and the temperature based on the normal value.
  • the proportions thus determined are plotted on a graph with the heart rate on the vertical axis and the body temperature on the horizontal axis. Furthermore, the graph is superimposed on Russell's ring model.
  • Russell's circle model human emotions are (1) awakening ARS, (2) pain DST, (3) sadness MIS, (4) depressive DEP, (5) sleepiness SLP, (6) satisfied CNT, (7 ) 8 types of pleasure PLS, (8) excitement EXC.
  • These emotions are arranged in 45-degree steps on a circle: 0 degrees: pleasure, 45 degrees: excitement, 90 degrees: awakening, 135 degrees: pain, 180 degrees: sadness, 225 degrees: melancholy, 270 degrees: sleepiness, 315 degrees: Satisfied.
  • the position of the plot on the toric model indicates one of eight emotions. Therefore, in this embodiment, the emotion of the subject is analyzed as one of a plurality of categories based on the intersection of the subject's heart rate and body temperature on Russell's ring model.
  • the radius of the disk is divided, for example, into three in accordance with the ratio of heart rate and body temperature measurements to the maximum value MAX and the minimum value MIN. That is, for example, 0% to 25% is a safety zone STB, 25% to 70% is a caution zone CAT, and 70% or more is a danger zone WAR.
  • the division ratio and the number of divisions may be changed to, for example, four divisions according to the subject's psychological physical condition.
  • each emotion may be assigned a number, and the category may be displayed as a percentage (for example, 65%). However, the safe zone is displayed in the color of "pleasure" regardless of emotion.
  • FIG. 3 shows a flow of processing S161 for analyzing and improving emotions.
  • the emotion analysis system detects biological information of the subject, such as heart rate, body temperature, blood oxygen concentration, and facial expression.
  • the emotion analysis system analyzes the subject's emotion using the detected biological information.
  • the emotion analysis system determines whether the analyzed emotion is bad or good. Judgment here is judged based on the coordinate which the test subject's living body information on a Russell circle shows. The emotion analysis system notifies in step S165 or step S167 that the subject can recognize his or her own objective emotion.
  • the emotion analysis system provides a message to improve the emotion into a good emotion in step S166. That is, if the subject's emotion is shown in the left half of the Russell's circle, a message is provided to move the subject's emotion to the right in the Russell's circle.
  • the emotion analysis system provides a message to the subject to maintain a good emotion in step S168 if the analyzed emotion is a good emotion such as pleasure or satisfaction. That is, if the subject's emotion is shown in the right half area of the Russell circle, a message is provided to maintain the subject's emotion in the right half area of the Russell circle.
  • the provided message can be said to be a positive response to subjects who are not merely informed.
  • the notification in step S165 or step S167 and the provision of the message in step S166 or step S168 are performed by the notification unit 13.
  • FIG. 4 shows the process from S162 to S167 in more detail.
  • the system inputs the subject's heart rate and body temperature.
  • the system accumulates heart rate and body temperature in step S152.
  • the system calculates the above-described normal value and the maximum and minimum values.
  • the system analyzes the subject's current emotion from the heart rate and the temperature in step S154.
  • step S155 The system reports the subject's emotion in step S155.
  • step S156 the system identifies a color pre-assigned to the current emotion.
  • the system displays the specified color on the notification unit 13 in step S157.
  • the notification unit 13 may display not only the color of the analyzed emotion but also Russell's ring model. Furthermore, the notification unit 13 may display a plot based on the subject's heart rate and body temperature. Specifically, the notification unit 13 displays a Russell ring model as a line drawing, displays a plot as a point, and further displays a color indicating an emotion at that time as a background color. In step S158, the system selects and switches between continuous lighting and blinking lighting of the background color according to the plot position on Russell's ring model. Processing for such an emotion analysis method is repeatedly performed.
  • the subject 31 can objectively grasp his / her emotion and the degree thereof. In particular, anyone can intuitively grasp emotions by the color of the notification unit 13. Therefore, when the subject 31 is ASD (Autism Spectrum Disorder), it is possible to objectively recognize changes in one's emotions. In a notable case, the subject 31 can take an action to objectively recognize changes in his / her emotions before the emotions become difficult to control, and to prevent the emotions from becoming difficult to control. When the subject 31 is a typical developing person, it can be dealt with before his or her degree of stress rises. As a result, the emotion analysis system can reduce the psychological and mental burden on the subject 31 in daily life.
  • ASD Application Spectrum Disorder
  • the emotion analysis system analyzes emotions, displays them objectively by visual information including color, and makes the subject 31 or the supporter 32 recognize. Furthermore, the emotion analysis system not only informs the analyzed emotions, but causes the subjects 31 and the supporter 32 to recognize the deterioration of the emotions.
  • the emotion analysis system not only makes the emotions recognized but also provides additional information to maintain good emotions and / or additional information to improve the deteriorated emotions.
  • the provision of the additional information may store a plurality of additional information in advance in a storage medium such as a semiconductor memory, and select and reproduce the plurality of additional information stored. Also, the additional information may be learned in actual use. Also, additional information may be added by the user. Also, the additional information may be added via the network such as the Internet or selected on the network.
  • the additional information is provided as information that can be understood by the user, such as text and voice.
  • the additional information may be in the form of an aid to draw the user's attention.
  • the additional information may include, for example, text and / or voice presented with a pictorial image such as a cartoon character that is subject to the subject's 31 attention.
  • a pictorial image such as a cartoon character can draw the attention of the subject 31 even when the subject's 31 emotion is getting worse.
  • the supporter 32 can also grasp the emotion of the subject 31.
  • the supporter 32 includes a guardian of the subject 31, a helper of the subject 31, or an employee of a support organization. Therefore, it is possible to provide effective support and assistance that are in line with the feelings of the subject 31 and that are not wasted.
  • the second embodiment includes, in addition to the first embodiment, a portable terminal 2a such as a smartphone or a tablet.
  • the wearable terminal 1a, the wearable measuring instrument 1b, and the portable terminal 2a are connected by any communication means, for example, wireless communication, and can transmit and receive heart rate and body temperature data to each other.
  • FIG. 1c1 An example of the wearable terminal 1a or the portable terminal and its display unit 1c is shown in FIG.
  • a circular portion 1c1 is present at the top of the display unit 1c, and the analyzed emotion is displayed in color here.
  • a message to the subject for the purpose of feeling improvement or feeling maintenance is displayed in the subject-oriented message column 1c2.
  • the input message of the reaction from the subject to it is displayed in 1c3.
  • the circular portion 1c1 may display the Russell's ring model of FIG. 2 to which a heart rate and a temperature index are added.
  • FIG. 8 shows an example in which the detection unit 11 and the display unit 1c are mounted on the wearable terminal 1a in the second embodiment.
  • the wired communication means 41 is, for example, a USB cable.
  • one example of the operation is to analyze the subject's emotion also with the portable terminal 2a.
  • the heart rate and skin temperature of the subject are measured by the wearable terminal 1a or the wearable measurement device 1b.
  • the emotion of the subject is analyzed by the portable terminal 2a based on the concept of FIG.
  • the analyzed emotion is displayed on both the portable terminal 2a and the wearable terminal 1a, or one of these display units.
  • Yet another operation example is to use another device 2b such as a screen of a personal computer or a television monitor instead of the portable terminal 2a or in addition to the portable terminal 2a.
  • the emotion analyzed by the portable terminal 2a is also displayed on another device 2b such as a screen of a personal computer or a television monitor. This enables the supporters around the subject to confirm the subject's emotions.
  • the color of the subject's emotion displayed on the portable terminal 2a or another device 2b such as a screen of a personal computer or a television monitor is red, orange, purple, yellow, Blue, light blue (light blue), pink, including green.
  • the emotion of the present subject is displayed on the wearable terminal 1a, the portable terminal 2a, or another device 2b such as a screen of a personal computer or a television monitor. It displays messages in the indicated language.
  • the portable terminal 2a is provided with an audio output unit, and has a function of outputting a language message as an audio. For example, if the emotion of the subject analyzed is "pain, 50%", a message such as "Now you are experiencing moderate pain" is displayed. A plurality of these messages are stored in advance in the portable terminal 2a, and an appropriate one is automatically selected and transmitted in the portable terminal 2a according to the analyzed emotion. These messages may be displayed differently each time at fixed time intervals, for example, every 10 minutes. In addition to the language message, a picture message may be displayed. Also, the language message may be transmitted to the subject by voice through the voice output unit.
  • a message is provided to prompt an action to maintain the state.
  • the message includes a language message or a picture message.
  • the message is displayed on the wearable terminal 1a, the portable terminal 2a, or another device 2b such as a screen of a personal computer or a television monitor.
  • a message for returning the emotion to the safe zone is provided.
  • the message includes a language message or a picture message.
  • the message is displayed on the wearable terminal 1a, the portable terminal 2a, or another device 2b such as a screen of a personal computer or a television monitor. For example, if the subject's emotion was analyzed as "pain, 50%", "It's about time to take action to get rid of the pain. A message such as These messages may be displayed differently each time at certain time intervals until the subject's emotions enter the safe zone.
  • the system continuously obtains heart rate and body temperature and continuously judges emotions, thereby quantifying whether the displayed message has an effect on the improvement of the subject's emotions, and to what extent. Can be measured.
  • the system memorizes the displayed message and its effect, and then the message with higher emotion improvement effect is better It can be displayed in a short time. This will be called “learning" in this embodiment.
  • the notification unit 13 notifies of an improvement measure for improving the emotion analyzed by the acquisition unit 12. Not only the emotions are displayed, but the improvement measures are informed. This allows the user including the subject and / or the supporter to improve the subject's emotions.
  • the wording of the message can be selected and changed, or the character to which the message is transmitted can be selected and changed according to the preference of the subject. For example, for children, adults, elderly people, men, women, close friendliness, elder tone, grandson tone, servant tone, teacher tone, etc.
  • the conditions of these messages are stored in advance in the notification unit 13. The user selects the tone of the message, for example, by the operation unit of the portable terminal 2a.
  • the subject can input the subject message of the reply and the answer.
  • the wearable terminal 1a worn by the subject or the portable terminal 2a carried by the subject has a voice input function or a text input function.
  • the voice input function can be provided, for example, by a microphone and a voice recognition device.
  • Text input functionality can be provided by the keyboard.
  • the portable terminal 2a can further send a message (reply) to the subject message.
  • the portable terminal 2a has a conversation providing unit for generating a reply to the subject message.
  • the conversation providing unit provides a reply in which the subject feels as if he or she is in conversation with a human.
  • the conversation providing unit can use a variety of devices, from bot devices that only select one of a plurality of prepared responses, to a device called artificial intelligence, according to the degree of conversation provided by itself. It is. This causes the subject to send and receive messages as if they were in conversation with a human being.
  • the conversation providing unit may be a communication line communicably connecting the subject and the operator.
  • the ninth embodiment A ninth embodiment is shown in FIG.
  • the analysis of emotions, the selection of messages and the display in the second to eighth embodiments are performed by the server system 3a equipped with artificial intelligence.
  • This server system may be located in independent hardware, but may be located on the cloud.
  • the artificial intelligence learns the message of the appropriate length, the degree of abstraction, the expression, the wording to the individual subject. , To allow artificial intelligence to create messages.
  • artificial intelligence learning data for a plurality of subjects is integrated.
  • the training data is collected via a network and made available as a set of data available to a particular subject.
  • the answer to a subject is selected as artificial intelligence from among learning data collected from a plurality of other subjects.
  • the use of the emotion analysis system by a plurality of subjects, the integration of the plurality of learning data, and the action of utilizing the integrated learning data for a specific subject are included. This makes it possible to create and send more appropriate linguistic messages faster.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Hospice & Palliative Care (AREA)
  • Pathology (AREA)
  • Developmental Disabilities (AREA)
  • Psychiatry (AREA)
  • Psychology (AREA)
  • Social Psychology (AREA)
  • Physics & Mathematics (AREA)
  • Child & Adolescent Psychology (AREA)
  • Biophysics (AREA)
  • Educational Technology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

Provided is an emotional analysis system that analyzes the emotions of a subject and then performs a notification concerning the emotions. The emotional analysis system (10) comprises a detection unit (11) that detects the heart rate and temperature of the subject. The emotional analysis system (10) comprises an acquisition unit (12) that acquires the heart rate and temperature detected by the detection unit (11), and analyzes the emotions of the subject. The emotional analysis system (10) comprises a notification unit (13) that categorizes the emotions obtained as a result of the analysis by the acquisition unit (12) into a plurality of categories, and performs a notification concerning the emotions. The notification unit (13) comprises a display unit. The emotions are categorized into a plurality of categories. The plurality of emotions are allocated a plurality of different colors. The notification unit (13) uses the display unit to display the colors corresponding to the emotions.

Description

感情分析システムEmotion analysis system 関連出願への相互参照Cross-reference to related applications
 本出願は、2018年1月26日に出願された日本特許出願番号2018-11827に基づくもので、ここにその記載内容が参照により組み入れられる。 This application is based on Japanese Patent Application No. 2018-11827 filed on January 26, 2018, the contents of which are incorporated herein by reference.
 本開示は、被験者の感情を分析するシステムに関するものである。 The present disclosure relates to a system for analyzing a subject's emotions.
 特許文献1および特許文献2は、被験者の感情を推定する技術を開示している。 Patent Document 1 and Patent Document 2 disclose techniques for estimating a subject's emotion.
特開2017-86992号公報JP, 2017-86992, A 特開2012-059107号公報JP, 2012-059107, A
 この開示の目的は、被験者の感情を分析して報知する感情分析システムを提供することである。この開示の他の目的は、被験者の感情を直感的に誰もが理解可能な形態で報知する感情分析システムを提供することである。この開示のさらに他の目的は、被験者の感情の改善を支援する感情分析システムを提供することである。この開示の別の目的は、被験者の感情と、その感情に対して提供された感情の改善策の後の感情の変化をもとに、後続の改善策を修正する感情分析システムを提供することである。 The purpose of this disclosure is to provide an emotion analysis system that analyzes and reports the subject's emotions. Another object of the present disclosure is to provide an emotion analysis system that intuitively reports subject's emotions in a form that can be understood by anyone. Yet another object of the present disclosure is to provide an emotion analysis system that supports the improvement of the subject's emotions. Another object of the present disclosure is to provide an emotion analysis system that modifies a subsequent improvement based on the subject's emotion and the change in emotion after the improvement of the emotion provided for the emotion. It is.
 上記の課題を解決するため、本開示は次の構成を含む。 In order to solve the above problems, the present disclosure includes the following configurations.
 被験者の心拍数と体温を検出する検出部、検出部により検出された心拍数と体温を取得し、被験者の感情を分析する取得部、および、取得部によって分析された感情を、利用者に対して報知する報知部を備える感情分析システム。 A detection unit for detecting the heart rate and body temperature of the subject, an acquisition unit for acquiring the heart rate and body temperature detected by the detection unit and analyzing the emotion of the subject, and the emotion analyzed by the acquisition unit Emotion analysis system provided with a notification unit for giving notification.
 上記感情分析システムにおいて、報知部は、取得部によって分析された感情を複数に分類し、分類された感情を、色、数字、文字、記号、絵画、および模様の少なくともひとつを含む視覚的情報により表示する表示部、音、音声、および音楽の少なくともひとつを含む聴覚的情報により出力する音出力部、または、振動、形状、および表面粗さを含む触覚的情報により出力する触感出力部の少なくともひとつを備える。 In the emotion analysis system, the notification unit classifies the emotions analyzed by the acquisition unit into a plurality of categories, and uses the classified emotions as visual information including at least one of color, numbers, characters, symbols, pictures, and patterns. At least one of a display unit to be displayed, a sound output unit outputting by auditory information including at least one of sound, voice, and music, or a tactile sensation output unit outputting by tactile information including vibration, shape, and surface roughness Equipped with
 上記感情分析システムにおいて、報知部は、取得部によって分析された感情を改善するための改善策を報知する。 In the emotion analysis system, the notification unit notifies an improvement measure for improving the emotion analyzed by the acquisition unit.
 上記感情分析システムにおいて、報知部は、取得部によって分析された感情を維持するための維持策を報知する。 In the emotion analysis system, the notification unit notifies a maintenance measure for maintaining the emotion analyzed by the acquisition unit.
 上記感情分析システムにおいて、取得部は、被験者の感情をラッセル円環に基づいて分析し、報知部は、ラッセル円環の左半分である不快領域と判断されたときには、左半分の側から右半分の快領域へ被験者の感情を変化させるような視覚的情報、あるいは聴覚的情報、あるいは触覚的情報の少なくとも一つを出力する。 In the emotion analysis system described above, the acquisition unit analyzes the subject's emotion based on the Russell circle, and the notification unit determines from the left half side the right half when it is determined that the discomfort area is the left half of the Russell circle. And output at least one of visual information, auditory information, and tactile information that changes the subject's emotion to the pleasure area of the subject.
 上記感情分析システムにおいて、取得部は、被験者の感情をラッセル円環に基づいて分析し、報知部は、ラッセル円環の半径方向中心側へ被験者の感情を変化させるような視覚的情報、あるいは聴覚的情報、あるいは触覚的情報の少なくとも一つを出力する。 In the emotion analysis system, the acquisition unit analyzes the subject's emotion based on the Russell circle, and the notification unit is visual information or auditory information that changes the subject's emotion toward the radial center of the Russell circle. Output at least one of dynamic information and tactile information.
 報知部は、感情を改善する場合には、被験者の現在の行為から、異なる別の行為への変更を促すメッセージを出力し、および/または、感情を維持する場合には、被験者に直接的または間接的に褒賞を与えるメッセージを出力する感情分析システム。 The notification unit outputs a message prompting a change from the subject's current action to a different action when improving the feeling, and / or directly to the subject when maintaining the feeling or An emotion analysis system that outputs messages that give awards indirectly.
 被験者が装着する装着型端末と、その端末と無線あるいは有線で電子的信号やデータを送受信する携帯型端末とを有する感情分析システムにおいて、被験者の心拍数と体温を検出する検出部、心拍数と体温を取得し感情を分析する取得部、および、分析した感情を表示する表示部を備えており、装着型端末には検出部と表示部を搭載し、携帯型端末には取得部と表示部を搭載し、さらに、装着型端末と携帯型端末は無線通信あるいは有線通信のどちらかで通信する通信部を有する感情分析システム。 A detection unit for detecting a subject's heart rate and body temperature, and a heart rate, in an emotion analysis system having a wearable terminal worn by the subject and a portable terminal that transmits / receives electronic signals and data to / from the terminal wirelessly or by wire. An acquisition unit for acquiring the body temperature and analyzing the emotion, and a display unit for displaying the analyzed emotion are provided. The wearable terminal includes the detection unit and the display unit, and the portable terminal includes the acquisition unit and the display unit The emotion analysis system further includes a communication unit on which the wearable terminal and the portable terminal communicate by either wireless communication or wired communication.
 携帯型端末に、感情を表す言語的なメッセージを複数備え、それらのメッセージの中から感情に応じたものを、装着型端末および携帯型端末の表示部に自動的に視覚的に表示し、および/または、メッセージを音声によって出力する音出力部を備える、感情分析システム。 The portable terminal is provided with a plurality of verbal messages representing emotions, and among the messages, ones according to the emotion are automatically visually displayed on the display unit of the wearable terminal and the portable terminal, And / or an emotion analysis system including a sound output unit that outputs a message by voice.
 装着型端末あるいは携帯型端末のどちらか、あるいはその両方、および、分離された別の表示手段に表示されたメッセージに対して、被験者が返事や回答の被験者メッセージを端末から入力できるようになっており、かつ、被験者メッセージに対し携帯型端末がさらに返信を送ることができるような会話提供部を備える、感情分析システム。 The subject can input a subject message of a reply or a reply from the terminal in response to the message displayed on either the wearable terminal or the portable terminal, or both, and another separated display means. An emotion analysis system comprising: a conversation providing unit which enables the portable terminal to send a reply to the subject message.
感情分析システムの基本構成を示す図である。It is a figure which shows the basic composition of an emotion analysis system. 被験者の心拍数と皮膚温から感情を分析するラッセル円環である。It is a Russell circle that analyzes emotions from the subject's heart rate and skin temperature. 感情分析システムの基本機能を示すフローチャートである。It is a flowchart which shows the basic function of an emotion analysis system. 感情分析方法を示すフローチャートである。It is a flowchart which shows an emotion analysis method. 第1実施形態のブロック図である。It is a block diagram of a 1st embodiment. 第2実施形態のブロック図である。It is a block diagram of 2nd Embodiment. 装着型端末または携帯型端末とその表示部の例を示す図である。It is a figure showing an example of a wearing type terminal or a portable terminal, and its display part. 装着型端末および携帯型端末とその接続の例を示す図である。It is a figure which shows the example of a wearable terminal, a portable terminal, and its connection. 第9実施形態のブロック図である。It is a block diagram of 9th Embodiment.
 以下、本開示の実施形態について図に基づいて説明する。なお、以下の各実施形態相互において、互いに同一もしくは均等である部分には、同一符号を付して説明を行う。 Hereinafter, embodiments of the present disclosure will be described based on the drawings. In the following embodiments, parts that are the same as or equivalent to each other will be described with the same reference numerals.
 (第1実施形態)
 第1実施形態にかかる感情分析システムについて図1~図5を参照して説明する。
First Embodiment
The emotion analysis system according to the first embodiment will be described with reference to FIGS. 1 to 5.
 感情分析システムの基本構成を図1に示す。ここでは、感情分析改善システムとしても機能する感情分析システムについて示してあるが、当該機能を備えていない構成であっても良い。図1に示すように、感情分析システム10(EANA)は、検出部11(DET)と、取得部12(ACC)と、報知部13(NOT)とを備える。 The basic configuration of the emotion analysis system is shown in FIG. Here, although the emotion analysis system functioning as an emotion analysis improvement system is shown, a configuration without the function may be employed. As shown in FIG. 1, the emotion analysis system 10 (EANA) includes a detection unit 11 (DET), an acquisition unit 12 (ACC), and a notification unit 13 (NOT).
 被験者の心拍数と体温を検出する検出部11は、被験者に装着された装着型端末1a、または被験者に装着された装着型測定器1bによって提供することができる。装着型端末1aは、例えば、脱着可能な腕時計型端末によって提供できる。装着型測定器1bは、例えば、肌にテープで貼付された端末、または皮下に埋設されたインプラントによって提供できる。検出部11は、一定の期間にわたって被験者の心拍数および体温を蓄積する。このようなデータの蓄積は、非遷移的な実体的メモリ装置によって実現される。 The detection unit 11 for detecting the heart rate and the body temperature of the subject can be provided by the wearable terminal 1a worn on the subject or the wearable measurement instrument 1b worn on the subject. The wearable terminal 1a can be provided, for example, by a detachable watch-type terminal. The wearable measuring instrument 1b can be provided, for example, by a terminal taped to the skin, or an implant implanted subcutaneously. The detection unit 11 accumulates the subject's heart rate and body temperature over a fixed period of time. Such data storage is realized by a non-transitional tangible memory device.
 取得部12は、いわゆるコンピュータシステムによって提供することができる。コンピュータシステムは、集中的に、または分散的に配置することができる。取得部12は、例えば、被験者に装着された装着型端末1aに設けることができる。また、取得部12は、装着型端末1a、および、後述の実施形態に開示される多様な端末機器2a、2b、3a(図6、図9参照)に設けることができる。取得部12は、後続の報知部13とは別の機器に設けられてもよい。また、一人の被験者に対してひとつまたは複数の取得部12が設けられてもよい。取得部12は、利用者が利用可能な端末機器に設けることができる。 The acquisition unit 12 can be provided by a so-called computer system. The computer systems can be arranged centrally or decentrally. The acquisition unit 12 can be provided, for example, on the wearable terminal 1a worn on the subject. In addition, the acquisition unit 12 can be provided on the wearable terminal 1a and various terminal devices 2a, 2b, and 3a (see FIGS. 6 and 9) disclosed in the embodiments described later. The acquisition unit 12 may be provided in a device different from the subsequent notification unit 13. In addition, one or more acquisition units 12 may be provided for one subject. The acquisition unit 12 can be provided in a terminal device available to the user.
 報知部13は、取得部12によって分析された感情を複数に分類して報知するものである。報知部13は、装着型端末1aに設けることができる。また、報知部13は、装着型端末1a、および、後述の実施形態に開示される多様な端末機器2a、2b、3aに設けることができる。一人の被験者に対してひとつまたは複数の報知部13が設けられてもよい。報知部13は、利用者が利用可能な端末機器に設けることができる。 The notification unit 13 classifies and reports the emotions analyzed by the acquisition unit 12 into a plurality. The notification unit 13 can be provided in the wearable terminal 1a. In addition, the notification unit 13 can be provided to the wearable terminal 1a and various terminal devices 2a, 2b, and 3a disclosed in the embodiments described later. One or more notification units 13 may be provided for one subject. The notification unit 13 can be provided in a terminal device available to the user.
 報知部13は、分類された感情を、色、数字、文字、記号、絵画、および模様の少なくともひとつを含む視覚的情報により表示する表示部を備えることができる。報知部13は、分類された感情を、音、音声、および音楽の少なくともひとつを含む聴覚的情報により出力する音出力部を備えることができる。報知部13は、分類された感情を、振動、形状、および表面粗さを含む触覚的情報により出力する触感出力部を備えることができる。報知部13は、表示部、音出力部、および触感出力部の少なくともひとつを備えることができる。 The notification unit 13 can include a display unit that displays classified emotions using visual information including at least one of color, numbers, characters, symbols, pictures, and patterns. The notification unit 13 can include a sound output unit that outputs classified emotions as auditory information including at least one of sound, voice, and music. The notification unit 13 can include a tactile sensation output unit that outputs classified emotions as tactile information including vibration, shape, and surface roughness. The notification unit 13 can include at least one of a display unit, a sound output unit, and a tactile sensation output unit.
 報知部13は、少なくとも表示部を備えることが望ましい。さらに、この実施形態では、感情を直感的に誰でも理解可能に報知するために、異なる感情を異なる色に割り当てて、感情を色で表示できる表示器を備えることが望ましい。さらに、報知部13は、取得部12によって分析された感情を改善するための改善策を報知する場合がある。報知部13は、被験者の感情の改善を支援するメッセージを表示部、音出力部、または触感出力部によって提供する場合がある。 The notification unit 13 preferably includes at least a display unit. Furthermore, in this embodiment, it is desirable to provide a display that can assign different emotions to different colors and display emotions in color in order to intuitively intelligibly convey emotions. Furthermore, the notification unit 13 may notify of an improvement measure for improving the emotion analyzed by the acquisition unit 12. The notification unit 13 may provide a message for supporting improvement in the subject's emotion by the display unit, the sound output unit, or the tactile sensation output unit.
 この実施形態は、図5に示すように、分析した感情を表示する表示部1cを備えた装着型端末1aを有する。また、検出部11は、装着型測定器1bとして装着型端末1aとは別に、装着型端末1aから離れて被験者の体、例えば心臓近傍の皮膚上に装着されていてもよい。この場合、装着型端末1aと装着型測定器1bの間は、何らかの通信手段、例えば無線通信でつながれ、心拍数と体温のデータを互いに送受信することができる。通信手段は、装着型端末1aに設けられた通信部1dと、装着型測定器1bに設けられた通信部1eとによって提供される。 This embodiment, as shown in FIG. 5, has a wearable terminal 1a provided with a display unit 1c for displaying the analyzed emotion. In addition, the detection unit 11 may be worn apart from the wearable terminal 1a as the wearable measurement device 1b and apart from the wearable terminal 1a and worn on the skin of the subject's body, for example, the heart. In this case, the wearable terminal 1a and the wearable measurement device 1b can be connected by any communication means, for example, wireless communication, so that heart rate and body temperature data can be transmitted and received mutually. The communication means is provided by the communication unit 1d provided in the wearable terminal 1a and the communication unit 1e provided in the wearable measurement instrument 1b.
 感情分析方法は例えば次のように実行される。装着型端末1a、あるいは、装着型測定器1bは、検出部11として機能する。検出部11は、被験者の心拍数と皮膚温を測定する。取得部12は、測定された心拍数と皮膚温から、被験者の感情を図2のチャートに基づいて分析する。この実施形態では、装着型端末1aのコンピュータシステムによって分析する。 The emotion analysis method is performed, for example, as follows. The wearable terminal 1 a or the wearable measurement device 1 b functions as the detection unit 11. The detection unit 11 measures the heart rate and the skin temperature of the subject. The acquisition unit 12 analyzes the subject's emotion based on the chart of FIG. 2 from the measured heart rate and skin temperature. In this embodiment, analysis is performed by the computer system of the wearable terminal 1a.
 報知部13は、取得部12によって分析した感情を、装着型端末1aの表示部1cに表示する。表示部1cは、LEDのような発光体でもよいし、液晶等を用いた画面でもよい。表示の態様は、感情に相当する色を基本要素として、感情を示す数字、記号、文字等である。 The notification unit 13 displays the emotion analyzed by the acquisition unit 12 on the display unit 1 c of the wearable terminal 1 a. The display unit 1c may be a light emitter such as an LED, or may be a screen using liquid crystal or the like. An aspect of the display is a number, a symbol, a character or the like indicating an emotion with a color corresponding to the emotion as a basic element.
 また、装着型端末1aは、バイブレータと呼ばれるモータによって提供される触感出力部を有する。装着型端末1aの報知部13は、被験者の感情が図2の注意圏および危険圏に入ったときに、小刻みにそれぞれのパターンで振動して、注意圏および危険圏への遷移を被験者に直感的に伝える機能も持つ。 The wearable terminal 1a also has a tactile sensation output unit provided by a motor called a vibrator. When the subject's emotion enters the caution zone and the danger zone shown in FIG. 2, the notification unit 13 of the wearable terminal 1a vibrates in small steps in each pattern to intuitively indicate to the subject the transition to the caution zone and the danger zone. Also has the ability to
 注意圏ではその区域の感情の色を連続的に報知部13に点灯させ、危険圏に入ったら点滅させて断続的に表示させる。こうすることによって、被験者は自分自身で自らの感情を客観的に知ることができ、被験者の支援者など報知部13を見る人は、被験者の感情を知ることができる。 In the caution zone, the color of emotion in the area is continuously turned on by the notification unit 13, and when it enters the danger zone, it is blinked and displayed intermittently. By doing this, the subject can objectively know his / her own emotion by himself, and a person who looks at the informing unit 13 such as a supporter of the subject can know the subject's emotion.
 ここで、装着型とは、被験者の体に常時触れさせておいて、心拍数や体温のようなデータを、被験者に測定の意思がなくても検出させることができるようなものを指す。 Here, the wearable type refers to a type in which data such as heart rate and body temperature can be detected without the intention of measurement while the subject's body is constantly touched.
 <システムの基本構成>
 上記したように、感情分析システム10(EANA)は、検出部11(DET)と、取得部12(ACC)と、報知部13(NOT)とを備える。検出部11は、被験者31(EXP)の心拍と体温を計測する。取得部12は、ラッセル円環上の座標を算出するラッセル円環算出部14(CAL)と、ラッセル円環上の座標を主要な要素として感情を分析する分析部15(ANA)と、分析された感情から報知データを生成する報知データ生成部16(CRE)とを有する。報知データ生成部16は、分析された感情を、だれでも理解しやすい色によって報知するための報知データを生成する。
<Basic configuration of system>
As described above, the emotion analysis system 10 (EANA) includes the detection unit 11 (DET), the acquisition unit 12 (ACC), and the notification unit 13 (NOT). The detection unit 11 measures the heartbeat and the temperature of the subject 31 (EXP). The acquisition unit 12 is analyzed by a Russell circle calculation unit 14 (CAL) that calculates coordinates on the Russell circle, and an analysis unit 15 (ANA) that analyzes emotions using the coordinates on the Russell circle as main elements. And a notification data generation unit 16 (CRE) that generates notification data from the selected emotion. The notification data generation unit 16 generates notification data for notifying the analyzed emotion in a color that is easy for anyone to understand.
 また、この感情分析システムは、感情分析改善システムとして機能するためのアドバイス生成部17(ADV)を備える。アドバイス生成部17は、感情を改善するための視覚、または音声によるメッセージを生成する。すなわち、アドバイス生成部17は、ラッセル円環上の現在の座標を、円環半径方向の原点に近い方向へ移動させるために有効とされるメッセージを生成する。円環左半分の不快領域に現在の座標がある場合は、円環右半分の快領域の側へそれを移動させるために有効とされるメッセージを生成する。これらのメッセージは、被験者のその時点の感情を知らせ、何か別な行動をとることを促すものであり、例えば次のようなものである。「今は気持ちが不安定になっているようです。」「今いる場所から離れて、3回深呼吸しましょう。」「今していることを一旦中断して、3回深呼吸しましょう。」 The emotion analysis system also includes an advice generation unit 17 (ADV) for functioning as an emotion analysis improvement system. The advice generation unit 17 generates a visual or audio message to improve emotions. That is, the advice generation unit 17 generates a message that is effective to move the current coordinates on the Russell's ring close to the origin of the ring radial direction. If there is a current coordinate in the annoyance area of the left half of the torus, then generate a message that is validated to move it to the side of the pleasure area of the torn right half. These messages inform the subject's feelings at that time and encourage them to take some other action, such as the following. "Now I feel like I'm unstable." "Let's go away from where you are, take three deep breaths." "Let's stop what you are doing and take three deep breaths."
 また、アドバイス生成部17は、ラッセル円環上の現在の座標が、円環右半分の快領域にある場合は、その領域にある状態を維持し、さらに、円環半径方向の原点に近い方向へ移動させるために有効とされるメッセージを生成する。これらのメッセージは、被験者のその時点の感情を知らせ、それまでの感情維持の努力を認め、その継続を促すものである。例えば次のようなものである。「現在まで2時間以上も気持ちが落ち着いています。」「これはあなたの感情維持の3番目に長い記録です。素晴らしい成果を達成しました。」「いつものお気に入りのおやつを摂ってはどうでしょうか。」「この気持ちの状態をできるだけ長く維持できるよう、3回深呼吸してみましょう。」 Further, when the current coordinates on the Russell's ring are in the right area of the right half of the ring, the advice generation unit 17 maintains the state in that area, and further, a direction closer to the origin in the ring radial direction. Generate a message that is validated for transfer to These messages inform the subject's emotions at that time, acknowledge the emotion maintenance efforts so far, and encourage their continuation. For example: "I have been feeling calm for over two hours so far." "This is the third longest record of maintaining your emotions. We have achieved great results." "How about taking your favorite treat? "Let's take three deep breaths to keep this feeling as long as possible."
 この結果、報知部13は、分析された被験者31の感情を示すだけでなく、メッセージを報知する。しかも、このメッセージは、行為変更メッセージ、および/または、褒賞メッセージを含む。さらに、メッセージは、推奨行為メッセージを追加的に含む場合がある。 As a result, the notification unit 13 not only indicates the analyzed emotion of the subject 31 but also notifies a message. Moreover, this message includes an action change message and / or a reward message. Additionally, the message may additionally include a recommendation action message.
 報知部13は、分析された感情を改善する場合には、被験者31の現在の行為から、異なる別の行為への変更を促す行為変更メッセージを出力する。行為変更メッセージは、被験者31に直接的または間接的に与えられる。行為変更メッセージは、支援者32に与えられてもよい。行為変更メッセージを受けた支援者32は、被験者31の行為を変更するように、被験者31に働きかけることができる。 In order to improve the analyzed emotion, the notification unit 13 outputs an action change message prompting a change from the current action of the subject 31 to another different action. The action change message is given to the subject 31 directly or indirectly. The action change message may be given to the supporter 32. The supporter 32 having received the action change message can work on the subject 31 to change the action of the subject 31.
 報知部13は、分析された感情を維持する場合には、被験者31に直接的または間接的に褒賞を与える褒賞メッセージを出力する。褒賞メッセージは、被験者31に対して直接的に与えられてもよい。さらに、褒賞メッセージは、支援者32に与えられてもよい。この場合、支援者32から被験者31に褒賞が与えられてもよい。褒賞は、被験者31を褒める言葉、被験者31が好む嗜好品、被験者31が好む物品、画像などの少なくともひとつを含む。 In order to maintain the analyzed emotion, the notification unit 13 outputs a reward message that directly or indirectly gives a reward to the subject 31. The reward message may be given directly to the subject 31. Further, the reward message may be given to the supporter 32. In this case, a reward may be given to the subject 31 from the supporter 32. The award includes at least one of a word that compliments the subject 31, a favorite item that the subject 31 likes, an article that the subject 31 likes, an image, and the like.
 報知部13は、感情を改善する場合に、感情を改善するために有効な被験者31の行為を示す推奨行為メッセージを出力する場合がある。また、報知部13は、感情を維持する場合に、感情を維持するために有効な被験者31の行為を示す推奨行為メッセージを出力する場合がある。推奨行為メッセージは、例えば、「深呼吸」によって例示されている。 When improving the emotion, the notification unit 13 may output a recommended action message indicating an action of the subject 31 effective to improve the feeling. Moreover, the alerting | reporting part 13 may output the recommendation action message which shows the test subject's 31 action effective in order to maintain an emotion, when maintaining an emotion. The recommended action message is exemplified by, for example, "deep breathing".
 メッセージは、あらかじめ設定されて記憶装置に格納されている。例えば、怒りの感情を平静な感情に戻すための複数のメッセージがあらかじめ格納されている。アドバイス生成部17は、例えば、メッセージごとにあらかじめ設定された強度データに応じて、順に強度を上げるようにメッセージを生成する。アドバイス生成部17は、メッセージを生成するために、ラッセル円環算出部14によって算出されたデータを利用する。アドバイス生成部17は、被験者および支援者を含む利用者にメッセージを報知するために報知部13を利用する。 The messages are preset and stored in the storage device. For example, a plurality of messages for returning anger's emotions to calm emotions are stored in advance. For example, the advice generation unit 17 generates a message so as to increase the strength in order according to the strength data set in advance for each message. The advice generation unit 17 uses the data calculated by the Russell circle calculation unit 14 to generate a message. The advice generation unit 17 uses the notification unit 13 to notify a user of a subject including the subject and the supporter.
 さらに、この感情分析システムは学習部18(LEA)を備える。学習部18は、生成した旧メッセージの効き目(効果)をフィードバック情報に基づいて評価し、引き続いて発生される新メッセージを変更する。このようなフィードバック情報に基づくメッセージの変化は、学習制御と呼ばれる。メッセージの変更または変化は、あらかじめ格納されたメッセージを選択することにより実行される。メッセージの変更または変化は、例えば、語尾の強さの変化によって与えられる。メッセージの変更または変化は、他の感情分析システムとのデータ統合によって得られた新しいメッセージの追加格納と、新しいメッセージの利用とによって実行されてもよい。または、メッセージの変更または変化は、他の感情分析システムとのデータ統合によって得られた新しいメッセージを格納し、その新しいメッセージへの切り替えによって実行されてもよい。 Furthermore, this emotion analysis system comprises a learning unit 18 (LEA). The learning unit 18 evaluates the effect (effect) of the generated old message based on the feedback information, and changes the subsequently generated new message. The change of the message based on such feedback information is called learning control. The change or change of the message is performed by selecting a previously stored message. The change or change of the message is given by, for example, a change of the end word strength. The change or change of the message may be implemented by the additional storage of a new message obtained by data integration with another emotion analysis system and the use of the new message. Alternatively, the change or change of the message may be implemented by storing a new message obtained by data integration with another emotion analysis system and switching to the new message.
 フィードバック情報のひとつは、旧メッセージの後のラッセル円環上における最新の感情の座標の変化から得ることができる。旧座標から新座標への変化が、目標とする座標への接近であるか、目標座標からの離間であるかによって旧メッセージの効き目が評価される。フィードバック情報を得るために、学習部18はラッセル円環算出部14から座標情報を取得する。フィードバック情報のひとつは、被験者からの申告、すなわちシステムへの入力によって得られる。フィードバック情報のひとつは、支援者32(SUP)からの申告、すなわちシステムへの入力によって得られる。支援者32からの申告をフィードバック情報のひとつとすることで、客観的な学習が可能となる。被験者31または支援者32を含む利用者からの申告は、教示部19(TEA)から学習部18に入力される。 One piece of feedback information can be obtained from changes in the coordinates of the most recent emotion on the Russell circle after the old message. The effectiveness of the old message is evaluated depending on whether the change from the old coordinate to the new coordinate is the approach to the target coordinate or the separation from the target coordinate. In order to obtain feedback information, the learning unit 18 obtains coordinate information from the Russell circular calculation unit 14. One piece of feedback information is obtained from the subject's declaration, ie, input to the system. One piece of feedback information is obtained by a declaration from the supporter 32 (SUP), that is, an input to the system. By making the declaration from the supporter 32 one of feedback information, objective learning becomes possible. The declaration from the user including the subject 31 or the supporter 32 is input from the teaching unit 19 (TEA) to the learning unit 18.
 このような学習制御機能を備えることにより、メッセージの提供による感情改善機能は、被験者個人の個性に適合することができる。具体的には、より感情が改善されやすいようにメッセージが調節される。また、このような学習制御機能を備えることにより、望ましい感情の変化が発生するようにメッセージが調節される。 By providing such a learning control function, the emotion improvement function by providing a message can be adapted to the individuality of the subject. Specifically, the message is adjusted so that emotion can be more easily improved. Also, by providing such a learning control function, the message is adjusted so that the desired change in emotion occurs.
 報知部13の一部または全体は、人工知能20(ATI)によって提供されてもよい。感情分析システムは、感情の分析、メッセージの選択および表示、メッセージの送受信を、人工知能20を搭載したサーバシステムによって提供する。このサーバシステムは独立したハードウェア内に置かれてもよいし、クラウド上に設置されてもよい。人工知能20から被験者31へ提供されたメッセージ、およびメッセージに対応するフィードバック情報を含む履歴情報は、すべて人工知能20に記録される。この結果、安全圏にある感情を維持するメッセージの効果、あるいは、注意圏または危険圏にある感情を安全圏に戻すメッセージの効果が自動的に蓄積される。この結果、被験者の感情を良い状態に維持する効果が高いメッセージ、および、被験者の感情を良い領域に変化させる効果が高いメッセージが学習される。人工知能は、それら効果が高いメッセージが再生される頻度を高くするように設定されている。また、これらの学習により人工知能20自身がメッセージを作成する。 A part or the whole of the notification unit 13 may be provided by artificial intelligence 20 (ATI). The emotion analysis system provides analysis of emotions, selection and display of messages, and transmission and reception of messages by a server system equipped with an artificial intelligence 20. This server system may be placed in independent hardware or may be placed on the cloud. The history information including the message provided from the artificial intelligence 20 to the subject 31 and the feedback information corresponding to the message are all recorded in the artificial intelligence 20. As a result, the effect of the message maintaining the emotion in the safe zone, or the effect of the message returning the emotion in the attention zone or the danger zone to the safe zone is automatically accumulated. As a result, a message having a high effect of maintaining the subject's emotion in a good state and a message having a high effect of changing the subject's emotion to a good area are learned. Artificial intelligence is set to increase the frequency with which these high-effect messages are played back. Also, the artificial intelligence 20 itself creates a message by these learnings.
 人工知能20は、複数の被験者33(EXPA)に関する複数の感情分析システム21(EANAS)の学習データを統合する。ある被験者に対する人工知能からの回答は、特定の被験者33のために集められた学習データからも、その他の被験者から集めた学習データの中からも選択される。これによって、より適切な言語的メッセージをより早く作成し、送信させる。 The artificial intelligence 20 integrates learning data of a plurality of emotion analysis systems 21 (EANAS) regarding a plurality of subjects 33 (EXPA). The answer from the artificial intelligence to a certain subject is selected from learning data collected for a specific subject 33 and learning data collected from other subjects. This will make the more relevant linguistic messages faster to create and send.
 人工知能20は、被験者31が置かれた環境を検出する環境検出部22(EDET)を備える。環境検出部22は、被験者31が置かれた環境の情報を検出する。環境の情報には、日時、温度、湿度、気圧、周囲の騒音レベルと音質、周囲の人数、周囲の明るさの少なくともひとつを含む。学習部18を提供する人工知能20は、ラッセル円環上の座標の変化と、環境とをモニタすることにより、座標の変化、すなわち感情の変化と環境とを関連づける。これにより、環境の変化から感情の変化を予測し、その予測された感情の変化を被験者に報知する。これにより、過剰に大きい感情の振れ幅を未然に防止することができる。さらに、人工知能20は、感情の変化と環境の変化とを関連づけることにより、新たなメッセージを自動的に生成してもよい。 The artificial intelligence 20 includes an environment detection unit 22 (EDET) that detects an environment in which the subject 31 is placed. The environment detection unit 22 detects information of the environment in which the subject 31 is placed. The environmental information includes at least one of date and time, temperature, humidity, barometric pressure, ambient noise level and sound quality, ambient people, and ambient brightness. The artificial intelligence 20 providing the learning unit 18 relates the change in coordinates, that is, the change in emotion, to the environment by monitoring the change in coordinates on the Russell's circle and the environment. Thereby, the change in emotion is predicted from the change in environment, and the predicted change in emotion is notified to the subject. As a result, it is possible to prevent excessive swings of emotions in advance. Furthermore, artificial intelligence 20 may automatically generate a new message by correlating the change of emotion with the change of environment.
 <感情を分析する方法>
 被験者の生体情報、ここでは心拍数HRと体温BTを使って感情を分析する考え方を図2に示す。まず、被験者から検出した心拍数と体温のデータからそれぞれの平常な値を平常値STDとして算出する。被験者が安静にしている状態の数値を平常値として使い、例えば体温であれば目覚めの時の数値を用いてもよい。また、被験者の心拍数と体温の最大値と最小値とを計測する。
<Method to analyze emotions>
The idea of analyzing emotion using the subject's biological information, here heart rate HR and body temperature BT, is shown in FIG. First, each normal value is calculated as a normal value STD from the data of heart rate and body temperature detected from the subject. The numerical value of the state in which the subject is at rest may be used as a normal value, for example, if it is the body temperature, the numerical value at the time of awakening may be used. Also, the maximum and minimum values of the subject's heart rate and body temperature are measured.
 最大値および最小値とは、被験者の日常生活における最大および最小の心拍数あるいは体温である。例えば、1日の測定のなかで記録された最大あるいは最小の値を言う。これら数値を仮の最大値および最小値とし、翌日以降は、これらの仮最大値及び仮最小値を超える値があれば、それまでの最大値あるいは最小値を更新する。超える値がない場合は、更新せずに、前日までの最大値および最小値を用いる。平常値と、最大値および最小値とは、初期値として与えられていてもよい。そして、次の式から、平常値を基準として、最大値および最小値に対する今回の測定値の割合が、心拍数と体温のそれぞれについて求められる。 The maximum and minimum values are the maximum and minimum heart rate or temperature in the daily life of the subject. For example, it refers to the maximum or minimum value recorded during daily measurements. These numerical values are regarded as temporary maximum and minimum values, and from the next day, if there is a value exceeding these temporary maximum and temporary minimum values, the previous maximum value or minimum value is updated. If there is no value exceeding, use the maximum and minimum values up to the previous day without updating. The normal value and the maximum and minimum values may be given as initial values. Then, from the following equation, the ratio of the current measurement value to the maximum value and the minimum value can be obtained for each of the heart rate and the temperature based on the normal value.
 測定値が平常値以上の場合、
 (測定値-平常値)/(最大値-平常値)/100 (%)(式1)
 測定値が平常値未満の場合、
 (測定値-平常値)/(最小値-平常値)/100 (%)(式2)
If the measured value is above the normal value,
(Measured value-normal value) / (maximum value-normal value) / 100 (%) (Equation 1)
If the measured value is less than the normal value,
(Measured value-normal value) / (minimum value-normal value) / 100 (%) (Equation 2)
 こうして求められた割合は、縦軸を心拍数、横軸を体温としたグラフにプロットされる。さらに、グラフは、ラッセルの円環モデルに重ね合わせられる。ラッセルの円環モデルによると、人間の感情は(1)覚醒ARS、(2)苦痛DST、(3)悲しみMIS、(4)憂鬱DEP、(5)眠気SLP、(6)満足CNT、(7)快楽PLS、(8)興奮EXCの8種類に分類される。これらの感情は、円上に45度刻みで配置され、0度:快楽、45度:興奮、90度:覚醒、135度:苦痛、180度:悲しみ、225度:憂鬱、270度:眠気、315度:満足、となっている。 The proportions thus determined are plotted on a graph with the heart rate on the vertical axis and the body temperature on the horizontal axis. Furthermore, the graph is superimposed on Russell's ring model. According to Russell's circle model, human emotions are (1) awakening ARS, (2) pain DST, (3) sadness MIS, (4) depressive DEP, (5) sleepiness SLP, (6) satisfied CNT, (7 ) 8 types of pleasure PLS, (8) excitement EXC. These emotions are arranged in 45-degree steps on a circle: 0 degrees: pleasure, 45 degrees: excitement, 90 degrees: awakening, 135 degrees: pain, 180 degrees: sadness, 225 degrees: melancholy, 270 degrees: sleepiness, 315 degrees: Satisfied.
 また、人間の体温は、感情が快の状態では上昇し、不快(例えば悲しみ)の状態では下降することが心理学の研究から知られている。人間の心拍数は、覚醒状態では上昇し、眠い状態では下降することが、生物学の研究から知られている。したがって、人間の感情と、心拍数および体温の関係は、図2のように配置することが可能である。各感情の軸に+22.5度~-22.5度すなわち幅45度を割り当てる。そして、プロットした場所が当てはまる領域を、その時点における被験者の感情であると判断する。なお、軸の0点は心拍数および体温とも平常値であることを示す。 Also, it is known from psychology research that human body temperature rises in a state of pleasant emotion and falls in a state of discomfort (eg sadness). It is known from biological research that human heart rate rises in awake state and falls in sleepy state. Therefore, the relationship between human emotion, heart rate and body temperature can be arranged as shown in FIG. Assign each axis of emotion +22.5 degrees to -22.5 degrees or 45 degrees wide. Then, it is determined that the area where the plotted place is applicable is the subject's emotion at that time. The zero point on the axis indicates that both the heart rate and the body temperature are normal values.
 円環モデル上におけるプロットの位置は、8種類の感情のいずれかを示す。よって、この実施形態では、ラッセルの円環モデル上における被験者の心拍数および体温の交点に基づいて、被験者の感情を複数の分類のひとつとして分析している。 The position of the plot on the toric model indicates one of eight emotions. Therefore, in this embodiment, the emotion of the subject is analyzed as one of a plurality of categories based on the intersection of the subject's heart rate and body temperature on Russell's ring model.
 また、心拍数および体温の測定値の最高値MAXおよび最小値MINに対する比率に応じて、円盤の半径を、例えば3分割する。すなわち、例えば0%~25%を安全圏STB、25%~70%を注意圏CAT、70%以上を危険圏WARとする。この分割比率および分割数は被験者の心理的身体的状態に応じて例えば4分割などに変更してもよい。 In addition, the radius of the disk is divided, for example, into three in accordance with the ratio of heart rate and body temperature measurements to the maximum value MAX and the minimum value MIN. That is, for example, 0% to 25% is a safety zone STB, 25% to 70% is a caution zone CAT, and 70% or more is a danger zone WAR. The division ratio and the number of divisions may be changed to, for example, four divisions according to the subject's psychological physical condition.
 分析した8つの感情に8色の色を割り当てる。例えば、苦痛は赤、悲しみは橙、憂鬱は紫、眠気は黄、満足は青、快楽は緑、興奮は桃色、覚醒は薄青(水色)、である。または、各感情に数字を割り当て、圏の別を割合の数値(例えば65%)で表示してもよい。ただし、安全圏内はどの感情であっても「快楽」の色で表示させる。 Assign eight colors to the eight emotions analyzed. For example, pain is red, sadness is orange, depression is purple, sleepiness is yellow, satisfaction is blue, pleasure is green, excitement is pink, arousal is light blue (light blue). Alternatively, each emotion may be assigned a number, and the category may be displayed as a percentage (for example, 65%). However, the safe zone is displayed in the color of "pleasure" regardless of emotion.
 <感情分析システムの基本機能>
 図3は、感情を分析し、改善する処理S161の流れを示す。感情分析システムは、ステップS162において、被験者の生体情報、例えば心拍数、体温、血中酸素濃度、表情を検知する。感情分析システムは、ステップS163において、検知された生体情報を使って被験者の感情を分析する。感情分析システムは、ステップS164において、分析した感情が悪いか、良いかを判断する。ここでの判断は、ラッセル円環上における被験者の生体情報が示す座標に基づいて判断される。感情分析システムは、ステップS165またはステップS167において、被験者が自分自身の客観的な感情を、認知できるように報知する。
<Basic functions of emotion analysis system>
FIG. 3 shows a flow of processing S161 for analyzing and improving emotions. In step S162, the emotion analysis system detects biological information of the subject, such as heart rate, body temperature, blood oxygen concentration, and facial expression. In step S163, the emotion analysis system analyzes the subject's emotion using the detected biological information. In step S164, the emotion analysis system determines whether the analyzed emotion is bad or good. Judgment here is judged based on the coordinate which the test subject's living body information on a Russell circle shows. The emotion analysis system notifies in step S165 or step S167 that the subject can recognize his or her own objective emotion.
 感情分析システムは、分析した感情が怒りや悲しみのような悪い感情であった場合、ステップS166において、その感情を改善して良い感情にするメッセージを提供する。すなわち、被験者の感情が、ラッセル円環において左半分の領域に示される感情の場合には、被験者の感情を、ラッセル円環において右方向へ移動させるためのメッセージを提供する。 If the analyzed emotion is a bad emotion such as anger or sadness, the emotion analysis system provides a message to improve the emotion into a good emotion in step S166. That is, if the subject's emotion is shown in the left half of the Russell's circle, a message is provided to move the subject's emotion to the right in the Russell's circle.
 感情分析システムは、分析した感情が喜びや満足のような良い感情の場合、ステップS168において、良い感情を維持するために、被験者にメッセージを提供する。すなわち、被験者の感情が、ラッセル円環において右半分の領域に示される感情の場合には、被験者の感情を、ラッセル円環において右半分の領域に維持するためのメッセージを提供する。 The emotion analysis system provides a message to the subject to maintain a good emotion in step S168 if the analyzed emotion is a good emotion such as pleasure or satisfaction. That is, if the subject's emotion is shown in the right half area of the Russell circle, a message is provided to maintain the subject's emotion in the right half area of the Russell circle.
 提供されるメッセージは、単なる報知に留まらない被験者への積極的な働きかけともいえる。ステップS165またはステップS167における報知と、ステップS166またはステップS168におけるメッセージの提供とは、報知部13によって行われる。 The provided message can be said to be a positive response to subjects who are not merely informed. The notification in step S165 or step S167 and the provision of the message in step S166 or step S168 are performed by the notification unit 13.
 <感情分析の流れ図>
 図4は、S162からS167までの処理をより詳細に示す。まず、システムは、ステップS151において、被験者の心拍数と体温とを入力する。システムは、ステップS152において、心拍数と体温とを蓄積する。システムは、ステップS153において、上記平常値および最大値および最小値を算出する。システムは、ステップS154において、心拍数と体温とから被験者の現在の感情を分析する。
<Flow chart of emotion analysis>
FIG. 4 shows the process from S162 to S167 in more detail. First, in step S151, the system inputs the subject's heart rate and body temperature. The system accumulates heart rate and body temperature in step S152. In step S153, the system calculates the above-described normal value and the maximum and minimum values. The system analyzes the subject's current emotion from the heart rate and the temperature in step S154.
 システムは、ステップS155において被験者の感情を報知する。システムは、ステップS156において、そのときの感情にあらかじめ割り当てられている色を特定する。システムは、ステップS157において、報知部13に特定した色を表示する。 The system reports the subject's emotion in step S155. In step S156, the system identifies a color pre-assigned to the current emotion. The system displays the specified color on the notification unit 13 in step S157.
 このとき、報知部13は、分析した感情の色だけでなく、ラッセルの円環モデルを表示してもよい。さらに、報知部13は、被験者の心拍数と体温とに基づくプロットを表示してもよい。報知部13は、具体的には、ラッセルの円環モデルを線画によって表示し、プロットを点として表示し、さらにそのときの感情を示す色を背景色として表示する。システムは、ステップS158において、ラッセルの円環モデル上におけるプロット位置に応じて、背景色の連続点灯と点滅点灯とを選択し、切り替える。このような感情分析方法のための処理は、繰り返して実行されている。 At this time, the notification unit 13 may display not only the color of the analyzed emotion but also Russell's ring model. Furthermore, the notification unit 13 may display a plot based on the subject's heart rate and body temperature. Specifically, the notification unit 13 displays a Russell ring model as a line drawing, displays a plot as a point, and further displays a color indicating an emotion at that time as a background color. In step S158, the system selects and switches between continuous lighting and blinking lighting of the background color according to the plot position on Russell's ring model. Processing for such an emotion analysis method is repeatedly performed.
 感情分析システムによれば、被験者31が自身の感情およびその程度を客観的に把握することができる。特に、報知部13の色によって誰でも直感的に感情を把握することができる。このため、被験者31がASD(自閉症スペクトラム障害)であるような場合は、自分の感情の変化を客観的に認識させることができる。顕著な場合、被験者31は、自分の感情が制御困難になる前に、自分の感情の変化を客観的に認識し、感情を制御困難な状況に陥ることを回避する行動をとることができる。被験者31が定型発達者である場合は、自身のストレスの程度が上昇する前にその対処をすることができる。この結果、感情分析システムは、日常生活において被験者31の心理的および精神的負担を減らすことができる。このように、感情分析システムは、感情を分析し、色を含む視覚的な情報によって客観的に表示し、被験者31または支援者32に認識させる。さらに、感情分析システムは、単に分析された感情を報知するだけでなく、感情の悪化を被験者31および支援者32に認識させる。 According to the emotion analysis system, the subject 31 can objectively grasp his / her emotion and the degree thereof. In particular, anyone can intuitively grasp emotions by the color of the notification unit 13. Therefore, when the subject 31 is ASD (Autism Spectrum Disorder), it is possible to objectively recognize changes in one's emotions. In a notable case, the subject 31 can take an action to objectively recognize changes in his / her emotions before the emotions become difficult to control, and to prevent the emotions from becoming difficult to control. When the subject 31 is a typical developing person, it can be dealt with before his or her degree of stress rises. As a result, the emotion analysis system can reduce the psychological and mental burden on the subject 31 in daily life. In this way, the emotion analysis system analyzes emotions, displays them objectively by visual information including color, and makes the subject 31 or the supporter 32 recognize. Furthermore, the emotion analysis system not only informs the analyzed emotions, but causes the subjects 31 and the supporter 32 to recognize the deterioration of the emotions.
 さらに、感情分析システムは、単に感情を認識させるだけなく、良い感情を維持するための追加情報、および/または、悪化した感情を改善するための追加情報を提供する。追加情報の提供は、予め複数の追加情報を半導体メモリ等の記憶媒体に格納し、それら格納された複数の追加情報を選択し、再生してもよい。また、追加情報は、実際の利用において学習されてもよい。また、追加情報は、利用者によって追加されてもよい。また、追加情報は、インターネットなどのネットワークを経由して追加、またはネットワーク上において選択されてもよい。 Furthermore, the emotion analysis system not only makes the emotions recognized but also provides additional information to maintain good emotions and / or additional information to improve the deteriorated emotions. The provision of the additional information may store a plurality of additional information in advance in a storage medium such as a semiconductor memory, and select and reproduce the plurality of additional information stored. Also, the additional information may be learned in actual use. Also, additional information may be added by the user. Also, the additional information may be added via the network such as the Internet or selected on the network.
 追加情報は、文字、声など利用者が理解可能な情報として提供される。さらに、追加情報は、利用者の注意を惹くように補助的な形態をとる場合がある。追加情報は、例えば、被験者31の注意を惹きやすいマンガのキャラクターのような絵画的な画像とともに提示される文字、および/または声を含むことができる。マンガのキャラクターのような絵画的な画像は、被験者31の感情が悪化しつつある局面においても、被験者31の注意を惹くことができる。 The additional information is provided as information that can be understood by the user, such as text and voice. In addition, the additional information may be in the form of an aid to draw the user's attention. The additional information may include, for example, text and / or voice presented with a pictorial image such as a cartoon character that is subject to the subject's 31 attention. A pictorial image such as a cartoon character can draw the attention of the subject 31 even when the subject's 31 emotion is getting worse.
 また、被験者31がASDの小学生などの低年齢者であったり、認知症患者であるような場合は、その支援者32も被験者31の感情を把握できる。なお、支援者32には、被験者31の保護者、被験者31の介助者、または支援組織の職員が含まれる。よって、被験者31の気持ちに即していて、かつ、無駄のない、効果的な支援や介助をおこなうことが可能になる。 In addition, when the subject 31 is a low-aged person such as a primary school student of ASD or a patient with dementia, the supporter 32 can also grasp the emotion of the subject 31. The supporter 32 includes a guardian of the subject 31, a helper of the subject 31, or an employee of a support organization. Therefore, it is possible to provide effective support and assistance that are in line with the feelings of the subject 31 and that are not wasted.
 (第2実施形態)
 第2実施形態を図6に示す。第2実施形態は、第1実施形態に加えて、例えばスマートフォンやタブレットのような携帯型端末2aを含む。装着型端末1aおよび装着型測定器1bと携帯型端末2aは、何らかの通信手段、例えば無線通信で接続され、心拍数と体温のデータを互いに送受信することができる。
Second Embodiment
A second embodiment is shown in FIG. The second embodiment includes, in addition to the first embodiment, a portable terminal 2a such as a smartphone or a tablet. The wearable terminal 1a, the wearable measuring instrument 1b, and the portable terminal 2a are connected by any communication means, for example, wireless communication, and can transmit and receive heart rate and body temperature data to each other.
 装着型端末1aまたは携帯型端末とその表示部1cの例を図7に示す。表示部1cの上部には、例えば円形の部分1c1があり、分析された感情がここに色で表示される。円形部分1c1の下には、言語的メッセージを表示する部分となるメッセージ欄1c2および1c3がある。感情改善あるいは感情維持を目的とした被験者へのメッセージが被験者向けメッセージ欄1c2に表示される。それに対する被験者からの反応の入力メッセージは1c3に表示される。また、円形部分1c1は、図2の、ラッセルの円環モデルに心拍数と体温の指標を加えたものを表示してもよい。 An example of the wearable terminal 1a or the portable terminal and its display unit 1c is shown in FIG. For example, a circular portion 1c1 is present at the top of the display unit 1c, and the analyzed emotion is displayed in color here. Below the circular portion 1c1, there are message fields 1c2 and 1c3 which become portions for displaying verbal messages. A message to the subject for the purpose of feeling improvement or feeling maintenance is displayed in the subject-oriented message column 1c2. The input message of the reaction from the subject to it is displayed in 1c3. Further, the circular portion 1c1 may display the Russell's ring model of FIG. 2 to which a heart rate and a temperature index are added.
 また、第2実施形態において、装着型端末1aに検出部11および表示部1cを搭載した例を図8に示す。 Further, FIG. 8 shows an example in which the detection unit 11 and the display unit 1c are mounted on the wearable terminal 1a in the second embodiment.
 また、装着型端末1aと携帯型端末2aとを有線通信手段41で接続された例も図8に示す。有線通信手段とは、例えばUSBケーブルである。 An example in which the wearable terminal 1a and the portable terminal 2a are connected by the wired communication means 41 is also shown in FIG. The wired communication means is, for example, a USB cable.
 動作の1例は、第1の実施形態の動作例に加えて、被験者の感情を携帯型端末2aでも分析するものである。装着型端末1aあるいは装着型測定器1bによって被験者の心拍数と皮膚温を測定する。測定された心拍数と皮膚温から、図2の考え方に基づいて被験者の感情を携帯型端末2aで分析する。分析した感情が携帯型端末2aおよび装着型端末1aの両方、あるいはこれらのどちらかの表示部に表示される。 In addition to the operation example of the first embodiment, one example of the operation is to analyze the subject's emotion also with the portable terminal 2a. The heart rate and skin temperature of the subject are measured by the wearable terminal 1a or the wearable measurement device 1b. From the measured heart rate and skin temperature, the emotion of the subject is analyzed by the portable terminal 2a based on the concept of FIG. The analyzed emotion is displayed on both the portable terminal 2a and the wearable terminal 1a, or one of these display units.
 さらにもう一つの動作例は、携帯型端末2aの代わり、あるいは、携帯型端末2aに加えて、パソコンの画面やテレビモニター等の他の機器2bを用いるものである。携帯型端末2aで分析された感情をパソコンの画面やテレビモニター等の他の機器2bにも表示する。これによって、被験者の周辺の支援者が被験者の感情を確認することができる。 Yet another operation example is to use another device 2b such as a screen of a personal computer or a television monitor instead of the portable terminal 2a or in addition to the portable terminal 2a. The emotion analyzed by the portable terminal 2a is also displayed on another device 2b such as a screen of a personal computer or a television monitor. This enables the supporters around the subject to confirm the subject's emotions.
 (第3実施形態)
 第3実施形態は、第2実施形態において、携帯型端末2a、あるいはパソコンの画面やテレビモニター等の他の機器2bに表示する被験者の感情の色の中に、赤、橙、紫、黄、青、薄青(水色)、桃色、緑を含むものである。
Third Embodiment
In the third embodiment, in the second embodiment, the color of the subject's emotion displayed on the portable terminal 2a or another device 2b such as a screen of a personal computer or a television monitor is red, orange, purple, yellow, Blue, light blue (light blue), pink, including green.
 (第4実施形態)
 第4実施形態は、第2実施形態、あるいは第3実施形態において、装着型端末1a、あるいは携帯型端末2a、あるいはパソコンの画面やテレビモニター等の他の機器2bに、現在の被験者の感情を示す言語のメッセージを表示するものである。また、携帯型端末2aに音声出力部を備えており、言語のメッセージを音声で出力する機能を持つものである。例えば、分析された被験者の感情が「苦痛、50%」であった場合は、「只今あなたは中程度の苦痛を感じています。」というようなメッセージが表示されるものである。これらのメッセージはあらかじめ携帯型端末2a内に複数保管されており、分析された感情に合わせて携帯型端末2a内で適切なものが自動的に選定されて送信される。これらのメッセージは、一定の時間間隔、例えば10分毎に、毎回異なるものが表示される場合がある。また、言語のメッセージの他に、絵画のメッセージが表示されてもよい。また、言語のメッセージは、音声出力部を通して音声によって被験者に伝わるようになっていてもよい。
Fourth Embodiment
In the fourth embodiment, in the second embodiment or the third embodiment, the emotion of the present subject is displayed on the wearable terminal 1a, the portable terminal 2a, or another device 2b such as a screen of a personal computer or a television monitor. It displays messages in the indicated language. In addition, the portable terminal 2a is provided with an audio output unit, and has a function of outputting a language message as an audio. For example, if the emotion of the subject analyzed is "pain, 50%", a message such as "Now you are experiencing moderate pain" is displayed. A plurality of these messages are stored in advance in the portable terminal 2a, and an appropriate one is automatically selected and transmitted in the portable terminal 2a according to the analyzed emotion. These messages may be displayed differently each time at fixed time intervals, for example, every 10 minutes. In addition to the language message, a picture message may be displayed. Also, the language message may be transmitted to the subject by voice through the voice output unit.
 (第5実施形態)
 第4実施形態に加えて、被験者の感情が安全圏にあると判断される場合は、その状態を維持する行動を促すようなメッセージが提供される。メッセージは、言語のメッセージ、あるいは絵画のメッセージを含む。メッセージは、装着型端末1a、あるいは携帯型端末2a、あるいはパソコンの画面やテレビモニター等の他の機器2bに表示されるものである。
Fifth Embodiment
In addition to the fourth embodiment, when it is determined that the subject's emotion is in the safe zone, a message is provided to prompt an action to maintain the state. The message includes a language message or a picture message. The message is displayed on the wearable terminal 1a, the portable terminal 2a, or another device 2b such as a screen of a personal computer or a television monitor.
 (第6実施形態)
 第4実施形態に加えて、被験者の感情が注意圏あるいは危険圏にあると判断される場合は、その感情を安全圏に戻すためのメッセージが提供される。メッセージは、言語のメッセージ、あるいは絵画のメッセージを含む。メッセージは、装着型端末1a、あるいは携帯型端末2a、あるいはパソコンの画面やテレビモニター等の他の機器2bに表示されるものである。例えば、分析された被験者の感情が「苦痛、50%」であった場合は、「そろそろ苦痛を取り除く行動をしてもいい頃合いです。深呼吸して体操をし、気持ちを落ち着けましょう。」というようなメッセージが示される。これらのメッセージは、被験者の感情が安全圏に入るまで、一定の時間間隔で、毎回異なるものが表示される場合がある。
Sixth Embodiment
In addition to the fourth embodiment, when it is determined that the subject's emotion is in the attention zone or the danger zone, a message for returning the emotion to the safe zone is provided. The message includes a language message or a picture message. The message is displayed on the wearable terminal 1a, the portable terminal 2a, or another device 2b such as a screen of a personal computer or a television monitor. For example, if the subject's emotion was analyzed as "pain, 50%", "It's about time to take action to get rid of the pain. A message such as These messages may be displayed differently each time at certain time intervals until the subject's emotions enter the safe zone.
 この時、システムは、心拍数と体温を継続的に取得して感情を連続的に判断し続けることにより、表示したメッセージが被験者の感情改善に効果があったかどうか、どの程度の効果があったかを定量的に測定することができる。この、測定→感情分析→メッセージ表示→測定→感情分析→次のメッセージ表示、というサイクルを繰り返すことにより、表示したメッセージとその効果をシステムは記憶し、その後はより感情改善効果が高いメッセージをより短時間で表示することができる。これをこの実施形態では「学習」と呼ぶことにする。 At this time, the system continuously obtains heart rate and body temperature and continuously judges emotions, thereby quantifying whether the displayed message has an effect on the improvement of the subject's emotions, and to what extent. Can be measured. By repeating this cycle of measurement → emotion analysis → message display → measurement → emotion analysis → next message display, the system memorizes the displayed message and its effect, and then the message with higher emotion improvement effect is better It can be displayed in a short time. This will be called "learning" in this embodiment.
 この実施形態では、報知部13は、取得部12によって分析された感情を改善するための改善策を報知する。単に感情を表示するだけでなく、改善策が報知される。これにより、被験者および/または支援者を含む利用者は、被験者の感情を改善することができる。 In this embodiment, the notification unit 13 notifies of an improvement measure for improving the emotion analyzed by the acquisition unit 12. Not only the emotions are displayed, but the improvement measures are informed. This allows the user including the subject and / or the supporter to improve the subject's emotions.
 (第7実施形態)
 第4、第5、第6実施形態において、被験者の好みに合わせて、メッセージの言葉遣いを選択し、変更でき、あるいは、メッセージを伝達するキャラクターを選択し、変更できるものである。例えば、子供向け、大人向け、老人向け、男性向け、女性向け、親し気(したしげ)な友人調、年長者調、孫調、召使い調、教師調、などである。これらメッセージの調子は、報知部13にあらかじめ格納されている。利用者は、例えば携帯型端末2aの操作部によってメッセージの調子を選択する。
Seventh Embodiment
In the fourth, fifth and sixth embodiments, the wording of the message can be selected and changed, or the character to which the message is transmitted can be selected and changed according to the preference of the subject. For example, for children, adults, elderly people, men, women, close friendliness, elder tone, grandson tone, servant tone, teacher tone, etc. The conditions of these messages are stored in advance in the notification unit 13. The user selects the tone of the message, for example, by the operation unit of the portable terminal 2a.
 (第8実施形態)
 第7実施形態において、装着型端末1aの表示部、あるいは携帯型端末2aの表示部、あるいはパソコンの画面やテレビモニター等の他の機器2b(分離された別の表示手段)に表示されたメッセージに対し、被験者が返事や回答の被験者メッセージを入力できるようにしたものである。被験者メッセージを入力するために、被験者が装着している装着型端末1a、または被験者が所持している携帯型端末2aは、音声入力機能、またはテキスト入力機能を有する。音声入力機能は、例えば、マイクと、音声認識装置とによって提供できる。テキスト入力機能は、キーボードによって提供できる。さらに、被験者メッセージに対し、携帯型端末2aがさらにメッセージ(返信)を送ることができるものである。携帯型端末2aは、被験者メッセージに対する返信を生成するための会話提供部を有する。会話提供部は、被験者が、あたかも人間と会話をしているかのような感覚を覚えるような返信を提供する。会話提供部は、それ自身が提供する会話の程度に応じて、予め用意された複数の返信からいずれかひとつを選択するだけのボット装置から、人工知能と呼ばれる装置まで、多様な装置を利用可能である。これにより、被験者はあたかも人間と会話をしているかのようにメッセージの送受信をおこなう。さらに、会話提供部は、被験者とオペレータとを交信可能に接続する通信回線でもよい。
Eighth Embodiment
In the seventh embodiment, a message displayed on the display unit of the wearable terminal 1a, the display unit of the portable terminal 2a, or the screen of a personal computer or another device 2b (separate display means) such as a television monitor On the other hand, the subject can input the subject message of the reply and the answer. In order to input a subject's message, the wearable terminal 1a worn by the subject or the portable terminal 2a carried by the subject has a voice input function or a text input function. The voice input function can be provided, for example, by a microphone and a voice recognition device. Text input functionality can be provided by the keyboard. Furthermore, the portable terminal 2a can further send a message (reply) to the subject message. The portable terminal 2a has a conversation providing unit for generating a reply to the subject message. The conversation providing unit provides a reply in which the subject feels as if he or she is in conversation with a human. The conversation providing unit can use a variety of devices, from bot devices that only select one of a plurality of prepared responses, to a device called artificial intelligence, according to the degree of conversation provided by itself. It is. This causes the subject to send and receive messages as if they were in conversation with a human being. Furthermore, the conversation providing unit may be a communication line communicably connecting the subject and the operator.
 (第9実施形態)
 第9実施形態を図9に示す。第2から第8実施形態における感情の分析、メッセージの選択および表示を、人工知能を搭載したサーバシステム3aでおこなうものである。このサーバシステムは独立したハードウェア内に置かれてもよいが、クラウド上に設置されてもよい。
The ninth embodiment
A ninth embodiment is shown in FIG. The analysis of emotions, the selection of messages and the display in the second to eighth embodiments are performed by the server system 3a equipped with artificial intelligence. This server system may be located in independent hardware, but may be located on the cloud.
 被験者と人工知能との間におけるメッセージと被験者メッセージとのやりとりはすべて人工知能に記録される。それらメッセージおよび被験者メッセージの関係から、落ち着いた感情の維持の効果、あるいは、注意圏または危険圏にある感情を安全圏に戻す効果が自動的に評価され、自動的に蓄積される。 All interactions between the subject and the artificial intelligence with the subject's message are recorded in the artificial intelligence. From the relationship between the message and the subject's message, the effect of maintaining calm emotions or the effect of returning emotions in the attention zone or danger zone to the safe zone is automatically evaluated and accumulated.
 被験者と人工知能との間におけるメッセージと被験者メッセージとのやり取りを人工知能に繰り返させることによって、個々の被験者に合わせて適切な長さ、抽象度、表現、言葉遣いのメッセージを人工知能に学習させ、人工知能がメッセージを作成できるようにする。 By making artificial intelligence exchange the message between the subject and artificial intelligence with the subject's message, the artificial intelligence learns the message of the appropriate length, the degree of abstraction, the expression, the wording to the individual subject. , To allow artificial intelligence to create messages.
 (第10実施形態)
 この実施形態では、第9実施形態において、複数の被験者に対する人工知能の学習データが統合される。学習データは、ネットワークを介して収集され、特定の被験者が利用可能なデータセットとして利用可能とされる。ある被験者に対する回答は、その被験者のために人工知能を強化学習などさせて用意されたデータセットに加えて、その他の複数の被験者から集めた学習データの中からも人工知能に選択させる。複数の被験者による感情分析システムの利用と、それら複数の学習データの統合と、統合された学習データを特定の被験者のために利用する行為とが含まれる。これによって、より適切な言語的メッセージをより早く作成し、送信させるものである。
Tenth Embodiment
In this embodiment, in the ninth embodiment, artificial intelligence learning data for a plurality of subjects is integrated. The training data is collected via a network and made available as a set of data available to a particular subject. In addition to the data set prepared by subjecting artificial intelligence to reinforcement learning etc. for the subject, the answer to a subject is selected as artificial intelligence from among learning data collected from a plurality of other subjects. The use of the emotion analysis system by a plurality of subjects, the integration of the plurality of learning data, and the action of utilizing the integrated learning data for a specific subject are included. This makes it possible to create and send more appropriate linguistic messages faster.
 (他の実施形態)
 本開示は、上記した実施形態に準拠して記述されたが、当該実施形態に限定されるものではなく、様々な変形例や均等範囲内の変形をも包含する。加えて、様々な組み合わせや形態、さらには、それらに一要素のみ、それ以上、あるいはそれ以下、を含む他の組み合わせや形態をも、本開示の範疇や思想範囲に入るものである。
(Other embodiments)
The present disclosure has been described based on the above-described embodiment, but is not limited to the embodiment, and includes various modifications and variations within the equivalent range. In addition, various combinations and forms, and further, other combinations and forms including only one element, or more or less than these elements are also within the scope and the scope of the present disclosure.

Claims (10)

  1.  被験者の心拍数と体温を検出する検出部、
     前記検出部により検出された心拍数と体温を取得し、前記被験者の感情を分析する取得部、および、
     前記取得部によって分析された感情を、利用者に対して報知する報知部を備える感情分析システム。
    Detection unit that detects heart rate and temperature of the subject,
    An acquisition unit that acquires a heart rate and a body temperature detected by the detection unit and analyzes an emotion of the subject;
    An emotion analysis system comprising: a notification unit that notifies the user of the emotion analyzed by the acquisition unit.
  2.  前記報知部は、前記取得部によって分析された感情を複数に分類し、分類された感情を、色、数字、文字、記号、絵画、および模様の少なくともひとつを含む視覚的情報により表示する表示部、音、音声、および音楽の少なくともひとつを含む聴覚的情報により出力する音出力部、または、振動、形状、および表面粗さを含む触覚的情報により出力する触感出力部の少なくともひとつを備える請求項1に記載の感情分析システム。 The notification unit classifies the emotions analyzed by the acquisition unit into a plurality of groups, and displays the classified emotions using visual information including at least one of color, numbers, characters, symbols, pictures, and patterns. A sound output unit that outputs sound based on auditory information including at least one of sound, voice, and music, or a tactile sense output unit that outputs tactile information that includes vibration, shape, and surface roughness; The emotion analysis system described in 1.
  3.  前記報知部は、前記取得部によって分析された感情を改善するための改善策を報知する請求項1または2に記載の感情分析システム。 The emotion analysis system according to claim 1, wherein the notification unit notifies an improvement measure for improving the emotion analyzed by the acquisition unit.
  4.  前記報知部は、前記取得部によって分析された感情を維持するための維持策を報知する請求項1ないし3のいずれか1つに記載の感情分析システム。 The emotion analysis system according to any one of claims 1 to 3, wherein the notification unit notifies a maintenance measure for maintaining the emotion analyzed by the acquisition unit.
  5.  前記取得部は、被験者の感情をラッセル円環に基づいて分析し、
     前記報知部は、ラッセル円環の左半分である不快領域と判断されたときには、左半分の側から右半分の快領域へ被験者の感情を変化させるような視覚的情報、あるいは聴覚的情報、あるいは触覚的情報の少なくとも一つを出力する、請求項1ないし4のいずれか1つに記載の感情分析システム。
    The acquisition unit analyzes the subject's emotion based on a Russell circle,
    When the informing unit is determined to be an unpleasant area which is the left half of the Russell's circle, visual information or auditory information, or the like, which changes the subject's emotion from the left half to the right area The emotion analysis system according to any one of claims 1 to 4, which outputs at least one of tactile information.
  6.  前記取得部は、被験者の感情をラッセル円環に基づいて分析し、
     前記報知部は、ラッセル円環の半径方向中心側へ被験者の感情を変化させるような視覚的情報、あるいは聴覚的情報、あるいは触覚的情報の少なくとも一つを出力する、請求項1ないし5のいずれか1つに記載の感情分析システム。
    The acquisition unit analyzes the subject's emotion based on a Russell circle,
    The notification unit outputs at least one of visual information, auditory information, and tactile information that changes the subject's emotion toward the radial center of the Russell's ring. Emotion analysis system described in one.
  7.  前記報知部は、前記感情を改善する場合には、前記被験者の現在の行為から、異なる別の行為への変更を促すメッセージを出力し、および/または、前記感情を維持する場合には、前記被験者に直接的または間接的に褒賞を与えるメッセージを出力する請求項1ないし6のいずれか1つに記載の感情分析システム。 The notification unit outputs a message prompting a change to another different action from the current action of the subject when improving the feeling, and / or when maintaining the feeling. The emotion analysis system according to any one of claims 1 to 6, which outputs a message for giving a reward directly or indirectly to a subject.
  8.  被験者が装着する装着型端末、
     その端末と無線あるいは有線で電子的信号やデータを送受信する携帯型端末、
     被験者の心拍数と体温を検出する検出部、
     心拍数と体温を取得し感情を分析する取得部、および、
     分析した感情を表示する表示部を備えており、
     前記装着型端末には前記検出部と前記表示部を搭載し、
     前記携帯型端末には前記取得部と前記表示部を搭載し、さらに、
     前記装着型端末と前記携帯型端末とは、前記無線あるいは前記有線のどちらかで通信する通信部を搭載している感情分析システム。
    Wearable terminal that the subject wears,
    A portable terminal that transmits and receives electronic signals and data wirelessly or wiredly with the terminal,
    Detection unit that detects heart rate and temperature of the subject,
    An acquisition unit for acquiring heart rate and body temperature and analyzing emotions;
    It has a display unit that displays the analyzed emotions,
    Mounting the detection unit and the display unit on the wearable terminal;
    The portable terminal includes the acquisition unit and the display unit, and
    The emotion analysis system in which the wearable terminal and the portable terminal are equipped with a communication unit that communicates either with the wireless communication or the wired communication.
  9.  前記携帯型端末に、感情を表す言語的なメッセージを複数備え、
     それらのメッセージの中から被験者の感情に応じたものを、前記装着型端末および前記携帯型端末の前記表示部に自動的に視覚的に表示し、および/または、
     前記メッセージを音声によって出力する音出力部を備える、請求項8に記載の感情分析システム。
    The portable terminal includes a plurality of verbal messages representing emotions,
    The visual display of the wearable terminal and the display unit of the portable terminal according to the subject's emotion among the messages is automatically displayed and / or,
    The emotion analysis system according to claim 8, further comprising: a sound output unit that outputs the message by voice.
  10.  前記装着型端末あるいは前記携帯型端末のどちらか、あるいはその両方、および、分離された別の表示手段に表示された前記メッセージに対して、前記被験者が返事や回答の被験者メッセージを端末から入力できるようになっており、
     かつ、前記被験者メッセージに対し前記携帯型端末がさらに返信を送ることができるような会話提供部を備える、請求項9に記載の感情分析システム。
    The subject can input a subject message of a reply or a reply from the terminal to the message displayed on either the wearable terminal or the portable terminal, or both of them, and another separated display means It has become
    The emotion analysis system according to claim 9, further comprising a conversation provision unit that allows the portable terminal to further send a reply to the subject message.
PCT/JP2019/002570 2018-01-26 2019-01-25 Emotional analysis system WO2019146767A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018-011827 2018-01-26
JP2018011827A JP2021052812A (en) 2018-01-26 2018-01-26 Emotion analysis system

Publications (1)

Publication Number Publication Date
WO2019146767A1 true WO2019146767A1 (en) 2019-08-01

Family

ID=67394988

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/002570 WO2019146767A1 (en) 2018-01-26 2019-01-25 Emotional analysis system

Country Status (2)

Country Link
JP (1) JP2021052812A (en)
WO (1) WO2019146767A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021250731A1 (en) * 2020-06-08 2021-12-16 日本電信電話株式会社 Message selection device, message presentation device, message selection method, and message selection program
WO2021250730A1 (en) * 2020-06-08 2021-12-16 日本電信電話株式会社 Message generating device, message presenting device, message generating method, and message generating program
WO2024004440A1 (en) * 2022-06-30 2024-01-04 ソニーグループ株式会社 Generation device, generation method, reproduction device, and reproduction method
EP4113425A4 (en) * 2020-02-28 2024-05-29 Astellas Pharma Inc Wearable appliance, information processing device, information processing system, and program

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023080019A1 (en) * 2021-11-04 2023-05-11 オムロン株式会社 Bioinformation processing device, bioinformation processing method, and program
WO2023203961A1 (en) * 2022-04-18 2023-10-26 パナソニックIpマネジメント株式会社 Psychological state display system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130023738A1 (en) * 2011-07-22 2013-01-24 Hon Hai Precision Industry Co., Ltd. Mobile phone for health inspection and method using same
CN104905803A (en) * 2015-07-01 2015-09-16 京东方科技集团股份有限公司 Wearable electronic device and emotion monitoring method thereof
JP2016036500A (en) * 2014-08-07 2016-03-22 シャープ株式会社 Voice output device, network system, voice output method, and voice output program
JP2017182530A (en) * 2016-03-31 2017-10-05 大日本印刷株式会社 Search system, search method, server and program

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130023738A1 (en) * 2011-07-22 2013-01-24 Hon Hai Precision Industry Co., Ltd. Mobile phone for health inspection and method using same
JP2016036500A (en) * 2014-08-07 2016-03-22 シャープ株式会社 Voice output device, network system, voice output method, and voice output program
CN104905803A (en) * 2015-07-01 2015-09-16 京东方科技集团股份有限公司 Wearable electronic device and emotion monitoring method thereof
JP2017182530A (en) * 2016-03-31 2017-10-05 大日本印刷株式会社 Search system, search method, server and program

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4113425A4 (en) * 2020-02-28 2024-05-29 Astellas Pharma Inc Wearable appliance, information processing device, information processing system, and program
WO2021250731A1 (en) * 2020-06-08 2021-12-16 日本電信電話株式会社 Message selection device, message presentation device, message selection method, and message selection program
WO2021250730A1 (en) * 2020-06-08 2021-12-16 日本電信電話株式会社 Message generating device, message presenting device, message generating method, and message generating program
JP7400968B2 (en) 2020-06-08 2023-12-19 日本電信電話株式会社 Message generation device, message presentation device, message generation method, and message generation program
JP7435768B2 (en) 2020-06-08 2024-02-21 日本電信電話株式会社 Message selection device, message presentation device, message selection method, and message selection program
WO2024004440A1 (en) * 2022-06-30 2024-01-04 ソニーグループ株式会社 Generation device, generation method, reproduction device, and reproduction method

Also Published As

Publication number Publication date
JP2021052812A (en) 2021-04-08

Similar Documents

Publication Publication Date Title
WO2019146767A1 (en) Emotional analysis system
TWI622953B (en) A computer implemented method,in a computer device, for mind tuning to preferred state of mind of a user, and a system for providing a preferred state of mind of a user
Guntupalli et al. Emotional and physiological responses of fluent listeners while watching the speech of adults who stutter
US20150351655A1 (en) Adaptive brain training computer system and method
JP2018517996A (en) Infant caregiver system, infant data aggregation system, aggregation of observations about infant data, and aggregation of guesses about infant data
US20220280085A1 (en) System and Method for Patient Monitoring
CN111656304A (en) Communication method and system
US11779275B2 (en) Multi-sensory, assistive wearable technology, and method of providing sensory relief using same
Pryss et al. A personalized sensor support tool for the training of mindful walking
JP2018517995A (en) Analysis of aggregate infant measurement data, prediction of infant sleep patterns, derivation of infant models based on observations related to infant data, creation of infant models using inference, and derivation of infant development models
JP2022059140A (en) Information processing device and program
Shannon Responses to infant-directed singing in infants of mothers with depressive symptoms
JP2018525079A (en) Remote aggregation of measurement data from multiple infant monitoring systems, remote aggregation of data related to the effects of environmental conditions on infants, determination of infant developmental age relative to biological age, and social media recognition of completion of infant learning content Offer
JP5987238B2 (en) Multimodal tracking system and program
WO2023034347A1 (en) Multi-sensory, assistive wearable technology, and method of providing sensory relief using same
JPWO2019220745A1 (en) Information processing system, information processing method, and recording medium
CN116419778A (en) Training system, training device and training with interactive auxiliary features
Obe et al. An affective-based E-healthcare system framework
Alhassan et al. Admemento: a prototype of activity reminder and assessment tools for patients with alzheimer’s disease
WO2019155010A1 (en) Method, apparatus and system for providing a measure to resolve an uncomfortable or undesired physiological condition of a person
Li et al. Rethinking pain communication of patients with Alzheimer’s disease through E-textile interaction design
US20230338698A1 (en) Multi-sensory, assistive wearable technology, and method of providing sensory relief using same
WO2023199422A1 (en) Inner state inference device, inner state inference method, and storage medium
Attard Mental state examination
Zhang et al. Peripheral display for conveying real-time pain condition of persons with severe intellectual disabilities to their caregivers

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19743966

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19743966

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP

NENP Non-entry into the national phase

Ref country code: JP