WO2020255630A1 - Biological information analysis device, biological information analysis system, biological information analysis program, and biological information analysis method - Google Patents

Biological information analysis device, biological information analysis system, biological information analysis program, and biological information analysis method Download PDF

Info

Publication number
WO2020255630A1
WO2020255630A1 PCT/JP2020/020391 JP2020020391W WO2020255630A1 WO 2020255630 A1 WO2020255630 A1 WO 2020255630A1 JP 2020020391 W JP2020020391 W JP 2020020391W WO 2020255630 A1 WO2020255630 A1 WO 2020255630A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
information
subject
mental state
action
Prior art date
Application number
PCT/JP2020/020391
Other languages
French (fr)
Japanese (ja)
Inventor
翔 アドナース 高橋
昌泰 藤岡
克敏 沢田
俊 青木
伸敏 小林
Original Assignee
Jsr株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jsr株式会社 filed Critical Jsr株式会社
Priority to JP2021527494A priority Critical patent/JPWO2020255630A1/ja
Publication of WO2020255630A1 publication Critical patent/WO2020255630A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H10/00ICT specially adapted for the handling or processing of patient-related medical or healthcare data

Definitions

  • An embodiment of the present invention relates to a biological information analyzer, a biological information analysis system, a biological information analysis program, and a biological information analysis method.
  • the present invention is to provide a biometric information analyzer, a biometric information analysis system, a biometric information analysis program, and a biometric information analysis method capable of accurately grasping the mental state of a subject.
  • the present invention is an acquisition unit that acquires biological information including autonomic nerve data of a subject and behavior record information in which a plurality of actions of the subject are recorded.
  • the biological information analyzer is provided with a calculation unit that calculates mental state information in each behavior of the subject based on the biological information and the behavior record information.
  • the present invention is a biometric information analysis system including a server device and a display terminal, in which the server device records biometric information including autonomic nerve data of the subject and a plurality of actions of the subject.
  • An acquisition unit that acquires the obtained action record information, a calculation unit that calculates mental state information in each action of the target person based on the biological information and the action record information, and a spirit in each action of the target person.
  • the display terminal includes an output control unit that transmits state information to the display terminal, and the display terminal receives mental state information in each action of the target person transmitted from the server device, and receives the mental state information of the target person.
  • It is a biological information analysis system equipped with a display control unit that displays mental state information in each action.
  • the present invention acquires biological information including autonomic nerve data of a subject and behavior record information in which a plurality of actions of the subject are recorded, and based on the biological information and the behavior record information, the said It is a biological information analysis program that causes a computer to execute each process of calculating mental state information in each behavior of the subject.
  • the present invention acquires biological information including autonomic nerve data of a subject and behavior record information in which a plurality of actions of the subject are recorded, and based on the biological information and the behavior record information, the said It is a biological information analysis method including calculating mental state information in each behavior of a subject.
  • a biological information analysis device capable of accurately grasping the mental state of a subject.
  • FIG. 1 is a diagram showing an example of a schematic configuration of an information processing apparatus according to an embodiment.
  • FIG. 2 is a diagram showing an example of the hardware configuration of the information processing apparatus according to the embodiment.
  • FIG. 3 is a flowchart showing an operation example of the information processing apparatus according to the embodiment.
  • FIG. 4 is a diagram for explaining the processing of the acquisition unit.
  • FIG. 5 is a diagram for explaining the processing of the acquisition unit.
  • FIG. 6 is a diagram for explaining the processing of the acquisition unit.
  • FIG. 7 is a diagram for explaining the processing of the acquisition unit.
  • FIG. 8 is a diagram for explaining the processing of the acquisition unit.
  • FIG. 9 is a diagram for explaining the processing of the acquisition unit.
  • FIG. 10 is a diagram for explaining the processing of the calculation unit.
  • FIG. 10 is a diagram for explaining the processing of the calculation unit.
  • FIG. 11 is a diagram for explaining the processing of the generation unit.
  • FIG. 12 is a diagram for explaining the processing of the generation unit.
  • FIG. 13 is a diagram for explaining the processing of the generation unit.
  • FIG. 14 is a diagram for explaining the processing of the output control unit.
  • FIG. 15 is a diagram for explaining the processing of the output control unit.
  • FIG. 16 is a diagram for explaining the processing of the output control unit.
  • FIG. 17 is a diagram for explaining the processing of the output control unit.
  • FIG. 18 is a diagram for explaining the processing of the output control unit.
  • FIG. 19 is a diagram for explaining the processing of the output control unit.
  • FIG. 20 is a diagram showing an example of a schematic configuration of the information processing apparatus according to the second embodiment.
  • FIG. 20 is a diagram showing an example of a schematic configuration of the information processing apparatus according to the second embodiment.
  • FIG. 21 is a diagram showing processing during learning and operation performed by the information processing apparatus according to the second embodiment.
  • FIG. 22 is a flowchart showing an operation example of the information processing apparatus according to the second embodiment.
  • FIG. 23 is a diagram showing an example of a schematic configuration of the system according to the embodiment.
  • FIG. 24 is a diagram for explaining a trained model according to each Example and Comparative Example.
  • biometric information analyzer the biometric information analysis system, the biometric information analysis program, and the biometric information analysis method according to the embodiment will be described with reference to the drawings.
  • the following embodiments are not limited to the following description.
  • each embodiment can be combined with other embodiments or conventional techniques as long as the processing contents do not conflict with each other.
  • this embodiment is applied to psychological counseling. That is, the case where the information processing device according to the present embodiment is used by a person who provides psychological counseling (for example, a counselor, a doctor, etc.) to grasp the mental state of the subject who is involved in psychological counseling will be described.
  • the present embodiment is not limited to psychological counseling, and can be applied in various situations for grasping the mental state of the subject.
  • FIG. 1 is a diagram showing an example of a schematic configuration of an information processing apparatus according to an embodiment.
  • the information processing apparatus 20 according to the embodiment directly or indirectly communicates with the sensor 11 and the mobile terminal 12 via a network such as a LAN (Local Area Network) or WAN (Wide Area Network). It is in a state where it can communicate with.
  • the information processing device 20 is an example of a biological information analyzer.
  • the sensor 11 and the mobile terminal 12 do not have to be always connected to the information processing device 20. For example, when a session to the target person is held, it is sufficient to be connected to exchange various information. Further, when exchanging information via an external recording medium or the like, the sensor 11 and the mobile terminal 12 do not have to be connected to the information processing device 20.
  • the sensor 11 is a sensor device worn by the target person.
  • the sensor 11 is a multi-sensor device having a plurality of sensor functions.
  • the sensor 11 includes functions of an electrocardiograph, an optical pulse wave meter, a body motion meter, and a thermistor.
  • the electrocardiograph measures time-series electrocardiographic waveform data.
  • the optical pulse wave meter measures time-series pulse wave data.
  • the body motion meter measures time-series acceleration data.
  • the thermistor also measures time-series temperature data.
  • the sensor 11 records various measured measurement data in a memory inside the apparatus in association with the measurement start time.
  • known techniques can be appropriately selected and applied.
  • the sensor 11 is basically always worn by the target person.
  • the subject removes the sensor 11 when the session is performed, and hands the removed sensor 11 to an operator (typically, a counselor or a doctor) of the information processing device 20.
  • the operator of the information processing device 20 transfers the measurement data recorded after the previous session from the sensor 11 received from the target person to the information processing device 20. The details of the process of acquiring biological information from the measurement data will be described later.
  • the sensor 11 is not necessarily limited to the multi-sensor device, and may at least have an electrocardiograph function. Further, the sensor 11 is not necessarily limited to being always worn, and may be removed within a range that does not affect the biological information analysis process described later. Further, the transfer timing of the measurement data recorded in the sensor 11 is not necessarily limited to the session, and may be performed periodically (for example, once a day) via the network, for example.
  • the mobile terminal 12 is, for example, a smartphone owned by the target person, and a recording application for collecting behavior record information in which the target person's behavior is recorded is introduced (installed).
  • the action record information is automatically or manually recorded in the memory inside the device of the mobile terminal 12.
  • the action record information recorded in the mobile terminal 12 is transferred to the information processing device 20 at an arbitrary timing. The details of the process of acquiring the action record information will be described later.
  • the mobile terminal 12 is not necessarily limited to a smartphone, and may be any information processing device such as a tablet or a personal computer into which an application for collecting action record information can be introduced. However, due to the nature of collecting behavior record information, it is preferable that the target person is a device (mobile terminal) that can be carried on a daily basis.
  • the information processing device 20 is, for example, a computer such as a personal computer or a workstation.
  • the information processing device 20 calculates mental state information indicating the mental state of the target person based on the information acquired by the sensor 11 and the mobile terminal 12.
  • the information processing device 20 includes an acquisition unit 201, a calculation unit 202, a generation unit 203, and an output control unit 204.
  • the functions of the information processing device 20 are not limited to the acquisition unit 201, the calculation unit 202, the generation unit 203, and the output control unit 204.
  • the acquisition unit 201, the calculation unit 202, the generation unit 203, and the output control unit 204 will be described later.
  • FIG. 2 is a diagram showing an example of the hardware configuration of the information processing apparatus 20 according to the embodiment.
  • the information processing device 20 includes a CPU (Central Processing Unit) 21, a ROM (Read Only Memory) 22, a RAM (Random Access Memory) 23, an auxiliary storage device 24, and an input device 25.
  • a display device 26 and an external I / F (Interface) 27 are provided.
  • the CPU 21 is a processor (processing circuit) that comprehensively controls the operation of the information processing device 20 by executing a program and realizes various functions of the information processing device 20. Various functions of the information processing device 20 will be described later.
  • the ROM 22 is a non-volatile memory and stores various data (information written at the manufacturing stage of the information processing device 20) including a program for activating the information processing device 20.
  • the RAM 23 is a volatile memory having a working area of the CPU 21.
  • the auxiliary storage device 24 stores various data such as a program executed by the CPU 21.
  • the auxiliary storage device 24 is composed of, for example, an HDD (Hard Disc Drive), an SSD (Solid State Drive), or the like.
  • the input device 25 is a device for an operator using the information processing device 20 to perform various operations.
  • the input device 25 is composed of, for example, a mouse, a keyboard, a touch panel, or hardware keys.
  • the operator corresponds to, for example, a medical person such as a doctor.
  • the display device 26 displays various information.
  • the display device 26 displays image data, model data, a GUI (Graphical User Interface) for receiving various operations from an operator, a medical image, and the like.
  • the display device 26 is composed of, for example, a liquid crystal display, an organic EL (Electro Luminescence) display, or a cathode ray tube display.
  • the input device 25 and the display device 26 may be integrally configured, for example, in the form of a touch panel.
  • the external I / F 27 is an interface for connecting (communication) with an external device such as a sensor 11 or a mobile terminal 12.
  • FIG. 3 is a flowchart showing an operation example of the information processing apparatus 20 according to the embodiment. In the description of FIG. 3, the description will be made with reference to FIGS. 4 to 19.
  • the biometric information analysis process shown in FIG. 3 can be executed at an arbitrary timing, but it is preferably executed every time a session for the target person is performed.
  • the acquisition unit 201 acquires the biological information of the target person (step S101). For example, the acquisition unit 201 acquires the biological information of the target person by generating the biological information of the target person based on various measurement data collected by the sensor 11.
  • the biological information is information including, for example, autonomic nerve data, total power data, body movement data, body temperature data, pulse data, and the like.
  • the autonomic nerve data is time-series information of the activity index of the autonomic nerve, and includes sympathetic nerve data, parasympathetic nerve data, and total power data.
  • the sympathetic nerve data is time-series information of the sympathetic nerve activity index (Sympathetic Nervous System: SNS level).
  • the parasympathetic nerve data is time-series information of the parasympathetic nerve activity index (ParaSympathetic Nervous System: PSNS level).
  • the total power data is data in which sympathetic nerve data and parasympathetic nerve data are integrated, and corresponds to, for example, the sum of sympathetic nerve data and parasympathetic nerve data.
  • Body movement data is time-series information indicating the intensity of body movement.
  • the body temperature data is time-series information of the body temperature on the body surface (the place where the sensor 11 is attached).
  • the pulse data is time
  • FIGS. 4 and 5 are diagrams for explaining the processing of the acquisition unit 201.
  • FIG. 4 shows an example of electrocardiographic waveform data (electrocardiogram) measured by the sensor 11.
  • FIG. 5 shows an example of autonomic nerve data generated by the acquisition unit 201.
  • the acquisition unit 201 acquires the autonomic nerve data by generating the autonomic nerve data from the electrocardiographic waveform data showing the time-series electrocardiographic waveform. Specifically, the acquisition unit 201 detects the position of the R wave from the electrocardiographic waveform data shown in FIG. 4, and plots the interval (RR interval: RRI) of the detected R wave over time in a heartbeat interval fluctuation time series. (RRI time series) is generated. Then, the acquisition unit 201 calculates the power spectrum from the RRI time series and integrates the power in a predetermined frequency region. As an example, the acquisition unit 201 calculates the integrated value of the power in the low frequency band (0.05 Hz to 0.15 Hz) as the SNS level (upper part of FIG.
  • the acquisition unit 201 generates sympathetic nerve data and parasympathetic nerve data, respectively.
  • the acquisition unit 201 acquires the total power data based on the sympathetic nerve data and the parasympathetic nerve data. For example, the acquisition unit 201 acquires total power data by adding the SNS level and the PSNS level corresponding to each time.
  • the acquisition unit 201 acquires the biological information of the target person based on various measurement data collected by the sensor 11.
  • the above-mentioned methods for acquiring sympathetic nerve data, parasympathetic nerve data, and total power data are merely examples, and are not limited to the above description.
  • known techniques can be appropriately selected and applied.
  • the acquisition unit 201 can acquire based on various measurement data collected by the sensor 11. For example, the acquisition unit 201 can acquire time-series data of heart rate and respiratory rate from the electrocardiographic waveform data. In addition, the acquisition unit 201 can acquire percutaneous arterial oxygen saturation (SpO2) and pulse time series data from the pulse wave data. In addition, the acquisition unit 201 can acquire body movement data from the acceleration data. In addition, the acquisition unit 201 can acquire body temperature data from the temperature data. As a method for acquiring these data, a known technique can be appropriately selected and applied.
  • the acquisition unit 201 acquires the behavior record information of the target person (step S102).
  • the acquisition unit 201 acquires the action record information from the mobile terminal 12 of the target person.
  • the action record information includes action data indicating the content of each action, time data indicating the time when each action was performed, place data indicating the place where each action was performed, and medication indicating the medication history. Includes historical data and mood history data indicating the history of mood when each action is performed.
  • the behavior data is represented by stepwise classification information.
  • FIGS. 6 and 7 are diagrams for explaining the processing of the acquisition unit 201.
  • FIG. 6 shows an example of action record information recorded by the mobile terminal 12.
  • FIG. 7 shows an example of classification of behavior data.
  • the mobile terminal 12 collects behavior record information including behavior data, time data, place data, medication history data, and mood history data.
  • the second record includes the action data "breakfast meal”, the time data "1 hour 15 minutes 7:00", the place data "home”, and the medication history data "drug name:”.
  • AAA medication time 7:40 ”and mood history data“ facial expression mark (smile) ” are associated. That is, the second record shows that the subject was having breakfast at home for 1 hour and 15 minutes from 7:00, at which time the subject was in a good mood.
  • the mobile terminal 12 collects behavior record information including behavior data, time data, place data, medication history data, and mood history data.
  • the behavior data is preferably represented by stepwise classification information as shown in FIG.
  • behavioral data is divided into major, middle, and minor categories.
  • the major classification "work” includes the middle classification “meeting” and “desk work”.
  • the middle category “desk work” includes the minor categories "report material creation” and "document arrangement”.
  • the classification information of the behavior data is defined in advance, it is preferable that the subject can add, delete, and edit the classification information.
  • the action data, the time data, and the place data are automatically input by the mobile terminal 12 based on the schedule.
  • the mobile terminal 12 acquires the schedule information of the target person from the schedule application introduced in the mobile terminal 12.
  • This schedule information includes information such as the action, date and time, and location planned by the subject. Therefore, the mobile terminal 12 extracts the action data, the time data, and the place data from the schedule information, and records the extracted information as the action record information.
  • the behavior data, time data, and location data automatically recorded by the mobile terminal 12 can be manually modified.
  • the location data may be automatically modified based on the coordinate information acquired by the GPS (Global Positioning System) function provided in the mobile terminal 12.
  • GPS Global Positioning System
  • each location data and the coordinate information on the GPS function of that location are associated in advance and registered in the recording application. Then, when the location data corresponding to the coordinate information actually acquired by the GPS function and the location data based on the schedule information are different from each other, the mobile terminal 12 discards the location data based on the schedule information. If the coordinate information is moving, "movement" is input as location data and action data.
  • medication history data and mood history data are manually input.
  • the subject activates a recording application each time he / she takes a medicine, and inputs the name of the medicine taken and the time of taking the medicine.
  • the subject selects mood history data corresponding to the mood at the time of performing the action.
  • the medication history data and mood history data can also be manually corrected.
  • the mood history data is not limited to the facial expression mark, and may be represented by text information indicating the mood such as "sleepy" or "relaxed” and numerical information indicating the degree thereof.
  • the mobile terminal 12 collects the behavior record information of the target person. Then, the mobile terminal 12 transfers the collected action record information to the information processing device 20.
  • the transfer timing of the action record information can be set arbitrarily. For example, the mobile terminal 12 may periodically transfer the action record information, or may transfer it at the request of the target person or the operator of the information processing device 20.
  • the acquisition unit 201 acquires the behavior record information of the target person from the mobile terminal 12.
  • the action record information of the target person can be corrected based on the body movement data acquired from the sensor 11.
  • the acquisition unit 201 acquires the sleep start time and end time of the subject based on the intensity of the body movement in the body movement data. If the start time and end time of sleep based on the physical motion data are different from the start time and end time of sleep in the action record information, the start time and end time of sleep in the action record information are discarded and the body It can be overwritten with the start time and end time of sleep based on the motion data.
  • the acquisition unit 201 acquires the question / answer information of the target person (step S103). For example, the acquisition unit 201 acquires the mental state answer data regarding the mental state of the target person and the session evaluation answer data regarding the evaluation of the session performed for the target person as the question and answer information.
  • FIGS. 8 and 9 are diagrams for explaining the processing of the acquisition unit 201.
  • FIG. 8 shows an example of mental state response data.
  • FIG. 9 shows an example of session evaluation response data.
  • the operator of the information processing device 20 distributes a questionnaire corresponding to each item of the mental state response data (FIG. 8) and the session evaluation response data (FIG. 9) to the target person each time a session is performed. Then, when the operator receives the questionnaire in which the answer by the target person is entered, the operator inputs the answer (question answer information) entered in the received questionnaire into the information processing device 20.
  • the question-and-answer information may be input by image recognition for the data scanned by a camera or the like, or may be manually input by the operator.
  • the acquisition unit 201 acquires the question / answer information including the mental state answer data and the session evaluation answer data.
  • the calculation unit 202 calculates the mental state information based on the biological information, the action record information, and the question / answer information (step S104). For example, the calculation unit 202 specifies the partial data corresponding to the period in which each action is performed among the time-series autonomic nerve data, and calculates the statistical value of the specified partial data as the mental state information.
  • FIG. 10 is a diagram for explaining the processing of the calculation unit 202.
  • XXX a numerical value indicating the mental state information calculated by the calculation unit 202 is input.
  • the calculation unit 202 calculates the mental state information in each action based on various information acquired by the acquisition unit 201.
  • the mental state information is, for example, a representative value of sympathetic nerve data, parasympathetic nerve data, and total power data in each action.
  • the calculation unit 202 links the time series of biological information with the time series of action record information by collating the measurement start time by the sensor 11 with the time data included in the action record information. Then, the calculation unit 202 specifies the partial data corresponding to the period in which each action is performed in the time-series autonomic nerve data. Specifically, the calculation unit 202 specifies the sympathetic nerve data in the period corresponding to the period "24:00 to 7:00" in which the behavior data "sleep" is performed, among the sympathetic nerve data in the time series. Then, the calculation unit 202 calculates the average value of the sympathetic nerve data for the specified period as the mental state information of FIG. The calculation unit 202 calculates the parasympathetic nerve data and the total power data in the same manner as the sympathetic nerve data.
  • the calculation unit 202 calculates the mental state information in each action based on the biological information and the action record information.
  • the case where the "mean value" is calculated as the representative value of the sympathetic nerve data in each behavior has been described, but the present invention is not limited to this, and any statistical value such as the peak value, the median value, and the standard deviation is described. Can be calculated.
  • the calculation unit 202 calculates the mental state score of the subject based on the mental state response data. For example, the calculation unit 202 calculates the mental state score related to the mental state of the subject by inputting the score for each item included in the mental state answer data into an arbitrary function.
  • the calculation unit 202 calculates the session evaluation score of the session based on the session evaluation response data. For example, the calculation unit 202 calculates the session evaluation score related to the session evaluation by inputting the score for each item included in the session evaluation response data into an arbitrary function.
  • the calculation method of the mental state score and the session evaluation score can be appropriately changed according to the question items included in each question answer information.
  • the generation unit 203 generates a comment based on the mental state information (step S105). For example, the generation unit 203 generates at least one of a comment and an image that evaluates the mental state in each action based on the mental state information in each action.
  • FIGS. 11, 12, and 13. 11, FIG. 12, and FIG. 13 are diagrams for explaining the processing of the generation unit 203.
  • FIG. 11 shows an example of a template for each comment.
  • FIG. 12 shows an example of a comment generated by the generation unit 203.
  • FIG. 13 shows an example of the comprehensive determination generated by the generation unit 203.
  • the auxiliary storage device 24 stores information in which the template of each comment and the generation condition of each comment are associated with each of the plurality of comments.
  • the auxiliary storage device 24 has a template " ⁇ % of ⁇ hour ⁇ minutes to ⁇ hour ⁇ minutes is good quality sleep", an exchange nerve data generation condition “less than the threshold A”, and parasympathetic nerve data generation.
  • the information in which the condition "threshold B or higher” and the behavior data generation condition "sleep" are associated with each other is stored.
  • This information is based on the template " ⁇ hour ⁇ minute to ⁇ hour ⁇ " when there is an action in which the sympathetic nerve data is "less than the threshold value A", the parasympathetic nerve data is “threshold value B or more”, and the behavior data is "sleep". It is shown that XX% of the minutes generate comments based on "good sleep”.
  • the generation unit 203 determines whether or not the generation condition of each comment is satisfied for each action included in the action record information.
  • the generation unit 203 reads the template of the comment from the auxiliary storage device 24. Then, the generation unit 203 generates a comment of the action satisfying the generation condition based on the read template.
  • the generation unit 203 when there is an action in which the sympathetic nerve data is "less than the threshold value A", the parasympathetic nerve data is “threshold value B or more", and the behavior data is "sleep", the template “from XX hours XX minutes to XX” Read the comment template based on “ ⁇ % of the hours and minutes are good sleep”. Then, the generation unit 203 calculates the start time, end time, and the ratio of good quality sleep of "sleep" that satisfies the generation condition, and inputs the calculated information into the template to "from 23:05 to 6:00". Generates the comment "29% of the minutes are good sleep" ( Figure 12). The ratio of good quality sleep is calculated as, for example, the ratio of the period during which the PSNS level is equal to or higher than a predetermined value during sleep time.
  • the generation unit 203 generates a comprehensive judgment of the common action based on the mental state information in each of the plurality of actions for the common action performed a plurality of times within a certain period. For example, if the behavior data "sleep" was performed 7 times between the previous session and the current session, the generation unit 203 "comprehensive sleep” based on the mental state information of the 7 times of sleep. Judgment "is performed. Specifically, the generation unit 203 inputs the sympathetic nerve data and the parasympathetic nerve data of seven sleeps into an arbitrary function, and is based on a score of four stages of "0" to "3" shown in FIG. Make a comprehensive judgment. Then, the generation unit 203 generates a comprehensive determination comment corresponding to the score of the comprehensive determination. The comprehensive determination comment is stored in the auxiliary storage device 24 in advance.
  • the output control unit 204 outputs the mental state information and the comment (step S106). For example, the output control unit 204 outputs information in which action data indicating each action, a period during which each action is performed, and mental state information in each action are associated with each other.
  • FIGS. 14, 15, and 16 are diagrams for explaining the processing of the output control unit 204.
  • the vertical axis corresponds to the SNS level and the horizontal axis corresponds to the PSNS level.
  • the size of the circle corresponds to the size of the period (time) in which each action is performed.
  • the output control unit 204 displays the autonomic nerve data of each action.
  • the output control unit 204 plots four behaviors of work, movement, meal, and sleep. As a result, the operator can easily grasp the difference between the SNS level and the PSNS level in each action. For example, the operator finds that the SNS level at work is higher than that of moving, eating, and sleeping.
  • the output control unit 204 simultaneously displays the latest autonomic nerve data and the past autonomic nerve data for each action.
  • the output control unit 204 plots the autonomic nerve data in the current work, the autonomic nerve data in the previous work, and the autonomic nerve data in the work two times before.
  • the operator can easily grasp the difference between the current mental state and the past mental state in each action. For example, the operator finds that the SNS level at work is higher than before.
  • the output control unit 204 simultaneously displays standard autonomic nerve data in each action.
  • the output control unit 204 plots the autonomic nerve data in each behavior of the target person and the autonomic nerve data of a standard person of the same generation as the target person.
  • the operator can easily grasp the difference between the mental state of the subject and the standard person. For example, the operator can see that the SNS level of the subject is high in any of the behaviors of work, movement, eating, and sleeping. In addition, the operator can see that the sleep time of the subject is shorter than that of the standard person.
  • the output control unit 204 displays the mental state score and the session evaluation score as the mental state information.
  • FIGS. 17 and 18 are diagrams for explaining the processing of the output control unit 204.
  • the vertical axis corresponds to the index of each data
  • the horizontal axis corresponds to the number of sessions.
  • the vertical axis is plotted with the index of each data at the time of the first session as 100.
  • the output control unit 204 displays the mental state score and the total power data.
  • This total power data is a representative value of the total power data for the period (for example, one week) divided by each session.
  • the operator can simultaneously view the objective mental state information detected by the sensor 11 together with the mental state score.
  • the mental state score may include blurring and lies depending on the mental state at the time of the subject's answer, but the operator objectively indicates the mental state that causes the blurring and lying. Since the information can be viewed at the same time, it can be useful for advice at the next session.
  • the output control unit 204 displays the session evaluation score and the total power data.
  • This total power data is a representative value of the total power data for the period (for example, one week) divided by each session.
  • the operator can simultaneously view the objective mental state information detected by the sensor 11 together with the session evaluation score.
  • the mental state score may include blurring and lies depending on the mental state at the time of the subject's response, but the operator objectively indicates the mental state that causes the blurring and lying. Information can be viewed at the same time, which can be useful when deciding the content of the next session.
  • the output control unit 204 displays comments related to each action in association with the action record information.
  • FIG. 19 is a diagram for explaining the processing of the output control unit 204.
  • FIG. 19 illustrates information in which biological information including arrhythmia, heart rate, physical activity, sympathetic nerve, and parasympathetic nerve is associated with behavior record information including behavior history and mood history.
  • the horizontal axis corresponds to time.
  • the output control unit 204 collates the measurement start time by the sensor 11 with the time data included in the action record information to obtain a time series of biometric information and a time series of action record information. Display in association with each other. Then, the output control unit 204 displays a comment regarding each action in association with the action record information.
  • the comment generated by the generation unit 203 is given to the action that satisfies the condition for generating the comment. Therefore, the output control unit 204 made a comment "29% of good sleep from 23:05 to 6:00" and ended “sleep” at the time “6:00” on the date "2018/10/19". Is displayed in association with. In addition, the output control unit 204 displays the comment "I am excited during the meal” in association with the "dinner” that ended at the time "20:00" on the date "2018/10/18".
  • the output control unit 204 outputs mental state information and comments.
  • the output control unit 204 may transmit the mental state information and the comment to an external device, or may store the mental state information and the comment in a memory or a recording medium.
  • FIGS. 14 to 19 are merely examples, and are not a person who requires that all display forms be displayed at the same time.
  • the operator can appropriately select and display the display modes illustrated in FIGS. 14 to 19 as necessary.
  • the display items shown in FIGS. 14 to 19 can be changed by the operator.
  • the total power data shown in FIGS. 17 and 18 may be replaced with other mental state information, or a plurality of mental state information may be displayed.
  • any statistical value such as a mean value, a peak value, a median value, and a standard deviation can be used.
  • the acquisition unit 201 acquires the biological information including the autonomic nerve data of the target person and the action record information in which a plurality of actions of the target person are recorded. ..
  • the calculation unit 202 calculates the mental state information in each action of the subject based on the biological information and the action record information. According to this, the operator can accurately grasp the mental state of the subject.
  • the mental state can be objectively quantified based on the electrocardiographic waveform data collected by the sensor 11, so that the mental state of the subject can be accurately grasped. be able to.
  • the information processing device 20 since the mental state in each action is quantified, it becomes easy to identify the action that causes stress. Therefore, the operator can give more appropriate advice to the target person.
  • the generation unit 203 adjusts the generation condition of each comment stored in the auxiliary storage device 24 according to the total power data of the target person. For example, when the total power data of the target person is larger than the standard data, the generation unit 203 increases the threshold value included in the generation condition. Further, when the total power data of the target person is smaller than the standard data, the generation unit 203 reduces the threshold value included in the generation condition. Then, the generation unit 203 determines whether or not the generation condition of each comment is satisfied by using the generation condition after adjustment. As a result, the information processing device 20 can appropriately add comments according to the size of the individual autonomic nerve data.
  • the mental state information is calculated based on the biological information, the action record information, and the question and answer information and the comment is generated based on the calculated mental state information has been described, but this is the embodiment. It is not limited to.
  • it is also possible to generate a mental state determination result by inputting biological information and behavior record information into a learned model created by machine learning.
  • the mental state determination result is obtained by generating (determining) data corresponding to the mental state response data and the session evaluation response data by the trained model.
  • the information processing device 50 has, for a plurality of subjects (second subjects) having a specific mental tendency, biometric information including autonomic nerve data of each subject and each subject.
  • a learned model is constructed using the behavior record information in which a plurality of behaviors of the above are recorded and the question and answer information answered by each subject.
  • the information processing device 50 determines the mental state of the arbitrary target person by inputting the biological information and the action record information of the arbitrary target person (first target person) into the constructed learned model. Generates the mental state judgment result.
  • the first target person may be included in the second target person.
  • the "second target person” is, for example, a target person including a person having a specific mental tendency.
  • “Specific mental tendencies” include schizophrenia, mood disorders (depression, manic-depressive illness), panic disorder, eating disorders, developmental disorders, dementia, epilepsy, addiction, higher brain dysfunction, etc. Including mental disorders (mental disorders).
  • the accuracy (mental) of the above-mentioned trained model is likely to be improved, and especially when the person having depression is the second subject, the accuracy of the trained model is likely to be improved.
  • the second target person is not limited to a person having a mental disorder, and may be a person classified according to a mental tendency such as preference.
  • the second subject may be a person who is involved in psychological counseling.
  • FIG. 20 is a diagram showing an example of a schematic configuration of the information processing apparatus according to the second embodiment.
  • the information processing device 50 according to the embodiment is in a state where it can directly or indirectly communicate with the sensor 11 and the mobile terminal 12 via a network such as a LAN or WAN.
  • the information processing device 50 is an example of a biological information analyzer. Further, since the sensor 11 and the mobile terminal 12 are the same as the sensor 11 and the mobile terminal 12 shown in FIG. 1, the description thereof will be omitted.
  • the information processing device 50 includes an acquisition unit 501, a determination unit 502, and an output control unit 503. Since the acquisition unit 501 and the output control unit 503 are the same as the acquisition unit 201 and the output control unit 204 shown in FIG. 1, the description thereof will be omitted. Further, since the hardware configuration of the information processing device 50 is the same as the hardware configuration of the information processing device 20 shown in FIG. 2, the description thereof will be omitted.
  • the determination unit 502 will be described later.
  • FIG. 21 is a diagram showing processing during learning and operation performed by the information processing apparatus 50 according to the second embodiment.
  • the subject X shown in FIG. 21 is an example of the first subject.
  • the subjects S-1 to SN are examples of the second subject.
  • the information processing apparatus 50 uses, for example, the biometric information and behavior of a plurality of subjects S-1 to SN (for example, a plurality of depressed-prone persons and a plurality of healthy persons).
  • Machine learning is performed using recorded information and question / answer information (mental state answer data and session evaluation answer data) as training data.
  • question / answer information mental state answer data and session evaluation answer data
  • biometric information and action record information are input data
  • question and answer information is correct answer data.
  • a learned model that outputs the mental state judgment result of the diagnosis target person is constructed by inputting the biological information and the behavior record information of the diagnosis target person.
  • the trained model is stored in a storage device (for example, ROM 22, RAM 23, auxiliary storage device 24, etc.).
  • the information processing device 50 inputs the biological information and the action record information of the subject X, who is the diagnosis target, into the learned model constructed at the time of learning. Then, the trained model is made to output the mental state determination result of the subject X.
  • the subject X is, for example, a person suspected of having a depressive tendency. Then, the information processing device 50 presents the mental state determination result of the target person X output from the learned model to the operator (or the target person X).
  • the biometric information is the same information as the biometric information described in the first embodiment.
  • the biological information may include all of autonomic nerve data, body temperature data, pulse data, and body movement data, or may include only arbitrary data.
  • the target period of the biological information can be arbitrarily set such as several hours, several days, and several weeks.
  • the action record information is the same information as the action record information described in the first embodiment.
  • the behavior record information may include all of the behavior data, the time data, and the mood history data, or may include only arbitrary data.
  • the target period of the action record information can be arbitrarily set, but it is preferably the same as the target period of the biometric information.
  • the question-and-answer information is the same information as the question-and-answer information described in the first embodiment.
  • the question-and-answer information may include both the mental state answer data and the session evaluation answer data, or may include only one of the data.
  • the question-and-answer information may be acquired once as a tendency of the entire period during an arbitrary target period (preferably the same as the target period of the biological information), or may be acquired an arbitrary number of times at an arbitrary timing. Although it may be obtained, it is preferable to acquire it once as a tendency of the whole period.
  • the process of constructing the trained model is executed by, for example, the determination unit 502. That is, the determination unit 502 records the biological information including the autonomic nerve data of the second subject and the behavior record in which the plurality of actions of the second subject are recorded for the plurality of second subjects having a specific mental tendency. The information and the question and answer information answered by the second target person are acquired.
  • the biological information, the behavior record information, and the question / answer information of the second subject are collected in advance and stored in a storage device (for example, ROM 22, RAM 23, auxiliary storage device 24, etc.).
  • the determination unit 502 acquires the information by reading the biological information, the action record information, and the question / answer information of the second subject from the storage device.
  • the determination unit 502 constructs a learned model by performing machine learning using the acquired biological information, action record information, and question / answer information.
  • the process of constructing the trained model is not limited to the determination unit 502, and can be executed by any processing unit (processor).
  • FIG. 22 is a flowchart showing an operation example of the information processing apparatus 50 according to the second embodiment.
  • the mental state determination process shown in FIG. 22 is a process for determining whether or not the subject X suspected of having a depressive tendency or the like has a depressive tendency or the like.
  • the mental state determination process can be executed at any timing.
  • the acquisition unit 501 acquires the biological information of the subject X (step S201).
  • the acquisition unit 501 acquires the biological information of the subject X by generating the biological information of the subject X based on various measurement data collected by the sensor 11.
  • the acquisition unit 501 acquires the action record information of the target person X (step S202). For example, the acquisition unit 501 acquires the action record information from the mobile terminal 12 of the target person X.
  • the determination unit 502 generates the mental state determination result of the subject X by inputting the biological information and the action record information of the subject X into the trained model. For example, the determination unit 502 reads out the learned model stored in the storage device. Then, the determination unit 502 inputs the biological information and the action record information acquired by the acquisition unit 501 to the read learned model, so that the mental state determination result of the subject X is output to the learned model. .. After that, the determination unit 502 sends the mental state determination result output from the learned model to the output control unit 503.
  • the mental state judgment result output from the trained model includes the same type of information as the question and answer information used for constructing the trained model. That is, when the question and answer information used in machine learning includes both the mental state answer data and the session evaluation answer data, the mental state judgment result output from the trained model is the mental state answer data and the session evaluation answer. Includes both data.
  • the mental state determination result output from the trained model includes mental state answer data.
  • the mental state determination result output from the trained model includes the session evaluation answer data.
  • the output control unit 503 outputs the mental state determination result of the subject X.
  • the output control unit 503 causes the display device 26 to display the mental state determination result.
  • the output destination of the mental state determination result is not limited to the display device 26, and may be transmitted to, for example, an external device, or may be stored in a memory or a recording medium.
  • the acquisition unit 501 includes biological information including autonomic nerve data of the subject and action record information in which a plurality of actions of the subject are recorded.
  • the determination unit 502 generates a mental state determination result of the subject by inputting the biological information and the behavior record information of the subject into the trained model. According to this, the operator can accurately grasp the mental state of the subject.
  • the mental state determination result corresponding to the mental state answer data and / or the session evaluation answer data can be obtained by the trained model even if the subject does not fill in the information. It can be automatically generated (judged). Therefore, the operator can easily grasp whether or not the target person has a specific mental tendency by viewing the mental state determination result output from the information processing device 50.
  • the mental state determination process can be applied to a person who is not suspected of having a depressive tendency or the like.
  • the target person X also referred to as the first target person or the judgment target person
  • the first target person may be any one of a plurality of second target persons.
  • the trained model is constructed by the information processing device 50
  • the embodiment is not limited to this.
  • the trained model may be constructed by an external device different from the information processing device 50.
  • the information processing device 50 acquires the trained model constructed by the external device and executes the above-mentioned mental state determination process.
  • the information processing device 50 described in the second embodiment not only presents the mental state determination result to the operator as it is, but also calculates a "match rate" with the question and answer information of the depressed person and the like. It is also possible to present it.
  • the determination unit 502 collates the mental state determination result of the subject X with the representative question-and-answer information of a depressed person or the like.
  • the typical question-and-answer information of the depressive-prone person, etc. the question-and-answer information of any depressive-prone person, etc. may be used as it is, or the question-and-answer information of a plurality of depressive-prone people, etc. You may use it.
  • the determination unit 502 collates each item included in the mental state determination result with each item included in the question and answer information, and calculates the number of matching items. Then, the determination unit 502 calculates the "match rate" by dividing the number of matching items by the total number of items.
  • the calculation method of the "match rate” described here is just an example, and any calculation method can be applied.
  • the output control unit 503 outputs the "match rate" calculated by the determination unit 502.
  • the information processing device 50 can support the diagnosis by the operator (medical worker such as a doctor). For example, the operator can accurately and easily diagnose whether or not the subject has a specific mental tendency by viewing the match rate.
  • the mental state determination result does not necessarily have to be output.
  • the information processing device 50 described in the first modification of the second embodiment not only presents the matching rate to the operator as it is, but also suggests the possibility of being a depressed person or the like. It is also possible to present (suggestions of possible mental tendencies).
  • the suggestion of the possibility of mental tendency may be information that indicates the degree of possibility of depressive tendency step by step.
  • the mental tendency determination result may be information in which the possibility of depressive tendency is determined in three stages of "high”, “medium”, and "low” according to the high matching rate.
  • two threshold values are used.
  • the information presented to the operator that suggests the possibility of mental tendency is for the purpose of supporting the diagnosis, not the diagnosis result itself. That is, the processing of the information processing device 50 does not correspond to a medical practice.
  • the trained model (mental state determination process) described in the second embodiment may be applied to the information processing device 20 shown in the first embodiment.
  • the information processing device 20 can output the mental state information and the comment even if the subject himself / herself does not enter the question / answer information.
  • the information processing apparatus 20 executes the process of step S203 of FIG. 22 instead of the process of step S103 of FIG. That is, the information processing device 20 generates the mental state determination result of the subject by inputting the biological information and the action record information acquired in steps S101 and S102 of FIG. 3 into the trained model.
  • the information processing device 20 uses the mental state determination result instead of the question answer information. That is, the information processing device 20 calculates the mental state information based on the biological information, the action record information, and the mental state determination result. Since the mental state determination result is data corresponding to the mental state answer data and the session evaluation answer data, it can be used instead of the question answer information.
  • the information processing apparatus 20 can output the mental state information and the comment by executing the same processing as the processing of step S105 and step S106 of FIG.
  • a sensor that can be attached to the body surface In the above-described embodiment, it has been explained that a known technique can be appropriately selected and applied to the sensor 11. However, in order to collect biological information for a long period of time (several days or weeks), it is preferable to apply a sensor that can be attached to the body surface.
  • the sensor that can be attached to the body surface is a wearable sensor that has a circuit inside the flexible device that does not break due to bending or the like so that it can flexibly follow the movement of the body. preferable.
  • an adhesive body such as an acrylic resin that does not cause discomfort to the body and has high medical quality characteristics and conforms to a biological safety test is used. It is preferable that the sensor can be attached to the body surface.
  • the acquisition unit 201 acquires biometric information based on the data collected by the sensor 11 that can be attached to the body surface of the subject.
  • the acquisition unit 201 can collect both the biological information during sleep and the biological information during awakening without omission, so that it is possible to continuously collect biological information over a long period of time.
  • the biological information used in this embodiment does not necessarily have to be continuous data, but it is preferably continuous data.
  • continuous data of 24 hours or more is preferable
  • continuous data of 72 hours or more is more preferable
  • continuous data of 144 hours or more is preferable.
  • continuous data of 144 hours or more is preferable.
  • continuous data of 168 hours or more is most preferable.
  • Each function included in the information processing apparatus 20 according to the above-described embodiment may be provided as a system.
  • FIG. 23 is a diagram showing an example of a schematic configuration of the system according to the embodiment.
  • the system 1 according to the embodiment includes a server device 30 and a display terminal 40.
  • the server device 30 and the display terminal 40 are in a state of being able to communicate directly or indirectly with each other via a network such as a LAN (Local Area Network) or a WAN (Wide Area Network).
  • the system 1 is an example of a biological information analysis system.
  • the sensor 11 and the mobile terminal 12 are in a state of being able to communicate directly or indirectly with the server device 30 via the network, they do not have to be always connected to the server device 30. Since the sensor 11 and the mobile terminal 12 have basically the same configuration as the sensor 11 and the mobile terminal 12 shown in FIG. 1, description thereof will be omitted.
  • the server device 30 is provided in, for example, a service center that provides biometric information analysis processing as a cloud service.
  • the server device 30 includes an acquisition unit 301, a calculation unit 302, a generation unit 303, and an output control unit 304.
  • the processing contents of the acquisition unit 301, the calculation unit 302, the generation unit 303, and the output control unit 304 are basically the processing contents of the acquisition unit 201, the calculation unit 202, the generation unit 203, and the output control unit 204 shown in FIG. Since it is the same as the above, the description thereof will be omitted.
  • the display terminal 40 is, for example, an information processing terminal used by a user of a cloud service, and is, for example, a terminal provided with a display device such as a personal computer, a workstation, a smartphone, or a tablet.
  • the user of the cloud service corresponds to a person who provides psychological counseling (for example, a counselor, a doctor, etc.).
  • the output control unit 304 of the server device 30 transmits the mental state information in each action of the target person to the display terminal.
  • the display control unit 401 of the display terminal 40 receives the mental state information in each action of the target person transmitted from the server device 30, and displays the mental state information in each action of the received target person. According to this, the user can accurately grasp the mental state of the subject.
  • each component of each device shown in the figure is a functional concept, and does not necessarily have to be physically configured as shown in the figure. That is, the specific form of distribution / integration of each device is not limited to the one shown in the figure, and all or part of the device is functionally or physically dispersed / physically distributed in an arbitrary unit according to various loads and usage conditions. It can be integrated and configured. Further, each processing function performed by each device may be realized by a CPU and a program analyzed and executed by the CPU, or may be realized as hardware by wired logic.
  • biometric information analysis method described in the above embodiment can be realized by executing a biometric information analysis program prepared in advance on a computer such as a personal computer or a workstation.
  • This biometric information analysis program can be distributed via a network such as the Internet.
  • this biometric information analysis program may be executed by being recorded on a computer-readable recording medium such as a hard disk, flexible disk (FD), CD-ROM, MO, or DVD, and being read from the recording medium by the computer. it can.
  • the mental state of the subject can be accurately grasped.
  • Examples 1 to 5, Comparative Example 1 By the method described in the second embodiment, a trained model was created that outputs mental state response data and session evaluation response data by inputting to biological information and behavior record information. By different types of data used for machine learning, trained models were created for each of Examples 1 to 5 and Comparative Example 1 shown in FIG. 24. Note that FIG. 24 is a diagram for explaining the trained model according to each Example and Comparative Example.
  • the input information (training data) used to create the trained model will be explained.
  • “biological information”, “behavior record information”, “mental state response data”, and “session evaluation response data” were used as input information.
  • input information for one week 168 hours including weekdays and holidays was collected for 15 depressive subjects and 15 healthy subjects.
  • autonomic nerve data body temperature data
  • pulse data body movement data
  • the autonomic nerve data included sympathetic nerve data, parasympathetic nerve data, and total power data, and each was generated from electrocardiographic waveform data.
  • Body temperature data temperature data
  • the pulse data is time series information of the pulse.
  • the physical activity data shows the amount of physical activity and was generated from the acceleration data.
  • the electrocardiographic waveform data, body temperature data, pulse wave data, and acceleration data are obtained by a sensor in a mode that can be attached to the body surface of the subject (for example, a wearable device in a mode described in Special Table 2017-510390). Collected over time.
  • the behavior data, time data, and mood history data shown in FIG. 6 were used.
  • the behavior record information was collected by having each subject record it over the target period (the above 7 days).
  • the mental state response data and the session evaluation response data were collected by having each subject record once during the target period as a tendency in each example and each comparative example.
  • the trained models according to the examples and the comparative examples were created by using various data corresponding to the “ ⁇ ” in FIG. 24 as training data.
  • the trained models related to each example and comparative example are created by machine learning using biometric information and action record information as "input data” and mental state response data and session evaluation response data as "correct answer data”.
  • a random forest was used as a machine learning method.
  • the trained model according to Example 1 was created using training data (input data and correct answer data) for 24 hours (1 day) corresponding to weekdays (Monday to Friday).
  • autonomic nerve data was used as biological information, and machine learning was performed without using body temperature data, pulse data, and body movement data.
  • the trained model according to Example 2 was created using training data for 24 hours (1 day) corresponding to holidays (Saturday, Sunday).
  • autonomic nerve data was used as biological information, and machine learning was performed without using body temperature data, pulse data, and body movement data.
  • the trained model according to Example 3 was created using training data for 24 hours (1 day) corresponding to weekdays.
  • the trained model according to Example 4 was created using training data for 72 hours (3 days) corresponding to weekdays.
  • the training data for these three days is, for example, a combination of training data for Monday, Wednesday, and Friday.
  • the trained model according to Example 5 was created using training data for 168 hours (one week) corresponding to weekdays and holidays.
  • the trained model related to the comparative example was created using training data for 24 hours (1 day) corresponding to weekdays.
  • pulse data was used as biometric information, and machine learning was performed without using autonomic nerve data, body temperature data, and body movement data.
  • the mental state response data for evaluation and the session evaluation response data were collected by having the subject record once after the lapse of the target period. It should be noted that the reason why the mental state response data and the session evaluation response data were collected once after each target period has passed is because it is considered that the tendency can be obtained throughout each target period. In addition, when the training data on Monday, Wednesday, and Friday were integrated, the mental state response data and session evaluation response data collected at the end of the data collection on Friday were adopted.
  • the matching rate was determined according to the ratio of the matching items among the plurality of items included in the mental state response data and the session evaluation response data. Specifically, when 70% or more of the matches were found, it was determined to be "matched". The matching rate was evaluated by the number of times it was determined to match (the number of matching times) out of 10 times. Specifically, "AAA” indicates that the proportion of matching items is 90% or more. “AA” indicates that the percentage of matching items is 85% or more and less than 90%. “A” indicates that the ratio of matching items is 80% or more and less than 85%. “B” indicates that the proportion of matching items is 75% or more and less than 80%. “C” indicates that the percentage of matching items is less than 75%.
  • the degree of matching of the trained model according to the first embodiment was "A”.
  • the degree of matching of the trained model according to Example 2 was “B”.
  • the degree of matching of the trained model according to Example 3 was "AA”.
  • the degree of matching of the trained model according to Example 4 was "AA”.
  • the degree of matching of the trained model according to Example 5 was "AAA”.
  • the degree of matching of the trained model according to the comparative example was "C”.
  • the degree of matching of the trained model according to Example 5 was higher than the degree of matching of the other trained models. From this result, it was suggested that it is preferable that the input data at the time of learning and operation is 7 days (168 hours) or more. That is, the acquisition unit 201 acquires the biological information and the action record information for 7 days or more, and the calculation unit 202 calculates the mental state information based on the biological information and the action record information for 7 days or more. It was suggested that it is suitable.
  • the input data during learning and operation includes weekdays and holidays. That is, the acquisition unit 201 acquires the biological information and the action record information for several days or more including weekdays and holidays, and the calculation unit 202 is based on the biological information and the action record information for several days or more including weekdays and holidays. Therefore, it was suggested that it is preferable to calculate the mental state information.
  • the degree of matching of the trained models according to Examples 3 and 4 was higher than the degree of matching of the trained models according to Examples 1 and 2. From this result, it was suggested that the input data during learning and operation preferably includes body temperature data, pulse data, and body movement data as biological information.
  • the degree of matching of the trained model according to Example 1 was higher than the degree of matching of the trained model according to Example 2. From this result, it was suggested that the input data during learning and operation is more suitable for weekday data than holiday data.
  • the comparative example is an example in which only the pulse data that can be collected by a relatively large number of wearable devices is used.
  • the degree of matching of the trained models according to the comparative example was lower than the degree of matching of any of the trained models according to Examples 1 to 5. From this result, it was suggested that it is preferable that the input data during learning and operation include autonomic nerve data as biometric information.

Abstract

A biological information analysis device (2) according to an embodiment comprises an acquisition unit (201) and a calculation unit (202). The acquisition unit acquires biological information including autonomic nerve data on a subject and behavior record information in which a plurality of behaviors of the subject are recorded. The calculation unit calculates mental state information in each behavior of the subject on the basis of the biological information and the behavior record information.

Description

生体情報分析装置、生体情報分析システム、生体情報分析プログラム、及び生体情報分析方法Biometric information analyzer, biometric information analysis system, biometric information analysis program, and biometric information analysis method
 本発明の実施形態は、生体情報分析装置、生体情報分析システム、生体情報分析プログラム、及び生体情報分析方法に関する。 An embodiment of the present invention relates to a biological information analyzer, a biological information analysis system, a biological information analysis program, and a biological information analysis method.
 心理カウンセリングの現場では、対象者の精神状態を把握することが求められている。このため、例えば、カウンセラーによる対象者へのセッション(面談)が行われる度に、精神状態に関する質問票に対して対象者に回答してもらい、その回答結果を分析することで対象者の精神状態を推測することが行われている。 In the field of psychological counseling, it is required to grasp the mental state of the subject. For this reason, for example, each time a counselor conducts a session (interview) with the target person, the target person is asked to answer a questionnaire regarding the mental state, and the result of the answer is analyzed to analyze the mental state of the target person. Is being guessed.
特開2018-153244号公報JP-A-2018-153244
 ここで、上記のような現場等では、近年、カウンセリング数の増加、およびカウンセリング対象者の属性の多様化等の理由により、より正確な精神状態を把握することが求められる。 Here, in the above-mentioned sites, etc., it is required to grasp a more accurate mental state in recent years due to the increase in the number of counseling and the diversification of the attributes of the counseling target person.
 本発明は、対象者の精神状態を正確に把握することができる生体情報分析装置、生体情報分析システム、生体情報分析プログラム、及び生体情報分析方法を提供することである。 The present invention is to provide a biometric information analyzer, a biometric information analysis system, a biometric information analysis program, and a biometric information analysis method capable of accurately grasping the mental state of a subject.
 上述した課題を解決し、目的を達成するために、本発明は、対象者の自律神経データを含む生体情報と、前記対象者の複数の行動が記録された行動記録情報とを取得する取得部と、前記生体情報及び前記行動記録情報に基づいて、前記対象者の各行動における精神状態情報を算出する算出部とを備える、生体情報分析装置である。 In order to solve the above-mentioned problems and achieve the object, the present invention is an acquisition unit that acquires biological information including autonomic nerve data of a subject and behavior record information in which a plurality of actions of the subject are recorded. The biological information analyzer is provided with a calculation unit that calculates mental state information in each behavior of the subject based on the biological information and the behavior record information.
 また、本発明は、サーバ装置と、表示用端末とを含む生体情報分析システムであって、前記サーバ装置は、対象者の自律神経データを含む生体情報と、前記対象者の複数の行動が記録された行動記録情報とを取得する取得部と、前記生体情報及び前記行動記録情報に基づいて、前記対象者の各行動における精神状態情報を算出する算出部と、前記対象者の各行動における精神状態情報を前記表示用端末に送信する出力制御部とを備え、前記表示用端末は、前記サーバ装置から送信された前記対象者の各行動における精神状態情報を受信し、受信した前記対象者の各行動における精神状態情報を表示させる表示制御部を備える、生体情報分析システムである。 Further, the present invention is a biometric information analysis system including a server device and a display terminal, in which the server device records biometric information including autonomic nerve data of the subject and a plurality of actions of the subject. An acquisition unit that acquires the obtained action record information, a calculation unit that calculates mental state information in each action of the target person based on the biological information and the action record information, and a spirit in each action of the target person. The display terminal includes an output control unit that transmits state information to the display terminal, and the display terminal receives mental state information in each action of the target person transmitted from the server device, and receives the mental state information of the target person. It is a biological information analysis system equipped with a display control unit that displays mental state information in each action.
 また、本発明は、対象者の自律神経データを含む生体情報と、前記対象者の複数の行動が記録された行動記録情報とを取得し、前記生体情報及び前記行動記録情報に基づいて、前記対象者の各行動における精神状態情報を算出する各処理をコンピュータに実行させる、生体情報分析プログラムである。 Further, the present invention acquires biological information including autonomic nerve data of a subject and behavior record information in which a plurality of actions of the subject are recorded, and based on the biological information and the behavior record information, the said It is a biological information analysis program that causes a computer to execute each process of calculating mental state information in each behavior of the subject.
 また、本発明は、対象者の自律神経データを含む生体情報と、前記対象者の複数の行動が記録された行動記録情報とを取得し、前記生体情報及び前記行動記録情報に基づいて、前記対象者の各行動における精神状態情報を算出することを含む、生体情報分析方法である。 Further, the present invention acquires biological information including autonomic nerve data of a subject and behavior record information in which a plurality of actions of the subject are recorded, and based on the biological information and the behavior record information, the said It is a biological information analysis method including calculating mental state information in each behavior of a subject.
 本発明によれば、対象者の精神状態を正確に把握することができる生体情報分析装置、生体情報分析システム、生体情報分析プログラム、及び生体情報分析方法を提供することができる。 According to the present invention, it is possible to provide a biological information analysis device, a biological information analysis system, a biological information analysis program, and a biological information analysis method capable of accurately grasping the mental state of a subject.
図1は、実施形態に係る情報処理装置の概略構成の一例を示す図である。FIG. 1 is a diagram showing an example of a schematic configuration of an information processing apparatus according to an embodiment. 図2は、実施形態に係る情報処理装置のハードウェア構成の一例を示す図である。FIG. 2 is a diagram showing an example of the hardware configuration of the information processing apparatus according to the embodiment. 図3は、実施形態に係る情報処理装置の動作例を示すフローチャートである。FIG. 3 is a flowchart showing an operation example of the information processing apparatus according to the embodiment. 図4は、取得部の処理を説明するための図である。FIG. 4 is a diagram for explaining the processing of the acquisition unit. 図5は、取得部の処理を説明するための図である。FIG. 5 is a diagram for explaining the processing of the acquisition unit. 図6は、取得部の処理を説明するための図である。FIG. 6 is a diagram for explaining the processing of the acquisition unit. 図7は、取得部の処理を説明するための図である。FIG. 7 is a diagram for explaining the processing of the acquisition unit. 図8は、取得部の処理を説明するための図である。FIG. 8 is a diagram for explaining the processing of the acquisition unit. 図9は、取得部の処理を説明するための図である。FIG. 9 is a diagram for explaining the processing of the acquisition unit. 図10は、算出部の処理を説明するための図である。FIG. 10 is a diagram for explaining the processing of the calculation unit. 図11は、生成部の処理を説明するための図である。FIG. 11 is a diagram for explaining the processing of the generation unit. 図12は、生成部の処理を説明するための図である。FIG. 12 is a diagram for explaining the processing of the generation unit. 図13は、生成部の処理を説明するための図である。FIG. 13 is a diagram for explaining the processing of the generation unit. 図14は、出力制御部の処理を説明するための図である。FIG. 14 is a diagram for explaining the processing of the output control unit. 図15は、出力制御部の処理を説明するための図である。FIG. 15 is a diagram for explaining the processing of the output control unit. 図16は、出力制御部の処理を説明するための図である。FIG. 16 is a diagram for explaining the processing of the output control unit. 図17は、出力制御部の処理を説明するための図である。FIG. 17 is a diagram for explaining the processing of the output control unit. 図18は、出力制御部の処理を説明するための図である。FIG. 18 is a diagram for explaining the processing of the output control unit. 図19は、出力制御部の処理を説明するための図である。FIG. 19 is a diagram for explaining the processing of the output control unit. 図20は、第2の実施形態に係る情報処理装置の概略構成の一例を示す図である。FIG. 20 is a diagram showing an example of a schematic configuration of the information processing apparatus according to the second embodiment. 図21は、第2の実施形態に係る情報処理装置により行われる学習時及び運用時の処理を示す図である。FIG. 21 is a diagram showing processing during learning and operation performed by the information processing apparatus according to the second embodiment. 図22は、第2の実施形態に係る情報処理装置の動作例を示すフローチャートである。FIG. 22 is a flowchart showing an operation example of the information processing apparatus according to the second embodiment. 図23は、実施形態に係るシステムの概略構成の一例を示す図である。FIG. 23 is a diagram showing an example of a schematic configuration of the system according to the embodiment. 図24は、各実施例及び比較例に係る学習済みモデルについて説明するための図である。FIG. 24 is a diagram for explaining a trained model according to each Example and Comparative Example.
 以下、図面を参照して、実施形態に係る生体情報分析装置、生体情報分析システム、生体情報分析プログラム、及び生体情報分析方法を説明する。なお、以下の各実施形態は、以下の説明に限定されるものではない。また、各実施形態は、処理内容に矛盾が生じない範囲で他の実施形態や従来技術との組み合わせが可能である。 Hereinafter, the biometric information analyzer, the biometric information analysis system, the biometric information analysis program, and the biometric information analysis method according to the embodiment will be described with reference to the drawings. The following embodiments are not limited to the following description. In addition, each embodiment can be combined with other embodiments or conventional techniques as long as the processing contents do not conflict with each other.
 なお、以下では、本実施形態が心理カウンセリングに適用される場合を説明する。つまり、心理カウンセリングを行う者(例えば、カウンセラー、医師など)が、心理カウンセリングにかかる対象者の精神状態を把握するために、本実施形態に係る情報処理装置が利用される場合を説明する。なお、本実施形態は、心理カウンセリングに限らず、対象者の精神状態を把握するために様々な場面で適用可能である。 In the following, a case where this embodiment is applied to psychological counseling will be described. That is, the case where the information processing device according to the present embodiment is used by a person who provides psychological counseling (for example, a counselor, a doctor, etc.) to grasp the mental state of the subject who is involved in psychological counseling will be described. The present embodiment is not limited to psychological counseling, and can be applied in various situations for grasping the mental state of the subject.
(第1の実施形態)
 図1は、実施形態に係る情報処理装置の概略構成の一例を示す図である。図1に示すように、実施形態に係る情報処理装置20は、センサ11及び携帯端末12とLAN(Local Area Network)やWAN(Wide Area Network)等のネットワークを介して直接的又は間接的に相互に通信可能な状態となっている。なお、情報処理装置20は、生体情報分析装置の一例である。
(First Embodiment)
FIG. 1 is a diagram showing an example of a schematic configuration of an information processing apparatus according to an embodiment. As shown in FIG. 1, the information processing apparatus 20 according to the embodiment directly or indirectly communicates with the sensor 11 and the mobile terminal 12 via a network such as a LAN (Local Area Network) or WAN (Wide Area Network). It is in a state where it can communicate with. The information processing device 20 is an example of a biological information analyzer.
 なお、センサ11及び携帯端末12は、情報処理装置20に常時接続されていなくても良い。例えば、対象者へのセッションが行われる際に、各種の情報のやり取りを行うために接続されれば十分である。また、外部記録媒体などを介して情報のやり取りを行う場合には、センサ11及び携帯端末12は、情報処理装置20に接続されていなくても良い。 The sensor 11 and the mobile terminal 12 do not have to be always connected to the information processing device 20. For example, when a session to the target person is held, it is sufficient to be connected to exchange various information. Further, when exchanging information via an external recording medium or the like, the sensor 11 and the mobile terminal 12 do not have to be connected to the information processing device 20.
 センサ11は、対象者に装着されるセンサデバイスである。例えば、センサ11は、複数のセンサ機能を備えたマルチセンサデバイスである。具体的には、センサ11は、心電計、光学式脈波計、体動計、及びサーミスタの各機能を備える。心電計は、時系列の心電波形データを計測する。また、光学式脈波計は、時系列の脈波データを計測する。また、体動計は、時系列の加速度データを計測する。また、サーミスタは、時系列の温度データを計測する。センサ11は、計測した各種の計測データを、計測開始時刻と対応づけて装置内部のメモリに記録する。なお、心電計、光学式脈波計、体動計、及びサーミスタについては、公知の技術を適宜選択して適用可能である。 The sensor 11 is a sensor device worn by the target person. For example, the sensor 11 is a multi-sensor device having a plurality of sensor functions. Specifically, the sensor 11 includes functions of an electrocardiograph, an optical pulse wave meter, a body motion meter, and a thermistor. The electrocardiograph measures time-series electrocardiographic waveform data. In addition, the optical pulse wave meter measures time-series pulse wave data. In addition, the body motion meter measures time-series acceleration data. The thermistor also measures time-series temperature data. The sensor 11 records various measured measurement data in a memory inside the apparatus in association with the measurement start time. As for the electrocardiograph, the optical pulse wave meter, the body motion meter, and the thermistor, known techniques can be appropriately selected and applied.
 ここで、センサ11は、基本的には対象者に常時装着されるものである。対象者は、セッションが行われる際にセンサ11を取り外し、取り外したセンサ11を情報処理装置20の操作者(代表的には、カウンセラー又は医師)に渡す。情報処理装置20の操作者は、対象者から受け取ったセンサ11から、前回のセッション以降に記録された計測データを情報処理装置20へ転送する。なお、計測データから生体情報を取得する処理の詳細は、後述する。 Here, the sensor 11 is basically always worn by the target person. The subject removes the sensor 11 when the session is performed, and hands the removed sensor 11 to an operator (typically, a counselor or a doctor) of the information processing device 20. The operator of the information processing device 20 transfers the measurement data recorded after the previous session from the sensor 11 received from the target person to the information processing device 20. The details of the process of acquiring biological information from the measurement data will be described later.
 なお、センサ11は、必ずしもマルチセンサデバイスに限定されるものではなく、少なくとも心電計の機能を備えていれば良い。また、センサ11は、必ずしも常時装着に限定されるものではなく、後述の生体情報分析処理に影響の無い範囲で取り外されても良い。また、センサ11に記録された計測データの転送タイミングは、必ずしもセッションの際に限定されるものではなく、例えば、ネットワーク経由で定期的(例えば、一日一回など)に行われても良い。 Note that the sensor 11 is not necessarily limited to the multi-sensor device, and may at least have an electrocardiograph function. Further, the sensor 11 is not necessarily limited to being always worn, and may be removed within a range that does not affect the biological information analysis process described later. Further, the transfer timing of the measurement data recorded in the sensor 11 is not necessarily limited to the session, and may be performed periodically (for example, once a day) via the network, for example.
 携帯端末12は、例えば、対象者が所持するスマートフォンであり、対象者の行動が記録された行動記録情報を収集するための記録用アプリケーションが導入(インストール)される。行動記録情報は、自動的、又は、対象者による手動で携帯端末12の装置内部のメモリに記録される。携帯端末12に記録された行動記録情報は、任意のタイミングで情報処理装置20へ転送される。なお、行動記録情報を取得する処理の詳細は、後述する。 The mobile terminal 12 is, for example, a smartphone owned by the target person, and a recording application for collecting behavior record information in which the target person's behavior is recorded is introduced (installed). The action record information is automatically or manually recorded in the memory inside the device of the mobile terminal 12. The action record information recorded in the mobile terminal 12 is transferred to the information processing device 20 at an arbitrary timing. The details of the process of acquiring the action record information will be described later.
 なお、携帯端末12は、必ずしもスマートフォンに限定されるものではなく、例えば、タブレットやパーソナルコンピュータなど、行動記録情報を収集するためのアプリケーションが導入可能な情報処理装置であれば良い。ただし、行動記録情報を収集する性質上、対象者が日頃から持ち運び可能な装置(モバイル端末)であるのが好適である。 The mobile terminal 12 is not necessarily limited to a smartphone, and may be any information processing device such as a tablet or a personal computer into which an application for collecting action record information can be introduced. However, due to the nature of collecting behavior record information, it is preferable that the target person is a device (mobile terminal) that can be carried on a daily basis.
 情報処理装置20は、例えば、パーソナルコンピュータやワークステーション等のコンピュータである。情報処理装置20は、センサ11及び携帯端末12により取得された情報に基づいて、対象者の精神状態を示す精神状態情報を算出する。情報処理装置20は、取得部201、算出部202、生成部203、及び出力制御部204を備える。なお、情報処理装置20が有する機能は、取得部201、算出部202、生成部203、及び出力制御部204に限られるものではない。また、取得部201、算出部202、生成部203、及び出力制御部204については後述する。 The information processing device 20 is, for example, a computer such as a personal computer or a workstation. The information processing device 20 calculates mental state information indicating the mental state of the target person based on the information acquired by the sensor 11 and the mobile terminal 12. The information processing device 20 includes an acquisition unit 201, a calculation unit 202, a generation unit 203, and an output control unit 204. The functions of the information processing device 20 are not limited to the acquisition unit 201, the calculation unit 202, the generation unit 203, and the output control unit 204. The acquisition unit 201, the calculation unit 202, the generation unit 203, and the output control unit 204 will be described later.
 ここで、図2を用いて、情報処理装置20のハードウェア構成について説明する。図2は、実施形態に係る情報処理装置20のハードウェア構成の一例を示す図である。図2に示すように、情報処理装置20は、CPU(Central Processing Unit)21と、ROM(Read Only Memory)22と、RAM(Random Access Memory)23と、補助記憶装置24と、入力装置25と、表示装置26と、外部I/F(Interface)27とを備える。 Here, the hardware configuration of the information processing apparatus 20 will be described with reference to FIG. FIG. 2 is a diagram showing an example of the hardware configuration of the information processing apparatus 20 according to the embodiment. As shown in FIG. 2, the information processing device 20 includes a CPU (Central Processing Unit) 21, a ROM (Read Only Memory) 22, a RAM (Random Access Memory) 23, an auxiliary storage device 24, and an input device 25. , A display device 26 and an external I / F (Interface) 27 are provided.
 CPU21は、プログラムを実行することにより、情報処理装置20の動作を統括的に制御し、情報処理装置20が有する各種の機能を実現するプロセッサ(処理回路)である。情報処理装置20が有する各種の機能については後述する。 The CPU 21 is a processor (processing circuit) that comprehensively controls the operation of the information processing device 20 by executing a program and realizes various functions of the information processing device 20. Various functions of the information processing device 20 will be described later.
 ROM22は、不揮発性のメモリであり、情報処理装置20を起動させるためのプログラムを含む各種データ(情報処理装置20の製造段階で書き込まれる情報)を記憶する。RAM23は、CPU21の作業領域を有する揮発性のメモリである。補助記憶装置24は、CPU21が実行するプログラム等の各種データを記憶する。補助記憶装置24は、例えばHDD(Hard Disc Drive)、SSD(Solid State Drive)等で構成される。 The ROM 22 is a non-volatile memory and stores various data (information written at the manufacturing stage of the information processing device 20) including a program for activating the information processing device 20. The RAM 23 is a volatile memory having a working area of the CPU 21. The auxiliary storage device 24 stores various data such as a program executed by the CPU 21. The auxiliary storage device 24 is composed of, for example, an HDD (Hard Disc Drive), an SSD (Solid State Drive), or the like.
 入力装置25は、情報処理装置20を使用する操作者が各種の操作を行うためのデバイスである。入力装置25は、例えばマウス、キーボード、タッチパネル又はハードウェアキーで構成される。なお、操作者は、例えば、医師等の医療関係者等に対応する。 The input device 25 is a device for an operator using the information processing device 20 to perform various operations. The input device 25 is composed of, for example, a mouse, a keyboard, a touch panel, or hardware keys. The operator corresponds to, for example, a medical person such as a doctor.
 表示装置26は、各種情報を表示する。例えば、表示装置26は、画像データやモデルデータ、操作者から各種操作を受け付けるためのGUI(Graphical User Interface)や、医用画像等を表示する。表示装置26は、例えば液晶ディスプレイ、有機EL(Electro Luminescence)ディスプレイ又はブラウン管ディスプレイで構成される。なお、例えばタッチパネルのような形態で、入力装置25と表示装置26とが一体に構成されても良い。 The display device 26 displays various information. For example, the display device 26 displays image data, model data, a GUI (Graphical User Interface) for receiving various operations from an operator, a medical image, and the like. The display device 26 is composed of, for example, a liquid crystal display, an organic EL (Electro Luminescence) display, or a cathode ray tube display. The input device 25 and the display device 26 may be integrally configured, for example, in the form of a touch panel.
 外部I/F27は、センサ11や携帯端末12等の外部装置と接続(通信)するためのインタフェースである。 The external I / F 27 is an interface for connecting (communication) with an external device such as a sensor 11 or a mobile terminal 12.
 図3を用いて、情報処理装置20が対象者の精神状態情報を算出する処理(生体情報分析処理)について説明する。図3は、実施形態に係る情報処理装置20の動作例を示すフローチャートである。図3の説明では、図4~図19を参照しつつ説明する。なお、図3に示す生体情報分析処理は、任意のタイミングで実行可能であるが、対象者に対するセッションが行われる毎に実行されるのが好適である。 A process (biological information analysis process) in which the information processing device 20 calculates the mental state information of the subject will be described with reference to FIG. FIG. 3 is a flowchart showing an operation example of the information processing apparatus 20 according to the embodiment. In the description of FIG. 3, the description will be made with reference to FIGS. 4 to 19. The biometric information analysis process shown in FIG. 3 can be executed at an arbitrary timing, but it is preferably executed every time a session for the target person is performed.
 図3に示すように、取得部201は、対象者の生体情報を取得する(ステップS101)。例えば、取得部201は、センサ11によって収集された各種の計測データに基づいて対象者の生体情報を生成することで、対象者の生体情報を取得する。 As shown in FIG. 3, the acquisition unit 201 acquires the biological information of the target person (step S101). For example, the acquisition unit 201 acquires the biological information of the target person by generating the biological information of the target person based on various measurement data collected by the sensor 11.
 ここで、生体情報は、例えば、自律神経データ、トータルパワーデータ、身体動作データ、体温データ、脈拍データ等を含む情報である。自律神経データは、自律神経の活動指標の時系列情報であり、交感神経データ、副交感神経データ、及びトータルパワーデータを含む。交感神経データは、交感神経の活動指標(Sympathetic Nervous System:SNSレベル)の時系列情報である。副交感神経データは、副交感神経の活動指標(ParaSympathetic Nervous System:PSNSレベル)の時系列情報である。トータルパワーデータは、交感神経データ及び副交感神経データが統合されたデータであり、例えば、交感神経データ及び副交感神経データの和に対応する。身体動作データは、身体の動作の強度を示す時系列情報である。体温データは、体表面(センサ11の装着部位)における体温の時系列情報である。脈拍データは、脈拍の時系列情報である。 Here, the biological information is information including, for example, autonomic nerve data, total power data, body movement data, body temperature data, pulse data, and the like. The autonomic nerve data is time-series information of the activity index of the autonomic nerve, and includes sympathetic nerve data, parasympathetic nerve data, and total power data. The sympathetic nerve data is time-series information of the sympathetic nerve activity index (Sympathetic Nervous System: SNS level). The parasympathetic nerve data is time-series information of the parasympathetic nerve activity index (ParaSympathetic Nervous System: PSNS level). The total power data is data in which sympathetic nerve data and parasympathetic nerve data are integrated, and corresponds to, for example, the sum of sympathetic nerve data and parasympathetic nerve data. Body movement data is time-series information indicating the intensity of body movement. The body temperature data is time-series information of the body temperature on the body surface (the place where the sensor 11 is attached). The pulse data is time series information of the pulse.
 図4及び図5を用いて、対象者の生体情報を取得する処理の一例を説明する。図4及び図5は、取得部201の処理を説明するための図である。図4には、センサ11によって計測される心電波形データ(心電図)の一例を示す。また、図5には、取得部201によって生成される自律神経データの一例を示す。 An example of the process of acquiring the biological information of the subject will be described with reference to FIGS. 4 and 5. 4 and 5 are diagrams for explaining the processing of the acquisition unit 201. FIG. 4 shows an example of electrocardiographic waveform data (electrocardiogram) measured by the sensor 11. Further, FIG. 5 shows an example of autonomic nerve data generated by the acquisition unit 201.
 例えば、取得部201は、時系列の心電波形を示す心電波形データから自律神経データを生成することで、自律神経データを取得する。具体的には、取得部201は、図4に示す心電波形データからR波の位置を検出し、検出したR波の間隔(R-R interval:RRI)を経時的にプロットした心拍間隔変動時系列(RRI時系列)を生成する。そして、取得部201は、RRI時系列からパワースペクトルを計算し、所定の周波数領域のパワーを積算する。一例としては、取得部201は、低周波数帯域(0.05Hz~0.15Hz)のパワーの積算値をSNSレベルとして算出し(図5の上段)、高周波数帯域(0.15Hz~0.40Hz)のパワーの積算値をPSNSレベルとして算出する(図5の下段)。一般的に、ストレス状態では、SNSレベルが優位となり、リラックス状態では、PSNSレベルが優位となることが知られている。このように、取得部201は、交感神経データ及び副交感神経データをそれぞれ生成する。 For example, the acquisition unit 201 acquires the autonomic nerve data by generating the autonomic nerve data from the electrocardiographic waveform data showing the time-series electrocardiographic waveform. Specifically, the acquisition unit 201 detects the position of the R wave from the electrocardiographic waveform data shown in FIG. 4, and plots the interval (RR interval: RRI) of the detected R wave over time in a heartbeat interval fluctuation time series. (RRI time series) is generated. Then, the acquisition unit 201 calculates the power spectrum from the RRI time series and integrates the power in a predetermined frequency region. As an example, the acquisition unit 201 calculates the integrated value of the power in the low frequency band (0.05 Hz to 0.15 Hz) as the SNS level (upper part of FIG. 5), and calculates the high frequency band (0.15 Hz to 0.40 Hz). ) Is calculated as the PSNS level (lower part of FIG. 5). In general, it is known that the SNS level is dominant in the stressed state and the PSNS level is dominant in the relaxed state. In this way, the acquisition unit 201 generates sympathetic nerve data and parasympathetic nerve data, respectively.
 また、取得部201は、交感神経データ及び副交感神経データに基づいて、トータルパワーデータを取得する。例えば、取得部201は、各時刻に対応するSNSレベルとPSNSレベルとを加算することで、トータルパワーデータを取得する。 Further, the acquisition unit 201 acquires the total power data based on the sympathetic nerve data and the parasympathetic nerve data. For example, the acquisition unit 201 acquires total power data by adding the SNS level and the PSNS level corresponding to each time.
 このように、取得部201は、センサ11によって収集された各種の計測データに基づいて、対象者の生体情報を取得する。なお、上述した交感神経データ、副交感神経データ、及びトータルパワーデータの取得方法はあくまで一例であり、上記の説明に限定されるものではない。例えば、交感神経データ、副交感神経データ、及びトータルパワーデータの取得方法としては、公知の技術を適宜選択して適用可能である。 In this way, the acquisition unit 201 acquires the biological information of the target person based on various measurement data collected by the sensor 11. The above-mentioned methods for acquiring sympathetic nerve data, parasympathetic nerve data, and total power data are merely examples, and are not limited to the above description. For example, as a method for acquiring sympathetic nerve data, parasympathetic nerve data, and total power data, known techniques can be appropriately selected and applied.
 また、取得部201は、上記のデータ以外にも、センサ11によって収集された各種の計測データに基づいて取得可能である。例えば、取得部201は、心電波形データから心拍数や呼吸率の時系列データを取得可能である。また、取得部201は、脈波データから経皮的動脈血酸素飽和度(SpO2)や脈拍の時系列データを取得可能である。また、取得部201は、加速度データから身体動作データを取得可能である。また、取得部201は、温度データから体温データを取得可能である。これらのデータの取得方法としては、公知の技術を適宜選択して適用可能である。 In addition to the above data, the acquisition unit 201 can acquire based on various measurement data collected by the sensor 11. For example, the acquisition unit 201 can acquire time-series data of heart rate and respiratory rate from the electrocardiographic waveform data. In addition, the acquisition unit 201 can acquire percutaneous arterial oxygen saturation (SpO2) and pulse time series data from the pulse wave data. In addition, the acquisition unit 201 can acquire body movement data from the acceleration data. In addition, the acquisition unit 201 can acquire body temperature data from the temperature data. As a method for acquiring these data, a known technique can be appropriately selected and applied.
 続いて、取得部201は、対象者の行動記録情報を取得する(ステップS102)。例えば、取得部201は、対象者の携帯端末12から行動記録情報を取得する。ここで、行動記録情報は、各行動の内容を示す行動データと、各行動が行われた時間を示す時間データと、各行動が行われた場所を示す場所データと、服薬の履歴を示す服薬履歴データと、各行動が行われた時の気分の履歴を示す気分履歴データとを含む。また、行動データは、段階的な分類情報によって表される。 Subsequently, the acquisition unit 201 acquires the behavior record information of the target person (step S102). For example, the acquisition unit 201 acquires the action record information from the mobile terminal 12 of the target person. Here, the action record information includes action data indicating the content of each action, time data indicating the time when each action was performed, place data indicating the place where each action was performed, and medication indicating the medication history. Includes historical data and mood history data indicating the history of mood when each action is performed. In addition, the behavior data is represented by stepwise classification information.
 図6及び図7を用いて、対象者の行動記録情報を取得する処理の一例を説明する。図6及び図7は、取得部201の処理を説明するための図である。図6には、携帯端末12にて記録される行動記録情報の一例を示す。また、図7には、行動データの分類の一例を示す。 An example of the process of acquiring the behavior record information of the target person will be described with reference to FIGS. 6 and 7. 6 and 7 are diagrams for explaining the processing of the acquisition unit 201. FIG. 6 shows an example of action record information recorded by the mobile terminal 12. Further, FIG. 7 shows an example of classification of behavior data.
 図6に示すように、携帯端末12は、行動データ、時間データ、場所データ、服薬履歴データ、及び気分履歴データを含む行動記録情報を収集する。例えば、図6に示す行動記録情報において、2番目のレコードには、行動データ「朝食 食事」、時間データ「1時間15分 7:00」、場所データ「自宅」、服薬履歴データ「薬名:AAA 服薬時刻 7:40」、及び気分履歴データ「表情マーク(笑顔)」が対応づけられている。つまり、2番目のレコードは、対象者が、7:00から1時間15分間、自宅で朝食を摂っており、そのとき対象者は気分が良かったことを表す。携帯端末12は、他のレコードについても同様に、行動データ、時間データ、場所データ、服薬履歴データ、及び気分履歴データを含む行動記録情報を収集する。 As shown in FIG. 6, the mobile terminal 12 collects behavior record information including behavior data, time data, place data, medication history data, and mood history data. For example, in the action record information shown in FIG. 6, the second record includes the action data "breakfast meal", the time data "1 hour 15 minutes 7:00", the place data "home", and the medication history data "drug name:". AAA medication time 7:40 ”and mood history data“ facial expression mark (smile) ”are associated. That is, the second record shows that the subject was having breakfast at home for 1 hour and 15 minutes from 7:00, at which time the subject was in a good mood. Similarly for other records, the mobile terminal 12 collects behavior record information including behavior data, time data, place data, medication history data, and mood history data.
 なお、行動データは、図7に示すように、段階的な分類情報によって表されるのが好適である。例えば、行動データは、大分類、中分類、及び小分類に分けられる。一例としては、大分類「仕事」には、中分類「会議」と「デスクワーク」とが含まれる。また、中分類「デスクワーク」には、小分類「報告資料作成」と「書類整理」とが含まれる。なお、行動データの分類情報は予め定義されているが、対象者が分類情報を追加、削除、編集できるのが好適である。 It should be noted that the behavior data is preferably represented by stepwise classification information as shown in FIG. For example, behavioral data is divided into major, middle, and minor categories. As an example, the major classification "work" includes the middle classification "meeting" and "desk work". In addition, the middle category "desk work" includes the minor categories "report material creation" and "document arrangement". Although the classification information of the behavior data is defined in advance, it is preferable that the subject can add, delete, and edit the classification information.
 ここで、行動データ、時間データ、及び場所データは、予定表に基づいて、携帯端末12によって自動入力される。例えば、携帯端末12は、携帯端末12に導入されたスケジュール用アプリケーションから対象者のスケジュール情報を取得する。このスケジュール情報には、対象者が予定している行動、日時、及び場所などの情報が含まれている。そこで、携帯端末12は、スケジュール情報から行動データ、時間データ、及び場所データを抽出し、抽出した情報を行動記録情報として記録する。携帯端末12によって自動的に記録された行動データ、時間データ、及び場所データは、手動で修正可能である。 Here, the action data, the time data, and the place data are automatically input by the mobile terminal 12 based on the schedule. For example, the mobile terminal 12 acquires the schedule information of the target person from the schedule application introduced in the mobile terminal 12. This schedule information includes information such as the action, date and time, and location planned by the subject. Therefore, the mobile terminal 12 extracts the action data, the time data, and the place data from the schedule information, and records the extracted information as the action record information. The behavior data, time data, and location data automatically recorded by the mobile terminal 12 can be manually modified.
 また、場所データは、携帯端末12が備えるGPS(Global Positioning System)機能によって取得される座標情報に基づいて、自動的に修正されてもよい。この場合、各場所データと、その場所のGPS機能上の座標情報とが予め対応づけられて記録用アプリケーションに登録されている。そして、携帯端末12は、実際にGPS機能によって取得された座標情報に対応する場所データと、スケジュール情報に基づく場所データとが相違する場合には、スケジュール情報に基づく場所データを破棄する。なお、座標情報が移動している場合には、場所データ及び行動データとして「移動」が入力される。 Further, the location data may be automatically modified based on the coordinate information acquired by the GPS (Global Positioning System) function provided in the mobile terminal 12. In this case, each location data and the coordinate information on the GPS function of that location are associated in advance and registered in the recording application. Then, when the location data corresponding to the coordinate information actually acquired by the GPS function and the location data based on the schedule information are different from each other, the mobile terminal 12 discards the location data based on the schedule information. If the coordinate information is moving, "movement" is input as location data and action data.
 また、服薬履歴データ及び気分履歴データは、手動で入力される。例えば、対象者は、服薬の度に記録用アプリケーションを起動し、服薬した薬名と服薬時刻とを入力する。また、対象者は、それぞれの行動が完了した後に、その行動を行っていた時の気分に対応する気分履歴データを選択する。なお、服薬履歴データ及び気分履歴データについても、手動で修正可能である。なお、気分履歴データについては、表情マークに限らず、「眠い」「リラックスしている」などの気分を表すテキスト情報と、その程度を表す数値情報によって表されても良い。 In addition, medication history data and mood history data are manually input. For example, the subject activates a recording application each time he / she takes a medicine, and inputs the name of the medicine taken and the time of taking the medicine. In addition, after each action is completed, the subject selects mood history data corresponding to the mood at the time of performing the action. The medication history data and mood history data can also be manually corrected. The mood history data is not limited to the facial expression mark, and may be represented by text information indicating the mood such as "sleepy" or "relaxed" and numerical information indicating the degree thereof.
 このように、携帯端末12は、対象者の行動記録情報を収集する。そして、携帯端末12は、収集した行動記録情報を情報処理装置20へ転送する。なお、行動記録情報の転送タイミングは、任意に設定可能である。例えば、携帯端末12は、定期的に行動記録情報を転送しても良いし、対象者又は情報処理装置20の操作者の要求に応じて転送しても良い。これにより、取得部201は、携帯端末12から対象者の行動記録情報を取得する。 In this way, the mobile terminal 12 collects the behavior record information of the target person. Then, the mobile terminal 12 transfers the collected action record information to the information processing device 20. The transfer timing of the action record information can be set arbitrarily. For example, the mobile terminal 12 may periodically transfer the action record information, or may transfer it at the request of the target person or the operator of the information processing device 20. As a result, the acquisition unit 201 acquires the behavior record information of the target person from the mobile terminal 12.
 なお、対象者の行動記録情報は、センサ11から取得した身体動作データに基づいて補正することができる。例えば、取得部201は、身体動作データにおける身体の動作の強度に基づいて、対象者の睡眠の開始時刻及び終了時刻を取得する。身体動作データに基づく睡眠の開始時刻及び終了時刻が、行動記録情報における睡眠の開始時刻及び終了時刻と相違していた場合には、行動記録情報における睡眠の開始時刻及び終了時刻を破棄して身体動作データに基づく睡眠の開始時刻及び終了時刻で上書きすることができる。 The action record information of the target person can be corrected based on the body movement data acquired from the sensor 11. For example, the acquisition unit 201 acquires the sleep start time and end time of the subject based on the intensity of the body movement in the body movement data. If the start time and end time of sleep based on the physical motion data are different from the start time and end time of sleep in the action record information, the start time and end time of sleep in the action record information are discarded and the body It can be overwritten with the start time and end time of sleep based on the motion data.
 また、取得部201は、対象者の質問回答情報を取得する(ステップS103)。例えば、取得部201は、質問回答情報として、対象者の精神状態に関する精神状態回答データ、及び、対象者に対して行われたセッションの評価に関するセッション評価回答データを取得する。 Further, the acquisition unit 201 acquires the question / answer information of the target person (step S103). For example, the acquisition unit 201 acquires the mental state answer data regarding the mental state of the target person and the session evaluation answer data regarding the evaluation of the session performed for the target person as the question and answer information.
 図8及び図9を用いて、精神状態回答データ及びセッション評価回答データの一例を説明する。図8及び図9は、取得部201の処理を説明するための図である。図8には、精神状態回答データの一例を示す。図9には、セッション評価回答データの一例を示す。 An example of mental state response data and session evaluation response data will be described with reference to FIGS. 8 and 9. 8 and 9 are diagrams for explaining the processing of the acquisition unit 201. FIG. 8 shows an example of mental state response data. FIG. 9 shows an example of session evaluation response data.
 例えば、情報処理装置20の操作者は、セッションが行われる度に、精神状態回答データ(図8)及びセッション評価回答データ(図9)の各項目に対応する質問票を対象者に配布する。そして、操作者は、対象者による回答が記入された質問票を受け取ると、受け取った質問票に記入された回答(質問回答情報)を情報処理装置20に入力する。質問回答情報の入力は、カメラ等でスキャンしたデータに対する画像認識によって行っても良いし、操作者が手動で入力しても良い。この結果、取得部201は、精神状態回答データ及びセッション評価回答データを含む質問回答情報を取得する。 For example, the operator of the information processing device 20 distributes a questionnaire corresponding to each item of the mental state response data (FIG. 8) and the session evaluation response data (FIG. 9) to the target person each time a session is performed. Then, when the operator receives the questionnaire in which the answer by the target person is entered, the operator inputs the answer (question answer information) entered in the received questionnaire into the information processing device 20. The question-and-answer information may be input by image recognition for the data scanned by a camera or the like, or may be manually input by the operator. As a result, the acquisition unit 201 acquires the question / answer information including the mental state answer data and the session evaluation answer data.
 そして、算出部202は、生体情報、行動記録情報、及び質問回答情報に基づいて、精神状態情報を算出する(ステップS104)。例えば、算出部202は、時系列の自律神経データのうち、各行動が行われた期間に対応する部分データを特定し、特定した部分データの統計値を、精神状態情報として算出する。 Then, the calculation unit 202 calculates the mental state information based on the biological information, the action record information, and the question / answer information (step S104). For example, the calculation unit 202 specifies the partial data corresponding to the period in which each action is performed among the time-series autonomic nerve data, and calculates the statistical value of the specified partial data as the mental state information.
 図10を用いて、精神状態情報の一例を説明する。図10は、算出部202の処理を説明するための図である。なお、図10の「XXX」には、算出部202によって算出された精神状態情報を示す数値がそれぞれ入力される。 An example of mental state information will be described with reference to FIG. FIG. 10 is a diagram for explaining the processing of the calculation unit 202. In addition, in "XXX" of FIG. 10, a numerical value indicating the mental state information calculated by the calculation unit 202 is input.
 図10に示すように、算出部202は、取得部201によって取得された各種の情報に基づいて、各行動における精神状態情報を算出する。ここで、精神状態情報は、例えば、各行動における交感神経データ、副交感神経データ、及びトータルパワーデータの代表値である。 As shown in FIG. 10, the calculation unit 202 calculates the mental state information in each action based on various information acquired by the acquisition unit 201. Here, the mental state information is, for example, a representative value of sympathetic nerve data, parasympathetic nerve data, and total power data in each action.
 例えば、算出部202は、センサ11による計測開始時刻と、行動記録情報に含まれる時間データとを照合することで、生体情報の時系列と行動記録情報の時系列とを対応づける。そして、算出部202は、時系列の自律神経データのうち、各行動が行われた期間に対応する部分データを特定する。具体的には、算出部202は、時系列の交感神経データのうち、行動データ「睡眠」が行われた期間「24:00~7:00」に対応する期間の交感神経データを特定する。そして、算出部202は、特定した期間の交感神経データの平均値を、図10の精神状態情報として算出する。なお、算出部202は、副交感神経データ及びトータルパワーデータについても交感神経データと同様に算出する。 For example, the calculation unit 202 links the time series of biological information with the time series of action record information by collating the measurement start time by the sensor 11 with the time data included in the action record information. Then, the calculation unit 202 specifies the partial data corresponding to the period in which each action is performed in the time-series autonomic nerve data. Specifically, the calculation unit 202 specifies the sympathetic nerve data in the period corresponding to the period "24:00 to 7:00" in which the behavior data "sleep" is performed, among the sympathetic nerve data in the time series. Then, the calculation unit 202 calculates the average value of the sympathetic nerve data for the specified period as the mental state information of FIG. The calculation unit 202 calculates the parasympathetic nerve data and the total power data in the same manner as the sympathetic nerve data.
 このように、算出部202は、生体情報及び行動記録情報に基づいて、各行動における精神状態情報を算出する。なお、上記の説明では、各行動における交感神経データの代表値として「平均値」が算出される場合を説明したが、これに限らず、ピーク値、中央値、標準偏差などの任意の統計値が算出可能である。 In this way, the calculation unit 202 calculates the mental state information in each action based on the biological information and the action record information. In the above explanation, the case where the "mean value" is calculated as the representative value of the sympathetic nerve data in each behavior has been described, but the present invention is not limited to this, and any statistical value such as the peak value, the median value, and the standard deviation is described. Can be calculated.
 また、算出部202は、精神状態回答データに基づいて、対象者の精神状態スコアを算出する。例えば、算出部202は、精神状態回答データに含まれる各項目に対するスコアを任意の関数に入力することで、対象者の精神状態に関する精神状態スコアを算出する。 In addition, the calculation unit 202 calculates the mental state score of the subject based on the mental state response data. For example, the calculation unit 202 calculates the mental state score related to the mental state of the subject by inputting the score for each item included in the mental state answer data into an arbitrary function.
 また、算出部202は、セッション評価回答データに基づいて、セッションのセッション評価スコアを算出する。例えば、算出部202は、セッション評価回答データに含まれる各項目に対するスコアを任意の関数に入力することで、セッションの評価に関するセッション評価スコアを算出する。 In addition, the calculation unit 202 calculates the session evaluation score of the session based on the session evaluation response data. For example, the calculation unit 202 calculates the session evaluation score related to the session evaluation by inputting the score for each item included in the session evaluation response data into an arbitrary function.
 なお、精神状態スコア及びセッション評価スコアの算出方法は、それぞれの質問回答情報に含まれる質問項目に応じて適宜変更可能である。 The calculation method of the mental state score and the session evaluation score can be appropriately changed according to the question items included in each question answer information.
 そして、生成部203は、精神状態情報に基づいて、コメントを生成する(ステップS105)。例えば、生成部203は、各行動における精神状態情報に基づいて、各行動における精神状態を評価したコメント及び画像の少なくとも一方を生成する。 Then, the generation unit 203 generates a comment based on the mental state information (step S105). For example, the generation unit 203 generates at least one of a comment and an image that evaluates the mental state in each action based on the mental state information in each action.
 図11、図12、及び図13を用いて、コメントを生成する処理の一例を説明する。図11、図12、及び図13は、生成部203の処理を説明するための図である。図11には、各コメントのテンプレートの一例を示す。図12には、生成部203によって生成されるコメントの一例を示す。図13には、生成部203によって生成される総合判定の一例を示す。 An example of the process of generating a comment will be described with reference to FIGS. 11, 12, and 13. 11, FIG. 12, and FIG. 13 are diagrams for explaining the processing of the generation unit 203. FIG. 11 shows an example of a template for each comment. FIG. 12 shows an example of a comment generated by the generation unit 203. FIG. 13 shows an example of the comprehensive determination generated by the generation unit 203.
 図11に示すように、補助記憶装置24は、複数のコメントそれぞれについて、各コメントのテンプレートと、各コメントの発生条件とが対応づけられた情報を記憶する。例えば、補助記憶装置24は、テンプレート「○時○○分から○時○○分のうち○○%が良質な睡眠」と、交換神経データの発生条件「閾値A未満」と、副交感神経データの発生条件「閾値B以上」と、行動データの発生条件「睡眠」とが対応づけられた情報を記憶する。この情報は、交換神経データが「閾値A未満」、副交感神経データが「閾値B以上」、行動データが「睡眠」である行動が存在する場合に、テンプレート「○時○○分から○時○○分のうち○○%が良質な睡眠」に基づくコメントが発生することを示す。 As shown in FIG. 11, the auxiliary storage device 24 stores information in which the template of each comment and the generation condition of each comment are associated with each of the plurality of comments. For example, the auxiliary storage device 24 has a template "○○% of ○ hour ○○ minutes to ○ hour ○○ minutes is good quality sleep", an exchange nerve data generation condition “less than the threshold A”, and parasympathetic nerve data generation. The information in which the condition "threshold B or higher" and the behavior data generation condition "sleep" are associated with each other is stored. This information is based on the template "○ hour ○○ minute to ○ hour ○○" when there is an action in which the sympathetic nerve data is "less than the threshold value A", the parasympathetic nerve data is "threshold value B or more", and the behavior data is "sleep". It is shown that XX% of the minutes generate comments based on "good sleep".
 例えば、生成部203は、行動記録情報に含まれる各行動について、各コメントの発生条件を満たすか否かを判定する。ここで、発生条件を満たすと判定されたコメントがある場合には、生成部203は、そのコメントのテンプレートを補助記憶装置24から読み出す。そして、生成部203は、読み出したテンプレートに基づいて、発生条件を満たした行動のコメントを生成する。 For example, the generation unit 203 determines whether or not the generation condition of each comment is satisfied for each action included in the action record information. Here, if there is a comment determined to satisfy the generation condition, the generation unit 203 reads the template of the comment from the auxiliary storage device 24. Then, the generation unit 203 generates a comment of the action satisfying the generation condition based on the read template.
 例えば、生成部203は、交換神経データが「閾値A未満」、副交感神経データが「閾値B以上」、行動データが「睡眠」である行動が存在する場合に、テンプレート「○時○○分から○時○○分のうち○○%が良質な睡眠」に基づくコメントのテンプレートを読み出す。そして、生成部203は、発生条件を満たした「睡眠」の開始時間、終了時間、良質な睡眠の割合を算出し、算出した情報をテンプレートに入力することで、「23時05分から6時00分のうち29%が良質な睡眠」というコメントを生成する(図12)。なお、良質な睡眠の割合は、例えば、睡眠時間においてPSNSレベルが所定値以上になっていた期間の割合として算出される。 For example, in the generation unit 203, when there is an action in which the sympathetic nerve data is "less than the threshold value A", the parasympathetic nerve data is "threshold value B or more", and the behavior data is "sleep", the template "from XX hours XX minutes to XX" Read the comment template based on "○○% of the hours and minutes are good sleep". Then, the generation unit 203 calculates the start time, end time, and the ratio of good quality sleep of "sleep" that satisfies the generation condition, and inputs the calculated information into the template to "from 23:05 to 6:00". Generates the comment "29% of the minutes are good sleep" (Figure 12). The ratio of good quality sleep is calculated as, for example, the ratio of the period during which the PSNS level is equal to or higher than a predetermined value during sleep time.
 なお、生成部203によって生成されたコメントは、そのコメントの発生条件を満たした行動に対して付与される。このため、「23時05分から6時00分のうち29%が良質な睡眠」というコメントは、日付「2018/10/19」の時刻「6:00」に終了した睡眠に対して付与される。 Note that the comment generated by the generation unit 203 is given to the action that satisfies the condition for generating the comment. Therefore, the comment "29% of the sleep from 23:05 to 6:00 is good sleep" is given to the sleep that ended at the time "6:00" on the date "2018/10/19". ..
 また、生成部203は、一定期間内に複数回行われた共通の行動について、複数回の行動それぞれにおける精神状態情報に基づいて、共通の行動の総合判定を生成する。例えば、前回のセッションから今回のセッションの間に、行動データ「睡眠」が7回行われていた場合には、生成部203は、7回分の睡眠の精神状態情報に基づいて、「睡眠の総合判定」を行う。具体的には、生成部203は、7回分の睡眠の交感神経データ及び副交感神経データを任意の関数に入力することで、図13に示す「0」~「3」の4段階のスコアに基づく総合判定を行う。そして、生成部203は、総合判定のスコアに対応する総合判定コメントを生成する。なお、総合判定コメントは、予め補助記憶装置24に記憶されている。 In addition, the generation unit 203 generates a comprehensive judgment of the common action based on the mental state information in each of the plurality of actions for the common action performed a plurality of times within a certain period. For example, if the behavior data "sleep" was performed 7 times between the previous session and the current session, the generation unit 203 "comprehensive sleep" based on the mental state information of the 7 times of sleep. Judgment "is performed. Specifically, the generation unit 203 inputs the sympathetic nerve data and the parasympathetic nerve data of seven sleeps into an arbitrary function, and is based on a score of four stages of "0" to "3" shown in FIG. Make a comprehensive judgment. Then, the generation unit 203 generates a comprehensive determination comment corresponding to the score of the comprehensive determination. The comprehensive determination comment is stored in the auxiliary storage device 24 in advance.
 そして、出力制御部204は、精神状態情報及びコメントを出力する(ステップS106)。例えば、出力制御部204は、各行動を示す行動データと、各行動が行われた期間と、各行動における精神状態情報とを関連づけた情報を出力する。 Then, the output control unit 204 outputs the mental state information and the comment (step S106). For example, the output control unit 204 outputs information in which action data indicating each action, a period during which each action is performed, and mental state information in each action are associated with each other.
 図14、図15、及び図16を用いて、自律神経データの表示例を説明する。図14、図15、及び図16は、出力制御部204の処理を説明するための図である。図14、図15、及び図16において、縦軸はSNSレベルに対応し、横軸はPSNSレベルに対応する。また、図14、図15、及び図16において、円の大きさは、各行動が行われた期間(時間)の大きさに対応する。 An example of displaying autonomic nerve data will be described with reference to FIGS. 14, 15, and 16. 14, 15, and 16 are diagrams for explaining the processing of the output control unit 204. In FIGS. 14, 15, and 16, the vertical axis corresponds to the SNS level and the horizontal axis corresponds to the PSNS level. Further, in FIGS. 14, 15, and 16, the size of the circle corresponds to the size of the period (time) in which each action is performed.
 図14に示すように、例えば、出力制御部204は、各行動の自律神経データを表示する。図14に示す例では、出力制御部204は、仕事、移動、食事、睡眠の4つの行動についてプロットする。これにより、操作者は、各行動におけるSNSレベル及びPSNSレベルの違いを容易に把握することができる。例えば、操作者は、移動、食事、睡眠と比較して、仕事におけるSNSレベルが大きいことがわかる。 As shown in FIG. 14, for example, the output control unit 204 displays the autonomic nerve data of each action. In the example shown in FIG. 14, the output control unit 204 plots four behaviors of work, movement, meal, and sleep. As a result, the operator can easily grasp the difference between the SNS level and the PSNS level in each action. For example, the operator finds that the SNS level at work is higher than that of moving, eating, and sleeping.
 また、図15に示すように、出力制御部204は、各行動について、最新の自律神経データと、過去の自律神経データとを同時に表示する。図15に示す例では、出力制御部204は、今回の仕事における自律神経データと、前回の仕事における自律神経データと、前々回の仕事における自律神経データとを併せてプロットする。これにより、操作者は、各行動における現在の精神状態と過去の精神状態との違いを容易に把握することができる。例えば、操作者は、仕事におけるSNSレベルが以前と比較して大きいことがわかる。 Further, as shown in FIG. 15, the output control unit 204 simultaneously displays the latest autonomic nerve data and the past autonomic nerve data for each action. In the example shown in FIG. 15, the output control unit 204 plots the autonomic nerve data in the current work, the autonomic nerve data in the previous work, and the autonomic nerve data in the work two times before. As a result, the operator can easily grasp the difference between the current mental state and the past mental state in each action. For example, the operator finds that the SNS level at work is higher than before.
 また、図16に示すように、出力制御部204は、各行動における標準的な自律神経データを同時に表示する。図16に示す例では、出力制御部204は、対象者の各行動における自律神経データと、対象者と同世代の標準的な人の自律神経データとを併せてプロットする。これにより、操作者は、対象者と標準的な人の精神状態との違いを容易に把握することができる。例えば、操作者は、仕事、移動、食事、睡眠のいずれの行動においても、対象者のSNSレベルが高いことがわかる。また、操作者は、標準的な人と比較して対象者の睡眠時間が短いことがわかる。 Further, as shown in FIG. 16, the output control unit 204 simultaneously displays standard autonomic nerve data in each action. In the example shown in FIG. 16, the output control unit 204 plots the autonomic nerve data in each behavior of the target person and the autonomic nerve data of a standard person of the same generation as the target person. As a result, the operator can easily grasp the difference between the mental state of the subject and the standard person. For example, the operator can see that the SNS level of the subject is high in any of the behaviors of work, movement, eating, and sleeping. In addition, the operator can see that the sleep time of the subject is shorter than that of the standard person.
 また、出力制御部204は、精神状態情報として、精神状態スコア及びセッション評価スコアを表示する。 Further, the output control unit 204 displays the mental state score and the session evaluation score as the mental state information.
 図17及び図18を用いて、精神状態スコア及びセッション評価スコアの表示例を説明する。図17及び図18は、出力制御部204の処理を説明するための図である。図17及び図18において、縦軸は各データの指標に対応し、横軸はセッション回数に対応する。なお、縦軸は、初回のセッション時における各データの指標を100としてプロットされる。 A display example of the mental state score and the session evaluation score will be described with reference to FIGS. 17 and 18. 17 and 18 are diagrams for explaining the processing of the output control unit 204. In FIGS. 17 and 18, the vertical axis corresponds to the index of each data, and the horizontal axis corresponds to the number of sessions. The vertical axis is plotted with the index of each data at the time of the first session as 100.
 図17に示すように、出力制御部204は、精神状態スコアと、トータルパワーデータとを表示する。このトータルパワーデータは、各セッションで区切られた期間(例えば、一週間)分のトータルパワーデータの代表値である。これにより、操作者は、精神状態スコアとともに、センサ11によって検知された客観的な精神状態情報を同時に閲覧することができる。例えば、精神状態スコアには対象者の回答時の精神状態によってブレや嘘が含まれてしまう可能性があるが、操作者は、ブレや嘘の要因となる精神状態を客観的に示す精神状態情報を同時に閲覧できるので、次回のセッション時のアドバイスなどに役立てることができる。 As shown in FIG. 17, the output control unit 204 displays the mental state score and the total power data. This total power data is a representative value of the total power data for the period (for example, one week) divided by each session. As a result, the operator can simultaneously view the objective mental state information detected by the sensor 11 together with the mental state score. For example, the mental state score may include blurring and lies depending on the mental state at the time of the subject's answer, but the operator objectively indicates the mental state that causes the blurring and lying. Since the information can be viewed at the same time, it can be useful for advice at the next session.
 また、図18に示すように、出力制御部204は、セッション評価スコアと、トータルパワーデータとを表示する。このトータルパワーデータは、各セッションで区切られた期間(例えば、一週間)分のトータルパワーデータの代表値である。これにより、操作者は、セッション評価スコアとともに、センサ11によって検知された客観的な精神状態情報を同時に閲覧することができる。例えば、精神状態スコアには対象者の回答時の精神状態によってブレや嘘が含まれてしまう可能性があるが、操作者は、ブレや嘘の要因となる精神状態を客観的に示す精神状態情報を同時に閲覧できるので、次回のセッションの内容を決める際に役立てることができる。 Further, as shown in FIG. 18, the output control unit 204 displays the session evaluation score and the total power data. This total power data is a representative value of the total power data for the period (for example, one week) divided by each session. As a result, the operator can simultaneously view the objective mental state information detected by the sensor 11 together with the session evaluation score. For example, the mental state score may include blurring and lies depending on the mental state at the time of the subject's response, but the operator objectively indicates the mental state that causes the blurring and lying. Information can be viewed at the same time, which can be useful when deciding the content of the next session.
 また、出力制御部204は、各行動に関するコメントを、行動記録情報に対応づけて表示する。 In addition, the output control unit 204 displays comments related to each action in association with the action record information.
 図19を用いて、コメントの表示例を説明する。図19は、出力制御部204の処理を説明するための図である。図19には、不整脈、心拍数、身体活動量、交感神経、及び副交感神経を含む生体情報と、行動履歴及び気分履歴を含む行動記録情報とが対応づけられた情報を例示する。図19において、横軸は時間に対応する。 An example of displaying a comment will be described with reference to FIG. FIG. 19 is a diagram for explaining the processing of the output control unit 204. FIG. 19 illustrates information in which biological information including arrhythmia, heart rate, physical activity, sympathetic nerve, and parasympathetic nerve is associated with behavior record information including behavior history and mood history. In FIG. 19, the horizontal axis corresponds to time.
 図19に示すように、出力制御部204は、センサ11による計測開始時刻と、行動記録情報に含まれる時間データとを照合することで、生体情報の時系列と行動記録情報の時系列とを対応づけて表示する。そして、出力制御部204は、各行動に関するコメントを、行動記録情報に対応づけて表示する。 As shown in FIG. 19, the output control unit 204 collates the measurement start time by the sensor 11 with the time data included in the action record information to obtain a time series of biometric information and a time series of action record information. Display in association with each other. Then, the output control unit 204 displays a comment regarding each action in association with the action record information.
 ここで、生成部203によって生成されたコメントは、そのコメントの発生条件を満たした行動に対して付与される。このため、出力制御部204は、コメント「23時05分から6時00分のうち29%が良質な睡眠」を、日付「2018/10/19」の時刻「6:00」に終了した「睡眠」に対応づけて表示する。また、出力制御部204は、コメント「食事中、興奮気味である」を、日付「2018/10/18」の時刻「20:00」に終了した「夕食」に対応づけて表示する。 Here, the comment generated by the generation unit 203 is given to the action that satisfies the condition for generating the comment. Therefore, the output control unit 204 made a comment "29% of good sleep from 23:05 to 6:00" and ended "sleep" at the time "6:00" on the date "2018/10/19". Is displayed in association with. In addition, the output control unit 204 displays the comment "I am excited during the meal" in association with the "dinner" that ended at the time "20:00" on the date "2018/10/18".
 このように、出力制御部204は、精神状態情報及びコメントを出力する。なお、ここでは、出力制御部204が表示装置26に表示させる場合を説明したが、これに限定されるものではない。例えば、出力制御部204は、精神状態情報及びコメントを、外部装置に送信しても良いし、メモリや記録媒体に格納しても良い。 In this way, the output control unit 204 outputs mental state information and comments. Although the case where the output control unit 204 displays on the display device 26 has been described here, the present invention is not limited to this. For example, the output control unit 204 may transmit the mental state information and the comment to an external device, or may store the mental state information and the comment in a memory or a recording medium.
 また、図14~図19はあくまでも一例であり、全ての表示形態を同時に表示することを要求する者ではない。例えば、操作者は、図14~図19に例示の表示形態を必要に応じて適宜選択して表示させることができる。また、図14~図19に示した表示項目は、操作者の任意により変更可能である。例えば、図17及び図18に示したトータルパワーデータは、他の精神状態情報に置き換えても良いし、複数の精神状態情報を表示しても良い。また、トータルパワーデータの代表値としては、平均値、ピーク値、中央値、標準偏差などの任意の統計値が利用可能である。 Further, FIGS. 14 to 19 are merely examples, and are not a person who requires that all display forms be displayed at the same time. For example, the operator can appropriately select and display the display modes illustrated in FIGS. 14 to 19 as necessary. Further, the display items shown in FIGS. 14 to 19 can be changed by the operator. For example, the total power data shown in FIGS. 17 and 18 may be replaced with other mental state information, or a plurality of mental state information may be displayed. Further, as a representative value of the total power data, any statistical value such as a mean value, a peak value, a median value, and a standard deviation can be used.
 上述してきたように、実施形態に係る情報処理装置20において、取得部201は、対象者の自律神経データを含む生体情報と、対象者の複数の行動が記録された行動記録情報とを取得する。また、算出部202は、生体情報及び行動記録情報に基づいて、対象者の各行動における精神状態情報を算出する。これによれば、操作者は、対象者の精神状態を正確に把握することができる。例えば、実施形態に係る情報処理装置20によれば、センサ11によって収集した心電波形データに基づいて客観的に精神状態を数値化することができるので、対象者の精神状態を正確に把握することができる。 As described above, in the information processing device 20 according to the embodiment, the acquisition unit 201 acquires the biological information including the autonomic nerve data of the target person and the action record information in which a plurality of actions of the target person are recorded. .. In addition, the calculation unit 202 calculates the mental state information in each action of the subject based on the biological information and the action record information. According to this, the operator can accurately grasp the mental state of the subject. For example, according to the information processing device 20 according to the embodiment, the mental state can be objectively quantified based on the electrocardiographic waveform data collected by the sensor 11, so that the mental state of the subject can be accurately grasped. be able to.
 また、例えば、情報処理装置20によれば、各行動における精神状態が数値化されるため、ストレスの要因となる行動を特定し易くなる。このため、操作者は、対象者に対してより適切なアドバイスを行うことができる。 Further, for example, according to the information processing device 20, since the mental state in each action is quantified, it becomes easy to identify the action that causes stress. Therefore, the operator can give more appropriate advice to the target person.
(変形例)
 自律神経データの大きさには個人差があることが知られている。このため、コメントの発生条件(閾値)を、対象者のトータルパワーデータに応じて調整することが好適である。
(Modification example)
It is known that there are individual differences in the size of autonomic nerve data. Therefore, it is preferable to adjust the comment generation condition (threshold value) according to the total power data of the subject.
 例えば、生成部203は、補助記憶装置24に記憶された各コメントの発生条件を、対象者のトータルパワーデータに応じて調整する。例えば、生成部203は、対象者のトータルパワーデータが標準データと比較して大きい場合には、発生条件に含まれる閾値を大きくする。また、生成部203は、対象者のトータルパワーデータが標準データと比較して小さい場合には、発生条件に含まれる閾値を小さくする。そして、生成部203は、調整後の発生条件を用いて、各コメントの発生条件を満たすか否かを判定する。これにより、情報処理装置20は、個人の自律神経データの大きさに応じて適切にコメントを付与することができる。 For example, the generation unit 203 adjusts the generation condition of each comment stored in the auxiliary storage device 24 according to the total power data of the target person. For example, when the total power data of the target person is larger than the standard data, the generation unit 203 increases the threshold value included in the generation condition. Further, when the total power data of the target person is smaller than the standard data, the generation unit 203 reduces the threshold value included in the generation condition. Then, the generation unit 203 determines whether or not the generation condition of each comment is satisfied by using the generation condition after adjustment. As a result, the information processing device 20 can appropriately add comments according to the size of the individual autonomic nerve data.
(第2の実施形態)
 第1の実施形態では、生体情報、行動記録情報、及び質問回答情報に基づいて精神状態情報を算出し、算出した精神状態情報に基づいてコメントを生成する場合を説明したが、実施形態はこれに限定されるものではない。例えば、機械学習により作成した学習済みモデルに対して生体情報及び行動記録情報を入力することで、精神状態判定結果を生成することも可能である。なお、精神状態判定結果とは、精神状態回答データ及びセッション評価回答データに相当するデータを、学習済みモデルによって生成(判定)したものである。
(Second Embodiment)
In the first embodiment, the case where the mental state information is calculated based on the biological information, the action record information, and the question and answer information and the comment is generated based on the calculated mental state information has been described, but this is the embodiment. It is not limited to. For example, it is also possible to generate a mental state determination result by inputting biological information and behavior record information into a learned model created by machine learning. The mental state determination result is obtained by generating (determining) data corresponding to the mental state response data and the session evaluation response data by the trained model.
 すなわち、第2の実施形態に係る情報処理装置50は、特定の精神的傾向を有する複数の対象者(第2対象者)について、各対象者の自律神経データを含む生体情報と、各対象者の複数の行動が記録された行動記録情報と、各対象者により回答された質問回答情報とを用いて学習された学習済みモデルを構築する。そして、情報処理装置50は、構築した学習済みモデルに対して、任意の対象者(第1対象者)の生体情報及び行動記録情報を入力することで、任意の対象者の精神状態が判定された精神状態判定結果を生成する。なお、第1対象者は、第2対象者の中に含まれる場合がある。 That is, the information processing device 50 according to the second embodiment has, for a plurality of subjects (second subjects) having a specific mental tendency, biometric information including autonomic nerve data of each subject and each subject. A learned model is constructed using the behavior record information in which a plurality of behaviors of the above are recorded and the question and answer information answered by each subject. Then, the information processing device 50 determines the mental state of the arbitrary target person by inputting the biological information and the action record information of the arbitrary target person (first target person) into the constructed learned model. Generates the mental state judgment result. The first target person may be included in the second target person.
 なお、第2の実施形態において、「第2対象者」は、例えば、特定の精神的傾向を有する者を含む対象者である。「特定の精神的傾向」とは、統合失調症、気分障害(うつ病、躁うつ病)、パニック障害、摂食障害、発達障害、認知症、てんかん、依存症、高次脳機能障害などの精神障害(精神疾患)を含む。また、このような特定の精神的傾向を有する者の中でも、気分障害(うつ病、躁うつ病)および摂食障害を有する者が第2対象者である場合、上記学習済みモデルの精度(精神状態判定結果の生成精度)が向上しやすく、特にうつ病を有する者が第2対象者の場合、上記学習済みモデルの精度がより向上しやすい。ただし、第2対象者は、精神障害を有する者に限らず、嗜好などの精神的な傾向により分類される者であっても良い。また、第2対象者は、心理カウンセリングにかかる者であっても良い。 In the second embodiment, the "second target person" is, for example, a target person including a person having a specific mental tendency. "Specific mental tendencies" include schizophrenia, mood disorders (depression, manic-depressive illness), panic disorder, eating disorders, developmental disorders, dementia, epilepsy, addiction, higher brain dysfunction, etc. Including mental disorders (mental disorders). In addition, among those who have such a specific mental tendency, when a person having a mood disorder (depression, manic depression) and an eating disorder is the second subject, the accuracy (mental) of the above-mentioned trained model The accuracy of generating the state determination result) is likely to be improved, and especially when the person having depression is the second subject, the accuracy of the trained model is likely to be improved. However, the second target person is not limited to a person having a mental disorder, and may be a person classified according to a mental tendency such as preference. In addition, the second subject may be a person who is involved in psychological counseling.
 図20は、第2の実施形態に係る情報処理装置の概略構成の一例を示す図である。図20に示すように、実施形態に係る情報処理装置50は、センサ11及び携帯端末12とLANやWAN等のネットワークを介して直接的又は間接的に相互に通信可能な状態となっている。なお、情報処理装置50は、生体情報分析装置の一例である。また、センサ11及び携帯端末12は、図1に示したセンサ11及び携帯端末12とそれぞれ同様であるので、説明を省略する。 FIG. 20 is a diagram showing an example of a schematic configuration of the information processing apparatus according to the second embodiment. As shown in FIG. 20, the information processing device 50 according to the embodiment is in a state where it can directly or indirectly communicate with the sensor 11 and the mobile terminal 12 via a network such as a LAN or WAN. The information processing device 50 is an example of a biological information analyzer. Further, since the sensor 11 and the mobile terminal 12 are the same as the sensor 11 and the mobile terminal 12 shown in FIG. 1, the description thereof will be omitted.
 情報処理装置50は、取得部501、判定部502、及び出力制御部503を備える。なお、取得部501及び出力制御部503は、図1に示した取得部201及び出力制御部204とそれぞれ同様であるので、説明を省略する。また、情報処理装置50のハードウェア構成は、図2に示した情報処理装置20のハードウェア構成と同様であるので、説明を省略する。また、判定部502については後述する。 The information processing device 50 includes an acquisition unit 501, a determination unit 502, and an output control unit 503. Since the acquisition unit 501 and the output control unit 503 are the same as the acquisition unit 201 and the output control unit 204 shown in FIG. 1, the description thereof will be omitted. Further, since the hardware configuration of the information processing device 50 is the same as the hardware configuration of the information processing device 20 shown in FIG. 2, the description thereof will be omitted. The determination unit 502 will be described later.
 図21を用いて、第2の実施形態に係る情報処理装置50により行われる学習時及び運用時の処理を説明する。図21は、第2の実施形態に係る情報処理装置50により行われる学習時及び運用時の処理を示す図である。なお、図21に示す対象者Xは、第1対象者の一例である。また、対象者S-1~S-Nは、第2対象者の一例である。 With reference to FIG. 21, the processing at the time of learning and at the time of operation performed by the information processing apparatus 50 according to the second embodiment will be described. FIG. 21 is a diagram showing processing during learning and operation performed by the information processing apparatus 50 according to the second embodiment. The subject X shown in FIG. 21 is an example of the first subject. The subjects S-1 to SN are examples of the second subject.
 図21の上段に示すように、学習時には、情報処理装置50は、例えば、複数の対象者S-1~S-N(例えば、複数の抑うつ傾向者および複数の健常者)の生体情報、行動記録情報、及び質問回答情報(精神状態回答データ及びセッション評価回答データ)を訓練データとして用いて機械学習を行う。訓練データのうち、生体情報及び行動記録情報は入力データであり、質問回答情報は正解データである。この機械学習により、診断対象者の生体情報及び行動記録情報を入力することで診断対象者の精神状態判定結果を出力する学習済みモデルを構築する。この学習済みモデルは、記憶装置(例えば、ROM22、RAM23、補助記憶装置24など)に格納される。 As shown in the upper part of FIG. 21, during learning, the information processing apparatus 50 uses, for example, the biometric information and behavior of a plurality of subjects S-1 to SN (for example, a plurality of depressed-prone persons and a plurality of healthy persons). Machine learning is performed using recorded information and question / answer information (mental state answer data and session evaluation answer data) as training data. Of the training data, biometric information and action record information are input data, and question and answer information is correct answer data. By this machine learning, a learned model that outputs the mental state judgment result of the diagnosis target person is constructed by inputting the biological information and the behavior record information of the diagnosis target person. The trained model is stored in a storage device (for example, ROM 22, RAM 23, auxiliary storage device 24, etc.).
 そして、図21の下段に示すように、運用時には、情報処理装置50は、診断対象者である対象者Xの生体情報及び行動記録情報を、学習時に構築した学習済みモデルに対して入力することで、学習済みモデルに対象者Xの精神状態判定結果を出力させる。ここで、対象者Xは、例えば、抑うつ傾向の疑いのある者である。そして、情報処理装置50は、学習済みモデルから出力された対象者Xの精神状態判定結果を操作者(又は対象者X)に提示する。 Then, as shown in the lower part of FIG. 21, during operation, the information processing device 50 inputs the biological information and the action record information of the subject X, who is the diagnosis target, into the learned model constructed at the time of learning. Then, the trained model is made to output the mental state determination result of the subject X. Here, the subject X is, for example, a person suspected of having a depressive tendency. Then, the information processing device 50 presents the mental state determination result of the target person X output from the learned model to the operator (or the target person X).
 なお、生体情報は、第1の実施形態にて説明した生体情報と同様の情報である。例えば、生体情報は、自律神経データ、体温データ、脈拍データ、及び身体動作データの全てを含んでいても良いし、任意のデータのみを含んでいても良い。また、生体情報の対象期間は、数時間、数日、数週間など、任意に設定可能である。 The biometric information is the same information as the biometric information described in the first embodiment. For example, the biological information may include all of autonomic nerve data, body temperature data, pulse data, and body movement data, or may include only arbitrary data. In addition, the target period of the biological information can be arbitrarily set such as several hours, several days, and several weeks.
 また、行動記録情報は、第1の実施形態にて説明した行動記録情報と同様の情報である。例えば、行動記録情報は、行動データ、時間データ、及び気分履歴データの全てを含んでいても良いし、任意のデータのみを含んでいても良い。また、行動記録情報の対象期間は、任意に設定可能であるが、生体情報の対象期間と同一であるのが好適である。 Further, the action record information is the same information as the action record information described in the first embodiment. For example, the behavior record information may include all of the behavior data, the time data, and the mood history data, or may include only arbitrary data. The target period of the action record information can be arbitrarily set, but it is preferably the same as the target period of the biometric information.
 また、質問回答情報は、第1の実施形態にて説明した質問回答情報と同様の情報である。例えば、質問回答情報は、精神状態回答データ及びセッション評価回答データの双方を含んでいても良いし、一方のデータのみを含んでいても良い。また、質問回答情報は、任意の対象期間(生体情報の対象期間と同一であるのが好適)において、同期間全体の傾向として1回取得してもよいし、任意のタイミングで任意回数取得してもよいが、同期間全体の傾向として1回取得することが好ましい。 Further, the question-and-answer information is the same information as the question-and-answer information described in the first embodiment. For example, the question-and-answer information may include both the mental state answer data and the session evaluation answer data, or may include only one of the data. In addition, the question-and-answer information may be acquired once as a tendency of the entire period during an arbitrary target period (preferably the same as the target period of the biological information), or may be acquired an arbitrary number of times at an arbitrary timing. Although it may be obtained, it is preferable to acquire it once as a tendency of the whole period.
 また、学習済みモデルを構築する処理は、例えば、判定部502によって実行される。すなわち、判定部502は、特定の精神的傾向を有する複数の第2対象者について、第2対象者の自律神経データを含む生体情報と、第2対象者の複数の行動が記録された行動記録情報と、第2対象者により回答された質問回答情報とを取得する。ここで、第2対象者の生体情報、行動記録情報、及び質問回答情報は、予め収集され、記憶装置(例えば、ROM22、RAM23、補助記憶装置24など)に格納済みである。例えば、判定部502は、記憶装置から第2対象者の生体情報、行動記録情報、及び質問回答情報を読み出すことにより、当該情報を取得する。そして、判定部502は、取得した生体情報と、行動記録情報と、質問回答情報と用いた機械学習を行うことにより、学習済みモデルを構築する。なお、学習済みモデルを構築する処理は、判定部502に限らず、任意の処理部(プロセッサ)によって実行可能である。 Further, the process of constructing the trained model is executed by, for example, the determination unit 502. That is, the determination unit 502 records the biological information including the autonomic nerve data of the second subject and the behavior record in which the plurality of actions of the second subject are recorded for the plurality of second subjects having a specific mental tendency. The information and the question and answer information answered by the second target person are acquired. Here, the biological information, the behavior record information, and the question / answer information of the second subject are collected in advance and stored in a storage device (for example, ROM 22, RAM 23, auxiliary storage device 24, etc.). For example, the determination unit 502 acquires the information by reading the biological information, the action record information, and the question / answer information of the second subject from the storage device. Then, the determination unit 502 constructs a learned model by performing machine learning using the acquired biological information, action record information, and question / answer information. The process of constructing the trained model is not limited to the determination unit 502, and can be executed by any processing unit (processor).
 図22を用いて、運用時において、情報処理装置50が対象者Xの精神状態判定結果を出力する処理(精神状態判定処理)について説明する。図22は、第2の実施形態に係る情報処理装置50の動作例を示すフローチャートである。なお、図22に示す精神状態判定処理は、抑うつ傾向等の疑いのある対象者Xについて、抑うつ傾向等であるか否かを判定するための処理である。また、精神状態判定処理は、任意のタイミングで実行可能である。 A process (mental state determination process) in which the information processing device 50 outputs the mental state determination result of the target person X during operation will be described with reference to FIG. 22. FIG. 22 is a flowchart showing an operation example of the information processing apparatus 50 according to the second embodiment. The mental state determination process shown in FIG. 22 is a process for determining whether or not the subject X suspected of having a depressive tendency or the like has a depressive tendency or the like. In addition, the mental state determination process can be executed at any timing.
 図22に示すように、取得部501は、対象者Xの生体情報を取得する(ステップS201)。例えば、取得部501は、センサ11によって収集された各種の計測データに基づいて対象者Xの生体情報を生成することで、対象者Xの生体情報を取得する。 As shown in FIG. 22, the acquisition unit 501 acquires the biological information of the subject X (step S201). For example, the acquisition unit 501 acquires the biological information of the subject X by generating the biological information of the subject X based on various measurement data collected by the sensor 11.
 続いて、取得部501は、対象者Xの行動記録情報を取得する(ステップS202)。例えば、取得部501は、対象者Xの携帯端末12から行動記録情報を取得する。 Subsequently, the acquisition unit 501 acquires the action record information of the target person X (step S202). For example, the acquisition unit 501 acquires the action record information from the mobile terminal 12 of the target person X.
 そして、判定部502は、学習済みモデルに対して、対象者Xの生体情報及び行動記録情報を入力することで、対象者Xの精神状態判定結果を生成する。例えば、判定部502は、記憶装置に記憶されている学習済みモデルを読み出す。そして、判定部502は、読み出した学習済みモデルに対して、取得部501により取得された生体情報及び行動記録情報を入力することで、対象者Xの精神状態判定結果を学習済みモデルに出力させる。その後、判定部502は、学習済みモデルから出力された精神状態判定結果を出力制御部503へ送る。 Then, the determination unit 502 generates the mental state determination result of the subject X by inputting the biological information and the action record information of the subject X into the trained model. For example, the determination unit 502 reads out the learned model stored in the storage device. Then, the determination unit 502 inputs the biological information and the action record information acquired by the acquisition unit 501 to the read learned model, so that the mental state determination result of the subject X is output to the learned model. .. After that, the determination unit 502 sends the mental state determination result output from the learned model to the output control unit 503.
 なお、学習済みモデルから出力される精神状態判定結果は、学習済みモデルの構築に用いられた質問回答情報と同一種類の情報を含む。つまり、機械学習において用いられた質問回答情報が精神状態回答データ及びセッション評価回答データの双方を含む場合には、学習済みモデルから出力される精神状態判定結果は、精神状態回答データ及びセッション評価回答データの双方を含む。また、機械学習において用いられた質問回答情報が精神状態回答データを含む場合には、学習済みモデルから出力される精神状態判定結果は、精神状態回答データを含む。また、機械学習において用いられた質問回答情報がセッション評価回答データを含む場合には、学習済みモデルから出力される精神状態判定結果は、セッション評価回答データを含む。 The mental state judgment result output from the trained model includes the same type of information as the question and answer information used for constructing the trained model. That is, when the question and answer information used in machine learning includes both the mental state answer data and the session evaluation answer data, the mental state judgment result output from the trained model is the mental state answer data and the session evaluation answer. Includes both data. When the question-and-answer information used in machine learning includes mental state answer data, the mental state determination result output from the trained model includes mental state answer data. Further, when the question / answer information used in the machine learning includes the session evaluation answer data, the mental state determination result output from the trained model includes the session evaluation answer data.
 出力制御部503は、対象者Xの精神状態判定結果を出力させる。例えば、出力制御部503は、精神状態判定結果を表示装置26に表示させる。なお、精神状態判定結果の出力先は表示装置26に限らず、例えば、外部装置に送信されても良いし、メモリや記録媒体に格納されても良い。 The output control unit 503 outputs the mental state determination result of the subject X. For example, the output control unit 503 causes the display device 26 to display the mental state determination result. The output destination of the mental state determination result is not limited to the display device 26, and may be transmitted to, for example, an external device, or may be stored in a memory or a recording medium.
 上述してきたように、第2の実施形態に係る情報処理装置50において、取得部501は、対象者の自律神経データを含む生体情報と、対象者の複数の行動が記録された行動記録情報とを取得する。また、判定部502は、学習済みモデルに対して、対象者の生体情報及び行動記録情報を入力することで、対象者の精神状態判定結果を生成する。これによれば、操作者は、対象者の精神状態を正確に把握することができる。 As described above, in the information processing device 50 according to the second embodiment, the acquisition unit 501 includes biological information including autonomic nerve data of the subject and action record information in which a plurality of actions of the subject are recorded. To get. Further, the determination unit 502 generates a mental state determination result of the subject by inputting the biological information and the behavior record information of the subject into the trained model. According to this, the operator can accurately grasp the mental state of the subject.
 例えば、第2の実施形態に係る情報処理装置50によれば、対象者本人が記入しなくとも、精神状態回答データ及び/又はセッション評価回答データに相当する精神状態判定結果を、学習済みモデルによって自動的に生成(判定)することができる。このため、操作者は、情報処理装置50から出力される精神状態判定結果を閲覧することで、対象者が特定の精神的傾向を有するか否かを容易に把握することができる。 For example, according to the information processing device 50 according to the second embodiment, the mental state determination result corresponding to the mental state answer data and / or the session evaluation answer data can be obtained by the trained model even if the subject does not fill in the information. It can be automatically generated (judged). Therefore, the operator can easily grasp whether or not the target person has a specific mental tendency by viewing the mental state determination result output from the information processing device 50.
 なお、第2の実施形態では、抑うつ傾向等の疑いのある者に対して精神状態判定処理を適用する場合を説明したが、これに限定されるものではない。例えば、精神状態判定処理は、抑うつ傾向等の疑いのない者に対しても適用可能である。つまり、判定対象となる対象者X(第1対象者、判定対象者とも呼ばれる)は、特段限定される必要はなく、任意の対象者であって良い。例えば、第1対象者は、複数の第2対象者のうちの任意の者であっても良い。 In the second embodiment, the case where the mental state determination process is applied to a person suspected of having a depressive tendency or the like has been described, but the present invention is not limited to this. For example, the mental state determination process can be applied to a person who is not suspected of having a depressive tendency or the like. That is, the target person X (also referred to as the first target person or the judgment target person) to be determined does not need to be particularly limited and may be any target person. For example, the first target person may be any one of a plurality of second target persons.
 また、第2の実施形態では、学習済みモデルが情報処理装置50にて構築される場合を説明したが、実施形態はこれに限定されるものではない。例えば、学習済みモデルは、情報処理装置50とは異なる外部装置にて構築されても良い。この場合、情報処理装置50は、外部装置にて構築された学習済みモデルを取得して、上記の精神状態判定処理を実行する。 Further, in the second embodiment, the case where the trained model is constructed by the information processing device 50 has been described, but the embodiment is not limited to this. For example, the trained model may be constructed by an external device different from the information processing device 50. In this case, the information processing device 50 acquires the trained model constructed by the external device and executes the above-mentioned mental state determination process.
(第2の実施形態の変形例1)
 また、第2の実施形態にて説明した情報処理装置50は、精神状態判定結果をそのまま操作者に提示するだけでなく、更に、抑うつ傾向者等の質問回答情報との「合致率」を算出して提示することも可能である。
(Modification 1 of the second embodiment)
Further, the information processing device 50 described in the second embodiment not only presents the mental state determination result to the operator as it is, but also calculates a "match rate" with the question and answer information of the depressed person and the like. It is also possible to present it.
 例えば、判定部502は、対象者Xの精神状態判定結果と、抑うつ傾向者等の代表的な質問回答情報とを照合する。なお、抑うつ傾向者等の代表的な質問回答情報は、任意の抑うつ傾向者等の質問回答情報をそのまま利用しても良いし、複数の抑うつ傾向者等の質問回答情報を統合(平均)して利用しても良い。 For example, the determination unit 502 collates the mental state determination result of the subject X with the representative question-and-answer information of a depressed person or the like. As the typical question-and-answer information of the depressive-prone person, etc., the question-and-answer information of any depressive-prone person, etc. may be used as it is, or the question-and-answer information of a plurality of depressive-prone people, etc. You may use it.
 具体的には、判定部502は、精神状態判定結果に含まれる各項目と、質問回答情報に含まれる各項目とを照合し、一致している項目数を算出する。そして、判定部502は、一致している項目数を全項目数で除算することで、「合致率」を算出する。なお、ここで説明した「合致率」の算出方法はあくまで一例であり、任意の算出方法が適用可能である。 Specifically, the determination unit 502 collates each item included in the mental state determination result with each item included in the question and answer information, and calculates the number of matching items. Then, the determination unit 502 calculates the "match rate" by dividing the number of matching items by the total number of items. The calculation method of the "match rate" described here is just an example, and any calculation method can be applied.
 そして、出力制御部503は、判定部502によって算出された「合致率」を出力する。これにより、情報処理装置50は、操作者(医師等の医療従事者)による診断を支援することができる。例えば、操作者は、合致率を閲覧することで、対象者が特定の精神的傾向を有するか否かを正確かつ容易に診断することができる。 Then, the output control unit 503 outputs the "match rate" calculated by the determination unit 502. As a result, the information processing device 50 can support the diagnosis by the operator (medical worker such as a doctor). For example, the operator can accurately and easily diagnose whether or not the subject has a specific mental tendency by viewing the match rate.
 なお、合致率が出力される場合には、精神状態判定結果は必ずしも出力されなくても良い。 When the match rate is output, the mental state determination result does not necessarily have to be output.
(第2の実施形態の変形例2)
 また、第2の実施形態の変形例1にて説明した情報処理装置50は、合致率をそのまま操作者に提示するだけでなく、更に、抑うつ傾向者等であるか否かの可能性の示唆(精神的傾向の可能性の示唆)を提示することも可能である。
(Modification 2 of the second embodiment)
Further, the information processing device 50 described in the first modification of the second embodiment not only presents the matching rate to the operator as it is, but also suggests the possibility of being a depressed person or the like. It is also possible to present (suggestions of possible mental tendencies).
 例えば、精神的傾向の可能性の示唆は、抑うつ傾向である可能性の程度を段階的に示す情報であっても良い。例えば、精神的傾向判定結果は、合致率の高さに応じて、抑うつ傾向である可能性を「高」、「中」、「低」の3段階に判別した情報であっても良い。なお、3段階に判別する場合には、2つの閾値が利用される。 For example, the suggestion of the possibility of mental tendency may be information that indicates the degree of possibility of depressive tendency step by step. For example, the mental tendency determination result may be information in which the possibility of depressive tendency is determined in three stages of "high", "medium", and "low" according to the high matching rate. In addition, when discriminating in three stages, two threshold values are used.
 また、精神的傾向の可能性の示唆を示す情報が出力される場合には、精神状態判定結果や合致率は必ずしも出力されなくても良い。 In addition, when information indicating the possibility of mental tendency is output, the mental state judgment result and the matching rate do not necessarily have to be output.
 また、操作者に提示される精神的傾向の可能性の示唆を示す情報は、あくまで診断の支援を目的としたものであり、診断結果そのものではない。つまり、情報処理装置50の処理は、医療行為に該当しない。 In addition, the information presented to the operator that suggests the possibility of mental tendency is for the purpose of supporting the diagnosis, not the diagnosis result itself. That is, the processing of the information processing device 50 does not correspond to a medical practice.
(第2の実施形態の変形例3)
 また、第2の実施形態にて説明した学習済みモデル(精神状態判定処理)は、第1の実施形態に示した情報処理装置20に適用されても良い。これにより、情報処理装置20は、対象者本人が質問回答情報を記入しなくとも、精神状態情報及びコメントを出力することができる。
(Modification 3 of the second embodiment)
Further, the trained model (mental state determination process) described in the second embodiment may be applied to the information processing device 20 shown in the first embodiment. As a result, the information processing device 20 can output the mental state information and the comment even if the subject himself / herself does not enter the question / answer information.
 具体的には、情報処理装置20は、図3のステップS103の処理に代えて、図22のステップS203の処理を実行する。つまり、情報処理装置20は、図3のステップS101及びステップS102において取得された生体情報及び行動記録情報を学習済みモデルに対して入力することで、対象者の精神状態判定結果を生成する。 Specifically, the information processing apparatus 20 executes the process of step S203 of FIG. 22 instead of the process of step S103 of FIG. That is, the information processing device 20 generates the mental state determination result of the subject by inputting the biological information and the action record information acquired in steps S101 and S102 of FIG. 3 into the trained model.
 そして、情報処理装置20は、図3のステップS104の処理において、質問回答情報に代えて精神状態判定結果を利用する。つまり、情報処理装置20は、生体情報、行動記録情報、及び精神状態判定結果に基づいて、精神状態情報を算出する。なお、精神状態判定結果は、精神状態回答データ及びセッション評価回答データに相当するデータであるので、質問回答情報に代えて利用可能である。 Then, in the process of step S104 of FIG. 3, the information processing device 20 uses the mental state determination result instead of the question answer information. That is, the information processing device 20 calculates the mental state information based on the biological information, the action record information, and the mental state determination result. Since the mental state determination result is data corresponding to the mental state answer data and the session evaluation answer data, it can be used instead of the question answer information.
 そして、情報処理装置20は、図3のステップS105及びステップS106の処理と同様の処理を実行することで、精神状態情報及びコメントを出力することができる。 Then, the information processing apparatus 20 can output the mental state information and the comment by executing the same processing as the processing of step S105 and step S106 of FIG.
(その他の実施形態)
 上述した実施形態以外にも、種々の異なる形態にて実施されてもよい。
(Other embodiments)
In addition to the above-described embodiments, various different embodiments may be implemented.
(体表面に貼り付け可能なセンサの利用)
 上述した実施形態では、センサ11については、公知の技術を適宜選択して適用可能である旨を説明した。しかしながら、長期間(数日、或いは数週間程度)の生体情報を収集するためには、体表面に貼り付け可能なセンサが適用されるのが好適である。そして、このような体表面に貼り付け可能なセンサとしては、体の動きに対して柔軟に追従することができるよう、フレキシブルなデバイス内部に折り曲げ等で断線しない回路を有するウェアラブルセンサであることが好ましい。また、このような体表面に貼り付け可能なセンサとしては、体に不快感を与えず、かつ高い医療品質特性を有する、生物学的安全性試験に適合したアクリル系樹脂等の粘着体を介して体表面に張り付け可能であるセンサであることが好ましい。
(Use of a sensor that can be attached to the body surface)
In the above-described embodiment, it has been explained that a known technique can be appropriately selected and applied to the sensor 11. However, in order to collect biological information for a long period of time (several days or weeks), it is preferable to apply a sensor that can be attached to the body surface. The sensor that can be attached to the body surface is a wearable sensor that has a circuit inside the flexible device that does not break due to bending or the like so that it can flexibly follow the movement of the body. preferable. Further, as such a sensor that can be attached to the body surface, an adhesive body such as an acrylic resin that does not cause discomfort to the body and has high medical quality characteristics and conforms to a biological safety test is used. It is preferable that the sensor can be attached to the body surface.
 例えば、取得部201は、対象者の体表面に貼り付け可能なセンサ11によって収集されたデータに基づいて、生体情報を取得する。これにより、取得部201は、睡眠時の生体情報も覚醒時の生体情報も漏れなく収集することができるので、長期間にわたる連続的な生体情報を収集することができる。なお、本実施形態に利用される生体情報は、必ずしも連続的なデータでなくても構わないが、連続的なデータであるのが好適である。このような長期間の連続的なデータとしては、24時間以上の連続的なデータであることが好ましく、72時間以上の連続的なデータであることがより好ましく、144時間以上の連続的なデータであることが特に好ましく、168時間以上の連続的データであることが最も好ましい。また、このような連続的なデータとしては、平日の連続的なデータと休日の連続的なデータを有することが好ましい。このような連続的なデータであれば、第2の実施形態において、より精度の高い学習済みモデルが構築される。 For example, the acquisition unit 201 acquires biometric information based on the data collected by the sensor 11 that can be attached to the body surface of the subject. As a result, the acquisition unit 201 can collect both the biological information during sleep and the biological information during awakening without omission, so that it is possible to continuously collect biological information over a long period of time. The biological information used in this embodiment does not necessarily have to be continuous data, but it is preferably continuous data. As such long-term continuous data, continuous data of 24 hours or more is preferable, continuous data of 72 hours or more is more preferable, and continuous data of 144 hours or more is preferable. Is particularly preferable, and continuous data of 168 hours or more is most preferable. Further, as such continuous data, it is preferable to have continuous data on weekdays and continuous data on holidays. With such continuous data, a more accurate trained model is constructed in the second embodiment.
(生体情報分析システム)
 上述した実施形態に係る情報処理装置20が備える各機能は、システムとして提供されても良い。
(Biological information analysis system)
Each function included in the information processing apparatus 20 according to the above-described embodiment may be provided as a system.
 図23は、実施形態に係るシステムの概略構成の一例を示す図である。図23に示すように、実施形態に係るシステム1は、サーバ装置30と、表示用端末40とを備える。サーバ装置30及び表示用端末40は、LAN(Local Area Network)やWAN(Wide Area Network)等のネットワークを介して直接的又は間接的に相互に通信可能な状態となっている。なお、システム1は、生体情報分析システムの一例である。 FIG. 23 is a diagram showing an example of a schematic configuration of the system according to the embodiment. As shown in FIG. 23, the system 1 according to the embodiment includes a server device 30 and a display terminal 40. The server device 30 and the display terminal 40 are in a state of being able to communicate directly or indirectly with each other via a network such as a LAN (Local Area Network) or a WAN (Wide Area Network). The system 1 is an example of a biological information analysis system.
 また、センサ11及び携帯端末12は、サーバ装置30とネットワークを介して直接的又は間接的に相互に通信可能な状態となっているが、サーバ装置30に常時接続されていなくても良い。なお、センサ11及び携帯端末12は、図1に示したセンサ11及び携帯端末12と基本的に同様の構成を有するので、説明を省略する。 Further, although the sensor 11 and the mobile terminal 12 are in a state of being able to communicate directly or indirectly with the server device 30 via the network, they do not have to be always connected to the server device 30. Since the sensor 11 and the mobile terminal 12 have basically the same configuration as the sensor 11 and the mobile terminal 12 shown in FIG. 1, description thereof will be omitted.
 サーバ装置30は、例えば、生体情報分析処理をクラウドサービスとして提供するサービスセンタに備えられる。サーバ装置30は、取得部301、算出部302、生成部303、及び出力制御部304を備える。取得部301、算出部302、生成部303、及び出力制御部304の処理内容は、図1に示した取得部201、算出部202、生成部203、及び出力制御部204の処理内容と基本的に同様であるので、説明を省略する。 The server device 30 is provided in, for example, a service center that provides biometric information analysis processing as a cloud service. The server device 30 includes an acquisition unit 301, a calculation unit 302, a generation unit 303, and an output control unit 304. The processing contents of the acquisition unit 301, the calculation unit 302, the generation unit 303, and the output control unit 304 are basically the processing contents of the acquisition unit 201, the calculation unit 202, the generation unit 203, and the output control unit 204 shown in FIG. Since it is the same as the above, the description thereof will be omitted.
 表示用端末40は、例えば、クラウドサービスの利用者が利用する情報処理端末であり、例えば、パーソナルコンピュータやワークステーション、スマートフォン、タブレット等、表示装置を備える端末である。ここで、クラウドサービスの利用者とは、心理カウンセリングを行う者(例えば、カウンセラー、医師など)に対応する。 The display terminal 40 is, for example, an information processing terminal used by a user of a cloud service, and is, for example, a terminal provided with a display device such as a personal computer, a workstation, a smartphone, or a tablet. Here, the user of the cloud service corresponds to a person who provides psychological counseling (for example, a counselor, a doctor, etc.).
 ここで、サーバ装置30の出力制御部304は、対象者の各行動における精神状態情報を表示用端末に送信する。表示用端末40の表示制御部401は、サーバ装置30から送信された対象者の各行動における精神状態情報を受信し、受信した対象者の各行動における精神状態情報を表示させる。これによれば、利用者は、対象者の精神状態を正確に把握することができる。 Here, the output control unit 304 of the server device 30 transmits the mental state information in each action of the target person to the display terminal. The display control unit 401 of the display terminal 40 receives the mental state information in each action of the target person transmitted from the server device 30, and displays the mental state information in each action of the received target person. According to this, the user can accurately grasp the mental state of the subject.
 また、図示した各装置の各構成要素は機能概念的なものであり、必ずしも物理的に図示の如く構成されていることを要しない。すなわち、各装置の分散・統合の具体的形態は図示のものに限られず、その全部又は一部を、各種の負荷や使用状況等に応じて、任意の単位で機能的又は物理的に分散・統合して構成することができる。更に、各装置にて行なわれる各処理機能は、その全部又は任意の一部が、CPU及び当該CPUにて解析実行されるプログラムにて実現され、或いは、ワイヤードロジックによるハードウェアとして実現され得る。 Further, each component of each device shown in the figure is a functional concept, and does not necessarily have to be physically configured as shown in the figure. That is, the specific form of distribution / integration of each device is not limited to the one shown in the figure, and all or part of the device is functionally or physically dispersed / physically distributed in an arbitrary unit according to various loads and usage conditions. It can be integrated and configured. Further, each processing function performed by each device may be realized by a CPU and a program analyzed and executed by the CPU, or may be realized as hardware by wired logic.
 また、上記の実施形態において説明した各処理のうち、自動的に行なわれるものとして説明した処理の全部又は一部を手動的に行なうこともでき、或いは、手動的に行なわれるものとして説明した処理の全部又は一部を公知の方法で自動的に行なうこともできる。この他、上記文書中や図面中で示した処理手順、制御手順、具体的名称、各種のデータやパラメータを含む情報については、特記する場合を除いて任意に変更することができる。 Further, among the processes described in the above-described embodiment, all or a part of the processes described as being automatically performed can be manually performed, or the processes described as being manually performed. It is also possible to automatically perform all or part of the above by a known method. In addition, the processing procedure, control procedure, specific name, and information including various data and parameters shown in the above document and drawings can be arbitrarily changed unless otherwise specified.
 また、上記の実施形態で説明した生体情報分析方法は、予め用意された生体情報分析プログラムをパーソナルコンピュータやワークステーション等のコンピュータで実行することによって実現することができる。この生体情報分析プログラムは、インターネット等のネットワークを介して配布することができる。また、この生体情報分析プログラムは、ハードディスク、フレキシブルディスク(FD)、CD-ROM、MO、DVD等のコンピュータで読み取り可能な記録媒体に記録され、コンピュータによって記録媒体から読み出されることによって実行することもできる。 Further, the biometric information analysis method described in the above embodiment can be realized by executing a biometric information analysis program prepared in advance on a computer such as a personal computer or a workstation. This biometric information analysis program can be distributed via a network such as the Internet. Further, this biometric information analysis program may be executed by being recorded on a computer-readable recording medium such as a hard disk, flexible disk (FD), CD-ROM, MO, or DVD, and being read from the recording medium by the computer. it can.
 以上説明した少なくともひとつの実施形態によれば、対象者の精神状態を正確に把握することができる。 According to at least one embodiment described above, the mental state of the subject can be accurately grasped.
 本発明のいくつかの実施形態を説明したが、これらの実施形態は、例として提示したものであり、発明の範囲を限定することは意図していない。これら実施形態は、その他の様々な形態で実施されることが可能であり、発明の要旨を逸脱しない範囲で、種々の省略、置き換え、変更を行うことができる。これら実施形態やその変形は、発明の範囲や要旨に含まれると同様に、特許請求の範囲に記載された発明とその均等の範囲に含まれるものである。 Although some embodiments of the present invention have been described, these embodiments are presented as examples and are not intended to limit the scope of the invention. These embodiments can be implemented in various other forms, and various omissions, replacements, and changes can be made without departing from the gist of the invention. These embodiments and modifications thereof are included in the scope and gist of the invention, as well as in the scope of the invention described in the claims and the equivalent scope thereof.
 [実施例]
 以下に、本発明を実施例に基づき、さらに詳細に説明するが、本発明はそれらに限定されるものではない。
[Example]
Hereinafter, the present invention will be described in more detail based on examples, but the present invention is not limited thereto.
 (実施例1~5、比較例1)
 第2の実施形態にて説明した方法により、生体情報及び行動記録情報に入力によって精神状態回答データ及びセッション評価回答データを出力する学習済みモデルを作成した。なお、機械学習に用いるデータの種類を異ならせることにより、図24に示す実施例1~5、及び比較例1のそれぞれについて、学習済みモデルを作成した。なお、図24は、各実施例及び比較例に係る学習済みモデルについて説明するための図である。
(Examples 1 to 5, Comparative Example 1)
By the method described in the second embodiment, a trained model was created that outputs mental state response data and session evaluation response data by inputting to biological information and behavior record information. By different types of data used for machine learning, trained models were created for each of Examples 1 to 5 and Comparative Example 1 shown in FIG. 24. Note that FIG. 24 is a diagram for explaining the trained model according to each Example and Comparative Example.
 まず、学習済みモデルの作成に用いたインプット情報(訓練データ)について説明する。本実施例では、インプット情報として、「生体情報」、「行動記録情報」、「精神状態回答データ」、及び「セッション評価回答データ」を利用した。また、15人の抑うつ傾向者および15人の健常者を対象者として、それぞれのインプット情報を一週間分(平日と休日を含む168時間分)採取した。 First, the input information (training data) used to create the trained model will be explained. In this embodiment, "biological information", "behavior record information", "mental state response data", and "session evaluation response data" were used as input information. In addition, input information for one week (168 hours including weekdays and holidays) was collected for 15 depressive subjects and 15 healthy subjects.
 「生体情報」としては、自律神経データ、体温データ、脈拍データ、及び身体動作データを利用した。自律神経データは、交感神経データ、副交感神経データ、及びトータルパワーデータを含み、それぞれ心電波形データから生成した。体温データ(温度データ)は、対象者の体表面における体温の時系列情報である。脈拍データは、脈拍の時系列情報である。身体動作データは、身体活動量を示し、加速度データから生成した。なお、心電波形データ、体温データ、脈波データ、及び加速度データは、対象者の体表面に貼り付け可能な態様のセンサ(例えば、特表2017-510390に記載される態様のウェアラブル装置)により経時的に採取した。 As "biological information", autonomic nerve data, body temperature data, pulse data, and body movement data were used. The autonomic nerve data included sympathetic nerve data, parasympathetic nerve data, and total power data, and each was generated from electrocardiographic waveform data. Body temperature data (temperature data) is time-series information of body temperature on the body surface of the subject. The pulse data is time series information of the pulse. The physical activity data shows the amount of physical activity and was generated from the acceleration data. The electrocardiographic waveform data, body temperature data, pulse wave data, and acceleration data are obtained by a sensor in a mode that can be attached to the body surface of the subject (for example, a wearable device in a mode described in Special Table 2017-510390). Collected over time.
 「行動記録情報」としては、図6に示す行動データ、時間データ、及び気分履歴データを利用した。なお、行動記録情報は、対象期間(上記の7日間)にわたって各対象者に記録してもらうことにより採取した。 As the "behavior record information", the behavior data, time data, and mood history data shown in FIG. 6 were used. The behavior record information was collected by having each subject record it over the target period (the above 7 days).
 「精神状態回答データ」及び「セッション評価回答データ」は、図8及び図9にそれぞれ示したデータと同様のデータを利用した。なお、精神状態回答データ及びセッション評価回答データは、対象期間にわたって各対象者に各実施例、各比較例の期間における傾向として、対象期間経過時点で1回記録してもらうことにより採取した。 For the "mental state response data" and the "session evaluation response data", the same data as those shown in FIGS. 8 and 9, respectively, were used. The mental state response data and the session evaluation response data were collected by having each subject record once during the target period as a tendency in each example and each comparative example.
 次に、各実施例及び比較例に係る学習済みモデルの作成について説明する。なお、各実施例及び比較例に係る学習済みモデルは、図24の「○印」に対応する各種のデータを訓練データとして利用して作成した。また、各実施例及び比較例に係る学習済みモデルは、生体情報と行動記録情報を「入力データ」とし、精神状態回答データとセッション評価回答データを「正解データ」とする機械学習により作成し、本実施例では機械学習方法としてランダムフォーレストを用いた。 Next, the creation of a trained model according to each example and comparative example will be described. The trained models according to the examples and the comparative examples were created by using various data corresponding to the “◯” in FIG. 24 as training data. In addition, the trained models related to each example and comparative example are created by machine learning using biometric information and action record information as "input data" and mental state response data and session evaluation response data as "correct answer data". In this example, a random forest was used as a machine learning method.
 実施例1に係る学習済みモデルは、平日(月曜~金曜)に対応する24時間分(1日分)の訓練データ(入力データ及び正解データ)を用いて作成した。なお、実施例1では、生体情報としては自律神経データを利用し、体温データ、脈拍データ、及び身体動作データは利用せずに機械学習を行った。 The trained model according to Example 1 was created using training data (input data and correct answer data) for 24 hours (1 day) corresponding to weekdays (Monday to Friday). In Example 1, autonomic nerve data was used as biological information, and machine learning was performed without using body temperature data, pulse data, and body movement data.
 実施例2に係る学習済みモデルは、休日(土曜、日曜)に対応する24時間分(1日分)の訓練データを用いて作成した。なお、実施例2では、生体情報としては自律神経データを利用し、体温データ、脈拍データ、及び身体動作データは利用せずに機械学習を行った。 The trained model according to Example 2 was created using training data for 24 hours (1 day) corresponding to holidays (Saturday, Sunday). In Example 2, autonomic nerve data was used as biological information, and machine learning was performed without using body temperature data, pulse data, and body movement data.
 実施例3に係る学習済みモデルは、平日に対応する24時間分(1日分)の訓練データを用いて作成した。 The trained model according to Example 3 was created using training data for 24 hours (1 day) corresponding to weekdays.
 実施例4に係る学習済みモデルは、平日に対応する72時間分(3日分)の訓練データを用いて作成した。なお、この3日分の訓練データは、例えば、月曜、水曜、及び金曜の訓練データを統合したものである。 The trained model according to Example 4 was created using training data for 72 hours (3 days) corresponding to weekdays. The training data for these three days is, for example, a combination of training data for Monday, Wednesday, and Friday.
 実施例5に係る学習済みモデルは、平日及び休日に対応する168時間分(一週間分)の訓練データを用いて作成した。 The trained model according to Example 5 was created using training data for 168 hours (one week) corresponding to weekdays and holidays.
 比較例に係る学習済みモデルは、平日に対応する24時間分(1日分)の訓練データを用いて作成した。なお、比較例では、生体情報としては脈拍データを利用し、自律神経データ、体温データ、及び身体動作データは利用せずに機械学習を行った。 The trained model related to the comparative example was created using training data for 24 hours (1 day) corresponding to weekdays. In the comparative example, pulse data was used as biometric information, and machine learning was performed without using autonomic nerve data, body temperature data, and body movement data.
 次に、各実施例及び比較例に係る学習済みモデルの評価方法について説明する。上記の訓練データを採取した対象者の中から無作為に選択された一人の対象者について、それぞれのインプット情報を一週間分(平日と休日を含む168時間分)採取し、各学習済みモデルへの入力データとした。採取された各種情報は、各実施例及び比較例に係る学習済みモデルの「取得期間」に対応する期間に切り出され、入力データとした。 Next, the evaluation method of the trained model according to each example and comparative example will be described. For one subject randomly selected from the subjects for whom the above training data was collected, input information for one week (168 hours including weekdays and holidays) was collected and sent to each trained model. It was used as the input data of. The various collected information was cut out in a period corresponding to the "acquisition period" of the trained model according to each example and the comparative example, and used as input data.
 また、評価用の精神状態回答データ及びセッション評価回答データを、対象期間経過時点で対象者に1回記録してもらうことにより採取した。なお、それぞれの対象期間が経過した時点で1回ずつ精神状態回答データ及びセッション評価回答データを採取したのは、各対象期間を通じた傾向が得られると考えたからである。また、月曜、水曜、及び金曜の訓練データを統合した場合には、金曜のデータ収集が終わった時点で採取された精神状態回答データ及びセッション評価回答データを採用した。 In addition, the mental state response data for evaluation and the session evaluation response data were collected by having the subject record once after the lapse of the target period. It should be noted that the reason why the mental state response data and the session evaluation response data were collected once after each target period has passed is because it is considered that the tendency can be obtained throughout each target period. In addition, when the training data on Monday, Wednesday, and Friday were integrated, the mental state response data and session evaluation response data collected at the end of the data collection on Friday were adopted.
 そして、採取した評価用の精神状態回答データ及びセッション評価回答データを、各学習済みモデルから出力される出力データ(精神状態回答データ及びセッション評価回答データ)と比較し、合致率を判定した。合致率は、精神状態回答データ及びセッション評価回答データに含まれる複数の項目のうち合致していた項目の割合に応じて判定した。具体的には、70%以上一致していた場合に「合致」と判定した。また、合致率は、10回のうち合致と判定された回数(合致回数)により評価した。具体的には、「AAA」は、合致していた項目の割合が90%以上であることを示す。「AA」は、合致していた項目の割合が85%以上90%未満であることを示す。「A」は、合致していた項目の割合が80%以上85%未満であることを示す。「B」は、合致していた項目の割合が75%以上80%未満であることを示す。「C」は、合致していた項目の割合が75%未満であることを示す。 Then, the collected mental state response data and session evaluation response data for evaluation were compared with the output data (mental state response data and session evaluation response data) output from each trained model, and the matching rate was determined. The matching rate was determined according to the ratio of the matching items among the plurality of items included in the mental state response data and the session evaluation response data. Specifically, when 70% or more of the matches were found, it was determined to be "matched". The matching rate was evaluated by the number of times it was determined to match (the number of matching times) out of 10 times. Specifically, "AAA" indicates that the proportion of matching items is 90% or more. “AA” indicates that the percentage of matching items is 85% or more and less than 90%. "A" indicates that the ratio of matching items is 80% or more and less than 85%. “B” indicates that the proportion of matching items is 75% or more and less than 80%. "C" indicates that the percentage of matching items is less than 75%.
 図24に示すように、実施例1に係る学習済みモデルの合致度は、「A」であった。また、実施例2に係る学習済みモデルの合致度は、「B」であった。また、実施例3に係る学習済みモデルの合致度は、「AA」であった。また、実施例4に係る学習済みモデルの合致度は、「AA」であった。また、実施例5に係る学習済みモデルの合致度は、「AAA」であった。また、比較例に係る学習済みモデルの合致度は、「C」であった。 As shown in FIG. 24, the degree of matching of the trained model according to the first embodiment was "A". The degree of matching of the trained model according to Example 2 was "B". The degree of matching of the trained model according to Example 3 was "AA". The degree of matching of the trained model according to Example 4 was "AA". The degree of matching of the trained model according to Example 5 was "AAA". The degree of matching of the trained model according to the comparative example was "C".
 実施例5に係る学習済みモデルの合致度は、他の学習済みモデルの合致度よりも高い結果となった。この結果から、学習時及び運用時の入力データは、7日間分(168時間分)以上あるのが好適であることが示唆された。すなわち、取得部201は、7日間分以上の生体情報及び行動記録情報を取得し、算出部202は、7日間分以上の生体情報及び行動記録情報に基づいて、精神状態情報を算出するのが好適であることが示唆された。 The degree of matching of the trained model according to Example 5 was higher than the degree of matching of the other trained models. From this result, it was suggested that it is preferable that the input data at the time of learning and operation is 7 days (168 hours) or more. That is, the acquisition unit 201 acquires the biological information and the action record information for 7 days or more, and the calculation unit 202 calculates the mental state information based on the biological information and the action record information for 7 days or more. It was suggested that it is suitable.
 また、この結果から、学習時及び運用時の入力データは、平日と休日を含むのが好適であることが示唆された。すなわち、取得部201は、平日及び休日を含む数日分以上の生体情報及び行動記録情報を取得し、算出部202は、平日及び休日を含む数日分以上の生体情報及び行動記録情報に基づいて、精神状態情報を算出するのが好適であることが示唆された。 In addition, this result suggests that it is preferable that the input data during learning and operation includes weekdays and holidays. That is, the acquisition unit 201 acquires the biological information and the action record information for several days or more including weekdays and holidays, and the calculation unit 202 is based on the biological information and the action record information for several days or more including weekdays and holidays. Therefore, it was suggested that it is preferable to calculate the mental state information.
 また、実施例3,4に係る学習済みモデルの合致度は、実施例1,2に係る学習済みモデルの合致度よりも高い結果となった。この結果から、学習時及び運用時の入力データは、生体情報として体温データ、脈拍データ、及び身体動作データを含むのが好適であることが示唆された。 Further, the degree of matching of the trained models according to Examples 3 and 4 was higher than the degree of matching of the trained models according to Examples 1 and 2. From this result, it was suggested that the input data during learning and operation preferably includes body temperature data, pulse data, and body movement data as biological information.
 また、実施例1に係る学習済みモデルの合致度は、実施例2に係る学習済みモデルの合致度よりも高い結果となった。この結果から、学習時及び運用時の入力データは、休日のデータより平日のデータが好適であることが示唆された。 Further, the degree of matching of the trained model according to Example 1 was higher than the degree of matching of the trained model according to Example 2. From this result, it was suggested that the input data during learning and operation is more suitable for weekday data than holiday data.
 また、比較例は、比較的多くのウェアラブル装置にて採取可能な脈拍データのみを用いた場合の例である。比較例に係る学習済みモデルの合致度は、実施例1~5に係る学習済みモデルのいずれの合致度よりも低い結果となった。この結果から、学習時及び運用時の入力データは、生体情報として自律神経データを含むのが好適であることが示唆された。 Further, the comparative example is an example in which only the pulse data that can be collected by a relatively large number of wearable devices is used. The degree of matching of the trained models according to the comparative example was lower than the degree of matching of any of the trained models according to Examples 1 to 5. From this result, it was suggested that it is preferable that the input data during learning and operation include autonomic nerve data as biometric information.

Claims (20)

  1.  対象者の自律神経データを含む生体情報と、前記対象者の複数の行動が記録された行動記録情報とを取得する取得部と、
     前記生体情報及び前記行動記録情報に基づいて、前記対象者の各行動における精神状態情報を算出する算出部と
     を備える、生体情報分析装置。
    An acquisition unit that acquires biological information including autonomic nerve data of the subject and behavior record information in which a plurality of actions of the subject are recorded.
    A biometric information analyzer comprising a calculation unit that calculates mental state information in each behavior of the subject based on the biometric information and the behavioral record information.
  2.  前記算出部は、時系列の前記自律神経データのうち、各行動が行われた期間に対応する部分データを特定し、特定した部分データの統計値を、前記精神状態情報として算出する、
     請求項1に記載の生体情報分析装置。
    The calculation unit specifies the partial data corresponding to the period in which each action is performed out of the time-series autonomic nerve data, and calculates the statistical value of the specified partial data as the mental state information.
    The biometric information analyzer according to claim 1.
  3.  前記取得部は、
     前記生体情報として、前記対象者の身体の動作の強度を示す身体動作データを更に取得し、
     取得した前記身体動作データに基づいて、前記行動記録情報を補正する、
     請求項1又は2に記載の生体情報分析装置。
    The acquisition unit
    As the biological information, body movement data indicating the intensity of the body movement of the subject is further acquired.
    The action record information is corrected based on the acquired body movement data.
    The biometric information analyzer according to claim 1 or 2.
  4.  前記取得部は、前記対象者の精神状態に関する精神状態回答データを取得し、
     前記算出部は、前記精神状態回答データに基づいて、前記対象者の精神状態スコアを算出する、
     請求項1~3のいずれか一つに記載の生体情報分析装置。
    The acquisition unit acquires mental state response data regarding the mental state of the subject, and obtains the data.
    The calculation unit calculates the mental state score of the subject based on the mental state response data.
    The biometric information analyzer according to any one of claims 1 to 3.
  5.  前記取得部は、前記対象者に対して行われたセッションの評価に関するセッション評価回答データを取得し、
     前記算出部は、前記セッション評価回答データに基づいて、前記セッションのセッション評価スコアを算出する、
     請求項1~4のいずれか一つに記載の生体情報分析装置。
    The acquisition unit acquires session evaluation response data regarding the evaluation of the session performed on the target person, and obtains the session evaluation response data.
    The calculation unit calculates the session evaluation score of the session based on the session evaluation response data.
    The biometric information analyzer according to any one of claims 1 to 4.
  6.  前記自律神経データは、交感神経データ及び副交感神経データを含む、
     請求項1~5のいずれか一つに記載の生体情報分析装置。
    The autonomic nerve data includes sympathetic nerve data and parasympathetic nerve data.
    The biometric information analyzer according to any one of claims 1 to 5.
  7.  前記自律神経データは、更に、前記交感神経データ及び前記副交感神経データが統合されたトータルパワーデータを含む、
     請求項6に記載の生体情報分析装置。
    The autonomic nerve data further includes total power data in which the sympathetic nerve data and the parasympathetic nerve data are integrated.
    The biometric information analyzer according to claim 6.
  8.  前記行動記録情報は、各行動を示す行動データと、各行動が行われた時間を示す時間データとを含む、
     請求項1~7のいずれか一つに記載の生体情報分析装置。
    The action record information includes action data indicating each action and time data indicating the time when each action was performed.
    The biometric information analyzer according to any one of claims 1 to 7.
  9.  各行動を示す行動データと、各行動が行われた期間と、各行動における前記精神状態情報とを関連づけた情報を出力する出力制御部を更に備える、
     請求項1~8のいずれか一つに記載の生体情報分析装置。
    It further includes an output control unit that outputs information relating to action data indicating each action, the period during which each action was performed, and the mental state information in each action.
    The biometric information analyzer according to any one of claims 1 to 8.
  10.  各行動における前記精神状態情報に基づいて、各行動における精神状態を評価したコメント及び画像の少なくとも一方を生成する生成部を更に備える、
     請求項1~9のいずれか一つに記載の生体情報分析装置。
    It further includes a generator that generates at least one of a comment and an image that evaluates the mental state in each action based on the mental state information in each action.
    The biometric information analyzer according to any one of claims 1 to 9.
  11.  前記対象者は、第1対象者であり、
     特定の精神的傾向を有する複数の第2対象者について、前記第2対象者の自律神経データを含む生体情報と、前記第2対象者の複数の行動が記録された行動記録情報と、前記第2対象者により回答された質問回答情報とを用いて学習された学習済みモデルに対して、前記第1対象者の前記生体情報及び前記行動記録情報を入力することで、前記第1対象者の精神状態が判定された精神状態判定結果を生成する判定部を更に備える、
     請求項1~10のいずれか一つに記載の生体情報分析装置。
    The target person is the first target person,
    For a plurality of second subjects having a specific mental tendency, biological information including the autonomic nerve data of the second subject, behavior record information in which a plurality of actions of the second subject are recorded, and the first. 2 By inputting the biological information and the behavior record information of the first target person into the trained model learned using the question and answer information answered by the target person, the first target person Further provided with a determination unit that generates a mental state determination result in which the mental state is determined.
    The biometric information analyzer according to any one of claims 1 to 10.
  12.  前記取得部は、前記対象者の体表面に貼り付け可能なウェアラブルセンサによって収集されたデータに基づいて、前記生体情報を取得する、
     請求項1~11のいずれか一つに記載の生体情報分析装置。
    The acquisition unit acquires the biological information based on the data collected by the wearable sensor that can be attached to the body surface of the subject.
    The biometric information analyzer according to any one of claims 1 to 11.
  13.  前記取得部は、144時間分以上の前記生体情報及び前記行動記録情報を取得し、
     前記算出部は、前記144時間分以上の前記生体情報及び前記行動記録情報に基づいて、前記精神状態情報を算出する、
     請求項1~12のいずれか一つに記載の生体情報分析装置。
    The acquisition unit acquires the biological information and the action record information for 144 hours or more.
    The calculation unit calculates the mental state information based on the biological information and the action record information for 144 hours or more.
    The biometric information analyzer according to any one of claims 1 to 12.
  14.  サーバ装置と、表示用端末とを含む生体情報分析システムであって、
     前記サーバ装置は、
     対象者の自律神経データを含む生体情報と、前記対象者の複数の行動が記録された行動記録情報とを取得する取得部と、
     前記生体情報及び前記行動記録情報に基づいて、前記対象者の各行動における精神状態情報を算出する算出部と、
     前記対象者の各行動における精神状態情報を前記表示用端末に送信する出力制御部と
     を備え、
     前記表示用端末は、
     前記サーバ装置から送信された前記対象者の各行動における精神状態情報を受信し、受信した前記対象者の各行動における精神状態情報を表示させる表示制御部
     を備える、生体情報分析システム。
    A biometric information analysis system that includes a server device and a display terminal.
    The server device is
    An acquisition unit that acquires biological information including autonomic nerve data of the subject and behavior record information in which a plurality of actions of the subject are recorded.
    A calculation unit that calculates mental state information in each behavior of the subject based on the biological information and the behavior record information.
    It is provided with an output control unit that transmits mental state information in each action of the target person to the display terminal.
    The display terminal is
    A biological information analysis system including a display control unit that receives mental state information in each action of the target person transmitted from the server device and displays the mental state information in each action of the received target person.
  15.  対象者の自律神経データを含む生体情報と、前記対象者の複数の行動が記録された行動記録情報とを取得し、
     前記生体情報及び前記行動記録情報に基づいて、前記対象者の各行動における精神状態情報を算出する
     各処理をコンピュータに実行させる、生体情報分析プログラム。
    The biological information including the autonomic nerve data of the subject and the behavior record information in which a plurality of actions of the subject are recorded are acquired.
    A biometric information analysis program that causes a computer to execute each process of calculating mental state information in each behavior of the subject based on the biometric information and the behavioral record information.
  16.  対象者の自律神経データを含む生体情報と、前記対象者の複数の行動が記録された行動記録情報とを取得し、
     前記生体情報及び前記行動記録情報に基づいて、前記対象者の各行動における精神状態情報を算出する
     ことを含む、生体情報分析方法。
    The biological information including the autonomic nerve data of the subject and the behavior record information in which a plurality of actions of the subject are recorded are acquired.
    A biometric information analysis method including calculating mental state information in each behavior of the subject based on the biometric information and the behavioral record information.
  17.  前記対象者は、第1対象者であり、
     更に、特定の精神的傾向を有する複数の第2対象者の144時間分以上の期間について、前記第2対象者の自律神経データを含む生体情報と、前記第2対象者の複数の行動が記録された行動記録情報と、前記第2対象者により回答された質問回答情報とを用いた機械学習を行うことにより、学習済みモデルを更に構築する、
     請求項16に記載の生体情報分析方法。
    The target person is the first target person,
    Further, for a period of 144 hours or more of a plurality of second subjects having a specific mental tendency, biological information including autonomic nerve data of the second subject and a plurality of behaviors of the second subject are recorded. A trained model is further constructed by performing machine learning using the behavior record information obtained and the question / answer information answered by the second subject.
    The biological information analysis method according to claim 16.
  18.  前記自律神経データは、更に、交感神経データ、副交感神経データ、及び、前記交感神経データと前記副交感神経データとが統合されたトータルパワーデータを含み、
     前記行動記録情報は、各行動を示す行動データと、各行動が行われた時間を示す時間データとを含み、
     前記取得する処理は、前記対象者の体表面に貼り付け可能なウェアラブルセンサによって前記自律神経データを取得する、
     請求項16に記載の生体情報分析方法。
    The autonomic nerve data further includes sympathetic nerve data, parasympathetic nerve data, and total power data in which the sympathetic nerve data and the parasympathetic nerve data are integrated.
    The action record information includes action data indicating each action and time data indicating the time when each action was performed.
    In the acquisition process, the autonomic nerve data is acquired by a wearable sensor that can be attached to the body surface of the subject.
    The biological information analysis method according to claim 16.
  19.  前記学習済みモデルを生成する処理は、前記第2対象者の自律神経データとして、交感神経データ、副交感神経データ、及び、前記交感神経データと前記副交感神経データとが統合されたトータルパワーデータを含むデータを用いて前記機械学習を行う、
     請求項17に記載の生体情報分析方法。
    The process of generating the trained model includes sympathetic nerve data, parasympathetic nerve data, and total power data in which the sympathetic nerve data and the parasympathetic nerve data are integrated as the autonomic nerve data of the second subject. Perform the machine learning using the data.
    The biological information analysis method according to claim 17.
  20.  第1対象者の自律神経データを含む生体情報と、前記第1対象者の複数の行動が記録された行動記録情報とを取得する取得部と、
     特定の精神的傾向を有する複数の第2対象者について、前記第2対象者の自律神経データを含む生体情報と、前記第2対象者の複数の行動が記録された行動記録情報と、前記第2対象者により回答された質問回答情報とを用いて学習された学習済みモデルに対して、前記第1対象者の前記生体情報及び前記行動記録情報を入力することで、前記第1対象者の精神状態が判定された精神状態判定結果を生成する判定部と
     を備える、生体情報分析装置。
    An acquisition unit that acquires biological information including autonomic nerve data of the first subject and behavior record information in which a plurality of actions of the first subject are recorded.
    For a plurality of second subjects having a specific mental tendency, biological information including the autonomic nerve data of the second subject, behavior record information in which a plurality of actions of the second subject are recorded, and the first. 2 By inputting the biological information and the behavior record information of the first target person into the trained model learned using the question and answer information answered by the target person, the first target person A biological information analyzer provided with a determination unit that generates a determination result of a mental state in which a mental state is determined.
PCT/JP2020/020391 2019-06-17 2020-05-22 Biological information analysis device, biological information analysis system, biological information analysis program, and biological information analysis method WO2020255630A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2021527494A JPWO2020255630A1 (en) 2019-06-17 2020-05-22

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019111884 2019-06-17
JP2019-111884 2019-06-17

Publications (1)

Publication Number Publication Date
WO2020255630A1 true WO2020255630A1 (en) 2020-12-24

Family

ID=74037097

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/020391 WO2020255630A1 (en) 2019-06-17 2020-05-22 Biological information analysis device, biological information analysis system, biological information analysis program, and biological information analysis method

Country Status (2)

Country Link
JP (1) JPWO2020255630A1 (en)
WO (1) WO2020255630A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6988034B1 (en) * 2021-04-13 2022-01-05 国立大学法人東海国立大学機構 Pregnant woman depressive symptom estimation system and estimation method, estimation model generator
WO2022153538A1 (en) * 2021-01-18 2022-07-21 日本電気株式会社 Stress level estimation method, teacher data generation method, information processing device, stress level estimation program, and teacher data generation program
WO2023176885A1 (en) * 2022-03-18 2023-09-21 株式会社MentalBase Information processing device, information processing method, and program

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001344352A (en) * 2000-05-31 2001-12-14 Toshiba Corp Life assisting device, life assisting method and advertisement information providing method
JP2012249797A (en) * 2011-06-02 2012-12-20 Konica Minolta Holdings Inc System, program and method for stress analysis
JP2016115057A (en) * 2014-12-12 2016-06-23 セイコーエプソン株式会社 Biological information processing system, server system, biological information processing apparatus, biological information processing method, and program
JP2017213278A (en) * 2016-06-01 2017-12-07 コニカミノルタ株式会社 Mental health evaluation apparatus, method, and program
JP2018139087A (en) * 2017-02-24 2018-09-06 沖電気工業株式会社 Emotion estimating server, emotion estimating method, presenting device and emotion estimating system
JP2019030389A (en) * 2017-08-04 2019-02-28 パナソニックIpマネジメント株式会社 Autonomic state evaluation device, autonomic state evaluation system, autonomic state evaluation method and program
JP2019030557A (en) * 2017-08-09 2019-02-28 沖電気工業株式会社 Presentation device, presentation method, emotion estimation server, emotion estimation method, and emotion estimation system
JP2019175108A (en) * 2018-03-28 2019-10-10 沖電気工業株式会社 Emotional information management server device, emotional information management method, program, terminal device, and information communication system

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001344352A (en) * 2000-05-31 2001-12-14 Toshiba Corp Life assisting device, life assisting method and advertisement information providing method
JP2012249797A (en) * 2011-06-02 2012-12-20 Konica Minolta Holdings Inc System, program and method for stress analysis
JP2016115057A (en) * 2014-12-12 2016-06-23 セイコーエプソン株式会社 Biological information processing system, server system, biological information processing apparatus, biological information processing method, and program
JP2017213278A (en) * 2016-06-01 2017-12-07 コニカミノルタ株式会社 Mental health evaluation apparatus, method, and program
JP2018139087A (en) * 2017-02-24 2018-09-06 沖電気工業株式会社 Emotion estimating server, emotion estimating method, presenting device and emotion estimating system
JP2019030389A (en) * 2017-08-04 2019-02-28 パナソニックIpマネジメント株式会社 Autonomic state evaluation device, autonomic state evaluation system, autonomic state evaluation method and program
JP2019030557A (en) * 2017-08-09 2019-02-28 沖電気工業株式会社 Presentation device, presentation method, emotion estimation server, emotion estimation method, and emotion estimation system
JP2019175108A (en) * 2018-03-28 2019-10-10 沖電気工業株式会社 Emotional information management server device, emotional information management method, program, terminal device, and information communication system

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022153538A1 (en) * 2021-01-18 2022-07-21 日本電気株式会社 Stress level estimation method, teacher data generation method, information processing device, stress level estimation program, and teacher data generation program
JP6988034B1 (en) * 2021-04-13 2022-01-05 国立大学法人東海国立大学機構 Pregnant woman depressive symptom estimation system and estimation method, estimation model generator
WO2023176885A1 (en) * 2022-03-18 2023-09-21 株式会社MentalBase Information processing device, information processing method, and program

Also Published As

Publication number Publication date
JPWO2020255630A1 (en) 2020-12-24

Similar Documents

Publication Publication Date Title
US11553870B2 (en) Methods for modeling neurological development and diagnosing a neurological impairment of a patient
WO2020255630A1 (en) Biological information analysis device, biological information analysis system, biological information analysis program, and biological information analysis method
Dillenseger et al. Digital biomarkers in multiple sclerosis
Levin et al. Tracking workload in the emergency department
Bizzozero et al. Upper and lower face apraxia: role of the right hemisphere
Umansky et al. Workload in nursing
US20190239791A1 (en) System and method to evaluate and predict mental condition
CN106256316B (en) Method and apparatus for assessing physiological aging level
US20040210159A1 (en) Determining a psychological state of a subject
Arnardottir et al. The future of sleep measurements: a review and perspective
JP2007287144A (en) Case based outcome prediction in real-time monitoring system
Pandian Sleep pattern analysis and improvement using artificial intelligence and music therapy
Palmius et al. A multi-sensor monitoring system for objective mental health management in resource constrained environments
JP7229491B1 (en) Learning device and estimation system
CN115428092A (en) System and method for assisting individuals in a behavioral modification program
Booth et al. Toward robust stress prediction in the age of wearables: Modeling perceived stress in a longitudinal study with information workers
Contrada et al. Personality and self-regulation in health and disease: Toward an integrative perspective
US20160092647A1 (en) Method for recording medical information of a user and for sharing user experience with symptoms and medical intervention
CN110459311B (en) Medicine taking reminding system and method for remotely monitoring medicine taking of patient
Parade et al. Infant sleep moderates the effect of infant temperament on maternal depressive symptoms, maternal sensitivity, and family functioning
JP2008183048A (en) Work fatigue degree measuring system
Xue et al. Understanding how group workers reflect on organizational stress with a shared, anonymous heart rate variability data visualization
Lewy Wearable devices-from healthy lifestyle to active ageing
Bufano et al. Weakened Sustained Attention and Increased Cognitive Effort after Total Sleep Deprivation: A Virtual Reality Ecological Study
US20230329631A1 (en) Systems and methods involving sleep management

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20827566

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021527494

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20827566

Country of ref document: EP

Kind code of ref document: A1