WO2020161901A1 - Dispositif et procédé de traitement d'informations biologiques, et support d'enregistrement lisible par ordinateur - Google Patents

Dispositif et procédé de traitement d'informations biologiques, et support d'enregistrement lisible par ordinateur Download PDF

Info

Publication number
WO2020161901A1
WO2020161901A1 PCT/JP2019/004660 JP2019004660W WO2020161901A1 WO 2020161901 A1 WO2020161901 A1 WO 2020161901A1 JP 2019004660 W JP2019004660 W JP 2019004660W WO 2020161901 A1 WO2020161901 A1 WO 2020161901A1
Authority
WO
WIPO (PCT)
Prior art keywords
identification
sensor data
identification model
condition
model
Prior art date
Application number
PCT/JP2019/004660
Other languages
English (en)
Japanese (ja)
Inventor
友嗣 大野
久保 雅洋
利憲 細井
Original Assignee
日本電気株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本電気株式会社 filed Critical 日本電気株式会社
Priority to PCT/JP2019/004660 priority Critical patent/WO2020161901A1/fr
Priority to JP2020570326A priority patent/JP7238910B2/ja
Priority to US17/428,102 priority patent/US20220022819A1/en
Publication of WO2020161901A1 publication Critical patent/WO2020161901A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/746Alarms related to a physiological condition, e.g. details of setting alarm thresholds or avoiding false alarms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7271Specific aspects of physiological measurement analysis
    • A61B5/7282Event detection, e.g. detecting unique waveforms indicative of a medical condition
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/63ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/30ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/0205Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
    • A61B5/02055Simultaneously evaluating both cardiovascular condition and temperature
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1118Determining activity level
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/168Evaluating attention deficit, hyperactivity
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4806Sleep evaluation

Definitions

  • the present disclosure relates to a biometric information processing apparatus, method, and computer-readable recording medium, and more specifically, a biometric information processing apparatus, method, and computer-readable recording medium that processes biometric information acquired from a patient or the like.
  • inpatients who are hospitalized include those who are at risk of causing behavioral problems such as falling out of bed, removing intubation, giving a strange voice, or using violence. Patients who exhibit problematic behavior are often in a condition called "restlessness" or "delirium”. Some medical staff, such as nurses and caregivers, spend about 20 to 30% of their time dealing with inpatients who are at risk of causing behavior problems. Time to focus is being pressured.
  • Patent Document 1 discloses a biological information monitoring system that monitors biological information of a subject on a bed.
  • the biological information monitoring system described in Patent Document 1 includes a physical condition determination unit.
  • the physical condition determination unit determines the physical condition of the subject using various biological information such as weight, body movement, respiration, and heartbeat.
  • the physical state determination unit applies, for example, various biological information of the subject to a function (model) that is learned using labeled teacher data and that indicates whether the subject is in a sleeping state, so that the subject sleeps. It is determined whether or not the state.
  • the physical condition determination unit determines whether the subject is in the state of delirium based on the subject's body movement information and/or respiration rate.
  • Patent Document 1 for example, a function representing sleep or wakefulness is created using a large amount of biometric data (labeled teacher data).
  • biometric data labeled teacher data
  • Patent Document 1 does not describe modification of the learned function.
  • the relationship between the data of biological information and sleep or awakening may change when the composition ratio of the inpatient's medical care subject changes. Further, the relationship between the biological information data and sleep or wakefulness may change according to seasonal changes. In such a case, if the function once created is continued to be used, the accuracy of the determination result of the physical condition deteriorates.
  • the present disclosure aims to provide a biological information processing apparatus, a method, and a computer-readable recording medium capable of suppressing a decrease in accuracy of a determination result of a physical condition.
  • the present disclosure acquires sensor data of a monitoring target person from a sensor group including one or more sensors, and is generated using the acquired sensor data and the sensor data acquired in the past.
  • An inner surface state identifying means for identifying the inner surface state of the monitored object based on an identification model for identifying the inner surface state of the monitored object, and another identification model different from the existing identification model
  • Determination means for determining whether or not the condition is satisfied, and when the determination means determines that the condition is satisfied, the inner surface state identification means is used by using the sensor data of the monitoring target person acquired from the sensor group.
  • a biometric information processing device including a model generation unit that generates an identification model different from the identification model used by.
  • the present disclosure also acquires sensor data of a monitoring target person from a sensor group including one or more sensors, and generates the sensor data of the monitoring target person using the acquired sensor data and the sensor data acquired in the past. Based on the identification model for identifying the inner surface state, to identify the inner surface state of the person to be monitored, to determine whether a condition for generating another identification model different from the existing identification model is satisfied, When it is determined that the condition is satisfied, the sensor information of the monitoring target person acquired from the sensor group is used to generate an identification model different from the identification model used to identify the inner surface state. Provide a way.
  • the present disclosure acquires sensor data of a monitoring target person from a sensor group including one or more sensors, and uses the acquired sensor data and the sensor data acquired in the past to generate the inner state of the monitoring target person. Based on the identification model for identifying, to identify the inner surface state of the monitored person, to determine whether a condition for generating another identification model different from the existing identification model is satisfied, the condition If it is determined that the above, the sensor data of the monitoring target person acquired from the sensor group is used to perform a process for generating an identification model different from the identification model used to identify the inner surface state.
  • a computer-readable recording medium storing a program to be executed by a computer.
  • the biometric information processing device, method, and computer-readable recording medium according to the present disclosure can suppress a decrease in accuracy of the determination result of the physical condition.
  • FIG. 3 is a block diagram schematically showing a biological information processing apparatus according to the present disclosure.
  • FIG. 1 is a block diagram showing a system including a biometric information processing device according to a first embodiment of the present disclosure.
  • the graph which shows the specific example of a restless score.
  • 3 is a flowchart showing an operation procedure in the first embodiment.
  • the flowchart which shows the operation procedure in 2nd Embodiment.
  • FIG. 3 is a block diagram showing a configuration example of an information processing device that can be used in the biological information processing device.
  • FIG. 1 schematically illustrates the biometric information processing device of the present disclosure.
  • the biometric information processing device 10 includes a determination unit 11, a model generation unit 12, and an inner surface state identification unit 13.
  • the sensor group 20 includes one or more sensors.
  • the inner surface state identification unit 13 acquires sensor data of a monitoring target person such as a patient from the sensor group 20.
  • the inner surface state identification means 13 identifies the inner surface state of the monitoring target person based on the acquired sensor data and the identification model 40.
  • the inner state of the monitored person refers to, for example, the state of the monitored person who cannot be directly judged from the outside by another person, and includes, for example, a mental state.
  • the identification model 40 is a model for identifying the inner surface state of the monitoring target person, which is generated using the sensor data acquired in the past.
  • the sensor data acquired in the past means data acquired before identifying the inner surface state of the monitoring target person.
  • the data acquired in the past includes the data of the person being monitored, for example, the data acquired when the person being monitored was at the facility in the past.
  • the data acquired in the past may not be the data of the person being monitored but may be the data acquired from a person different from the person being monitored.
  • the determination means 11 determines whether or not a condition for generating another identification model different from the existing identification model is satisfied.
  • the model generation unit 12 uses the sensor data of the monitoring target person acquired from the sensor group 20 and is used by the inner surface state identification unit 13. An identification model 50 different from the identification model 40 used is generated.
  • the model generation means 12 generates the identification model 50 using the sensor data of the monitoring target person acquired from the sensor group 20 when the condition for generating another identification model is satisfied.
  • the inner surface state identification means 13 can identify the inner surface state by using the generated identification model 50. Since the identification model 50 is generated using the sensor data acquired from the monitoring target person, the accuracy of the identification result when the identification model 50 is used is the identification result when the identification model 40 is used. May be higher than the accuracy of. In the present disclosure, since the identification model 50 is generated when the above condition is satisfied, it is possible to suppress deterioration in accuracy of the identification result of the inner surface state under the condition where the above condition is satisfied.
  • FIG. 2 shows a biological information processing apparatus system including the biological information processing apparatus according to the first embodiment of the present disclosure.
  • the biometric information processing system 100 includes a biometric information processing device (restless identification device) 110, a sensor group 120, a storage device 140, and a notification unit 150.
  • the restlessness identification device 110 is configured as a computer device including, for example, a memory and a processor.
  • the storage device 140 is configured as a storage device such as an HDD (Hard Disk Drive) or an SSD (Solid State Drive).
  • the restlessness identification device 110 corresponds to the biometric information processing device 10 in FIG. 1.
  • the behavior becomes excessive and restless before the problem behavior actually occurs (disturbance state). It turns out that there are many cases.
  • the "restless state” may include not only excessive behavior and restlessness, but also a state in which the patient is not calm and a state in which the mind is not normally controlled. As the restless state occurs due to at least one of physical distress and delirium, the term "restless state” is intended to include delirium in the present specification.
  • the restlessness identifying device 110 identifies an inner state including a mental state of a monitored person such as a patient, for example, a restless state of the monitored person.
  • the storage device 140 stores the past data 141, the attribute information 142, and the identification model 143.
  • the discrimination model 143 is a discrimination model (discrimination parameter) for generating information indicating the level of the restless state from the sensor data obtained from the sensor group 120.
  • the level of restlessness includes, for example, restlessness, normal, and unknowns that are neither of them.
  • the restlessness may be represented as multiple level values.
  • the disturbed state may be represented by three levels (strong disturbed state, moderate disturbed state, weak disturbed state). In this case, the stronger the level, the higher the probability of causing a problem behavior or the higher the possibility of causing a serious problem. Unknown is a state in which it is difficult to determine whether the state is disturbing or normal.
  • the identification model 143 is generated, for example, by learning the relationship between the past sensor data and the past restless state or non-restless state.
  • the identification model 143 corresponds to the identification model 40 or 50 in FIG.
  • the past data 141 includes learning data used for machine learning of the identification model 143.
  • the past sensor data used to generate the identification model 143 is labeled with a label indicating whether the patient was normal or restless when each sensor data was obtained.
  • the past data 141 includes past sensor data of the monitoring target person, which is acquired from the sensor group 120.
  • the past data 141 may include sensor data acquired from patients other than the monitoring target.
  • the attribute information 142 includes attribute information of the group to which the patient from whom the sensor data used to generate the identification model 143 is obtained belongs.
  • the attribute information includes, for example, information about the facility where the patient is hospitalized, information about the surroundings of the facility, and information about the time.
  • the information on the facility includes, for example, information indicating which subject the patient belongs to such as neurosurgery, cardiac surgery, respiratory surgery, oncology, psychiatry, or palliative care.
  • Information about the facility may include information about the type of facility, such as an acute care hospital, rehabilitation hospital, nursing home, or nursing home.
  • the information about the surroundings of the facility includes a place, a region, installation conditions of surrounding hospitals, temperature, humidity, average age of local residents, and information about food and drink in the region.
  • the information regarding time includes information such as the season, the month, or the time zone in a day such as day or night.
  • the sensor group 120 includes one or more sensors that acquire biological information (sensor data) of a monitoring target person such as a patient.
  • the sensor data includes information selected from the group including heart rate, respiration, blood pressure, body temperature, consciousness level, skin temperature, skin conductance response, electrocardiographic waveform, and electroencephalographic waveform.
  • the attribute information 130 includes attribute information of the group to which the monitoring target person belongs. The sensor group 120 and the attribute information 130 correspond to the sensor group 20 and the attribute information 30 of FIG.
  • the restlessness identification device 110 includes an inner surface state identification unit 111, a determination unit 112, and a model generation unit 113.
  • the inner surface state identification unit 111 acquires sensor data of a patient to be monitored from the sensor group 120.
  • the inner surface state identification unit 111 identifies the inner surface state (restless state) of the patient based on the acquired sensor data and the identification model 143 stored in the storage device 140.
  • the inner surface state identification unit 111 may identify the restless state by extracting a feature amount from the acquired sensor data and applying the extracted feature amount to the identification model 143.
  • the inner surface state identification unit 111 outputs, for example, a score indicating the level of a restless state (restless score) as a result of the determination of the restless state.
  • the inner surface state identification unit 111 corresponds to the inner surface state identification unit 13 in FIG.
  • the notification unit 150 outputs the identification result of the disturbed state identified by the inner surface state identification unit 111 to a medical staff or the like.
  • the notification unit 150 may notify a medical staff or the like that the patient is in a restless state, for example, when the restlessness score output by the inner surface state identification unit 111 is equal to or greater than a predetermined value.
  • the notification unit 150 includes, for example, at least one of a lamp, an image display device, and a speaker, and uses at least one of light, image information, and sound to notify a medical staff or the like that the patient is in a restless state. May be.
  • the notification unit 150 may display that the patient is in a restless state on the display screen of a portable information terminal device such as a smartphone or a tablet possessed by a medical person.
  • the notification unit 150 may notify the earphone or the like worn by a medical staff or the like by voice that the patient is in a restless state.
  • the notification unit 150 may display on the monitor installed in the nurse station or the like that the patient is in a disturbed state, or using a speaker installed in the nurse station or the like, the patient may be in a disturbed state. You may notify that there is.
  • the notification unit 150 notifies the medical staff of the unrest state when the patient is in the unrest state before the behavioral behavior of the patient, so that the medical staff or the like takes care of the patient before the patient exhibits the behavioral problem. be able to.
  • FIG. 3 shows a specific example of the restlessness score.
  • the vertical axis represents the restlessness score
  • the horizontal axis represents time.
  • the graph shown in FIG. 3 represents the time change of the restlessness score obtained as a result of applying the identification model in the inner surface state identification unit 111 to the sensor data at a certain time.
  • the restlessness score takes a value from 0 to 1.
  • the restlessness score “0” represents that the restlessness is not likely (normal) or the restlessness is least likely.
  • the restlessness score “1” represents that the degree of restlessness is the strongest or the possibility of restlessness is the highest.
  • Such an identification model is generated by performing learning using learning data in which a label given in a normal state is “0” and a label given in a restless state is “1”, for example. To be done.
  • the inner surface state identification unit 111 applies sensor data that can change from moment to moment to the identification model 143 and outputs the restless score in time series. For example, when the restlessness score is a predetermined value, for example, 0.7 or more, the notification unit 150 notifies the medical staff or the like that the patient is in a restless state.
  • the threshold value used as the criterion for notification may be appropriately set according to the identification model used and other conditions.
  • the medical person who receives the notification can go to check the condition of the patient.
  • the medical practitioner may use a terminal device such as a tablet arranged beside the bed to input information indicating whether the patient is actually restless or not. Further, the medical staff may use a terminal device such as a tablet arranged beside the bed to input information regarding the content of treatment to the patient such as voice call and bed adjustment.
  • the determination unit 112 determines whether or not a condition for generating another identification model different from the existing identification model is satisfied.
  • the condition for generating another identification model can be read as a condition for regenerating the identification model 143, for example.
  • the determination unit 112 determines whether or not the condition for generating another identification model is satisfied based on the accuracy of the identification result of the disturbed state identified by the inner surface state identification unit 111. For example, when the accuracy of the identification result is lower than a predetermined threshold value, the determination unit 112 determines that the condition for generating another identification model is satisfied.
  • the accuracy of the unrested state identification result can be calculated by comparing the identification result of the inner surface state identification unit 111 and the patient unrested state determination result input by a medical staff or the like.
  • the determination unit 112 determines whether or not the condition for generating another identification model is satisfied, based on the time series data of the disturbed state (restless score) identified by the inner surface state identification unit 111. Good. For example, the determination unit 112 determines that the condition for generating another discriminant model is satisfied when the restlessness score is distributed within a certain range.
  • distribution of a certain disturbing score within a certain range means, for example, a state in which the ratio of the number of disturbing scores taking a value within a certain range to the total number of disturbing scores (all samples) is a predetermined ratio or more. ..
  • the discriminant model used for generating the restless score may not be able to properly identify the restless state. Specifically, if most of the restlessness scores are in the range near the middle of the value indicating the restlessness and the value indicating the normality, the discriminant model may not be able to correctly distinguish the restlessness from the normal. ..
  • the determination unit 112 may determine that the condition for generating another discriminant model is satisfied when the rate of the restless score that takes a value in the range near the middle is equal to or higher than a certain level.
  • the determination unit 112 may determine whether or not a condition for generating another identification model is satisfied based on the attribute information 130 of the monitoring target person and the attribute information 142 stored in the storage device 140. ..
  • the determination unit 112 compares the attribute information 130 and the attribute information 142, and if the current situation of the facility is different from when the identification model is generated (when the attribute information has changed), another determination model is generated. You may judge that the conditions to be met are satisfied.
  • the determination unit 112 generates a different identification model when, for example, a new hospital is created around the hospital where the patient is hospitalized, or when there are no more hospitals around the hospital where the patient is hospitalized. May be determined to hold. Alternatively, when a new clinical department is added to a hospital or the like in which a patient is hospitalized, it may be determined that a condition for generating another identification model is satisfied.
  • the group of monitored persons (its attribute information) is different from the group (attribute information) at the time of generating the identification model, in other words, whether the group has changed, for example, the following method is used. It can be judged using.
  • the attribute information 130 and the attribute information 142 are used as explanatory variables, and a value indicating whether or not the group has changed (changed: 1, no change: 0) is aimed.
  • Machine learning is performed as a variable.
  • the teacher data used for machine learning can be generated based on the accuracy of the identification result identified by using the identification model 143 and the sensor data acquired from the sensor group 120.
  • the accuracy of the identification result can be calculated, for example, by comparing the value input by the medical staff with the identification result. If the accuracy is lower than a preset threshold value, for example, 70%, it is considered that the group has changed (value "1"), and if it is equal to or higher than the threshold value, the group has not changed (value "0"). ..
  • the attribute information 130 and the attribute information 142 can be applied to the model obtained by machine learning to obtain a value indicating whether or not the group has changed. In this case, in the learning phase, the accuracy is calculated to determine whether or not the group has changed, but in the identification phase, the accuracy is not calculated and the group is calculated from the attribute information 130 and the attribute information 142. Can be determined.
  • the determination unit 112 may determine the season from the current date and time, and determine that the condition for generating another identification model when the season changes is satisfied. Alternatively, the determination unit 112 may determine once a month that the condition for generating another identification model is satisfied. The determination unit 112 may determine that the condition for generating another identification model is satisfied at the timing when the operation of the biological information processing system 100 is started. The determination unit 112 corresponds to the determination unit 11 in FIG.
  • the model generation unit 113 uses the past sensor data of the monitoring target person acquired from the sensor group 120 to determine the existing identification model. Separately from 143, a new identification model is generated. The generated new identification model is used by the inner surface state identification unit 111 to identify a restless state. The model generation unit 113 does not generate a new discrimination model unless it is determined that the condition for generating another discrimination model is satisfied. If it is not determined that the condition for generating another identification model is satisfied, the data acquired from the sensor group 120 may be added to the past data 141 used to generate the existing identification model 143.
  • the model generation unit 113 corresponds to the model generation means 12 in FIG.
  • FIG. 4 shows an operation procedure.
  • the inner surface state identification unit 111 acquires sensor data from the sensor group 120 (step A1).
  • the determination unit 112 determines whether or not a condition for generating another identification model is satisfied (step A2).
  • the model generation unit 113 uses the sensor data of the monitoring target person to generate a new identification model independently of the previous model. (Step A3).
  • the model generation unit 113 does not generate a new discriminant model.
  • a new identification model is generated independently of the previous model. For example, when the accuracy of the disturbed state identification result is lower than a predetermined threshold value, it is considered that the currently used identification model 143 is not suitable for identifying the disturbed state of the monitoring target person. In such a case, it is determined that the condition for generating another discrimination model is satisfied, a new discrimination model is generated independently of the previous model, and the disturbed state is discriminated using the discrimination model. By doing so, it is possible to suppress a decrease in the accuracy of the result of identifying the disturbed state.
  • the presently used identification model may not correctly identify the restlessness of the monitored person. is there. In such a case, it is determined that the condition for generating another discriminant model is satisfied, a new discriminant model is generated independently of the previous model, and the disturbed state is discriminated using the discriminant model. Thus, it is possible to suppress a decrease in the accuracy of the result of identifying the disturbed state.
  • the identification model 143 When the situation of the facility where the monitored person is hospitalized changes, the attribute information of the group to which the monitored person belongs changes, and the identification model 143 currently used is not suitable for identifying the disturbed state of the monitored person. there is a possibility. Further, since the sensor data is affected by the temperature and humidity of the external environment, the identification model 143 may not be suitable for identifying the disturbed state of the monitoring target person depending on the season and the season. In such a case, it is determined that the condition for generating another discriminant model is satisfied, a new discriminant model is generated independently of the previous model, and the disturbed state is discriminated using the discriminant model. Thus, it is possible to suppress a decrease in the accuracy of the result of identifying the disturbed state. In particular, when another discriminant model is generated according to a change in season or time, it is possible to periodically discriminate a restless state using the discriminant model matched to the season or time.
  • an identification model 143 is generated using sensor data acquired from a patient at a hospital as learning data.
  • the identification model 143 is applied to sensor data acquired from a patient who is admitted to another hospital, if the attribute information to which the groups of patients from which both sensor data are acquired belong is close, the discrimination model 143 is used.
  • the accuracy of the state identification result is considered to be high. However, for example, when the region, the time period, the medical department, etc. are different, the accuracy of the discrimination result of the disturbed state using the discrimination model 143 is not necessarily high.
  • the determination unit 112 determines that the condition for generating another identification model is satisfied when, for example, the region, the time, the medical department, or the like differs between when the identification model is generated and when the identification model is applied. judge.
  • the determination unit 112 determines whether or not the amount of past sensor data of the monitoring target person included in the past data 141 is equal to or greater than a threshold value. When the amount of past sensor data of the monitoring target person is equal to or larger than the threshold value, the determination unit 112 determines that there is a sufficient amount of sensor data of the monitoring target person. When determining that there is a sufficient amount of sensor data of the monitoring target person, the determination unit 112 determines whether or not a condition for generating another identification model is satisfied.
  • the determination unit 112 determines that the sensor data of the monitoring target person does not exist in a sufficient amount, the determination unit 112 adds the acquired sensor data to the past data 141.
  • the determination unit 112 also causes the model generation unit 113 to regenerate the identification model. Other points may be similar to those of the first embodiment.
  • the threshold value of the amount of sensor data can be determined, for example, based on whether or not there is almost no change in the identification accuracy before and after the determination by increasing the data.
  • FIG. 5 shows an operation procedure in the second embodiment.
  • the inner surface state identification unit 111 acquires sensor data from the sensor group 120 (step B1).
  • Step B1 may be similar to step A1 in FIG.
  • the determination unit 112 determines whether or not the past sensor data of the monitoring target person included in the past data 141 is a sufficient amount of data for generating the identification model (step B2).
  • the determination unit 112 adds the sensor data acquired in step B1 to the past data 141 and causes the model generation unit 113 to regenerate the identification model (step B5 ).
  • the model generation unit 113 regenerates (corrects) the identification model 143 used by the inner surface state identification unit 111 using the past data 141 to which the sensor data has been added.
  • step B3 determines whether or not a condition for generating another identification model is satisfied.
  • Step B3 may be similar to step A2 in FIG.
  • the model generation unit 113 uses the past sensor data of the monitoring target person, independently of the previous model, and creates a new identification model. Is generated (step B4).
  • Step B4 may be similar to step A3 in FIG.
  • the model generation unit 113 determines in step B3 that the condition for generating another identification model is not satisfied, it does not generate a new identification model.
  • the determination unit 112 determines whether or not there is a sufficient amount of sensor data for generating an identification model. When there is a sufficient amount of sensor data for generating an identification model, the determination unit 112 determines whether a condition for generating another identification model is satisfied. When the discrimination model is generated when there is not enough sensor data, it is considered that the accuracy of the discrimination result of the disturbed state using the discrimination model is not high. If the determination unit 112 determines that sufficient sensor data does not exist, the model generation unit 113 does not generate a new identification model independent of the previous identification model. By doing so, it is possible to suppress the generation of a discrimination model with low accuracy in the discrimination result, and to discriminate a disturbing state using the discrimination model.
  • the storage device 140 can store a plurality of identification models 143 including the identification model generated in step A3 (see FIG. 4) or step B4 (see FIG. 5). In that case, the storage device 140 may store, for each identification model, the attribute information 142 to which the group of the acquisition source of the sensor data used to generate the identification model belongs.
  • the determination unit 112 determines the identification result when each of the plurality of identification models is applied to the acquired sensor data. The accuracy of may be calculated. The determination unit 112 may select the identification model used by the inner surface state identification unit 111 based on the accuracy of the identification result. For example, the determination unit 112 may select the identification model with the highest accuracy of the identification result among the plurality of identification models as the identification model used for the identification of the disturbed state.
  • each unit in the biometric information processing system 100 may be realized by using hardware or software. Further, the function of each unit in the biometric information processing system 100 may be realized by combining hardware and software.
  • FIG. 6 shows a configuration example of an information processing device (computer device) that can be used for the restlessness identifying device 110.
  • the information processing device 500 includes a control unit (CPU: Central Processing Unit) 510, a storage unit 520, a ROM (Read Only Memory) 530, a RAM (Random Access Memory) 540, a communication interface (IF) 550, and a user interface 560.
  • CPU Central Processing Unit
  • ROM Read Only Memory
  • RAM Random Access Memory
  • IF communication interface
  • the communication interface 550 is an interface for connecting the information processing apparatus 500 to a communication network via a wired communication means or a wireless communication means.
  • the user interface 560 includes a display unit such as a display.
  • the user interface 560 also includes an input unit such as a keyboard, a mouse, and a touch panel.
  • the storage unit 520 is an auxiliary storage device that can hold various data.
  • the storage unit 520 does not necessarily have to be a part of the information processing device 500, and may be an external storage device or a cloud storage connected to the information processing device 500 via a network.
  • the storage unit 520 corresponds to the storage device 140 in FIG.
  • the ROM 530 is a non-volatile storage device.
  • a semiconductor memory device such as a flash memory having a relatively small capacity is used.
  • the program executed by the CPU 510 can be stored in the storage unit 520 or the ROM 530.
  • Non-transitory computer readable media include various types of tangible storage media.
  • Examples of the non-transitory computer readable medium include, for example, a magnetic recording medium such as a flexible disk, a magnetic tape, or a hard disk, a magneto-optical recording medium such as a magneto-optical disk, a CD (compact disk), or a DVD (digital versatile disk).
  • an optical disk medium such as a mask ROM, a PROM (programmable ROM), an EPROM (erasable PROM), a flash ROM, or a semiconductor memory such as a RAM.
  • the program may be supplied to the computer using various types of transitory computer-readable media.
  • Examples of transitory computer-readable media include electrical signals, optical signals, and electromagnetic waves.
  • the transitory computer-readable medium can supply the program to the computer via a wired communication path such as an electric wire and an optical fiber, or a wireless communication path.
  • RAM 540 is a volatile storage device.
  • various semiconductor memory devices such as DRAM (Dynamic Random Access Memory) or SRAM (Static Random Access Memory) are used.
  • the RAM 540 can be used as an internal buffer that temporarily stores data and the like.
  • the CPU 510 loads the program stored in the storage unit 520 or the ROM 530 into the RAM 540 and executes it. When the CPU 510 executes the program, the functions of the internal surface state identification unit 111, the determination unit 112, and the model generation unit 113 in the restlessness identification device 110 illustrated in FIG. 2 are realized.
  • the CPU 510 may have an internal buffer that can temporarily store data and the like.
  • [Appendix 1] To acquire sensor data of a monitoring target person from a sensor group including one or more sensors, and to identify the inner state of the monitoring target person generated by using the acquired sensor data and the sensor data acquired in the past.
  • An inner surface state identifying means for identifying the inner surface state of the person to be monitored, based on the identification model of Determination means for determining whether or not a condition for generating another identification model different from the existing identification model is satisfied, When it is determined that the condition is satisfied by the determination unit, an identification model different from the identification model used by the inner surface state identification unit is determined by using the sensor data of the monitoring target person acquired from the sensor group.
  • a biometric information processing device comprising: a model generating unit that generates the model.
  • Appendix 2 The biological information processing apparatus according to appendix 1, wherein the inner surface state includes whether or not the monitoring target person is in a resting state, and the inner surface state identifying means outputs the level of the restless state as a result of identifying the inner surface state. ..
  • Appendix 3 3. The biometric information processing device according to appendix 1 or 2, wherein the determining unit determines whether or not the condition is satisfied based on the accuracy of the inner surface state identification result identified by the inner surface state identifying unit.
  • Appendix 4 The biological information processing apparatus according to appendix 3, wherein the determination unit determines that the condition is satisfied when the accuracy of the identification result is lower than a threshold value.
  • Appendix 5 The biometric information processing apparatus according to appendix 2, wherein the determining unit determines whether or not the condition is satisfied, based on the level of the unquiet state identified by the inner surface state identifying unit.
  • Appendix 6 The biometric information processing apparatus according to appendix 5, wherein the determination unit determines that the condition is satisfied when the level of the restless state is distributed within a predetermined range.
  • the determination means determines whether or not the condition is satisfied based on attribute information of a group to which the monitoring target person belongs and attribute information of a group to which the acquisition source of the sensor data acquired in the past belongs.
  • Appendix 8 8. The biometric information processing apparatus according to appendix 7, wherein the attribute information includes information regarding a facility where a patient is hospitalized, information regarding a periphery of a facility where a patient is hospitalized, and information regarding time.
  • the determination means determines that the condition is satisfied when the attribute information of the group to which the monitoring target person belongs and the attribute information of the group to which the acquisition source of the sensor data acquired in the past belong are different.
  • the biometric information processing device according to 1.
  • the determination means uses the attribute information of the group to which the monitoring target belongs and the attribute information of the group to which the acquisition source of the sensor data acquired in the past belongs as an explanatory variable, and indicates whether the attribute information is different. 10.
  • the biometric information processing apparatus according to appendix 9, which determines whether or not the attribute information is different by using a model generated by performing machine learning with the as a target variable.
  • the determination means when there are a plurality of identification models usable by the inner surface state identification means, based on the accuracy of the inner surface state identification result identified by the inner surface state identification means using each of the plurality of identification models.
  • the biometric information processing apparatus according to any one of appendices 1 to 10, wherein an identification model used by the inner surface state identification means is selected.
  • the determining means determines whether or not the amount of sensor data of the monitoring target person acquired from the sensor group is equal to or greater than a threshold value, and when the amount of sensor data is determined to be equal to or greater than the threshold value, 12.
  • the biological information processing apparatus according to any one of appendices 1 to 11, which determines whether or not a condition is satisfied.
  • Appendix 14 To acquire sensor data of a monitoring target person from a sensor group including one or more sensors, and to identify the inner state of the monitoring target person generated by using the acquired sensor data and the sensor data acquired in the past. Based on the identification model of, to identify the inner state of the monitored person, It is determined whether or not a condition for generating another discriminant model different from the existing discriminant model is satisfied, When it is determined that the condition is satisfied, a process for generating an identification model different from the identification model used for identifying the inner surface state by using the sensor data of the monitoring target person acquired from the sensor group A computer-readable recording medium storing a program for causing a computer to execute.
  • biometric information processing device 11 determination means 12: model generation means 13: inner surface state identification means 20: sensor group 30: attribute information 40, 50: identification model 100: biometric information processing system 110: restlessness identification device 111: inner surface state Identification unit 112: determination unit 113: model generation unit 120: sensor group 130: attribute information 140: storage device 141: past data 142: attribute information 143: identification model 150: notification unit

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Pathology (AREA)
  • Primary Health Care (AREA)
  • Epidemiology (AREA)
  • Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • Veterinary Medicine (AREA)
  • Physiology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Surgery (AREA)
  • Biophysics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Business, Economics & Management (AREA)
  • Artificial Intelligence (AREA)
  • Business, Economics & Management (AREA)
  • Psychiatry (AREA)
  • Signal Processing (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Medical Treatment And Welfare Office Work (AREA)

Abstract

La présente invention permet de limiter une diminution de précision de résultats d'évaluation de condition corporelle. Un moyen d'identification d'état interne (13) acquiert des données de capteur d'un sujet sous surveillance, tel qu'un patient, à partir d'un groupe de capteurs (20) comprenant un ou plusieurs capteurs. Un modèle d'identification (40) est un modèle permettant d'identifier l'état interne du sujet sous surveillance. Le modèle d'identification (40) est généré à l'aide de données de capteur préalablement acquises. Le moyen d'identification d'état interne (13) identifie l'état interne du sujet sous surveillance en fonction de données acquises de capteur et du modèle d'identification (40). Un moyen de détermination (11) détermine si des conditions de production d'un autre modèle d'identification sont vérifiées ou non. En cas de détermination de la vérification des conditions permettant de générer un autre modèle d'identification, un moyen de production de modèle (12) génère un autre modèle d'identification (50), différent du modèle d'identification (40), à l'aide des données de capteur du sujet sous surveillance, les données de capteur étant acquises à partir du groupe de capteurs (20).
PCT/JP2019/004660 2019-02-08 2019-02-08 Dispositif et procédé de traitement d'informations biologiques, et support d'enregistrement lisible par ordinateur WO2020161901A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
PCT/JP2019/004660 WO2020161901A1 (fr) 2019-02-08 2019-02-08 Dispositif et procédé de traitement d'informations biologiques, et support d'enregistrement lisible par ordinateur
JP2020570326A JP7238910B2 (ja) 2019-02-08 2019-02-08 生体情報処理装置、方法、及びプログラム
US17/428,102 US20220022819A1 (en) 2019-02-08 2019-02-08 Biological information processing apparatus, method, and computer readable recording medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2019/004660 WO2020161901A1 (fr) 2019-02-08 2019-02-08 Dispositif et procédé de traitement d'informations biologiques, et support d'enregistrement lisible par ordinateur

Publications (1)

Publication Number Publication Date
WO2020161901A1 true WO2020161901A1 (fr) 2020-08-13

Family

ID=71948188

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/004660 WO2020161901A1 (fr) 2019-02-08 2019-02-08 Dispositif et procédé de traitement d'informations biologiques, et support d'enregistrement lisible par ordinateur

Country Status (3)

Country Link
US (1) US20220022819A1 (fr)
JP (1) JP7238910B2 (fr)
WO (1) WO2020161901A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003308319A (ja) * 2002-04-16 2003-10-31 Communication Research Laboratory 訳語選択装置、翻訳装置、訳語選択プログラム、及び翻訳プログラム
WO2014155690A1 (fr) * 2013-03-29 2014-10-02 富士通株式会社 Procédé, dispositif et programme de mise à jour de modèle
JP2014528314A (ja) * 2011-10-07 2014-10-27 コーニンクレッカ フィリップス エヌ ヴェ 患者を監視し、患者のせん妄を検出する監視システム
WO2018190152A1 (fr) * 2017-04-14 2018-10-18 ソニー株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations, et programme
JP2019016235A (ja) * 2017-07-07 2019-01-31 株式会社エヌ・ティ・ティ・データ 疾病発症予測装置、疾病発症予測方法およびプログラム
JP2019017499A (ja) * 2017-07-12 2019-02-07 パラマウントベッド株式会社 療養支援システム

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
RU2007124523A (ru) * 2004-12-30 2009-02-10 ПРОВЕНТИС, Инк., (US) Способы, системы и компьютерные программные продукты для разработки и использования прогнозных моделей для прогнозирования большинства медицинских случаев, оценки стратегий вмешательства и для одновременной оценки нерегулярности биологических маркеров
KR101347509B1 (ko) * 2011-06-20 2014-01-06 가톨릭대학교 산학협력단 섬망 고위험군 예측모형 시스템 및 그 예측모형 방법, 및 그것을 이용한 섬망 고위험군 예측 시스템
JP6531821B2 (ja) * 2015-03-23 2019-06-19 日本電気株式会社 予測モデル更新システム、予測モデル更新方法および予測モデル更新プログラム
US11508465B2 (en) * 2018-06-28 2022-11-22 Clover Health Systems and methods for determining event probability
GB201817708D0 (en) * 2018-10-30 2018-12-19 Univ Oxford Innovation Ltd Method and apparatus for monitoring a patient

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003308319A (ja) * 2002-04-16 2003-10-31 Communication Research Laboratory 訳語選択装置、翻訳装置、訳語選択プログラム、及び翻訳プログラム
JP2014528314A (ja) * 2011-10-07 2014-10-27 コーニンクレッカ フィリップス エヌ ヴェ 患者を監視し、患者のせん妄を検出する監視システム
WO2014155690A1 (fr) * 2013-03-29 2014-10-02 富士通株式会社 Procédé, dispositif et programme de mise à jour de modèle
WO2018190152A1 (fr) * 2017-04-14 2018-10-18 ソニー株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations, et programme
JP2019016235A (ja) * 2017-07-07 2019-01-31 株式会社エヌ・ティ・ティ・データ 疾病発症予測装置、疾病発症予測方法およびプログラム
JP2019017499A (ja) * 2017-07-12 2019-02-07 パラマウントベッド株式会社 療養支援システム

Also Published As

Publication number Publication date
JP7238910B2 (ja) 2023-03-14
US20220022819A1 (en) 2022-01-27
JPWO2020161901A1 (ja) 2021-11-25

Similar Documents

Publication Publication Date Title
JP7108267B2 (ja) 生体情報処理システム、生体情報処理方法、及びコンピュータプログラム
JP6114470B2 (ja) ヘルスケア意思決定支援システム、患者ケアシステム及びヘルスケア意思決定方法
JP2021513377A (ja) 生理学的イベント検知特徴部を備えたベッド
JP7057592B2 (ja) 生体情報処理システム、生体情報処理方法、および生体情報処理プログラム
US20150371522A1 (en) Multi-Station System for Pressure Ulcer Monitoring and Analysis
Sannino et al. An automatic rules extraction approach to support osa events detection in an mhealth system
JP2019527864A (ja) 安心で独立した生活を促進するためのバーチャル健康アシスタント
KR20180046354A (ko) 저전력 모션 센서를 이용한 코골이 검출 방법
US20230248283A9 (en) System and Method for Patient Monitoring
KR102321197B1 (ko) 딥러닝을 이용한 치매 위험인자 결정 방법 및 장치
WO2020170290A1 (fr) Dispositif de détermination d'anomalie, procédé et support lisible par ordinateur
US11717209B2 (en) Abnormality determination apparatus and non-transitory computer readable medium storing program used for the same
US20160232323A1 (en) Patient health state compound score distribution and/or representative compound score based thereon
WO2020161901A1 (fr) Dispositif et procédé de traitement d'informations biologiques, et support d'enregistrement lisible par ordinateur
US20200330021A1 (en) Incontinence detection systems and methods
US11213233B2 (en) Assessing delirium in a subject
Zhang et al. A feasibility study on smart mattresses to improve sleep quality
Otte et al. A behavioral approach to annotating sleep in infants: Building on the classic framework
CN111613330B (zh) 基于谵妄意识模糊快速评估法的智能评估系统
Tsolakou et al. Hypnos: a sleep monitoring and recommendation system to improve sleep hygiene in intelligent homes
WO2021106319A1 (fr) Système de gestion d'environnement, procédé de gestion d'environnement et support non transitoire lisible par ordinateur stockant un programme
JP7172483B2 (ja) 状態検出装置、状態検出方法、及びプログラム
Reichenbach et al. researchis calling
Dow et al. Fuzzy set training for sleep apnea classification
Zhang et al. Multimodal wearable EEG, EMG and accelerometry measurements improve the accuracy of tonic-clonic seizure detection

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19914351

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020570326

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19914351

Country of ref document: EP

Kind code of ref document: A1