WO2018235628A1 - Système d'aide à la surveillance, procédé de commande associé et programme - Google Patents

Système d'aide à la surveillance, procédé de commande associé et programme Download PDF

Info

Publication number
WO2018235628A1
WO2018235628A1 PCT/JP2018/021983 JP2018021983W WO2018235628A1 WO 2018235628 A1 WO2018235628 A1 WO 2018235628A1 JP 2018021983 W JP2018021983 W JP 2018021983W WO 2018235628 A1 WO2018235628 A1 WO 2018235628A1
Authority
WO
WIPO (PCT)
Prior art keywords
state
bed
image
subject
person
Prior art date
Application number
PCT/JP2018/021983
Other languages
English (en)
Japanese (ja)
Inventor
純平 松永
田中 清明
信二 高橋
達哉 村上
Original Assignee
オムロン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オムロン株式会社 filed Critical オムロン株式会社
Publication of WO2018235628A1 publication Critical patent/WO2018235628A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B25/00Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B25/00Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
    • G08B25/01Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium
    • G08B25/04Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium using a single signalling line, e.g. in a closed loop

Definitions

  • the present invention relates to a technique for supporting watching of a subject on a bed.
  • Patent Document 1 In order to prevent a fall accident from a bed and the like, a system that supports watching and listening of patients in hospitals and nursing facilities is known.
  • the head of a patient is detected from an image captured by a camera installed obliquely above the bed, and it is determined that the head moves up when the head exceeds a boundary set on the bed, and a nurse A system has been proposed to provide notification etc.
  • the conventional system generally detects a patient's head from an image and estimates a patient's behavior (such as getting up or leaving) from the position and movement of the head.
  • a patient's behavior such as getting up or leaving
  • a conventional method has a problem that the behavior of the patient can not be correctly estimated when the head can not be detected from the image or when it is detected erroneously. For example, if the detection accuracy varies depending on the disturbance or environment, such as when the patient is covered with a futon, a person other than the patient or a confusing object is present, or the lighting environment changes, dangerous behavior may occur. And false alarms (unnecessary alarms) may occur, which may significantly reduce the reliability of the system.
  • the present invention has been made in view of the above-described circumstances, and an object thereof is to provide a technique for detecting the state or behavior of a subject on a bed with high accuracy and high reliability.
  • the state of the watching target is estimated from the monitoring image using a regressor that machine-learns the pattern of the bed and the state of the person, and the action of the target is detected.
  • a first aspect of the present invention is a watching support system for supporting watching of a subject on a bed, and an image showing a bed and a person is input, and a score indicating the state of the person relative to the bed is input.
  • a regressor that is machine-learned to output the image, an image acquisition unit that acquires an image from an imaging device installed to image the monitoring region including the bed of the subject, and the image acquisition unit
  • a determination unit that determines the action of the subject based on a score obtained by inputting the image of the monitoring area to the regressor; an output unit that performs notification based on the determination result of the determination unit;
  • the regressor is used to estimate the state of the subject in the input image. Therefore, highly accurate state estimation can be performed on an unknown input image. Further, since the score of the regressor is output as a continuous value, it is possible to obtain a reasonable estimation result even if the condition of the subject can not be clearly classified. Furthermore, even if the target person is covered with a futon, there are strange people or objects around the target person, or the lighting environment is different from normal, etc., the image is difficult to detect the head. There is also the advantage that you can get
  • the regressor may be a neural network. By using a neural network, highly accurate and highly robust state estimation can be performed.
  • the state of the person with respect to the bed is classified in advance into a plurality of types, and different scores are assigned to each of the plurality of types, and the regressor is configured to determine that the state of the person is between two types. If so, it may be configured to output a value between the two types of scores.
  • Such a design makes it possible to express various human-readable states as a one-dimensional score, making it easy to handle "human states” mathematically or in programs, for example, The processing of (judgment logic by the judgment unit etc.) can be constructed extremely easily.
  • the plurality of types may include state 0 in which the person is sleeping in the bed, state 1 in which the person is rising on the bed, and state 2 in which the person is away from the bed . This is because if it is possible to distinguish at least three types of states of state 0 to state 2, it is possible to detect “wake up” and “getting out of bed” which have a high need for watching.
  • the determination unit may determine an action of the subject based on a temporal change of the state of the subject estimated by the score.
  • the accuracy of the action determination can be improved by determining the action of the subject based on the temporal change (state transition) of the state of the subject rather than the temporary state of the subject.
  • the determination unit determines that the state 1 is When continuing for a predetermined time, it may be determined that the subject has got up.
  • the determination unit determines the state of the subject from the state 0
  • the state 1 is transitioned to, the state 1 is changed to the state 2, and the state 2 continues for a predetermined time, or the state of the subject is changed from the state 0 to the state 2
  • the present invention can be understood as a watching support system having at least a part of the above configuration or function.
  • the present invention also provides a watching support method or a watching support system control method including at least a part of the above-described processing, a program for causing a computer to execute these methods, or non-temporarily such a program. It can also be regarded as a recorded computer readable recording medium.
  • the state or behavior of the subject on the bed can be detected with high accuracy and high reliability.
  • FIG. 1 is a block diagram schematically showing a hardware configuration and a functional configuration of the watching support system of the first embodiment.
  • FIG. 2 is a view showing an installation example of the imaging device.
  • FIG. 3A and FIG. 3B are examples of the monitoring area set for the image.
  • FIG. 4 shows types of human states and examples of images.
  • FIG. 5 is a diagram schematically showing machine learning of the regressor.
  • FIG. 6 is a diagram schematically showing the ability of the regressor.
  • FIG. 7 is a flowchart of the state monitoring process.
  • FIG. 8 is a flowchart of action determination.
  • FIG. 9 is an example of the determination logic.
  • the present invention relates to a technique for supporting watching of a subject on a bed.
  • This technology can be applied to a system that automatically detects the getting-up and getting-up behavior of patients and care recipients in hospitals and nursing facilities, etc., and performs necessary notification when a dangerous state occurs.
  • This system can be preferably used, for example, for watching and supporting elderly people, patients with dementia, children and the like.
  • FIG. 1 is a block diagram schematically showing a hardware configuration and a functional configuration of the watching support system 1
  • FIG. 2 is a diagram showing an installation example of an imaging device.
  • the watching support system 1 includes an imaging device 10 and an information processing device 11 as main hardware configurations.
  • the imaging device 10 and the information processing device 11 are connected by wire or wirelessly. Although only one imaging device 10 is shown in FIG. 1, a plurality of imaging devices 10 may be connected to the information processing device 11.
  • the imaging device 10 is a device for capturing a subject on a bed and capturing image data.
  • a monochrome or color visible light camera, an infrared camera, a three-dimensional camera or the like can be used.
  • the imaging device 10 configured by the infrared LED illumination 100 and the near infrared camera 101 is adopted in order to enable watching of the target person even at night (even when the room is dark).
  • the imaging device 10 is installed to look over the entire bed 20 from the top of the bed 20 to the foot.
  • the imaging device 10 captures an image at a predetermined time interval (for example, 30 fps), and the image data is sequentially captured by the information processing device 11.
  • a predetermined time interval for example, 30 fps
  • the information processing apparatus 11 has a function of analyzing image data taken in from the imaging apparatus 10 in real time, automatically detecting wakeup behavior and leaving behavior of the subject 21 on the bed 20, and notifying when necessary. It is.
  • the information processing apparatus 11 includes, as specific functional modules, an image acquisition unit 110, an area setting unit 111, a preprocessing unit 112, a regression unit 113, a score stabilization unit 114, a determination unit 115, an output unit 116, and a storage unit 117. Have.
  • the information processing apparatus 11 includes a CPU (processor), memory, storage (HDD, SSD, etc.), input device (keyboard, mouse, touch panel, etc.), output device (display, speaker, etc.), communication interface, etc.
  • a CPU processor
  • memory storage
  • HDD hard disk drive
  • storage HDD, SSD, etc.
  • input device keyboard, mouse, touch panel, etc.
  • output device display, speaker, etc.
  • communication interface etc.
  • Each module of the information processing apparatus 11 described above is realized by the CPU executing a program stored in the storage or the memory.
  • the configuration of the information processing apparatus 11 is not limited to this example.
  • distributed computing may be performed by a plurality of computers, a part of the module may be executed by a cloud server, or a part of the module may be configured by a circuit such as an ASIC or an FPGA. It is also good.
  • the image acquisition unit 110 is a module for acquiring an image captured by the imaging device 10.
  • the image data input from the image acquisition unit 110 is temporarily stored in a memory or storage, and is used for area setting processing and status monitoring processing described later.
  • the area setting unit 111 is a module for setting a monitoring area for an image captured by the imaging device 10.
  • the monitoring area is a range (in other words, an image range used as an input of the regressor 113) to be subjected to the state monitoring process in the field of view of the imaging device 10. Details of the area setting process will be described later.
  • the preprocessing unit 112 is a module for performing necessary preprocessing on an image (hereinafter referred to as an “original image”) input from the image acquisition unit 110 in the state monitoring process. For example, the preprocessing unit 112 performs processing of clipping an image within the monitoring area from the original image (hereinafter, the clipped image is referred to as a “monitoring area image”). In addition, the preprocessing unit 112 may perform processing such as resizing (reduction), affine transformation, and luminance correction on the monitoring area image. Resizing (reduction) has the effect of shortening the calculation time of the regressor 113.
  • the affine transformation can be expected to have the effect of normalizing the input image to the regressor 113 and improving the estimation accuracy by performing necessary distortion correction such as, for example, deforming a bed reflected in a trapezoidal shape in the image into a rectangular shape .
  • the luminance correction can be expected to have an effect of improving the estimation accuracy, for example, by reducing the influence of the illumination environment.
  • the regressor 113 is a module for outputting a score indicating the state of the subject 21 (for example, bedtime state, wakeup state, leaving state) shown in the monitoring area image when the monitoring area image is given. .
  • the regressor 113 is a machine learning model of a relation model between the features of the input image and the human state so as to output an image showing the bed and the person and output a score indicating the human state with respect to the bed. . It is assumed that the training of the regressor 113 is performed in advance (before shipping or operation of the system) by the learning device 12 using a large number of training images.
  • any model such as a neural network, a random forest, a support vector machine, etc. may be used.
  • a convolutional neural network (CNN) suitable for image recognition is used.
  • the score stabilization unit 114 is a module for suppressing rapid change and fluttering of the score output from the regressor 113.
  • the score stabilization unit 114 calculates an average of the current score obtained from the image of the current frame and the past score obtained from the images of the immediately preceding two frames, and outputs the average as a stabilization score. This process is equivalent to applying a temporal low-pass filter to the time series data of the score.
  • the score stabilization part 114 may be abbreviate
  • the determination unit 115 is a module for determining the action of the subject based on the score obtained by the regressor 113. Specifically, the determination unit 115 determines what kind of behavior (for example, wake-up action, leaving-behind action) of the subject based on temporal change in score (that is, transition of the “subject's state” indicated by the score). Etc.) are estimated. Details of the processing of the determination unit 115 will be described later.
  • the output unit 116 is a module that performs necessary notification based on the determination result of the determination unit 115. For example, the output unit 116 determines whether or not the notification is necessary (for example, notifies only in the dangerous state), the content of the notification (for example, the content of the message), the notification unit (for example, for example). Voice, mail, buzzer, warning light, notification destination (eg, nurse, doctor, carer), frequency of notification, etc. can be switched.
  • the storage unit 117 is a module for storing various data used by the watching support system 1 for processing.
  • the storage unit 117 stores, for example, setting information of a monitoring area, parameters used in preprocessing, parameters used in score stabilization processing, time series data of scores, parameters used in determination processing, and the like.
  • the setting of the monitoring area may be performed manually or automatically.
  • the area setting unit 111 may provide a user interface for allowing the user to input the area of the bed 20 in the image or the monitoring area itself.
  • the area setting unit 111 may detect the bed 20 from the image by object recognition processing, and set the monitoring area so as to include the detected area of the bed 20.
  • the area setting process is performed, for example, when the monitoring area needs to be updated along with the movement of the bed 20 or the imaging device 10 when the monitoring area is not set (for example, when the system is installed).
  • FIG. 3A is an example of the monitoring area set for the original image.
  • a monitoring area 30 is set by adding a margin of a predetermined width to the left side, the right side, and the upper side (foot side) of the area of the bed 20.
  • the width of the margin is set so that the whole body of the person (see FIG. 3B) rising on the bed 20 falls within the monitoring area 30.
  • the human condition with respect to the bed is classified into three types from 0 to 2 in advance.
  • the "type 0 type” is a state in which a person is sleeping in the bed (referred to as “sleeping state” or “state 0")
  • the "type 1 type” is a state in which a person is rising on the bed (“wake up state Or “state 1”)
  • “type 2” is a state in which a person is separated from the bed (dismounted) (referred to as “bed leaving state” or “state 2”).
  • FIG. 4 is an example showing correspondence between time-series images representing a series of actions of a person who was sleeping rising and leaving the bed, and three types.
  • FIG. 5 schematically shows the machine learning of the regressor 113.
  • images obtained by photographing an actual patient room and the like are collected, and each image is classified into type 0 to type 2.
  • a portion corresponding to the monitoring area of each image is clipped, and the type number (0, 1, 2) is assigned as a label to generate a set of training images.
  • the type number (0, 1, 2) is assigned as a label to generate a set of training images.
  • it is preferable to prepare a sufficient number of images and it is preferable to prepare images of various variations for each type.
  • the specific layer structure of the neural network, the filter, the activation function, the specification of the input image, and the like may be appropriately designed according to the mounting and the required accuracy.
  • FIG. 6 schematically shows the ability of the regressor 113.
  • the regressor 113 models the correspondence between the “feature amount” of the image and the “score” indicating the human state.
  • the regressor 113 extracts the feature amount from the input image according to the relationship model, and calculates and outputs a score corresponding to the feature amount.
  • FIG. 6 shows the relationship model as a two-dimensional linear model for the convenience of description, the actual feature amount space is multidimensional, and the relationship model is non-linear.
  • the score output from the regressor 113 is a real value (continuous value) in the range of 0-2.
  • the output score becomes 1 or a value very close to 1.
  • the input image there is also an ambiguous image as to which type it belongs to, such as a state in which the upper body is about to wake up from the sleeping position or a state in which it is about to stand up from the bed.
  • the extracted feature quantity is a feature quantity between the two types, so that an intermediate score between the two types is output.
  • a score of a value larger than 0 and smaller than 1 is obtained because it is an intermediate state between the 0th type and the 1st type.
  • the regressor 113 is used to estimate the human state in the input image. Therefore, highly accurate state estimation can be performed on an unknown input image.
  • the following advantages can be mentioned as superior points as compared with the conventional method using head detection. Even when an intermediate state image is input, a reasonable estimation result can be obtained. ⁇ Even if the subject is covered with a futon, there are strange people or objects around the subject, or the lighting environment is different from usual, etc., it is an image that is difficult to detect the head. You can get it.
  • step S ⁇ b> 70 the image acquisition unit 110 captures an image of one frame from the imaging device 10.
  • the acquired original image is temporarily stored in the storage unit 117.
  • the preprocessing unit 112 clips the monitoring area image from the original image, and executes resizing, affine transformation, luminance correction and the like as necessary (step S71).
  • the regressor 113 inputs the monitoring area image and outputs the corresponding score (step S72).
  • the score stabilization unit 114 performs stabilization processing of the score obtained in step S72 (step S73), and delivers the obtained score to the determination unit 115.
  • the determination unit 115 selects one of 0 type (state 0) / first type (state 1) / second type (state 2) as the current state of the subject 21.
  • Classified into The classification method is not limited, but in the present embodiment, state 0 (steps S74 and S75) in the case of score ⁇ threshold th1, state 1 (step S76 and S77) in the case of threshold th1 ⁇ score ⁇ threshold th2, threshold th2 ⁇ In the case of the score, it is classified as state 2 (step S78).
  • the detection sensitivity can be adjusted by changing the thresholds th1 and th2.
  • the determination unit 115 determines the action of the subject 21 based on the temporal change of the state of the subject 21 (step S79). If it is determined in step S79 that the target person 21 is "wake up” or “getting up”, the output unit 116 performs necessary notification (step S80). The above steps S70 to S80 are executed for each frame until the system is completed (step S81).
  • FIG. 8 is a detailed flow of the action determination process in step S79
  • FIG. 9 is an example of the determination logic, showing the correspondence between the state transition, the determination result, and the notification setting.
  • the wake-up preparation flag is a flag for confirming transition from the state 0
  • the bed-out preparation flag is a flag for confirming transition from the state 1. It is assumed that the wakeup preparation flag and the bed leaving preparation flag are initialized as "OFF" when the state monitoring process is started.
  • step S790 If the current state is "0" (step S790; YES), the determination unit 115 sets the wakeup preparation flag and the bed leaving preparation flag to "ON" (step S791).
  • step S 792 If the current state is “1” (step S 792: YES), the determination unit 115 confirms whether the wake-up preparation flag is “ON” (step S 793). If “ON”, the object person 21 is It is determined that the user gets up (step S794), the wake-up preparation flag is set to "OFF”, and the bed-up preparation flag is set to "ON” (step S795).
  • step S796 If the current state is "2" (step S796; YES), the determination unit 115 confirms whether the bed leaving preparation flag is "ON” (step S797), and if "ON", the target person 21 is It is determined that the user has got out of bed (step S798), and the wake-up preparation flag and the release flag are set to "OFF” (step S799).
  • the determination unit 115 when transitioning from state 0 to state 1, it is determined to be “wake up”, and when transitioning from state 0 or state 1 to state 2 is “leaving” It is determined that Otherwise, the determination unit 115 does not output the determination result or outputs the result of “no abnormality”.
  • the state estimation of the target person 21 is performed by the regressor 113, the state or the behavior of the target person 21 can be determined with high accuracy as compared with the conventional method.
  • the wakeup behavior and the leaving behavior of the target person 21 are determined based on the temporal change (state transition) of the state of the target person 21 instead of the temporary state of the target person 21, the accuracy of action determination is improved. can do.
  • the wakeup preparation flag and the stay preparation flag are temporarily set to "OFF", so the wakeup determination and the leave will be performed unless the target person 21 returns to the bed state etc. again. Judgment does not work. Therefore, false notification of wakeup and wakeup notification can be reduced, and the reliability of the system can be improved.
  • the state 1 when it transits from the state 0 to the state 1, it is determined to be a wake up, but instead, the state 1 continues for a predetermined time (for example 1 second) after transitioning from the state 0 to the state 1 In this case, it may be determined that the user has got up. Thereby, the reliability of the wakeup determination can be further enhanced.
  • a predetermined time for example 1 second
  • the state 2 after transition from the state 0 or the state 1 to the state 2, if the state 2 continues for a predetermined time (for example, 1 second), it may be determined that the bed has left. Thereby, the reliability of the bed leaving determination can be further enhanced.
  • the sleeping state / wake-up state / disengagement state is estimated from the image, and the wake-up behavior and the leaving behavior of the subject are detected.
  • the state of the estimation target and the action of the detection target are not limited to these. That is, it is possible to handle various "human states” and “actions” as long as different features appear in the image. For example, it is also possible to detect actions such as eating and reading.
  • Watch-over support system 10 imaging device, 11: information processing device, 12: learning device 100: illumination, 101: near infrared camera, 110: image acquisition unit, 111: area setting unit, 112: preprocessing unit, 113: Regressor, 114: Score stabilization unit, 115: Determination unit, 116: Output unit, 117: Storage unit 20: Bed, 21: Target person 30: Monitoring area

Landscapes

  • Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Emergency Management (AREA)
  • Engineering & Computer Science (AREA)
  • Public Health (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Biomedical Technology (AREA)
  • Pathology (AREA)
  • Biophysics (AREA)
  • Alarm Systems (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)
  • Emergency Alarm Devices (AREA)
  • Invalid Beds And Related Equipment (AREA)

Abstract

L'invention concerne un système d'aide à la surveillance permettant d'aider à la surveillance d'un sujet dans un lit, ledit système comprenant : un dispositif de régression dans lequel une image comprenant un lit et une personne est entrée et dans lequel l'apprentissage mécanique est réalisé de façon à générer un score indiquant l'état de la personne par rapport au lit; une unité d'acquisition d'image permettant d'acquérir une image à partir d'un dispositif d'imagerie conçu de façon à capturer une image d'une zone de surveillance qui comprend le lit du sujet; une unité de détermination permettant de déterminer un comportement du sujet d'après un score obtenu en entrant une image de la zone de surveillance acquise par l'unité d'acquisition d'image dans le dispositif de régression; et une unité de sortie permettant de fournir une notification d'après le résultat de détermination de l'unité de détermination.
PCT/JP2018/021983 2017-06-23 2018-06-08 Système d'aide à la surveillance, procédé de commande associé et programme WO2018235628A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017123009A JP6822326B2 (ja) 2017-06-23 2017-06-23 見守り支援システム及びその制御方法
JP2017-123009 2017-06-23

Publications (1)

Publication Number Publication Date
WO2018235628A1 true WO2018235628A1 (fr) 2018-12-27

Family

ID=64735938

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/021983 WO2018235628A1 (fr) 2017-06-23 2018-06-08 Système d'aide à la surveillance, procédé de commande associé et programme

Country Status (2)

Country Link
JP (1) JP6822326B2 (fr)
WO (1) WO2018235628A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2584644A (en) * 2019-06-05 2020-12-16 Peak Medtek Ltd A fall prevention device

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7457625B2 (ja) 2020-10-07 2024-03-28 パラマウントベッド株式会社 ベッドシステム

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007213528A (ja) * 2006-02-13 2007-08-23 Sanyo Electric Co Ltd 行動認識システム
JP2007241477A (ja) * 2006-03-06 2007-09-20 Fuji Xerox Co Ltd 画像処理装置
JP2015138460A (ja) * 2014-01-23 2015-07-30 富士通株式会社 状態認識方法及び状態認識装置
JP2016157219A (ja) * 2015-02-24 2016-09-01 株式会社日立製作所 画像処理方法、画像処理装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007213528A (ja) * 2006-02-13 2007-08-23 Sanyo Electric Co Ltd 行動認識システム
JP2007241477A (ja) * 2006-03-06 2007-09-20 Fuji Xerox Co Ltd 画像処理装置
JP2015138460A (ja) * 2014-01-23 2015-07-30 富士通株式会社 状態認識方法及び状態認識装置
JP2016157219A (ja) * 2015-02-24 2016-09-01 株式会社日立製作所 画像処理方法、画像処理装置

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2584644A (en) * 2019-06-05 2020-12-16 Peak Medtek Ltd A fall prevention device
US11341832B2 (en) 2019-06-05 2022-05-24 Peak Medtek Limited Fall prevention device
GB2584644B (en) * 2019-06-05 2024-01-31 Peak Medtek Ltd A fall prevention device

Also Published As

Publication number Publication date
JP2019008515A (ja) 2019-01-17
JP6822326B2 (ja) 2021-01-27

Similar Documents

Publication Publication Date Title
JP6137425B2 (ja) 画像処理システム、画像処理装置、画像処理方法、および画像処理プログラム
JP6167563B2 (ja) 情報処理装置、情報処理方法、及び、プログラム
US10786183B2 (en) Monitoring assistance system, control method thereof, and program
JP6822328B2 (ja) 見守り支援システム及びその制御方法
JP6915421B2 (ja) 見守り支援システム及びその制御方法
WO2015118953A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations, et programme
US11116423B2 (en) Patient monitoring system and method
JP6780641B2 (ja) 画像解析装置、画像解析方法、及び、画像解析プログラム
JP6952299B2 (ja) 睡眠深度判定システム、睡眠深度判定装置及び睡眠深度判定方法
WO2018235628A1 (fr) Système d'aide à la surveillance, procédé de commande associé et programme
JP6729510B2 (ja) 見守り支援システム及びその制御方法
JP6870514B2 (ja) 見守り支援システム及びその制御方法
US10762761B2 (en) Monitoring assistance system, control method thereof, and program
WO2016185738A1 (fr) Dispositif d'analyse d'image, procédé d'analyse d'image, et programme d'analyse d'image
JP6729512B2 (ja) 見守り支援システム及びその制御方法
US20220054046A1 (en) Assessing patient out-of-bed and out-of-chair activities using embedded infrared thermal cameras
JP6635074B2 (ja) 見守り支援システム及びその制御方法
TWI697869B (zh) 姿態判斷方法、電子系統以及非暫態電腦可讀取記錄媒體
JP6292012B2 (ja) 見守り装置、見守り方法及びプログラム
JP2023051147A (ja) ナースコールシステム、および状態判断システム
Salmi Improving safety for home care patients with a low cost computer vision solution
EP2939595A1 (fr) Procédé et système pour déterminer un mouvement d'objet

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18821409

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18821409

Country of ref document: EP

Kind code of ref document: A1