CN116997942A - Nursing system - Google Patents

Nursing system Download PDF

Info

Publication number
CN116997942A
CN116997942A CN202280020914.7A CN202280020914A CN116997942A CN 116997942 A CN116997942 A CN 116997942A CN 202280020914 A CN202280020914 A CN 202280020914A CN 116997942 A CN116997942 A CN 116997942A
Authority
CN
China
Prior art keywords
person
bone model
unit
bone
contact
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202280020914.7A
Other languages
Chinese (zh)
Inventor
伊藤雅宽
石井宏二
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yazaki Corp
Original Assignee
Yazaki Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yazaki Corp filed Critical Yazaki Corp
Publication of CN116997942A publication Critical patent/CN116997942A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B25/00Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B25/00Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
    • G08B25/01Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium
    • G08B25/04Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium using a single signalling line, e.g. in a closed loop
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Emergency Management (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Alarm Systems (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Emergency Alarm Devices (AREA)
  • Image Analysis (AREA)

Abstract

The nursing system (1) comprises: an imaging unit (11) that captures an image of a space to be monitored; a bone model generation unit (23 b) that generates a bone model representing a person included in the image captured by the imaging unit (11); a position detector (12) capable of detecting the position of a person corresponding to the bone model in the imaging depth direction of the imaging unit (11); and a determination unit (23 c) that determines contact between the first person and the second person based on whether or not the first bone model representing the first person and the second bone model representing the second person overlap, the position of the first person in the imaging depth direction detected by the position detector (12), and the position of the second person in the imaging depth direction detected by the position detector (12).

Description

Nursing system
Technical Field
The present invention relates to care systems.
Background
For example, patent document 1 discloses a posture detection device for a person to be monitored, which is provided with an imaging device, a bone information extraction unit, a bone information sample storage unit, a posture detection unit, and a posture determination unit. The image pickup device acquires image data of an image pickup area for monitoring a person to be monitored. A bone information extraction unit extracts bone information of a subject from image data captured by an imaging device. The bone information sample storage unit stores a posture information sample composed of bone information. The posture detecting unit detects the posture of the monitored person based on the bone information of the person extracted by the bone information extracting unit and the bone information sample stored in the bone information storing unit. The posture determination unit determines whether or not there is a fall or a fall based on the posture detected by the posture detection unit.
Prior art literature
Patent literature
Patent document 1: japanese patent laid-open No. 2020-34960
Disclosure of Invention
Technical problem to be solved by the invention
However, the above-described posture detection device has room for further improvement in grasping whether or not a person is in contact with each other, for example.
The present invention has been made in view of the above circumstances, and an object thereof is to provide a nursing system capable of appropriately grasping a contact state of a person.
Means for solving the problems
In order to achieve the above object, a nursing system of the present invention includes: an imaging unit that captures an image of a space to be monitored; a bone model generation unit that generates a bone model representing a person included in the image captured by the imaging unit; a position detector configured to detect a position of a person corresponding to the bone model in an imaging depth direction of the imaging unit; and a determination unit configured to determine contact between a first person and a second person based on whether or not a first bone model representing the first person and a second bone model representing the second person overlap each other, a position of the first person in the imaging depth direction detected by the position detector, and a position of the second person in the imaging depth direction detected by the position detector.
Effects of the invention
The nursing system according to the present invention has an effect of being able to properly grasp the contact state of a person.
Drawings
Fig. 1 is a block diagram showing a schematic configuration of a nursing system according to an embodiment.
Fig. 2 is a schematic diagram showing an example of mounting of the nursing system according to the embodiment.
Fig. 3 is a schematic diagram illustrating an example of a state determination based on a bone model in the nursing system according to the embodiment.
Fig. 4 is a schematic diagram illustrating an example of determination in the nursing system according to the embodiment.
Fig. 5 is a schematic diagram illustrating an example of determination in the nursing system according to the embodiment.
Fig. 6 is a schematic diagram illustrating an example of determination in the nursing system according to the embodiment.
Fig. 7 is a schematic diagram illustrating an example of determination in the nursing system according to the embodiment.
Fig. 8 is a flowchart illustrating an example of processing in the nursing system according to the embodiment.
Description of the reference numerals
1. Nursing system
1A detection system
10. Setting device
11. Image pickup unit
12. Position detector
20. Detection device
21. 31 interface part
22. 32 storage part
23. 33 processing part
23a information processing section
23b bone model generating part
23c determination unit
23d operation processing unit
30. Terminal equipment
BB bounding box
I image
MDL, MDL1, MDL2 bone model
N network
P, P1, P2 person
SP monitoring object space
Depth direction of X-ray image pickup
Width direction of Y-camera shooting
Up and down direction of Z camera shooting
Detailed Description
Embodiments according to the present invention will be described in detail below with reference to the drawings. The present invention is not limited to this embodiment. The constituent elements in the following embodiments include elements that can be easily replaced by those skilled in the art, or substantially the same elements.
Embodiment(s)
The nursing system 1 of the present embodiment shown in fig. 1 and 2 is a system for monitoring and nursing the state of the person P existing in the space SP to be monitored. The nursing system 1 of the present embodiment is applied to, for example, nursing institutions such as outpatient nursing (daytime service) and welfare institutions such as senior citizens institutions. The monitoring target space SP is, for example, a living room space, a corridor space, or the like of the facility.
The nursing system 1 of the present embodiment includes a setting device 10, a detecting device 20, and a terminal device 30, which constitute a cooperative system that mutually transmits and receives information and cooperates. Here, the setting apparatus 10 and the detection apparatus 20 constitute a detection system 1A, and the detection system 1A detects contact (interference) of the persons P with each other in the monitored object space SP, and records the situation at the time of contact. The nursing system 1 of the present embodiment is configured to determine the state of the person P existing in the space to be monitored SP based on the bone model MDL (see fig. 3) representing the person P in the detection system 1A, and to appropriately grasp the situation when the persons P come into contact with each other. Hereinafter, each configuration of the nursing system 1 will be described in detail with reference to the drawings.
In the nursing system 1 illustrated in fig. 1, the connection between the respective components for transmitting and receiving power supply, control signals, various information, and the like may be either wired connection or wireless connection unless otherwise specified. The wire-based connection is, for example, a connection via a wiring material such as an electric wire or an optical fiber. The wireless-based connection is, for example, a connection based on wireless communication, contactless power supply, or the like.
< basic Structure of setting device >
The setting apparatus 10 is an apparatus that is set in the monitored space SP and photographs the monitored space SP. The setting apparatus 10 includes an imaging unit 11, a position detector 12, a display 13, a speaker 14, and a microphone 15. The installation apparatus 10 is installed in a ceiling or the like of the space to be monitored SP in a unitized manner by assembling these components in a housing or the like, for example, to constitute an indoor monitoring module having various functions integrated therein. In addition, the setting device 10 may set these constituent elements separately in the monitored space SP, for example. The setting device 10 is provided in a plurality in a mechanism of the application care system 1, for example.
The image pickup unit 11 picks up an image I of the space SP to be monitored (see fig. 3, for example). The imaging unit 11 may be, for example, a monocular camera capable of capturing two-dimensional images, or a stereo camera capable of capturing three-dimensional images. The imaging unit 11 may be a so-called TOF (Time of Flight) camera or the like. The image pickup section 11 is typically provided at a position where all the persons P existing in the monitored space SP can be picked up. The imaging unit 11 is disposed above the space to be monitored SP, for example, and is disposed on the ceiling in this case, and sets the angle of view so that the imaging range includes the entire area of the space to be monitored SP. The imaging unit 11 may be configured to provide a plurality of imaging units 11 when one imaging unit 11 cannot cover the entire area of the space SP to be monitored, and the plurality of imaging units 11 may cover the entire area of the space SP to be monitored.
In the following description, as shown in fig. 3, 4, 5, and 6, which will be described later, the imaging depth direction of the imaging unit 11 may be referred to as an "imaging depth direction X", a direction intersecting the imaging depth direction X and extending in the horizontal direction may be referred to as an "imaging width direction Y", and a direction intersecting the imaging depth direction X and extending in the vertical direction may be referred to as an "imaging vertical direction Z". The imaging depth direction X typically corresponds to a direction along the optical axis direction of the imaging unit 11.
The position detector 12 is a detector capable of detecting the position of the person P in the imaging depth direction X of the imaging unit 11. The position detector 12 detects the position of the person P in the imaging depth direction X, which is imaged by the imaging unit 11. The position of the person P in the imaging depth direction X imaged by the imaging unit 11 typically corresponds to the position of the person P in the imaging depth direction X corresponding to a bone model MDL generated by the detection device 20 as described later. The position detector 12 can use, for example, various kinds of radar, sonar, liDAR (light detection and Ranging), or the like that detect a distance by laser, infrared rays, millimeter waves, ultrasonic waves, or the like. The position detector 12 can detect the position of the person P in the imaging depth direction X by measuring the distance between the person P and the position detector 12 along the imaging depth direction X, for example.
The display 13 displays (outputs) image information (visual information) toward the monitored space SP. The display 13 is constituted by, for example, a thin liquid crystal display, a plasma display, an organic EL display, or the like. The display 13 displays image information at a position visible to the person P in the monitored space SP. The display 13 performs various guidance (broadcasting) by outputting image information, for example.
The speaker 14 outputs sound information (auditory information) to the monitored space SP. The speaker 14 outputs, for example, audio information to perform various broadcasting (broadcasting).
The microphone 15 is a sound receiving device that converts sound generated in the monitored space SP into an electric signal. The microphone 15 can be used for exchanging sounds with a person (for example, a staff member of a facility or the like) outside the space SP to be monitored, for example.
< basic Structure of detection device >
The detection device 20 is a device that detects the state of the person P present in the monitored object space SP based on the bone model MDL generated from the image I captured by the setting device 10. The detection device 20 includes an interface 21, a storage 22, and a processing unit 23, which are communicably connected to each other. The detection device 20 may be configured as a so-called cloud service type device (cloud server) installed on a network, or may be configured as a so-called stand-alone type device separated from the network. The detection device 20 can be configured by, for example, installing applications for realizing various processes in various computer devices such as a personal computer, a workstation, and a tablet terminal. The detection device 20 is provided in, for example, a management center of a mechanism to which the nursing system 1 is applied, but is not limited thereto.
The interface unit 21 is an interface for transmitting and receiving various information to and from other devices than the detection device 20. The interface 21 has a function of performing wired communication of information between each section and the other section via an electric wire or the like, a function of performing wireless communication of information between each section and the other section via a wireless communication means or the like, and the like. As other devices than the detection device 20, the interface section 21 transmits and receives information to and from the plurality of setting devices 10 and the plurality of terminal devices 30. Here, the interface unit 21 is illustrated as being directly communicably connected to the plurality of setting devices 10, and is communicably connected to the plurality of terminal devices 30 via the communication unit 21a and the network N, but is not limited thereto. The plurality of setting devices 10 may be connected to the interface unit 21 via the communication unit 21a and the network N. The communication unit 21a is a communication module (Data Communication Module: data communication module) that is communicatively connected to the network N. The network N is not limited to wired or wireless, and any communication network can be used.
The storage unit 22 is a storage circuit that stores various information. The storage unit 22 may be, for example, a storage device having a relatively large capacity such as a hard disk, an SSD (Solid State Drive: solid state disk), an optical disk, or a semiconductor memory capable of rewriting data such as a RAM, a flash memory, or an NVSRAM (Non Volatile Static Random Access Memory: nonvolatile static random access memory). The storage unit 22 stores, for example, a program for causing the detection device 20 to realize various functions. The programs stored in the detection device 20 include a program for causing the interface section 21 to function, a program for causing the communication section 21a to function, a program for causing the processing section 23 to function, and the like. The storage unit 22 stores, for example, a learned mathematical model or the like used for determining the state of the person P in the monitored space SP. The storage unit 22 stores various data required for various processes in the processing unit 23. The storage unit 22 reads out these various data as necessary by the processing unit 23 or the like. The storage unit 22 may be implemented by a cloud server or the like connected to the detection device 20 via the network N.
The processing section 23 is a processing circuit that realizes various processing functions in the detection device 20. The processing unit 23 is implemented by a processor, for example. The processor refers to, for example, a circuit such as a CPU (Central Processing Unit ), MPU (Micro Processing Unit, micro processing unit), ASIC (Application Specific Integrated Circuit ), FPGA (Field Programmable Gate Array, field programmable gate array), or the like. The processing unit 23 realizes each processing function by executing a program read from the storage unit 22, for example. For example, the processing section 23 can perform processing of inputting image data or the like representing the image I captured by the image capturing section 11 of the setting apparatus 10 to the detection apparatus 20 via the interface section 21.
< basic Structure of terminal device >
The terminal device 30 is a device communicably connected to the detection device 20. The terminal device 30 includes an interface unit 31, a storage unit 32, a processing unit 33, a display 34, a speaker 35, and a microphone 36, which are communicably connected to each other. The terminal device 30 can be configured by, for example, installing applications for realizing various processes on various computer devices such as a personal computer, a workstation, and a tablet terminal. The terminal device 30 may be configured as a portable terminal device that can be carried by, for example, a staff member of the facility to which the nursing system 1 is applied, or may be configured as a fixed management terminal device.
The interface 31, the storage 32, the processing 33, the display 34, the speaker 35, and the microphone 36 have substantially the same configuration as the interface 21, the storage 22, the processing 23, the display 13, the speaker 14, and the microphone 15 described above. The interface unit 31 is an interface for transmitting and receiving various information to and from other devices than the terminal device 30. The interface unit 31 is communicably connected to the detection device 20 via the communication unit 31a and the network N. The communication unit 31a is a communication module similar to the communication unit 21 a. The storage unit 32 stores, for example, programs for causing the terminal device 30 to realize various functions. The processing section 33 is a processing circuit that realizes various processing functions in the terminal device 30. The processing unit 33 realizes each processing function by executing a program read from the storage unit 32, for example. The display 34 displays image information. The speaker 35 outputs sound information. The microphone 36 is a sound receiving device that converts sound into an electrical signal.
As described above, the outline of the overall structure of the nursing system 1 according to the present embodiment is described.
< processing function of processing section of detection device >
With such a configuration, as shown in fig. 3 to 7, the processing unit 23 according to the present embodiment has a function of performing various processes such as: the state of the person P existing in the monitored space SP is determined based on the bone model MDL representing the person P, and the situation at the time of contact is appropriately grasped and recorded.
Specifically, in order to realize the various processing functions described above, the processing unit 23 of the present embodiment is functionally and conceptually configured to include: an information processing unit 23a, a bone model generating unit 23b, a determining unit 23c, and an operation processing unit 23d. The processing unit 23 realizes the processing functions of the information processing unit 23a, the bone model generating unit 23b, the determining unit 23c, and the operation processing unit 23d by, for example, executing a program read from the storage unit 22.
The information processing unit 23a is a unit having a function capable of executing processing related to various information used in the care system 1. The information processing unit 23a can perform processing for transmitting and receiving various information to and from the setting device 10 and the terminal device 30. The care system 1 can exchange information (e.g., audio information, image information, etc.) with the setting device 10 and the terminal device 30 by the processing of the information processing unit 23 a. Here, the information processing unit 23a can execute processing of acquiring image data representing the image I of the space SP to be monitored captured by the image capturing unit 11 from the setting apparatus 10 and temporarily storing the image data in the storage unit 22.
The bone model generating unit 23b is a unit having a function capable of executing a process of generating a bone model MDL (see fig. 3) indicating the person P included in the image I of the space SP to be monitored captured by the imaging unit 11. The bone model MDL is a human body model representing human bones including the head, eyes, nose, mouth, shoulders, waist, legs, knees, elbows, hands, joints, and the like of the person P in three dimensions.
The bone model generating unit 23b can generate the bone model MDL representing the person P included in the image I by estimating the top-down bone estimation of the bone of the person P after the person P is first detected, for example. In this case, the bone model generating unit 23b recognizes the person P in the image I using various known object recognition techniques, and executes a process of surrounding the outside of the area in which the recognized person P exists in the image I with the bounding box BB. Here, the bounding box BB is a rectangular box enclosing the person P identified in the image I by a desired size. Then, the bone model generating unit 23b detects three-dimensional position coordinates of each bone part (human body part) of the human body such as the head, the eye, the nose, the mouth, the shoulder, the waist, the leg, the knee, the elbow, the hand, each joint, and the like in the boundary box BB, and combines them to generate the bone model MDL of the human body P. The bone model MDL illustrated in fig. 3 is generated by symbolically representing each bone site of the human body such as the head, eye, nose, mouth, shoulder, waist, leg, knee, elbow, hand, each joint, etc. of the person P by using "points" and connecting them with "lines". When there are a plurality of persons P included in the image I, the bone model generating unit 23b generates a plurality of bone models MDL based on the number of persons P. The bone model generating unit 23b stores the generated bone model MDL in the storage unit 22.
The bone model generating unit 23b may generate a bone model representing the person P included in the image I by estimating the bone estimation of the person P from the bottom up after all bone parts of the person P in the image I are detected without using the bounding box BB or the like, for example. In this case, the bone model generating unit 23b first detects all three-dimensional position coordinates of each bone portion such as the head, the eye, the nose, the mouth, the shoulder, the waist, the leg, the knee, the elbow, the hand, and each joint of the human body in the image I using various known object recognition techniques. Then, the bone model generating unit 23b matches and splices the detected bone parts for each person P to generate a bone model MDL for each person P.
As an object recognition technique for recognizing the person P in the image I, the care system 1 can use various machine learning object recognition techniques. In this case, the care system 1 learns the person P by various machine learning in advance using, for example, the image I including the person P as learning data. At this time, the care system 1 can also register the users of the institutions in advance, and perform learning in advance so that each user can be identified as an individual in the image I. For example, the care system 1 prepares a training data set for learning using data on "an image I including a person P of a user as a mechanism" collected in advance as an explanatory variable, and data on "identification information (for example, user ID or the like) of the person P corresponding to the image" as a target variable, and performs machine learning using the training data set for learning. As the machine learning, for example, logistic regression, support vector machines, neural networks, random forests, and the like can be used, and various types of algorithms in the present embodiment can be applied. The nursing system 1 stores in advance a mathematical model or the like for the recognition of the learned object (person recognition) obtained by the machine learning in the storage unit 22.
Then, the bone model generating unit 23b generates the bone model MDL representing the person P in the image I after identifying the person P in the state of the person by the identification information, for example, by classification/regression based on the learned mathematical model for object identification (person identification) or the like stored in the storage unit 22 as described above. More specifically, the bone model generating unit 23b inputs the image I captured by the imaging unit 11 into the mathematical model for object recognition. As a result, the bone model generating unit 23b recognizes the person P in the image I, acquires the recognition information for specifying the person P, and generates the bone model MDL representing the person P. Thereby, the bone model generating unit 23b can generate the bone model MDL of the person P whose person is specified by the identification information. The bone model generating unit 23b stores the generated bone model MDL in the storage unit 22 together with the identification information of the person P who has identified the person.
The bone model generating unit 23b may perform processing of clearing the image I for generating the bone model MDL after the bone model MDL is generated, and may not retain the image data representing the image I in the storage unit 22, the storage unit 32, the temporary storage unit of the setting device 10, and the like. In this case, after the bone model MDL is generated, the care system 1 performs various processes using the bone model MDL generated from the image without using the image I for generating the bone model MDL in the subsequent processes (see fig. 4, 5, and the like). Thus, the nursing system 1 can perform various kinds of monitoring while ensuring the privacy of the organization user without using an image reflecting the face of the organization user or the like.
The determination unit 23c is a part having a function capable of executing a process of determining the state of the person P corresponding to the bone model MDL based on the bone model MDL generated by the bone model generation unit 23 b. The determination unit 23c determines a standing state, a sitting state, a falling state, or the like as the state of the person P corresponding to the bone model MDL generated by the bone model generation unit 23 b. When the person P is included in the image I of the space SP to be monitored captured by the imaging unit 11, the determination unit 23c determines the state of the person P.
The nursing system 1 learns the state of the person P in advance by various machine learning using, for example, the relative positional relationship, the relative distance, the size of the bounding box BB, and the like of each bone site in the bone model MDL as parameters. For example, the care system 1 performs machine learning using data related to the previously collected "relative positional relationship, relative distance, size of the bounding box BB" of each bone site in the bone model MDL as explanatory variables, and data related to "state of the person P" as target variables. As the machine learning, for example, logistic regression, support vector machines, neural networks, random forests, and the like can be used in the same manner as described above, and the algorithm of various forms of the present embodiment can be applied. The nursing system 1 stores in advance the mathematical model or the like for determining the state of the learning obtained by the machine learning in the storage unit 22.
Then, the determination unit 23c determines the state of the person corresponding to the bone model MDL by classification/regression based on the learned state determination mathematical model or the like stored in the storage unit 22 as described above. More specifically, the determination unit 23c inputs the relative positional relationship, the relative distance, and the size of the bounding box BB of each bone site obtained from the bone model MDL of the person P included in the actually captured image I into the mathematical model for state determination. Thus, the determination unit 23c determines the state of the person P (standing state, sitting state, falling state, etc.) corresponding to the bone model MDL.
As described above, the determination unit 23c of the present embodiment determines the state of the person P corresponding to the bone model MDL based on the bone model MDL generated by the bone model generation unit 23b without using the image I for generating the bone model MDL.
The determination unit 23c of the present embodiment can also execute the following processing: the contact of the plurality of persons P is determined based on the presence or absence of overlapping of the bone models MDL indicating the plurality of persons P and the positions of the persons P in the imaging depth direction X detected by the position detector 12.
Specifically, as shown in fig. 4, 5, 6, 7, etc., the determination unit 23c of the present embodiment determines the contact between the first person P1 and the second person P2 based on whether or not the first bone model MDL1 representing the first person P1 and the second bone model MDL2 representing the second person P2 overlap, the position of the first person P1 in the imaging depth direction X detected by the position detector 12, and the position of the second person P2 in the imaging depth direction X detected by the position detector 12.
For example, as illustrated in fig. 4, when the bone model MDL1 representing the person P1 and the bone model MDL2 representing the person P2 do not overlap, the determination unit 23c determines that the first person P1 and the second person P2 are not in contact.
On the other hand, for example, as illustrated in fig. 5, when there is an overlap between the bone model MDL1 representing the person P1 and the bone model MDL2 representing the person P2, the determination unit 23c determines the total overlap between the person P1 and the person P2, in other words, the contact (interference) between the person P1 and the person P2, based on the positions of the persons P1 and P2 in the imaging depth direction X detected by the position detector 12.
That is, in this case, for example, as illustrated in fig. 5 and 6, the determination unit 23c determines that the first person P1 and the second person P2 are not in contact when the bone model MDL1 representing the person P1 and the bone model MDL2 representing the person P2 overlap each other and the position of the person P1 in the imaging depth direction X and the position of the person P2 in the imaging depth direction X are out of a predetermined contact range. Here, the contact range is a position range set in advance for determining contact (interference) between the persons P, and indicates that the persons P are in contact with each other when the position of one person P in the imaging depth direction X is within the contact range based on the position of the other person P in the imaging depth direction X. That is, when the position of the person P1 in the imaging depth direction X and the position of the person P2 in the imaging depth direction X are out of the contact range, the determination unit 23c can determine that the person P1 and the person P2 are not in contact.
On the other hand, for example, as illustrated in fig. 5 and 7, when the bone model MDL1 representing the person P1 and the bone model MDL2 representing the person P2 overlap each other and the position of the person P1 in the imaging depth direction X and the position of the person P2 in the imaging depth direction X are within the contact range, the determination unit 23c determines that the first person P1 is in contact with the second person P2.
When the determination unit 23c determines that the persons P corresponding to the bone model MDL are in contact with each other, the information processing unit 23a of the present embodiment may store the status data indicating the status at the time of contact in the storage unit 22 and store the status data as a record. The condition data representing the condition at the time of contact typically includes data representing the action of the bone model MDL until the first person P1 is in contact with the second person P2. Here, the bone model MDL until the first person P1 comes into contact with the second person P2 includes the bone model MDL1 of the first person P1 and the bone model MDL2 of the second person P2. The information processing unit 23a may store, for example, identification information indicating the persons P1 and P2 associated with the contact in the storage unit 22 as status data indicating the status at the time of the contact in association with the data.
In this case, as described above, the information processing unit 23a may not store the image I for generating the bone model MDL in the storage unit 22 as a record. In this way, the care system 1 can store the operations and the like of the bone model MDL until the persons P come into contact with each other as records in the storage unit 22 while ensuring the privacy of the persons P.
The operation processing section 23d is a section having a function capable of executing processing of controlling operations of the sections based on the determination result of the detection device 20. The operation processing unit 23d of the present embodiment can execute processing of transferring the status data to another device other than the detection device 20 based on the determination result of the determination unit 23 c. The operation processing unit 23d controls the communication unit 21a, for example, based on the determination result of the determination unit 23c, transfers the status data corresponding to the determination result to the terminal device 30, and notifies the staff member or the like of the facility that there is contact with each other via the terminal device 30. The terminal device 30 may store the received status data at the time of contact in the storage unit 32, for example, and store the status data as a record in advance. The terminal device 30 may display, for example, the operation of the bone model MDL until the person P touches each other via the display 34, and may cause an employee of the organization or the like to confirm the situation at the touch time. The processing function and the like for transferring the status data to the other devices by the operation processing unit 23d may be realized by the information processing unit 23 a.
< example of control flow >
Next, an example of control of the nursing system 1 will be described with reference to the flowchart of fig. 8.
First, when the person P enters the monitoring target space SP, the information processing section 23a of the detection apparatus 20 controls the image capturing section 11 to capture an image I of the monitoring target space SP, and stores the captured image information in the storage section 22 (step S1).
Next, the bone model generating unit 23b of the detecting device 20 detects the positions of the bone parts of the person P by object recognition and bone estimation based on the image I of the space SP to be monitored stored in the storage unit 22 (step S2), and generates the bone model MDL of the person P (step S3).
Next, the bone model generating unit 23b clears the image for generating the bone model MDL (step S4), and does not retain the image data representing the image. In the respective processes thereafter, the processing section 23 of the detection apparatus 20 performs various processes using the bone model MDL generated from the image I without using the image I for generating the bone model MDL.
Next, the determination section 23c of the detection device 20 determines whether or not the bone models MDL representing the plurality of persons P overlap with each other based on the bone models MDL generated in the process of step S3 (step S5).
When it is determined that the bone models MDL indicating the plurality of persons P do not overlap with each other (step S5: no), the determination unit 23c determines that the plurality of persons P do not contact each other (step S6), ends the current control cycle, and shifts to the next control cycle.
When it is determined in step S5 that the bone models MDL representing the plurality of persons P overlap each other (yes in step S5), the determination unit 23c determines whether or not the positions of the plurality of persons P in the imaging depth direction X are within the contact range based on the detection result of the position detector 12 (step S7).
When it is determined that the positions of the plurality of persons P in the imaging depth direction X are out of the contact range (step S7: no), the determination unit 23c proceeds to step S6 described above.
When determining that the positions of the plurality of persons P in the imaging depth direction X are within the contact range (yes in step S7), the determination unit 23c determines that the plurality of persons P are in contact with each other (step S8). At this time, the information processing unit 23a stores data indicating the operation of the bone model MDL until the persons P come into contact with each other and identification information indicating the person P related to the contact in association with each other as status data indicating the status at the time of contact in the storage unit 22, and stores the status data as a record.
Thereafter, the operation processing unit 23d of the detection device 20 controls the communication unit 21a, transmits the status data at the time of contact to the terminal device 30, notifies the staff member or the like of the mechanism via the terminal device 30 that there is contact between the persons P (step S9), and ends the control cycle of this time, and shifts to the next control cycle. At this time, the terminal device 30 stores the received status data at the time of contact in the storage unit 32, and saves the data as a record.
The nursing system 1 described above captures an image I of the space SP to be monitored by the imaging unit 11, and generates a bone model MDL representing the person P included in the image I by the bone model generating unit 23 b. Then, the determination unit 23c determines whether or not the person P corresponding to the bone model MDL is in contact based on the bone model MDL generated by the bone model generation unit 23b and the position of the person P detected by the position detector 12. As a result, the nursing system 1 can properly grasp the contact state of the person P. For example, in the case where a contact or interference accident between users occurs in the welfare mechanism, the nursing system 1 can use the determination result of the determination unit 23c as a material of the determination cause.
Further, the nursing system 1 can grasp the state of the person P based on the bone model MDL generated from the image I, not based on the image I itself captured by the imaging unit 11, and therefore can appropriately grasp the situation at the time of falling with a smaller data amount and calculation load.
In this case, since the nursing system 1 grasps the situation at the time of contact based on the bone model MDL generated from the image I, not on the image I itself captured by the imaging unit 11, the privacy of the person P can be ensured, and the situation at the time of contact can be grasped appropriately as described above. Thus, the nursing system 1 can reduce psychological stress on the setting of the setting device 10, for example, and can be a system that can easily obtain consent to the setting.
Here, the nursing system 1 described above includes the storage units 22 and 32, and the storage units 22 and 32 store the operations of the bone models MDL1 and MDL2 until the first person P1 and the second person P2 come into contact. Thus, the care system 1 can keep the actions of the bone models MDL1 and MDL2 before the person P1 and the person P2 fall as records while ensuring the privacy of the person P.
Specifically, the nursing system 1 described above determines that the plurality of persons P are not in contact when the determination unit 23c determines that the bone models MDL indicating the plurality of persons P do not overlap with each other. Thus, the care system 1 can easily determine that there is no contact by the person P from the overlapping of the plurality of bone models MDL. When there is a superposition of the bone models MDL indicating the plurality of persons P and the positions of the plurality of persons P in the imaging depth direction X are out of the contact range, the care system 1 determines that the plurality of persons P are not in contact by the determination unit 23 c. On the other hand, when the bone models MDL representing the plurality of persons P overlap each other and the positions of the plurality of persons P in the imaging depth direction X are within the contact range, the care system 1 determines that the plurality of persons P are in contact. As a result, the nursing system 1 can accurately determine the contact and interference of the person P, and can appropriately grasp the contact state of the person P as described above.
The nursing system according to the embodiment of the present invention is not limited to the embodiment described above, and various modifications can be made within the scope described in the claims.
In the above description, the storage units 22 and 32 may, for example, store the sound data collected by the microphone 15 as the condition data in advance, thereby supplementing the condition before and after contact with sound.
The processing units 23 and 33 described above are described as each having a single processor to realize each processing function, but the present invention is not limited thereto. The processing units 23 and 33 may be configured to combine a plurality of independent processors to execute a program by each processor, thereby realizing each processing function. The processing functions of the processing units 23 and 33 may be appropriately distributed or integrated in a single or a plurality of processing circuits. The processing functions of the processing units 23 and 33 may be realized by a program, or may be realized as hardware based on wired logic or the like.
The nursing system according to the present embodiment may be configured by appropriately combining the components of the embodiments and modifications described above.

Claims (3)

1. A nursing system, characterized by comprising:
an imaging unit that captures an image of a space to be monitored;
a bone model generation unit that generates a bone model representing a person included in the image captured by the imaging unit;
a position detector configured to detect a position of a person corresponding to the bone model in an imaging depth direction of the imaging unit; and
and a determination unit configured to determine contact between a first person and a second person based on whether or not a first bone model representing the first person and a second bone model representing the second person overlap each other, a position of the first person in the imaging depth direction detected by the position detector, and a position of the second person in the imaging depth direction detected by the position detector.
2. The care system as recited in claim 1, wherein,
the nursing system is provided with a storage part which stores the actions of the bone model until the first person contacts with the second person.
3. The care system as recited in claim 1 or 2, wherein,
in the case where the first bone model and the second bone model do not overlap, the determination unit determines that the first person and the second person are not in contact,
when the first bone model and the second bone model overlap each other and the position of the first person in the imaging depth direction and the position of the second person in the imaging depth direction are out of a predetermined contact range, the determination unit determines that the first person and the second person are not in contact,
the determination unit determines that the first person is in contact with the second person when the first bone model and the second bone model overlap and the position of the first person in the imaging depth direction and the position of the second person in the imaging depth direction are within the contact range.
CN202280020914.7A 2021-03-18 2022-02-14 Nursing system Pending CN116997942A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2021044212A JP7326363B2 (en) 2021-03-18 2021-03-18 Monitoring system
JP2021-044212 2021-03-18
PCT/JP2022/005693 WO2022196213A1 (en) 2021-03-18 2022-02-14 Watch-over system

Publications (1)

Publication Number Publication Date
CN116997942A true CN116997942A (en) 2023-11-03

Family

ID=83322242

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202280020914.7A Pending CN116997942A (en) 2021-03-18 2022-02-14 Nursing system

Country Status (3)

Country Link
JP (1) JP7326363B2 (en)
CN (1) CN116997942A (en)
WO (1) WO2022196213A1 (en)

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6148480B2 (en) 2012-04-06 2017-06-14 キヤノン株式会社 Image processing apparatus and image processing method
JP2018151693A (en) 2017-03-09 2018-09-27 株式会社デンソーテン Drive supporting device and drive supporting method

Also Published As

Publication number Publication date
WO2022196213A1 (en) 2022-09-22
JP7326363B2 (en) 2023-08-15
JP2022143604A (en) 2022-10-03

Similar Documents

Publication Publication Date Title
JP6592183B2 (en) monitoring
JP7138931B2 (en) Posture analysis device, posture analysis method, and program
CN204480228U (en) motion sensing and imaging device
EP3037917B1 (en) Monitoring
WO2015186436A1 (en) Image processing device, image processing method, and image processing program
CN110212451A (en) A kind of electric power AR intelligent patrol detection device
JP6666488B2 (en) Image extraction device
US20170206664A1 (en) Method for identifying, tracking persons and objects of interest
KR102413893B1 (en) Non-face-to-face non-contact fall detection system based on skeleton vector and method therefor
Humenberger et al. Embedded fall detection with a neural network and bio-inspired stereo vision
JP2020194493A (en) Monitoring system for nursing-care apparatus or hospital and monitoring method
Gomez-Donoso et al. Enhancing the ambient assisted living capabilities with a mobile robot
CN107111363B (en) Method, device and system for monitoring
JPWO2016088368A1 (en) Direction control device, direction control method, and direction control program
KR102156279B1 (en) Method and automated camera-based system for detecting and suppressing harmful behavior of pet
WO2022196214A1 (en) Monitoring system
JP2020160765A (en) Information processor, equipment determination method, computer program and learned model generation method
CN116997942A (en) Nursing system
CN110799090A (en) Sleeping abnormality notification system, sleeping abnormality notification method, and program
CN116940962A (en) Nursing system
JP7122543B1 (en) Information processing device, information processing system, and estimation method
US10891755B2 (en) Apparatus, system, and method for controlling an imaging device
WO2020241034A1 (en) Monitoring system and monitoring method
Ismail et al. Multimodal indoor tracking of a single elder in an AAL environment
CN113963780A (en) Automated method, system and apparatus for medical environment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination