WO2017029841A1 - Dispositif d'analyse d'images, procédé d'analyse d'images et programme d'analyse d'images - Google Patents

Dispositif d'analyse d'images, procédé d'analyse d'images et programme d'analyse d'images Download PDF

Info

Publication number
WO2017029841A1
WO2017029841A1 PCT/JP2016/063626 JP2016063626W WO2017029841A1 WO 2017029841 A1 WO2017029841 A1 WO 2017029841A1 JP 2016063626 W JP2016063626 W JP 2016063626W WO 2017029841 A1 WO2017029841 A1 WO 2017029841A1
Authority
WO
WIPO (PCT)
Prior art keywords
person
watched
space
watching
entrances
Prior art date
Application number
PCT/JP2016/063626
Other languages
English (en)
Japanese (ja)
Inventor
安川 徹
Original Assignee
ノーリツプレシジョン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ノーリツプレシジョン株式会社 filed Critical ノーリツプレシジョン株式会社
Priority to JP2017535257A priority Critical patent/JP6645503B2/ja
Publication of WO2017029841A1 publication Critical patent/WO2017029841A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/18Status alarms
    • G08B21/22Status alarms responsive to presence or absence of persons
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B25/00Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B25/00Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
    • G08B25/01Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium
    • G08B25/04Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium using a single signalling line, e.g. in a closed loop
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • the present invention relates to an image analysis apparatus, an image analysis method, and an image analysis program.
  • Patent Document 1 proposes a patient recognition device that detects a patient's getting out of bed by photographing a patient on a bed with a camera and processing an image obtained thereby. Specifically, this patient recognition apparatus binarizes a color imaged image captured by a camera with a target color such as black and other colors and extracts the connected component extracted with a straight color in the binarized image. Label (Area).
  • the patient recognition apparatus recognizes the presence of the human body by determining whether the area corresponds to a part of the human body based on the size and shape of the labeled area. Thereby, the patient recognition apparatus can detect that the patient's getting out of bed continues for a predetermined time or longer when the state in which the human body is not recognized continues for a predetermined time.
  • Patent Document 2 a method for extracting a foreground area from a captured image including the depth of each pixel and estimating the behavior of the watching target person based on the positional relationship between the extracted foreground area and a bed or the like. Proposed.
  • each estimation condition is set under the assumption that the extracted foreground area is related to the behavior of the person being watched over. Therefore, according to this method, it is possible to detect the behavior of the person being watched over by determining whether or not the positional relationship between the extracted foreground region and the bed satisfies each estimation condition.
  • the foreground area is extracted around the side frame of the bed, so that it is possible to detect the person leaving the watching target.
  • the bed leaving time for determining an abnormality may differ depending on the destination of the person being watched over.
  • the person who is watching over is uniformly treated regardless of the destination, and the abnormality of the watching target is detected based on a certain bed leaving time. Therefore, the present inventors have found that there is a problem in the conventional watching system that it is not possible to make an abnormality determination according to the destination of the watching target person.
  • the present invention has been made in consideration of such points, and an object of the present invention is to provide a technique that enables abnormality determination according to the destination of the person being watched over.
  • the present invention adopts the following configuration in order to solve the above-described problems.
  • the image analysis apparatus includes a plurality of entrances and exits existing in a watching space for watching a person to be watched, and an abnormality determination time for determining an abnormal state is set for each.
  • An image acquisition unit that continuously acquires captured images from a capturing device that is installed so as to capture a plurality of entrances and exits, and the plurality of people in the acquired captured images that the watching target is reflected in the captured images
  • An outing analysis unit that analyzes whether or not the exit of the watching space has been exited from any of the entrances and exits of the doorway, and was analyzed that the person being watched has exited the watching space from any of the entrances Later, in the acquired photographed image, the person to be watched returns to the watch space from the entrance used when going out of the watch space.
  • a feedback analysis unit that analyzes whether or not the person being watched has returned to the inside of the watching space, and before being analyzed, it is analyzed that the person being watched has gone out of the watching space from one of the entrances Whether or not the monitoring target person has exceeded the abnormality determination time set for the entrance used when the monitoring target person goes out of the monitoring space, the abnormal state of the monitoring target person is determined.
  • An abnormality determining unit for determining.
  • a shooting device is installed so that a plurality of entrances and exits existing in the watching space for watching over the person to be watched can be photographed, and the obtained images are analyzed, so that the person being watched over can enter each doorway. Watch for entering and exiting the watch space.
  • the abnormality determination time for determining an abnormal condition is set with respect to each entrance / exit. Then, the entrance / exit used by the person being watched over is specified, and it is determined whether or not the person being watched is in an abnormal state based on the abnormality determination time set for the specified entrance / exit. Therefore, according to the said structure, since the abnormality determination time used as the reference
  • the person to be watched over is a person to be watched over, such as an inpatient, a resident in a facility, a care recipient, and the like.
  • the watching space is a place where the watching target person is watched, for example, a room of the watching target person.
  • the entrance / exit may be a real one such as a door existing in the watching space, or may be a virtual one appropriately set on the photographed image.
  • a plurality of abnormality determination times may be set according to a time zone for at least one of the plurality of entrances.
  • the bed leaving time to be detected as abnormal may differ depending not only on the destination but also on the time zone.
  • a plurality of abnormality determination times are set according to the time zone for at least one of the plurality of entrances / exits shown in the captured image. Therefore, according to the said structure, about the entrance / exit in which multiple abnormality determination time was set according to the time slot
  • the image analysis apparatus uses an abnormality determination time set for the entrance used when the person to be watched goes out of the watching space.
  • an abnormality detection notification unit may be further provided that performs notification for notifying the abnormal state of the person to be watched.
  • a notification destination and a notification method for performing notification for notifying the abnormal state of the person being watched over can be appropriately selected according to the embodiment.
  • the notification may be made to a watcher who watches the behavior of the watch target person.
  • the watcher is, for example, a nurse, a facility staff, a caregiver, or the like.
  • the notification may be performed in cooperation with equipment installed in a facility such as a nurse call.
  • the imaging device may image the watching space from above. Then, when the person region in which the person is photographed moves to the back side of any one of the entrances in the captured image, the going-out analysis unit allows the watching target person to be outside the watching space from any of the entrances. You may analyze that it came out.
  • the return analysis unit is used when the watching target person goes out of the watching space in the photographed image after the watching target person goes out of the watching space from any one of the entrances. When the person area has moved to the near side of the doorway, it may be analyzed that the person to be watched has returned to the watch space.
  • the image acquisition unit may acquire a captured image including depth data indicating the depth of each pixel in the captured image.
  • the outing analysis unit extracts a person region in which a person is photographed based on three-dimensional pattern matching using the depth of each pixel indicated by the depth data, and the person in the extracted person region is the person to be watched over. If the person area moves to the back side of any one of the entrances, it may be analyzed that the person being watched has gone out of the watching space from any of the entrances.
  • the return analysis unit is used when the watching target person goes out of the watching space in the photographed image after the watching target person goes out of the watching space from any one of the entrances.
  • the person area is extracted based on the three-dimensional pattern matching on the near side of the entrance / exit, it may be analyzed that the person to be watched has returned to the watch space.
  • the acquired captured image includes depth data indicating the depth of each pixel.
  • the depth of each pixel expresses the depth of the subject shown in each pixel. Therefore, according to the depth data, the position of the subject in the real space can be specified. Therefore, according to the configuration, it is possible to specify the position in the real space of the object existing in the watching space such as the person to be watched, the door serving as the entrance / exit, without depending on the installation location and the viewing direction of the photographing apparatus. it can. Therefore, according to the said structure, it is not limited to the installation conditions of an imaging device, An abnormality of a watching target person can be detected.
  • the outing analysis unit may be configured such that when the person area disappears in the vicinity of any one of the entrance / exit areas, the person area corresponds to any one of the entrance / exit areas. It may be analyzed that the person to be watched has gone out of the watching space from any one of the entrances and exits as having moved to the back side. When the person to be watched goes out of the watch space when the person region appears in the vicinity of the entrance / exit area used when the watch person goes out of the watch space It may be analyzed that the person to be watched has returned to the watching space, assuming that the person area has moved to the front side of the entrance / exit used.
  • the entrance / exit is a door or the like that actually exists in the watching space
  • the watching target person goes out of the entrance / exit
  • the area where the watching target person appears disappears in the vicinity of the entrance / exit.
  • the watching target person returns to the watching space through the doorway, an area in which the watching target person appears appears near the doorway. According to this configuration, it is possible to easily analyze whether the watching target person enters or leaves the watching space based on the disappearance and appearance of the region in which the watching target person appears. Therefore, according to the said structure, watching of a watching target person is realizable by a simple process.
  • an information processing system that realizes each of the above-described configurations, an information processing method, or a program may be used. It may be a storage medium that can be read by a computer, other devices, machines, or the like in which such a program is recorded.
  • the computer-readable recording medium is a medium that stores information such as programs by electrical, magnetic, optical, mechanical, or chemical action.
  • the information processing system may be realized by one or a plurality of information processing devices.
  • the computer includes a plurality of entrances and exits existing in a watching space for watching the person being watched over, and each of the abnormality determination times for determining an abnormal state is provided.
  • a step of continuously acquiring a captured image from an imaging device installed so as to capture a plurality of entrances set, and in the acquired captured image, the person to be watched is reflected in the captured image
  • the person to be watched enters the watch space from the entrance used when going out of the watch space.
  • the image analysis program includes a plurality of entrances and exits in a watching space for watching a person to be watched, and an abnormality determination time for determining an abnormal state.
  • a step of continuously acquiring a captured image from an imaging device installed so as to capture a plurality of entrances set for the image, and in the acquired captured image, the person to be watched A step of analyzing whether or not one of the plurality of entrances to be photographed exits the watching space, and is analyzed that the person to be watched has exited the watching space from any of the entrances and exits. After that, in the acquired photographed image, the watching target is watched from the entrance used when the watching target person goes out of the watching space.
  • FIG. 1 schematically illustrates an example of a scene to which the present invention is applied.
  • FIG. 2 schematically illustrates an example of a captured image according to the embodiment.
  • FIG. 3 illustrates a hardware configuration of the image analysis apparatus according to the embodiment.
  • FIG. 4 illustrates a functional configuration of the image analysis apparatus according to the embodiment.
  • FIG. 5 illustrates a data configuration of the entrance / exit setting information according to the embodiment.
  • FIG. 6 exemplifies a courtesy procedure related to watching over the person being watched over by the image analysis apparatus according to the embodiment.
  • FIG. 7 schematically illustrates an example of a photographed image obtained by photographing a scene where the watching target person has gone out.
  • FIG. 8 schematically illustrates an example of a photographed image obtained by photographing a scene where the watching target person returns.
  • FIG. 9 illustrates a hardware configuration of an image analysis apparatus according to another embodiment.
  • FIG. 10 illustrates the relationship between the depth acquired by the camera according to another embodiment and the subject.
  • FIG. 11 illustrates a coordinate relationship in a captured image according to another embodiment.
  • FIG. 12 illustrates the positional relationship in real space between an arbitrary point (pixel) of a captured image and a camera according to another embodiment.
  • this embodiment will be described with reference to the drawings.
  • this embodiment described below is only an illustration of the present invention in all respects. It goes without saying that various improvements and modifications can be made without departing from the scope of the present invention. That is, in implementing the present invention, a specific configuration according to the embodiment may be adopted as appropriate.
  • data appearing in the present embodiment is described in a natural language, more specifically, it is specified by a pseudo language, a command, a parameter, a machine language, or the like that can be recognized by a computer.
  • FIG. 1 schematically illustrates a scene where the image analysis apparatus 1 according to the present embodiment is used.
  • FIG. 2 schematically illustrates an example of a captured image 3 acquired by the image analysis apparatus 1 using the camera 2.
  • the image analysis apparatus 1 according to the present embodiment shoots a watching space to watch over the person to be watched by the camera 2, and performs image analysis on the captured image 3 obtained thereby. ) Is detected. Therefore, the image analysis apparatus 1 according to the present embodiment can be widely used in a scene where the watching target person is watched.
  • the camera 2 is installed so as to be able to photograph a plurality of entrances 31 to 33 existing in a watching space for watching the person being watched, and image analysis is performed.
  • the device 1 photographs the watching space with the camera 2.
  • the camera 2 corresponds to the “photographing device” of the present invention.
  • the person to be watched over is a person to be watched over, for example, an inpatient, a facility resident, or a care recipient.
  • the watching space is a place where the watching target person is watched, for example, a room of the watching target person.
  • the image analysis apparatus 1 continuously acquires the captured image 3 in which the plurality of entrances 31 to 33 as illustrated in FIG. Then, the image analysis apparatus 1 determines whether or not the person to be watched has gone out of the watching space from any of the plurality of entrances 31 to 33 shown in the photographed image 3 in the acquired photographed image 3. Is analyzed.
  • the image analysis apparatus 1 After analyzing that the person to be watched has gone out of the watching space from any of the entrances and exits, the image analysis apparatus 1 causes the person to be watched to go out of the watching space in the acquired captured image 3. Analyzes whether or not it has returned to the watching space from the entrance / exit used. As a result, the image analysis apparatus 1 monitors whether the watching target person enters and exits the watching space through each doorway.
  • an abnormality determination time for determining an abnormal state is set for each of the entrances 31 to 33.
  • the image analysis apparatus 1 measures an elapsed time after it is analyzed that the person to be watched has gone out of the watching space from any of the plurality of entrances 31 to 33 shown in the captured image 3.
  • the image analyzing apparatus 1 determines whether the watching target person is in an abnormal state.
  • the entrance / exit used by the person being watched over is identified, and whether or not the person being watched is in an abnormal state is determined based on the abnormality determination time set for the identified doorway. Determined. Therefore, according to the present embodiment, an abnormality determination time serving as a reference for determining an abnormal state can be set according to the destination through which each doorway is communicated, so the abnormality determination according to the destination of the person being watched over is performed. can do.
  • the entrances 31 to 33 for the person to be watched to enter and exit the watching space may be actual doors or the like existing in the watching space, or may be virtual ones set appropriately on the photographed image 3. There may be.
  • the captured image 3 includes three entrances 31 to 33. However, the number of entrances / exits in the captured image 3 is not limited to three, and may be appropriately selected according to the embodiment.
  • the camera 2 is installed near the ceiling of the living room, which is a watching space, so that the watching space is photographed from above.
  • the installation position of the camera 2 does not have to be limited to such an example, and may be arranged in any place as long as a plurality of entrances / exits existing in the watching space can be photographed.
  • the location of the image analysis device 1 can be determined as appropriate according to the embodiment as long as the captured image 3 can be acquired from the camera 2.
  • the image analysis apparatus 1 may be disposed so as to be close to the camera 2 as illustrated in FIG.
  • the image analysis apparatus 1 may be connected to the camera 2 via a network, or may be disposed at a place completely different from the camera 2.
  • FIG. 3 illustrates a hardware configuration of the image analysis apparatus 1 according to the present embodiment.
  • the image analysis apparatus 1 stores a control unit 11 including a CPU, a RAM (Random Access Memory), a ROM (Read Only Memory), and the like, a program 5 executed by the control unit 11, and the like.
  • Unit 12 a touch panel display 13 for displaying and inputting images, a speaker 14 for outputting sound, an external interface 15 for connecting to an external device, a communication interface 16 for communicating via a network, and This is a computer to which a drive 17 for reading a program stored in the storage medium 6 is electrically connected.
  • the communication interface and the external interface are described as “communication I / F” and “external I / F”, respectively.
  • the components can be omitted, replaced, and added as appropriate according to the embodiment.
  • the control unit 11 may include a plurality of processors.
  • the touch panel display 13 may be replaced with an input device and a display device that are separately connected independently.
  • the speaker 14 may be omitted.
  • the speaker 14 may be connected to the image analysis device 1 as an external device instead of as an internal device of the image analysis device 1.
  • the image analysis apparatus 1 may incorporate a camera 2.
  • the drive 17 may be connected via the external interface 16 or may be omitted.
  • the image analysis device 1 may include a plurality of external interfaces 15 and may be connected to a plurality of external devices.
  • the camera 2 is connected to the image analysis apparatus 1 via the external interface 15, and is arranged so as to be capable of photographing a plurality of entrances (entrances 31 to 33 in FIGS. 1 and 2) existing in the watching space.
  • the camera 2 is installed near the ceiling of a living room, which is a watching space, so that the watching space is photographed from above.
  • the storage unit 12 stores the entry / exit setting information 121 and the program 5 for setting the monitoring conditions for each of the plurality of entrances 31 to 33 shown in the captured image 3.
  • the entrance / exit setting information 121 will be described later.
  • the program 5 is a program for causing the image analysis apparatus 1 to execute a processing procedure relating to watching of the person to be watched later, and corresponds to the “image analysis program” of the present invention.
  • the program 5 may be recorded on the storage medium 6.
  • the storage medium 6 stores information such as a program by an electrical, magnetic, optical, mechanical, or chemical action so that information such as a program recorded by a computer or other device or machine can be read. It is a medium to do.
  • the storage medium 6 corresponds to the “storage medium” of the present invention.
  • FIG. 3 illustrates a disk-type storage medium such as a CD (Compact Disk) or a DVD (Digital Versatile Disk) as an example of the storage medium 6.
  • the type of the storage medium 6 is not limited to the disk type and may be other than the disk type. Examples of the storage medium other than the disk type include a semiconductor memory such as a flash memory.
  • an image analysis apparatus 1 may be, for example, an apparatus designed exclusively for a service to be provided, or a general-purpose apparatus such as a PC (Personal Computer) or a tablet terminal. Furthermore, the image analysis apparatus 1 may be implemented by one or a plurality of computers.
  • FIG. 4 illustrates a functional configuration of the image analysis apparatus 1 according to the present embodiment.
  • FIG. 5 illustrates a data configuration of the entrance / exit setting information 121 according to the present embodiment.
  • the control unit 11 of the image analysis device 1 expands the program 5 stored in the storage unit 12 in the RAM. And the control part 11 interprets and runs the program 5 expand
  • the image analysis apparatus 1 functions as a computer including the image acquisition unit 111, the outing analysis unit 112, the feedback analysis unit 113, the abnormality determination unit 114, and the abnormality detection notification unit 115.
  • the image acquisition unit 111 continuously acquires the captured image 3 from the camera 2 installed so as to capture a plurality of entrances and exits (entrances 31 to 33 in the examples of FIGS. 1 and 2) existing in the watching space.
  • An abnormality determination time for determining an abnormal state is set for each of the plurality of entrances 31 to 33 shown in the captured image 3. In the present embodiment, this setting is held in the storage unit 12 as the entrance / exit setting information 121 illustrated in FIG.
  • the data indicating the entrance / exit setting information 121 includes an entrance / exit ID field, a position field, and an abnormality determination time field, as illustrated in FIG.
  • data for one row corresponds to data for one doorway.
  • an identifier for identifying the target gateway is stored.
  • the entrance / exit ID may be arbitrarily set as long as each entrance / exit 31-33 can be identified.
  • position information for specifying the position of the area where the target entrance / exit appears in the captured image 3 is stored.
  • the position of the target entrance / exit may be specified by coordinates on the captured image 3 or may be specified by coordinates of a space that appears in the captured image 3.
  • the method for specifying the position of the target entrance / exit may be appropriately selected according to the embodiment.
  • the abnormality judgment time designated for the target entrance is stored.
  • the abnormality determination time is a time serving as a reference for determining an abnormal state when the watching target person who has gone out from the target entrance does not return to the watching space. That is, the abnormality determination time corresponds to a time for monitoring the going out from the target entrance. Therefore, the abnormality determination time is appropriately set so as to exceed the normal time required for the watching target person to return from the target entrance and exit.
  • the abnormality determination time set for the target entrance / exit May be specified relatively long.
  • the abnormality determination time set for the target entrance / exit may be specified to be relatively short.
  • the abnormality determination time may be set to be fixed for each doorway, or may be set to be appropriately changeable according to a user instruction.
  • settings related to a plurality of entrances / exits in the captured image 3 are configured as independent entrance / exit setting information 121.
  • the settings related to the plurality of entrances / exits shown in the captured image 3 may be incorporated in a part of the program 5 (for example, the branch condition in the program 5).
  • the data indicating the entrance / exit setting information 121 is expressed in a table format.
  • the data format of the entrance / exit setting information 121 may not be limited to such an example, and may be appropriately selected according to the embodiment.
  • the data format of the entrance / exit setting information 121 may be a data format other than the table format.
  • the entrance / exit setting information 121 is stored in the storage unit 12 in the own apparatus.
  • the entrance / exit setting information 121 may be stored in a storage device of another information processing apparatus.
  • the image analysis apparatus 1 may acquire the entrance / exit setting information 121 from the other information processing apparatus via a network, for example.
  • the record structure of the data indicating the entrance / exit setting information 121 may not be limited to the example of FIG. 5 and may be appropriately set according to the embodiment.
  • the values stored in each record in FIG. 5 are described for convenience in order to explain the operation of the image analysis apparatus 1 according to the present embodiment, and are not limited to such an example. The value stored in each record may be appropriately determined according to the embodiment.
  • the outing analysis unit 112 allows the person to be watched to go out of the watching space from any of the plurality of entrances 31 to 33 shown in the photographed image 3 in the acquired photographed image 3. Analyzes whether or not.
  • the return analysis unit 113 is used when the watching target person goes out of the watching space in the acquired photographed image 3 after it is analyzed that the watching target person goes out of the watching space from any of the entrances and exits. Analyzes whether or not they have returned to the watching space from the entrance.
  • the abnormality determination unit 114 is configured to monitor the elapsed time since it is analyzed that the person to be watched has gone out of the watching space from one of the entrances before it is analyzed that the person to be watched has returned to the watching space.
  • the abnormal state of the person being watched over is determined based on whether or not the abnormality determination time set for the entrance used when the person goes out of the watching space is exceeded.
  • the abnormality detection notification unit 115 indicates that the monitoring target person is in an abnormal state because the elapsed time exceeds the abnormality determination time set for the entrance used when the monitoring target person goes out of the watching space. When it is determined that, the notification for notifying the abnormal state of the person being watched over is performed.
  • FIG. 6 illustrates a processing procedure relating to watching of the person being watched by the image analysis apparatus 1.
  • the processing procedure relating to watching of the watching target person described below corresponds to the “image analysis method” of the present invention.
  • the processing procedure related to watching of the watching target person described below is only an example, and each processing may be changed as much as possible. Further, in the processing procedure described below, steps can be omitted, replaced, and added as appropriate according to the embodiment.
  • the control unit 11 first executes a determination process for detecting the person being watched over by the processes of steps S101 to S103. The processing in steps S101 to S103 is repeated until it is detected that the watching target person has gone out. When detecting that the person being watched over is detected, next, the control unit 11 performs a determination process for detecting the return of the person being watched by the processes of steps S104 to S107, and the process since the person being watched is gone. Execute time judgment processing. Then, if the return of the watching target person is detected before the elapsed time since the watching target person goes out exceeds the abnormality determination time set for the target entrance, the control unit 11 performs step S108. The process according to this operation example is terminated.
  • Step S101 In step S ⁇ b> 101, the control unit 11 functions as the image acquisition unit 111 and acquires the captured image 3 captured by the camera 2. When the captured image 3 is acquired from the camera 2, the control unit 11 advances the processing to the next step S102.
  • the captured image 3 may be a two-dimensional image such as a general RGB image. Further, the captured image 3 may be a moving image or one or a plurality of still images. Further, the control unit 11 may acquire such a captured image 3 in synchronization with the video signal of the camera 2. And the control part 11 may perform the process of step S102 and step S103 mentioned later with respect to the picked-up image 3 acquired synchronizing with the camera 2 immediately.
  • the image analysis apparatus 1 can detect in real time whether the person being watched over is present in the shooting range of the camera 2 by continuously executing such an operation continuously. Below, the example which detects going out of a monitoring subject by the captured image 3 acquired continuously in this way is demonstrated.
  • Step S102 In the next step S102, the control unit 11 functions as the outing analysis unit 112, and in the captured image 3 acquired in step S101, the person to be watched out of the plurality of entrances 31 to 33 shown in the captured image 3 Analyze whether or not you went out of the watching space from one of the entrances. That is, the control part 11 performs the determination process which detects going out of a monitoring subject.
  • a method for determining whether or not the person to be watched has gone out of the watching space from any of the plurality of entrances 31 to 33 shown in the captured image 3 is appropriately set according to the embodiment. Good.
  • the control unit 11 determines whether or not the person to be watched has gone out of the watching space from any of the multiple entrances 31 to 33 shown in the captured image 3. Also good.
  • FIG. 7 schematically illustrates the captured image 3 obtained when the person to be watched goes out of the watching space using the entrance 31.
  • the control unit 11 analyzes that the person to be watched has moved out of the watching space from any of the entrances when the person area in which the person is photographed moves to the back side of any of the entrances in the captured image 3. May be.
  • the method for extracting the person area may be appropriately selected according to the embodiment.
  • the control unit 11 may extract a person region by pattern matching. That is, the control unit 11 may extract a person region by holding a template of a person shape in the storage unit 12 or the like and searching the captured image 3 for a pattern that matches the template.
  • control unit 11 may extract a person region by a background difference method. That is, the control unit 11 extracts the foreground area of the captured image 3 by calculating the difference between the captured image 3 acquired in step S101 and the background image set as the background of the captured image 3.
  • This foreground area is an area where a change has occurred from the background image. Therefore, when the watching target person moves in the captured image 3, the area where the watching target person is captured is extracted as the foreground area. Therefore, the control unit 11 may treat this foreground area as a person area.
  • the background image can be set as appropriate according to the embodiment.
  • the control unit 11 may determine whether or not the shape of the foreground region corresponds to a person by image processing such as pattern matching. Then, as a result of pattern matching, the control unit 11 may exclude the foreground area that does not correspond to the shape of the person from the processing target of this step S102, and may set the foreground area corresponding to the shape of the person as the processing target of this step S102.
  • the control unit 11 captures the person region extracted in this way and sets it as a tracking target. Thereby, the control unit 11 can track the person area in which the same target (watching target person) is captured in the image group of the captured image 3 repeatedly acquired in step S101 thereafter.
  • the control unit 11 regards the person appearing in the person area as a person to be watched over, and determines whether or not the person area being tracked has moved deeper than any one of the plurality of entrances 31 to 33. Determine.
  • the positions of the entrances 31 to 33 can be specified by referring to the entrance setting information 121.
  • the positional relationship between the person area and each of the entrances 31 to 33 can be determined by an arbitrary method.
  • the control unit 11 It may be determined that the person has moved to the back side of the doorway, and the going-out of the person being watched over may be detected.
  • each entrance 31 determines whether or not the person to be watched has moved from one of the entrances to the back side based on the positional relationship between the areas of the respective entrances 31 to 33 and the person area. May be.
  • the front-rear relationship of each subject can be estimated at the position of the lower end of each subject. That is, the control unit 11 determines that the person to be watched has moved to the back side of any one of the entrances when the lower end of the person area moves upward from the lower end of any entrance on the captured image 3. The going-out of the person being watched over may be detected.
  • control unit 11 may appropriately determine whether or not the person shown in the person area is the person to be watched over. The determination can be performed by any method.
  • the control unit 11 may stop the monitoring of each doorway by omitting the processes after step S102.
  • step S102 when it is detected in step S102 that the person being watched over is detected, that is, the person to be watched is out of the watch space from any of the plurality of entrances 31 to 33 shown in the photographed image 3.
  • the control unit 11 proceeds to the next step S103 after detecting that the person being watched over has gone out.
  • the control unit 11 advances the processing to the next step S103 without detecting the watching target person going out.
  • Step S103 the control unit 11 determines whether or not the watching target person has gone out in step S ⁇ b> 102.
  • the control unit 11 returns the process to step S101. That is, the control unit 11 repeats the determination process (steps S101 to S103) for detecting the watching target person's going out until the watching target person's going out is detected.
  • step S104 the control unit 11 advances the processing to the next step S104. That is, when detecting the going-out of the person being watched over, the control unit 11 performs the determination process for detecting the return of the person being watched over and the elapsed time after going out of the person being watched over (hereinafter also referred to as “elapsed time going out”). Starts to execute the process of determining whether or not the abnormality determination time set for the target doorway has been exceeded. At this time, the control unit 11 may start a timer and measure the elapsed time by using the timer in order to specify the elapsed time in step S107 described later. Moreover, the control part 11 may hold
  • Step S104 In the next step S104, the control unit 11 functions as the image acquisition unit 111, and acquires the captured image 3 from the camera 2 in the same manner as in step S101. When the captured image 3 is acquired from the camera 2, the control unit 11 advances the processing to the next step S105.
  • the processes in steps S104 to S107 correspond to the determination process for detecting the return of the person being watched over and the process for determining whether or not the elapsed time has exceeded the abnormality determination time set for the target entrance. To do.
  • the processes in steps S104 to S107 are executed until the return of the person being watched over is detected or the elapsed time exceeds the abnormality determination time set for the target entrance. Therefore, as in step S101, the process of step S104 is repeatedly executed until the return of the person being watched over is detected or the elapsed time exceeds the abnormality determination time set for the target entrance. That is, the photographed image 3 is continuously acquired by the processing of step S104 until the return of the person being watched over is detected or the elapsed time exceeds the abnormality determination time set for the target entrance.
  • Step S105 In the next step S105, the control unit 11 functions as the feedback analysis unit 113, and is obtained in step S104 after it is analyzed in step S102 that the person to be watched has gone out of the watching space from one of the entrances. In the captured image 3, it is analyzed whether or not the person to be watched has returned to the watch space from the entrance / exit used for going out. That is, the control part 11 performs the determination process which detects a monitoring subject's return.
  • a method for determining whether or not the person to be watched has returned to the watching space from the entrance / exit used for going out may be appropriately set according to the embodiment.
  • the control unit 11 may determine whether the watching target person has returned to the watching space from the entrance used for going out.
  • FIG. 8 schematically illustrates the captured image 3 obtained when the person to be watched returns to the watching space using the entrance 31 after the scene of FIG.
  • the control unit 11 can extract a person region by the same method as in step S102. That is, the control unit 11 can extract a person region by image processing such as pattern matching and background difference method. Moreover, the control part 11 can determine whether the person area
  • each doorway 31 to 33 is provided with a shield such as a door
  • the person to be watched by the shield is The captured image 3 is reflected. Therefore, in this case, the control unit 11 is used for the going out when the shielding object is not shown in front of the person area or the person area appears near the entrance area used for going out. It may be determined that the person being watched has moved to the near side from the doorway, and the return of the person being watched over may be detected.
  • each entrance 31 determines that the person to be watched is closer to the front than the entrance used for going out based on the positional relationship between the areas of the entrances 31 to 33 and the person area. You may determine whether it moved.
  • the front-rear relationship of each subject can be estimated at the position of the lower end of each subject. That is, when the lower end of the person area moves downward from the lower end of the entrance used for going out on the captured image 3, the control unit 11 determines that the person to be watched has moved to the near side from the entrance used for going out. It may be determined and the return of the person being watched over may be detected. In addition, when it is determined that a plurality of persons have returned to the watching space, the control unit 11 may limit the person to be monitored as the watching target person. For example, the control unit 11 may watch as a person who has first recognized that he / she has returned and monitor the person as a target person.
  • control unit 11 detects the return of the person being watched over and proceeds to the next step S106. On the other hand, when such a determination cannot be made, the control unit 11 proceeds to the next step S106 without detecting the return of the person being watched over.
  • Step S106 Returning to FIG. 6, in the next step S ⁇ b> 106, the control unit 11 determines whether or not the watched person's return is detected in step S ⁇ b> 105. If the return of the person being watched over is not detected in step S105, the control unit 11 advances the process to the next step S107.
  • control unit 11 ends the processing according to this operation example.
  • control unit 11 may specify an outing time from detection of the watching target person going out to detection of the return of the watching target person, and record the outing time in the storage unit 12 or the like.
  • the method for specifying the outing time can be appropriately selected according to the embodiment. For example, when the going-out elapsed time is measured by a timer, the control unit 11 can stop the counting of the timer at this time and specify the going-out time by referring to the timer. In addition, for example, in the case where the time at which the watching target person has gone out is held, the control unit 11 calculates the difference between the time at which the watching target person has gone out and the time at the time of this processing. In this way, it is possible to specify the time to go out.
  • control unit 11 may start the process again from step S101. That is, the control unit 11 may continue watching the watching target person by starting execution from the determination process for detecting the watching target person going out.
  • Step S107 In the next step S107, the control unit 11 functions as the abnormality determination unit 114, and the elapsed time after detecting the going-out of the person to be watched in step S102 is the entrance / exit used for going out (FIGS. 7 and 8). Then, it is determined whether or not the abnormality determination time set for the entrance / exit 31) has been exceeded.
  • the control unit 11 can specify the abnormality determination time set for the entrance / exit used by the person being watched by referring to the entrance / exit setting information 121. As a result of this determination, if the going-out elapsed time does not exceed the abnormality determination time set for the target entrance, the control unit 11 determines that the abnormal state of the watching target has not occurred, The process returns to S104.
  • control unit 11 detects the return of the person being watched over or the elapsed time of going out to detect the return of the person being watched until the elapsed time exceeds the abnormality determination time set for the target entrance. Repeats the process of determining whether or not has exceeded the abnormality determination time set for the target entrance.
  • the control unit 11 advances the processing to the next step S108. That is, the control unit 11 performs an abnormality detection notification in the next step S108, assuming that an abnormal state of the person being watched over has occurred.
  • the going-out elapsed time at this time can be specified as appropriate.
  • the control unit 11 can specify the going-out elapsed time at this time by referring to the timer at this time.
  • the control unit 11 calculates the difference between the time at which the watching target person has gone out and the time at the time of this processing. Thus, it is possible to specify the elapsed time at this time.
  • step S107 may be executed at an arbitrary timing.
  • the process in step S107 may be executed separately from the processes in steps S104 to S106.
  • the control part 11 performs step. The process returns to S104.
  • Step S108 the control unit 11 functions as the abnormality detection notification unit 115 and performs notification for notifying the abnormal state of the person being watched over. That is, in step S107, the control unit 11 determines that the watching target person is in an abnormal state because the outing elapsed time exceeds the abnormality determination time set for the entrance used by the watching target person. In the event that there is a possibility that the person being watched over may be in an abnormal state, an alarm is issued.
  • the control unit 11 may notify the third party other than the watching target person, in particular, the watching person watching the behavior of the watching target person.
  • the watcher is, for example, a nurse, a facility staff, a caregiver, or the like.
  • the control part 11 may perform the said notification to watching over person itself.
  • the image analysis apparatus 1 when used in a facility such as a hospital, the image analysis apparatus 1 can be connected to equipment such as a nurse call system via the external interface 15.
  • the control unit 11 may perform notification for notifying the abnormal state of the person being watched over in cooperation with equipment such as the nurse call system. That is, the control unit 11 may control the nurse call system via the external interface 15. And the control part 11 may perform the call by the said nurse call system as a notification for notifying an abnormal condition of a watching target person. Accordingly, it is possible to appropriately notify a nurse or the like who watches over the person being watched over that the person being watched over is unnaturally absent.
  • control unit 11 may perform notification for notifying the abnormal state of the person being watched over by displaying the screen on the touch panel display 13. Further, for example, the control unit 11 may perform notification for notifying the abnormal state of the person being watched over by outputting predetermined sound from the speaker 14 connected to the image analysis apparatus 1.
  • control unit 11 may perform notification for notifying the abnormal state of the person being watched over using e-mail, short message service, push notification, or the like.
  • the e-mail address, telephone number, and the like of the user terminal that is the notification destination may be registered in advance in the storage unit 12.
  • the control part 11 may perform the notification for notifying an abnormal condition of a person to watch over using this e-mail address, a telephone number, etc. which were registered beforehand.
  • the control unit 11 terminates the processing according to this operation example after performing such notification.
  • the control unit 11 may repeat the processing of steps S104 to S108 until it detects the return of the person being watched over.
  • the controller 11 since the controller 11 starts to newly measure the elapsed time, the controller 11 may reset the elapsed time measured in the processing so far.
  • the image analysis apparatus 1 first, by the processing of steps S101 to S103, the person being watched over watches from any of the plurality of entrances 31 to 33 shown in the captured image 3. A determination process is performed to detect that the user has gone out of the space. Next, after detecting the going-out of the person being watched over, the image analysis apparatus 1 performs the processing in steps S104 to S106, so that the person to be watched from the entrance (outlet 31 in FIGS. 7 and 8) uses the watching space. A determination process for detecting that the process has returned to is executed.
  • the image analysis apparatus 1 determines whether or not the elapsed time has exceeded the abnormality determination time set for the entrance used for going out before detecting the return of the person being watched over by the processing of step S107. The process for determining is executed. If the going-out elapsed time exceeds the abnormality determination time before detecting the return of the person being watched over, the image analyzing apparatus 1 notifies the abnormality state of the person being watched over by the process of step S108. Make a notification.
  • an abnormality determination time serving as a threshold for determining an abnormal state is set for each of the entrances 31 to 33.
  • the abnormal state determination time for each of the entrances 31 to 33 can be set according to the destination to which each of the entrances 31 to 33 communicates. Therefore, according to this embodiment, the abnormality determination according to the destination of the watching target person can be performed.
  • the detected abnormal state of the person being watched over can be notified by the process of step S108. Therefore, it is possible to prevent the watching target person from being unnaturally absent from the watching space for a long period of time, thereby preventing an accident or the like from approaching the watching target person.
  • steps S102 and S105 it is determined based on the positional relationship between the person area and each of the entrances 31 to 33 that the person to be watched enters and exits the watching space. It is arranged to take a picture of the watching space from above. Therefore, it is possible to prevent the other from being completely obstructed by either one, and as a result, as illustrated in FIGS. 7 and 8, the camera 2 connects the watching target person and each of the entrances 31 to 33. You can shoot properly. Therefore, according to this embodiment, by using such a captured image 3, it is possible to appropriately watch over the person being watched over.
  • the image analysis apparatus 1 monitors the person area based on the disappearance and appearance of the person area. The person's going out and returning can be easily detected. Therefore, according to the present embodiment, watching of the watching target person can be realized by a simple process.
  • one abnormality determination time is set for each of the entrances 31 to 33.
  • the number of abnormality determination times that can be set for each of the entrances 31 to 33 is not limited to one, and a plurality of abnormality determination times are set for at least one of the plurality of entrances 31 to 33. May be.
  • each abnormality determination time may be set according to the time zone.
  • the entrance / exit 31 communicates with the shared space, and the available time zone of the shared space is 8:00 to 20:00.
  • two different abnormality determination times may be set for the entrance / exit 31 depending on the time zone from 8:00 to 20:00 when the shared space can be used and other time zones.
  • the abnormality determination time corresponding to the time zone from 8:00 to 20:00 is set to be longer than the abnormality determination time corresponding to the other time zones.
  • the control unit 11 refers to the entrance / exit setting information 121 in accordance with the time zone when comparing the elapsed time and the abnormality determination time in step S107. Acquire abnormal judgment time. For example, the control unit 11 is set for an entrance / exit used by the person being watched over for a time zone that includes the time at which the watched person has gone out in step S102 or the time at which the process of step S107 is performed. Get the abnormality judgment time.
  • the abnormality determination time for determining the abnormal state of the person being watched over can be made variable not only for each of the entrances 31 to 33 but also for the time zone. That is, according to the modified example, for the entrance / exit in which a plurality of abnormality determination time zones are set according to the time zone, in addition to the destination of the person being watched over, the abnormality determination according to the time zone can be performed.
  • the captured image 3 may include depth data indicating the depth of each pixel in the captured image 3. A modification in this case will be described with reference to FIGS.
  • FIG. 9 illustrates a hardware configuration of the image analysis apparatus 1 according to this modification.
  • the camera 2 includes a depth sensor 21 for measuring the depth of the subject.
  • the type and measurement method of the depth sensor 21 may be appropriately selected according to the embodiment.
  • the depth sensor 21 may be a sensor of TOF (TimeFOf Flight) method or the like.
  • the configuration of the camera 2 is not limited to such an example as long as the depth of the subject can be acquired, and can be appropriately selected according to the embodiment.
  • the camera 2 may be a stereo camera. Since the stereo camera shoots the subject within the shooting range from a plurality of different directions, the depth of the subject can be recorded.
  • the camera 2 may be replaced with the depth sensor 21 alone.
  • the depth sensor 21 may be an infrared depth sensor that measures the depth based on infrared irradiation so that the depth can be acquired without being affected by the brightness of the shooting location.
  • relatively inexpensive imaging devices including such an infrared depth sensor include Kinect from Microsoft, Xtion from ASUS, and Structure® Sensor from Occipital.
  • the captured image 3 only needs to include data indicating the depth of the subject within the photographing range (view angle). For example, data in which the depth of the subject within the photographing range is two-dimensionally distributed (for example, a depth map). ).
  • the captured image 3 may include an RGB image together with the depth data. Further, the captured image 3 may be a moving image or one or a plurality of still images.
  • FIG. 10 shows an example of a distance that can be handled as the depth according to the present modification.
  • the depth represents the depth of the subject.
  • the depth of the subject may be expressed by, for example, a straight line distance A between the camera 2 and the object, or a perpendicular distance B from the horizontal axis with respect to the subject of the camera 2. It may be expressed as
  • the depth according to this modification may be the distance A or the distance B.
  • the distance B is treated as the depth.
  • the distance A and the distance B can be converted into each other by using, for example, the three-square theorem. Therefore, the following description using the distance B can be applied to the distance A as it is.
  • the image analysis apparatus 1 according to the present modification can specify the position of the subject in the real space.
  • the control unit 11 determines, for example, the gray value of each pixel according to the depth of each pixel in step S101 and step S104.
  • the obtained captured image 3 is acquired.
  • the control unit 11 can specify the position of each pixel in the real space based on the depth data included in the captured image 3. That is, the control unit 11 can specify the position in the three-dimensional space (real space) of the subject captured in each pixel from the coordinates (two-dimensional information) and the depth of each pixel in the captured image 3. .
  • a calculation example in which the control unit 11 specifies the position of each pixel in the real space will be described with reference to FIGS. 11 and 12.
  • FIG. 11 illustrates the coordinate relationship in the captured image 3.
  • FIG. 12 illustrates the positional relationship between an arbitrary pixel (point s) of the captured image 3 and the camera 2 in the real space.
  • the left-right direction in FIG. 11 corresponds to a direction perpendicular to the paper surface of FIG. That is, the length of the captured image 3 shown in FIG. 12 corresponds to the length in the vertical direction (H pixels) illustrated in FIG. Further, the length in the horizontal direction (W pixels) illustrated in FIG. 11 corresponds to the length in the vertical direction of the photographed image 3 that does not appear in FIG.
  • the coordinates of an arbitrary pixel (point s) of the captured image 3 are (x s , y s ), the horizontal angle of view of the camera 2 is V x , and the vertical image Assume that the corner is V y . Further, it is assumed that the number of pixels in the horizontal direction of the captured image 3 is W, the number of pixels in the vertical direction is H, and the coordinates of the center point (pixel) of the captured image 3 are (0, 0).
  • the control unit 11 can acquire information indicating the angle of view (V x , V y ) of the camera 2 from the camera 2.
  • the method for acquiring information indicating the angle of view (V x , V y ) of the camera 2 is not limited to such an example, and the control unit 11 is information indicating the angle of view (V x , V y ) of the camera 2. May be acquired based on user input, or may be acquired as a preset setting value.
  • the control unit 11 can acquire the coordinates (x s , y s ) of the point s and the number of pixels (W ⁇ H) of the captured image 3 from the captured image 3.
  • the control unit 11 can acquire the depth Ds of the point s by referring to the depth data included in the captured image 3.
  • the control unit 11 can specify the position of each pixel (point s) in the real space by using these pieces of information. For example, based on the relational expressions represented by the following formulas 1 to 3, the control unit 11 performs vector S (S x , S y , S z) from the camera 2 to the point s in the camera coordinate system illustrated in FIG. , 1) can be calculated. Thereby, the position of the point s in the two-dimensional coordinate system in the captured image 3 and the position of the point s in the camera coordinate system can be mutually converted.
  • the vector S is a vector of a three-dimensional coordinate system centered on the camera 2.
  • the camera 2 may be tilted with respect to the horizontal direction. That is, the camera coordinate system may be tilted from the world coordinate system in the three-dimensional space (real space). Therefore, the control unit 11 applies the projective transformation using the roll angle, the pitch angle ( ⁇ in FIG. 12), and the yaw angle of the camera 2 to the vector S, so that the vector S of the camera coordinate system is converted into the world coordinate system. And the position of the point s in the world coordinate system may be calculated.
  • control unit 11 can specify the three-dimensional shape of the subject. Therefore, the control unit 11 may perform three-dimensional pattern matching using the depth of each pixel indicated by the depth data when extracting the person region in steps S102 and S105. In this case, for example, the control unit 11 holds a three-dimensional template of a person in the storage unit 12 or the like, and searches the captured image 3 for a pattern that matches the three-dimensional template using the depth data. Thus, the person area can be extracted.
  • the position of the subject in the real space can be specified by the depth data included in the captured image 3. Accordingly, it is possible to specify the position in the real space of the object existing in the watching space such as the person being watched over, each of the entrances 31 to 33, etc., without depending on the installation location and the visual field direction of the camera 2. Therefore, according to the modified example, it is not limited to the installation conditions of the camera 2, and it is possible to detect the abnormality of the person being watched over.
  • the control unit 11 may extract a person area based on the foreground area.
  • the area of the extracted foreground area corresponds to the size of the person. Therefore, the control unit 11 may determine whether or not the area of the foreground region is included in the predetermined area range in Steps S102 and S105.
  • This predetermined area range is set so as to cover a possible value as the area of the foreground area in which the person being watched over is captured.
  • control unit 11 excludes the foreground region having an area that is not included in the predetermined area range from the processing target in Steps S102 and S105, and determines the foreground region having the area included in the predetermined area range in Steps S102 and Step S102. It is good also as a processing target of S105.
  • the area of the foreground region may be given by the number of pixels included in the foreground region.
  • the depth of the subject appearing in the captured image 3 is acquired with respect to the surface of the subject, the area of the surface portion of the subject corresponding to each pixel of the captured image 3 does not always match between the pixels.
  • the control unit 11 may calculate the area of the extracted foreground area in the real space using the depth of each pixel in step S102 and step S105 in order to exclude the influence of the subject's perspective. .
  • the area of the foreground area in the real space can be calculated as follows, for example. That is, the control unit 11 first determines the length of the arbitrary point s (one pixel) illustrated in FIGS. 11 and 12 in the real space in the real space based on the following relational expressions 4 and 5. Each of w and / or length h in the vertical direction is calculated.
  • control unit 11 calculates the area of one pixel in the real space at the depth Ds by the square of w calculated in this way, the square of h, or the product of w and h. And the control part 11 calculates the area in the real space of a foreground area
  • control unit 11 may use the average of the areas for several frames.
  • the control unit 11 determines the corresponding region. You may exclude from a process target.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Emergency Management (AREA)
  • Business, Economics & Management (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Pathology (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Signal Processing (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Alarm Systems (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Emergency Alarm Devices (AREA)

Abstract

La présente invention concerne une technologie qui permet de déterminer une anomalie sur la base de la destination d'une personne faisant l'objet de soins. Selon un aspect de la présente invention, un dispositif d'analyse d'image obtient des images capturées d'une pluralité de portes d'un endroit où des soins sont prodiguées à une personne, permettant ainsi de surveiller les entrées et sorties de la personne faisant l'objet de soins par n'importe laquelle des portes de l'endroit où les soins sont prodigués. Si la personne faisant l'objet de soins passe à l'extérieur par l'une des portes, le dispositif d'analyse d'image détermine en outre si la sortie de la personne faisant l'objet de soins représente une situation anormale sur la base d'un ensemble d'horaires de détermination d'anomalie pour la porte que la personne faisant l'objet de soins a utilisée pour passer à l'extérieur.
PCT/JP2016/063626 2015-08-18 2016-05-06 Dispositif d'analyse d'images, procédé d'analyse d'images et programme d'analyse d'images WO2017029841A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2017535257A JP6645503B2 (ja) 2015-08-18 2016-05-06 画像解析装置、画像解析方法、及び、画像解析プログラム

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2015161024 2015-08-18
JP2015-161024 2015-08-18

Publications (1)

Publication Number Publication Date
WO2017029841A1 true WO2017029841A1 (fr) 2017-02-23

Family

ID=58051681

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/063626 WO2017029841A1 (fr) 2015-08-18 2016-05-06 Dispositif d'analyse d'images, procédé d'analyse d'images et programme d'analyse d'images

Country Status (2)

Country Link
JP (1) JP6645503B2 (fr)
WO (1) WO2017029841A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019067129A (ja) * 2017-09-29 2019-04-25 キヤノン株式会社 画像処理装置、画像処理システム、画像処理方法、及びプログラム
JP2019074806A (ja) * 2017-10-12 2019-05-16 株式会社日立エルジーデータストレージ 生活リズム測定システム及び生活リズム測定方法
CN115810257A (zh) * 2022-11-23 2023-03-17 佳净洁环境科技有限公司 基于智慧云厕的救援报警信息生成方法及智慧云厕

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11224390A (ja) * 1998-02-05 1999-08-17 Matsushita Electric Ind Co Ltd 在室状況監視装置および在室状況監視システム
JP2002373388A (ja) * 2001-06-14 2002-12-26 Matsushita Electric Works Ltd 人体検知装置
JP2003162775A (ja) * 2001-11-27 2003-06-06 Santekku:Kk 建築物のセキュリティ監視システム
JP2014236896A (ja) * 2013-06-10 2014-12-18 Nkワークス株式会社 情報処理装置、情報処理方法、及び、プログラム

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000123273A (ja) * 1998-08-11 2000-04-28 Nippon Signal Co Ltd:The 生活行動遠隔確認システム

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11224390A (ja) * 1998-02-05 1999-08-17 Matsushita Electric Ind Co Ltd 在室状況監視装置および在室状況監視システム
JP2002373388A (ja) * 2001-06-14 2002-12-26 Matsushita Electric Works Ltd 人体検知装置
JP2003162775A (ja) * 2001-11-27 2003-06-06 Santekku:Kk 建築物のセキュリティ監視システム
JP2014236896A (ja) * 2013-06-10 2014-12-18 Nkワークス株式会社 情報処理装置、情報処理方法、及び、プログラム

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019067129A (ja) * 2017-09-29 2019-04-25 キヤノン株式会社 画像処理装置、画像処理システム、画像処理方法、及びプログラム
JP7080614B2 (ja) 2017-09-29 2022-06-06 キヤノン株式会社 画像処理装置、画像処理システム、画像処理方法、及びプログラム
JP2019074806A (ja) * 2017-10-12 2019-05-16 株式会社日立エルジーデータストレージ 生活リズム測定システム及び生活リズム測定方法
CN115810257A (zh) * 2022-11-23 2023-03-17 佳净洁环境科技有限公司 基于智慧云厕的救援报警信息生成方法及智慧云厕

Also Published As

Publication number Publication date
JP6645503B2 (ja) 2020-02-14
JPWO2017029841A1 (ja) 2018-06-07

Similar Documents

Publication Publication Date Title
JP6115335B2 (ja) 情報処理装置、情報処理方法、及び、プログラム
JP6780641B2 (ja) 画像解析装置、画像解析方法、及び、画像解析プログラム
JP6167563B2 (ja) 情報処理装置、情報処理方法、及び、プログラム
JP6500785B2 (ja) 情報処理装置、情報処理方法、及び、プログラム
JP6504156B2 (ja) 情報処理装置、情報処理方法、及び、プログラム
JP6432592B2 (ja) 情報処理装置、情報処理方法、及び、プログラム
WO2015133195A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations, et programme
WO2014199786A1 (fr) Système d'imagerie
JP6489117B2 (ja) 情報処理装置、情報処理方法、及び、プログラム
JP6638723B2 (ja) 画像解析装置、画像解析方法、及び、画像解析プログラム
JP2016170701A (ja) 浴室異常検知装置、浴室異常検知方法、及び、浴室異常検知プログラム
Joshi et al. A fall detection and alert system for an elderly using computer vision and Internet of Things
WO2017029841A1 (fr) Dispositif d'analyse d'images, procédé d'analyse d'images et programme d'analyse d'images
JP6607253B2 (ja) 画像解析装置、画像解析方法、及び、画像解析プログラム
US20230412919A1 (en) Device and method for controlling a camera
JP6606912B2 (ja) 浴室異常検知装置、浴室異常検知方法、及び浴室異常検知プログラム
JP6565468B2 (ja) 呼吸検知装置、呼吸検知方法、及び呼吸検知プログラム
JP6780639B2 (ja) 画像解析装置、画像解析方法、及び、画像解析プログラム
JP2022126069A (ja) 画像処理方法
JP2024093173A (ja) 画像処理装置、画像処理システム、画像処理プログラム、および画像処理方法
KR20150060440A (ko) 2d와 3d 영상 분석을 이용한 침입 감지 방법

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16836833

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2017535257

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16836833

Country of ref document: EP

Kind code of ref document: A1