WO2016152182A1 - Dispositif de détection d'état anormal, procédé de détection d'état anormal, et programme de détection d'état anormal - Google Patents

Dispositif de détection d'état anormal, procédé de détection d'état anormal, et programme de détection d'état anormal Download PDF

Info

Publication number
WO2016152182A1
WO2016152182A1 PCT/JP2016/050281 JP2016050281W WO2016152182A1 WO 2016152182 A1 WO2016152182 A1 WO 2016152182A1 JP 2016050281 W JP2016050281 W JP 2016050281W WO 2016152182 A1 WO2016152182 A1 WO 2016152182A1
Authority
WO
WIPO (PCT)
Prior art keywords
pedestrian
abnormal state
real space
captured image
depth
Prior art date
Application number
PCT/JP2016/050281
Other languages
English (en)
Japanese (ja)
Inventor
靖和 田中
安川 徹
Original Assignee
ノーリツプレシジョン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ノーリツプレシジョン株式会社 filed Critical ノーリツプレシジョン株式会社
Priority to JP2017507517A priority Critical patent/JP6737262B2/ja
Publication of WO2016152182A1 publication Critical patent/WO2016152182A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb

Definitions

  • the present invention relates to an abnormal state detection device, an abnormal state detection method, and an abnormal state detection program.
  • Patent Document 1 a pedestrian is photographed with a stereo camera, and the obtained image data is analyzed three-dimensionally to detect the pedestrian's posture and motion, and the pedestrian is based on the detected posture and motion.
  • a system has been proposed to measure the function of life. According to such a system, the state of the pedestrian can be observed without causing the pedestrian to wear equipment.
  • the present invention has been made in consideration of such points, and an object thereof is to provide a system capable of appropriately watching a pedestrian.
  • the present invention adopts the following configuration in order to solve the above-described problems.
  • the abnormal state detection device acquires a captured image that is a captured image of a pedestrian performing a walking motion and includes depth data indicating the depth of each pixel in the captured image.
  • An image acquisition unit, an extraction unit that extracts a person region in which the pedestrian is captured in the acquired captured image, and a depth of each pixel included in the extracted person region, the image captured in the captured image A behavior measurement unit that measures the behavior of the local part in the real space of the pedestrian's body by continuously specifying the position of the local part in the real space, and the measurement A state determination unit that determines whether or not the pedestrian is in an abnormal state based on the behavior of the local part that has been performed, and a case where the determination result indicates that the pedestrian is in an abnormal state And that the pedestrian is in an abnormal state Comprising a notification unit that performs an abnormality detection notification for causing et al., The.
  • the captured image acquired in order to detect the abnormal state of the pedestrian includes depth data indicating the depth of each pixel.
  • the depth of each pixel indicates the depth from the photographing apparatus to the subject. More specifically, the depth of the subject is acquired with respect to the surface of the subject. That is, if the depth data is used, the position of the subject surface in the real space can be specified. Therefore, if this depth data is used, the state of the pedestrian in the real space (three-dimensional space) can be analyzed.
  • the behavior in the real space of the local part to be observed in the body of the pedestrian, not the entire body of the pedestrian, is measured. And based on the behavior in the real space of the said local site
  • the behavior measuring unit may measure a behavior in the real space above the pedestrian as the local part.
  • the state determination unit detects whether or not the upper part of the pedestrian has descended a predetermined distance or more in real time based on the measured behavior of the upper part of the pedestrian. When it is detected that the upper part of the person has fallen a predetermined distance or more in real time, the pedestrian may fall and determine that the pedestrian is in an abnormal state.
  • the position of the pedestrian's body is assumed to move rapidly downward. Therefore, in this configuration, the pedestrian's fall is monitored by detecting whether or not the upper part of the pedestrian has fallen a predetermined distance or more within a predetermined time. Thereby, according to the said structure, when a pedestrian falls, it can detect that the pedestrian fell into the abnormal state.
  • the upper part of the pedestrian indicates the upper end of the pedestrian in the real space, and may be one point of the upper end of the pedestrian or may be an area having an arbitrary area provided at the upper end of the pedestrian. Good.
  • the upper part of the pedestrian can be set as appropriate.
  • the upper end of a pedestrian is the highest part in real space among the pedestrian's bodies shown in the photographed image.
  • the behavior measuring unit may measure a behavior in the real space above the pedestrian as the local part.
  • the state determination unit detects whether the upper part of the pedestrian has moved to a position lower than a predetermined first height in real space based on the measured behavior of the upper part of the pedestrian. When it is detected that the upper part of the pedestrian has moved to a position lower than the predetermined first height in real space, the pedestrian is crooked and is in an abnormal state. May be determined.
  • the predetermined first height value for detecting the stagnation state may be appropriately set according to the embodiment.
  • the state determination unit may be configured such that the upper part of the pedestrian is in real space based on the measured behavior of the upper part of the pedestrian. Detecting whether or not the pedestrian has moved to a position lower than a predetermined second height lower than the first height, and the upper part of the pedestrian has moved to a position lower than the predetermined second height in real space When this is detected, it may be determined that the pedestrian is lying and the pedestrian is in an abnormal state.
  • the pedestrian When the pedestrian is lying down, the entire body of the pedestrian is assumed to be at a lower height than in the case of the above-mentioned cramped state. Therefore, in this configuration, the pedestrian lies down by detecting whether or not the upper part of the pedestrian has moved to a position lower than a predetermined second height that is lower than the first height in real space. Monitor whether or not Thereby, according to the said structure, when a pedestrian lies down, it can detect that the pedestrian fell into the abnormal state.
  • the predetermined second height value for detecting the lying state may be appropriately set according to the embodiment.
  • the notification unit may perform the abnormality detection notification when the abnormal state of the pedestrian continues for a predetermined time or more.
  • anomaly detection notification is made only when the abnormal state of the pedestrian continues for a certain time or longer, so that the pedestrian's state occurs for a moment when the abnormal state condition is satisfied. It is possible to prevent erroneous notification of abnormal detection notifications. Therefore, according to the said structure, the misreport of an abnormality detection notification can be prevented and it can notify appropriately that the abnormal state of the pedestrian was detected.
  • an information processing system that realizes each of the above configurations, an information processing method, or a program may be used.
  • it may be a storage medium that can be read by a computer, a device, a machine or the like in which such a program is recorded.
  • the computer-readable recording medium is a medium that stores information such as programs by electrical, magnetic, optical, mechanical, or chemical action.
  • the information processing system may be realized by one or a plurality of information processing devices.
  • the abnormal state detection method is a captured image in which a computer captures a pedestrian performing a walking motion, and includes a captured image including depth data indicating the depth of each pixel in the captured image.
  • the abnormal state detection program is a captured image obtained by capturing a pedestrian performing a walking motion, and includes depth data indicating the depth of each pixel in the captured image.
  • a step of acquiring a photographed image, a step of extracting a person region in which the pedestrian is captured in the acquired photographed image, and a depth of each pixel included in the extracted person region are referred to as the photographed image.
  • FIG. 1 schematically illustrates a scene where the present invention is applied.
  • FIG. 2 illustrates a hardware configuration of the abnormal state detection device according to the embodiment.
  • FIG. 3 illustrates the relationship between the depth acquired by the camera according to the embodiment and the subject.
  • FIG. 4 illustrates the functional configuration of the abnormal state detection device according to the embodiment.
  • FIG. 5 illustrates a processing procedure relating to pedestrian watching by the abnormal state detection device according to the embodiment.
  • FIG. 6 illustrates a captured image acquired by the camera according to the embodiment.
  • FIG. 7 illustrates the coordinate relationship in the captured image according to the embodiment.
  • FIG. 8 illustrates the positional relationship between an arbitrary point (pixel) of the captured image and the camera in the real space according to the embodiment.
  • FIG. 9 schematically illustrates a state where a pedestrian has fallen.
  • FIG. 10 schematically illustrates a state in which a pedestrian is cramped.
  • FIG. 11 schematically illustrates a state where a pedestrian is lying.
  • this embodiment will be described with reference to the drawings.
  • this embodiment described below is only an illustration of the present invention in all respects. It goes without saying that various improvements and modifications can be made without departing from the scope of the present invention. That is, in implementing the present invention, a specific configuration according to the embodiment may be adopted as appropriate.
  • data appearing in the present embodiment is described in a natural language, more specifically, it is specified by a pseudo language, a command, a parameter, a machine language, or the like that can be recognized by a computer.
  • FIG. 1 shows an example of a scene in which the abnormal state detection device 1 according to the present embodiment is used.
  • the abnormal state detection device 1 according to the present embodiment photographs a pedestrian with the camera 2 and analyzes the captured image 3 obtained thereby to monitor the state of the pedestrian in the captured image 3 and to perform the walking. It is an information processing apparatus that watches a person. Therefore, the abnormal state detection apparatus 1 according to the present embodiment can be widely used in a scene where the target person who is the target of watching is watched over.
  • the abnormal state detection device 1 acquires a captured image 3 obtained by capturing a pedestrian performing a walking motion from the camera 2.
  • the target person (pedestrian) is walking in the shooting range of the camera 2, and the camera 2 is installed for shooting such a target person.
  • the target person does not always have to perform a walking motion, and may remain in a specific place.
  • the camera 2 is configured to be able to acquire the depth corresponding to each pixel in the captured image 3.
  • the camera 2 includes a depth sensor (a depth sensor 21 described later) that measures the depth of the subject so that the depth of each pixel can be acquired.
  • the abnormal state detection apparatus 1 according to the present embodiment is connected to such a camera 2 and acquires a photographed image 3 obtained by photographing a pedestrian whose state is to be monitored.
  • the acquired captured image 3 includes depth data indicating the depth obtained for each pixel, as illustrated in FIG.
  • the captured image 3 only needs to include data indicating the depth of the subject within the imaging range, and the data format can be appropriately selected according to the embodiment.
  • the captured image 3 may be data (for example, a depth map) in which the depth of the subject within the imaging range is two-dimensionally distributed.
  • the captured image 3 may include an RGB image together with the depth data.
  • the captured image 3 may be configured with a moving image or one or a plurality of still images as long as the state of the pedestrian can be analyzed.
  • the abnormal state detection device 1 extracts a person area in which the pedestrian appears in the acquired captured image 3.
  • depth data indicating the depth of each pixel is included. Therefore, the abnormal state detection apparatus 1 can specify the position of the subject in the captured image 3 in the real space by using this depth data. More specifically, the depth of the subject is acquired with respect to the surface of the subject. That is, the abnormal state detection device 1 can specify the position of the subject surface in the real space by referring to the depth of each pixel indicated by the depth data.
  • the abnormal state detection device 1 refers to the depth of each pixel included in the extracted person region, and in the real space of the local part to be observed among the pedestrian's body shown in the captured image 3. By continuously specifying the position, the behavior of the local part in real space is measured.
  • the local region to be observed can be set as appropriate according to the embodiment.
  • the local site may be a specific site on the body such as the head, shoulder, chest, or leg.
  • the local part may be a part where the position on the body can be changed depending on the state of the pedestrian in the captured image, such as the upper part of the pedestrian, instead of such a specific part on the body.
  • it is desirable that the local part is set to a part where the state of the pedestrian is easily reflected.
  • the local part to be observed is a part indicating the position of the upper end of the pedestrian, such as the upper part of the pedestrian or the head.
  • the upper part 31 of the pedestrian is set as a local site to be observed.
  • the abnormal state detection device 1 determines whether or not the pedestrian is in an abnormal state based on the measured behavior of the local part. Furthermore, when the abnormal state detection device 1 determines that the pedestrian is in an abnormal state as a result of the determination, the abnormal state detection device 1 performs an abnormal state notification for notifying that the pedestrian is in the abnormal state. That is, when the pedestrian falls into an abnormal state, the abnormal state detection device 1 performs an alarm for notifying the abnormal state. Thereby, the user of the abnormal state detection device 1 according to the present embodiment can know the abnormal state of the pedestrian existing in the shooting range of the camera 2 and can watch over the pedestrian.
  • the state of the pedestrian is analyzed based on the captured image 3 including the depth data indicating the depth of each pixel.
  • the position of the subject surface in real space can be specified by using the depth data. Therefore, if this depth data is used, the state of the pedestrian in the real space (three-dimensional space) can be analyzed regardless of the viewing direction (viewpoint) of the camera 2 with respect to the pedestrian.
  • the abnormal state detection device 1 uses this depth data to determine a local part (for example, walking) of the pedestrian's body, not the entire pedestrian's body.
  • the behavior of the upper part 31) of the person in real space is measured.
  • the abnormal condition detection apparatus 1 determines whether a pedestrian is in an abnormal condition based on the behavior in the real space of the said local site
  • the location of the abnormal state detection device 1 can be determined as appropriate according to the embodiment as long as the captured image 3 can be acquired from the camera 2.
  • the abnormal state detection device 1 may be disposed so as to be close to the camera 2 as illustrated in FIG.
  • the abnormal state detection apparatus 1 may be connected to the camera 2 via a network, or may be arranged at a place completely different from the camera 2.
  • FIG. 2 illustrates a hardware configuration of the abnormal state detection device 1 according to the present embodiment.
  • the abnormal state detection apparatus 1 stores a control unit 11 including a CPU, a RAM (Random Access Memory), a ROM (Read Only Memory), and the like, a program 5 executed by the control unit 11, and the like.
  • a storage unit 12 a touch panel display 13 for displaying and inputting images, a speaker 14 for outputting sound, an external interface 15 for connecting to an external device, a communication interface 16 for communicating via a network,
  • the computer 17 is electrically connected to a drive 17 for reading a program stored in the storage medium 6.
  • the communication interface and the external interface are described as “communication I / F” and “external I / F”, respectively.
  • the components can be omitted, replaced, and added as appropriate according to the embodiment.
  • the control unit 11 may include a plurality of processors.
  • the touch panel display 13 may be replaced with an input device and a display device that are separately connected independently.
  • the speaker 14 may be omitted.
  • the speaker 14 may be connected to the abnormal state detection device 1 as an external device instead of as an internal device of the abnormal state detection device 1.
  • the abnormal state detection device 1 may incorporate the camera 2.
  • the abnormal state detection device 1 may include a plurality of external interfaces 15 and may be connected to a plurality of external devices.
  • the camera 2 is connected to the abnormal state detection device 1 via the external interface 15 and photographs a target pedestrian whose state is to be monitored.
  • the installation location of the camera 2 may be appropriately selected according to the embodiment.
  • the camera 2 may be arrange
  • the camera 2 includes a depth sensor 21 for measuring the depth of the subject in order to capture the captured image 3 including depth data.
  • the type and measurement method of the depth sensor 21 may be appropriately selected according to the embodiment.
  • the depth sensor 21 may be a sensor of TOF (TimeFOf Flight) method or the like.
  • the configuration of the camera 2 is not limited to such an example as long as the depth can be acquired, and can be appropriately selected according to the embodiment.
  • the camera 2 may be a stereo camera so that the depth of the subject within the shooting range can be specified. Since the stereo camera shoots the subject within the shooting range from a plurality of different directions, the depth of the subject can be recorded. Further, the camera 2 may be replaced with the depth sensor 21 as long as the depth of the subject within the shooting range can be specified.
  • the depth sensor 21 may be an infrared depth sensor that measures the depth based on infrared irradiation so that the depth can be acquired without being affected by the brightness of the shooting location.
  • relatively inexpensive imaging apparatuses including such an infrared depth sensor include Kinect from Microsoft, Xtion from ASUS, and CARMINE from PrimeSense.
  • FIG. 3 shows an example of a distance that can be handled as the depth according to the present embodiment.
  • the depth represents the depth of the subject.
  • the depth of the subject may be expressed by, for example, a straight line distance A between the camera 2 and the object, or a perpendicular distance B from the horizontal axis with respect to the subject of the camera 2. It may be expressed as
  • the depth according to the present embodiment may be the distance A or the distance B.
  • the distance B is treated as the depth.
  • the distance A and the distance B can be converted into each other based on, for example, the three-square theorem. Therefore, the following description using the distance B can be applied to the distance A as it is.
  • the abnormal state detection apparatus 1 according to the present embodiment can analyze the state of the pedestrian by using such a depth.
  • the storage unit 12 stores the program 5.
  • This program 5 is a program for causing the abnormal state detection device 1 to execute each process related to detection of an abnormal state of a pedestrian described later, and corresponds to the “abnormal state detection program” of the present invention.
  • the program 5 may be recorded on the storage medium 6.
  • the storage medium 6 stores information such as a program by an electrical, magnetic, optical, mechanical, or chemical action so that information such as a program recorded by a computer or other device or machine can be read. It is a medium to do.
  • the storage medium 6 corresponds to the “storage medium” of the present invention.
  • 2 illustrates a disk-type storage medium such as a CD (Compact Disk) or a DVD (Digital Versatile Disk) as an example of the storage medium 6.
  • the type of the storage medium 6 is not limited to the disk type and may be other than the disk type. Examples of the storage medium other than the disk type include a semiconductor memory such as a flash memory.
  • an abnormal state detection device 1 may be, for example, a device designed exclusively for the provided service, or a general-purpose device such as a PC (Personal Computer) or a tablet terminal. Furthermore, the abnormal state detection device 1 may be implemented by one or a plurality of computers.
  • FIG. 4 illustrates a functional configuration of the abnormal state detection device 1 according to the present embodiment.
  • the control unit 11 of the abnormal state detection device 1 expands the program 5 stored in the storage unit 12 in the RAM. And the control part 11 interprets and runs the program 5 expand
  • the abnormal state detection device 1 functions as a computer including the image acquisition unit 51, the extraction unit 52, the behavior measurement unit 53, the state determination unit 54, and the notification unit 55.
  • the image acquisition unit 51 acquires the captured image 3 captured by the camera 2.
  • the acquired captured image 3 includes depth data indicating the depth of each pixel.
  • the position of the subject in the captured image 3 in the real space more specifically, the position of the subject surface in the real space can be specified.
  • the extraction unit 52 extracts a person area in which the pedestrian appears in the acquired photographed image 3.
  • the behavior measurement unit 53 refers to the depth of each pixel included in the extracted person region and continuously determines the position in the real space of the local part to be observed among the pedestrian's body shown in the captured image 3. By specifying, the behavior of the local part in the real space is measured.
  • the state determination unit 54 determines whether or not the pedestrian in the captured image 3 is in an abnormal state based on the measured behavior of the local part. Then, as a result of the determination, when it is determined that the pedestrian is in an abnormal state, the notification unit 55 performs an abnormality detection notification for notifying that the pedestrian is in an abnormal state.
  • FIG. 5 illustrates a processing procedure related to watching of a pedestrian by the abnormal state detection device 1.
  • the processing procedure relating to watching of pedestrians described below corresponds to the “abnormal state detection method” of the present invention.
  • the processing procedure regarding watching of a pedestrian described below is only an example, and each processing may be changed as much as possible. Further, in the processing procedure described below, steps can be omitted, replaced, and added as appropriate according to the embodiment.
  • Step S101 In step S ⁇ b> 101, the control unit 11 functions as the image acquisition unit 51 and acquires the captured image 3 captured by the camera 2. Then, after acquiring the captured image 3, the control unit 11 advances the processing to the next step S102.
  • the camera 2 includes a depth sensor 21. Therefore, the captured image 3 acquired in step S101 includes depth data indicating the depth of each pixel measured by the depth sensor 21.
  • the control unit 11 acquires the captured image 3 illustrated in FIG. 6 as the captured image 3 including the depth data.
  • FIG. 6 shows an example of the captured image 3 including depth data.
  • the captured image 3 illustrated in FIG. 6 is an image in which the gray value of each pixel is determined according to the depth of each pixel.
  • a black pixel is closer to the camera 2.
  • a white pixel is farther from the camera 2.
  • the control unit 11 can specify the position of each pixel in the real space. That is, the control unit 11 can specify the position in the three-dimensional space (real space) of the subject captured in each pixel from the coordinates (two-dimensional information) and the depth of each pixel in the captured image 3. .
  • FIGS. 7 and 8 a calculation example in which the control unit 11 specifies the position of each pixel in the real space will be described with reference to FIGS. 7 and 8.
  • FIG. 7 schematically illustrates the coordinate relationship in the captured image 3.
  • FIG. 8 schematically illustrates a positional relationship between an arbitrary pixel (point s) of the captured image 3 and the camera 2 in the real space. 7 corresponds to a direction perpendicular to the paper surface of FIG. That is, the length of the captured image 3 shown in FIG. 8 corresponds to the length in the vertical direction (H pixels) illustrated in FIG. Further, the length in the horizontal direction (W pixels) illustrated in FIG. 7 corresponds to the length in the direction perpendicular to the paper surface of the captured image 3 that does not appear in FIG.
  • the coordinates of an arbitrary pixel (point s) of the captured image 3 are (x s , y s ), the horizontal field angle of the camera 2 is V x , and the vertical direction Let the angle of view be V y .
  • the number of pixels in the horizontal direction of the captured image 3 is W
  • the number of pixels in the vertical direction is H
  • the coordinates of the center point (pixel) of the captured image 3 are (0, 0).
  • the control unit 11 can acquire information indicating the angle of view (V x , V y ) of the camera 2 from the camera 2. Further, the control unit 11 may acquire information indicating the angle of view (V x , V y ) of the camera 2 based on a user input or may be acquired as a preset setting value. Further, the control unit 11 can acquire the coordinates (x s , y s ) of the point s and the number of pixels (W ⁇ H) of the captured image 3 from the captured image 3. Furthermore, the control unit 11 can acquire the depth Ds of the point s by referring to the depth data included in the captured image 3.
  • the control unit 11 can specify the position of each pixel (point s) in the real space by using these pieces of information. For example, the control unit 11 performs vector S (S x , S y , S z) from the camera 2 to the point s in the camera coordinate system illustrated in FIG. , 1) can be calculated. Thereby, the position of the point s in the two-dimensional coordinate system in the captured image 3 and the position of the point s in the camera coordinate system can be mutually converted.
  • the vector S is a vector of a three-dimensional coordinate system centered on the camera 2.
  • the camera 2 may be inclined with respect to a horizontal plane (ground). That is, the camera coordinate system may be tilted from the world coordinate system of a three-dimensional space with respect to the horizontal plane (ground). Therefore, the control unit 11 applies the projective transformation using the roll angle, pitch angle ( ⁇ in FIG. 8), and yaw angle of the camera 2 to the vector S, so that the vector S of the camera coordinate system is converted to the world coordinate system. And the position of the point s in the world coordinate system may be calculated.
  • Each of the camera coordinates and the world coordinates is a coordinate system representing a real space. In this way, the control unit 11 can specify the position of the subject in the captured image 3 in the real space by using the depth data.
  • the control unit 11 may acquire a moving image or a still image as the captured image 3.
  • the control unit 11 may acquire a moving image or one still image for one point as the captured image 3.
  • the control unit 11 may acquire a moving image or a plurality of still images for a predetermined time as the captured image 3.
  • the control unit 11 obtains a moving image for one time point or a predetermined time or one or a plurality of still images as the photographed image 3, and performs processing for steps S102 to S105 described later on the photographed image 3 obtained.
  • the state of the pedestrian in the captured image 3 is analyzed.
  • control unit 11 may acquire the captured image 3 in synchronization with the video signal of the camera 2 in order to monitor the pedestrian. Then, the control unit 11 may immediately execute the captured image 3 acquired in steps S102 to S105 described later.
  • the abnormal state detection apparatus 1 can perform real-time image processing by continuously executing such an operation continuously, and can watch a pedestrian existing in the shooting range of the camera 2 in real time.
  • Step S102 Returning to FIG. 5, in the next step S ⁇ b> 102, the control unit 11 functions as the extraction unit 52, and extracts a person region where a pedestrian is captured as illustrated in FIG. 6 from the captured image 3 acquired in step S ⁇ b> 101. To do. Then, after extracting the person area from the captured image 3, the control unit 11 advances the processing to the next step S103.
  • the control unit 11 may extract a person region in the captured image 3 by performing image analysis such as pattern detection and graphic element detection based on the shape of the pedestrian.
  • the control unit 11 may extract the person region by detecting the three-dimensional shape of the pedestrian using the depth data.
  • the control part 11 may extract a person area
  • the control unit 11 may extract the moving area as a person area based on the background difference method.
  • the control unit 11 acquires a background image used for the background subtraction method.
  • This background image may be acquired by an arbitrary method, and is set as appropriate according to the embodiment.
  • the control unit 11 may acquire a photographed image before a pedestrian enters the photographing range of the camera 2, in other words, a photographed image without a pedestrian as a background image.
  • the control part 11 calculates the difference of the picked-up image 3 acquired at the time of the said step S101, and a background image, and extracts the foreground area
  • This foreground region is a region where a change has occurred from the background image, and is a region where a moving object (moving object) is captured.
  • the control unit 11 may recognize the foreground area as a person area.
  • the control unit 11 may extract a person area from the foreground area by pattern detection or the like.
  • the process for extracting the foreground area is merely a process for calculating the difference between the captured image 3 and the background image. Therefore, according to the processing, the control unit 11 (abnormal state detection device 1) can narrow the range in which the person area is detected without using advanced image processing. Therefore, according to the processing, the processing load in step S102 can be reduced.
  • the background subtraction method applicable to the present embodiment is not limited to the above example.
  • Other types of background subtraction methods include, for example, a method of separating the background and the foreground using three different images, and a method of separating the background and the foreground by applying a statistical model. . With these methods, the control unit 11 may extract a person region.
  • Step S103 In the next step S103, the control unit 11 functions as the behavior measurement unit 53, and refers to the depth of each pixel included in the person region extracted in step S102, and observes the pedestrian's body in the captured image 3. By continuously specifying the position in the real space of the local part to be the target, the behavior of the local part in the real space is measured. And the control part 11 advances a process to the following step S104, after measuring the behavior in the real space of the local site
  • the local region to be observed can be set as appropriate according to the embodiment.
  • the local site may be a specific site on the body such as the head, shoulder, chest, or leg.
  • the local part may be a part where the position on the body can be changed depending on the state of the pedestrian in the captured image, such as the upper part of the pedestrian, instead of such a specific part on the body.
  • the local region to be observed may be selected according to the type of abnormal state of the pedestrian detected in step S104 described later.
  • the local part is set to a part where the state of the pedestrian is easily reflected. For example, as will be described later, when the pedestrian is in a state of falling, crouching, lying down, etc., the entire body of the pedestrian is present at a position near the ground. Therefore, in order to detect these states, it is preferable that the local part to be observed is a part indicating the position of the upper end of the pedestrian, such as the upper part of the pedestrian or the head.
  • the upper part 31 of the pedestrian is employed as a local site to be observed.
  • the upper part 31 of the pedestrian indicates the upper end of the pedestrian in the real space, and may be one point of the upper end of the pedestrian or an area having an arbitrary area provided at the upper end of the pedestrian. May be.
  • the upper part 31 of the pedestrian can be set as appropriate.
  • the upper end of a pedestrian is the highest part in real space among the pedestrian's bodies shown in the photographed image.
  • the control unit 11 can specify the position of the upper part 31 of the pedestrian in the real space by using the depth data.
  • the control unit 11 uses the depth of each pixel included in the person area, and specifies the position of each pixel included in the person area in the real space by the above method.
  • the control part 11 makes the pixel which exists in the highest position in real space among each pixel contained in a person area the upper end of a pedestrian, and makes this pixel or the predetermined area
  • the control unit 11 continuously specifies the position of the upper part 31 of the pedestrian in the real space. For example, by plotting the position of the upper part 31 of the pedestrian on the real space coordinates, It is possible to measure the behavior of the upper portion 31 of the upper part 31 in real space. For example, when the moving image for one time point or one still image is acquired as the captured image 3 in the above step S101, the control unit 11 displays the pedestrian appearing in the moving image for one time point or one still image. The position of the upper part 31 is plotted on the real space coordinates. Thereby, the behavior of the upper part 31 of the pedestrian at one time point is measured.
  • the control unit 11 converts the moving image or the plurality of still images for the predetermined time into the captured image 3.
  • the position of the upper part 31 of the pedestrian that appears is continuously plotted on real space coordinates. Thereby, the behavior of the upper part 31 of the pedestrian within a predetermined time is measured.
  • control unit 11 can specify the position of the part in the real space by using the depth of each pixel included in the person region. And the control part 11 can measure the behavior in the real space of the said part by plotting the position of the said part on real space coordinate.
  • the control unit 11 By performing pattern detection or the like in the area, the region of the local part to be observed is specified.
  • the control unit 11 performs pattern detection on the three-dimensional shape of the local part by using the depth of each pixel included in the person region.
  • an area in which a local part is captured may be specified in the person area.
  • the control part 11 may specify the area
  • control part 11 can specify the position in the real space of the said local site
  • Step S104 In the next step S104, the control unit 11 functions as the state determination unit 54, and determines whether or not the pedestrian is in an abnormal state based on the behavior of the local part measured in step S103. And as a result of the determination, if it is determined that the pedestrian is in an abnormal state, the control unit 11 advances the processing to the next step S105. On the other hand, when it determines with a pedestrian not being in an abnormal state, the control part 11 complete
  • the image analysis method for determining whether or not the pedestrian is in an abnormal state may be appropriately selected according to the embodiment.
  • the upper part 31 of a pedestrian is adopted as a local site to be observed. Therefore, the control unit 11 detects an abnormal state of the pedestrian when the behavior of the upper part 31 of the pedestrian measured in step S103 can be evaluated as a movement satisfying a predetermined condition. May be.
  • the control unit 11 detects the pedestrian's falling state, crouching state, and lying state.
  • an example of a method for detecting various abnormal states will be described.
  • FIG. 9 schematically illustrates a scene where a pedestrian is in a fall state. As illustrated in FIG. 9, when the pedestrian falls, the position of the pedestrian's body changes suddenly. Specifically, it is assumed that the pedestrian suddenly descends vertically downward toward the ground.
  • the control unit 11 detects whether or not the pedestrian's upper part 31 has fallen by a predetermined distance or more in a certain time on the real space based on the behavior of the pedestrian's upper part 31 measured in step S103. . And when the control part 11 detects that the upper part 31 of this pedestrian descend
  • each threshold value of time and distance for detecting a fall state may be set as appropriate according to the embodiment.
  • the method of detecting a fall state may not be restricted to such an example, and the control part 11 may detect the fall state of a pedestrian by another method.
  • FIG. 10 schematically illustrates a scene in which the pedestrian is in a cramped state. As illustrated in FIG. 10, when the pedestrian is cramped, it is assumed that the entire body of the pedestrian exists below a predetermined height in real space.
  • the control unit 11 moves the upper part 31 of the pedestrian to a position lower than the predetermined first height H1 in the real space. Whether or not is detected. And when the control part 11 detects that the upper part 31 of the said pedestrian moved to the position lower than predetermined
  • the control unit 11 compares the height h in the real space of the upper part 31 of the pedestrian with a predetermined first height H1. And as a result of the comparison, when it is determined that the height h of the pedestrian upper portion 31 is lower than the predetermined first height H1, the control unit 11 determines that the pedestrian upper portion 31 is the predetermined first height H1. It is detected that the pedestrian has moved to a position lower than the height H1.
  • the value of the predetermined first height H1 may be appropriately set according to the embodiment.
  • the method of detecting the cramped state is not limited to such an example, and the control unit 11 may detect the crooked state of the pedestrian by other methods.
  • the height h of the upper part 31 of a pedestrian and predetermined 1st height H1 are expressed on the basis of the ground.
  • the position (height) of the ground in the real space can be given by an arbitrary method.
  • the control unit 11 calculates the position (height) of the ground in the real space by using the depth of each pixel included in the area where the ground appears in the captured image 3 by the above method. Can do. Therefore, the control part 11 can express the height h of the upper part 31 of a pedestrian with the distance from the ground.
  • the expression form of the height h of the upper part 31 of the pedestrian and the predetermined first height H1 is not limited to such an example, and may be appropriately selected according to the embodiment.
  • the height h of the upper part 31 of the pedestrian and the predetermined first height H1 may be expressed with the camera 2 as a reference.
  • FIG. 11 schematically illustrates a scene where a pedestrian is lying down. As illustrated in FIG. 11, when the pedestrian is lying down, it is assumed that the entire body of the pedestrian is present at a lower height in real space than in the case of the crouched state.
  • the control unit 11 determines that the upper part 31 of the pedestrian is the predetermined first in real space based on the behavior of the upper part 31 of the pedestrian measured in step S ⁇ b> 103. It is detected whether or not it has moved to a position lower than a predetermined second height H2, which is lower than the height H1. And when the control part 11 detects that the upper part 31 of the said pedestrian moved to the position lower than predetermined
  • the control unit 11 compares the height h in the real space of the upper part 31 of the pedestrian with a predetermined second height H2. And as a result of the comparison, when it is determined that the height h of the pedestrian upper portion 31 is lower than the predetermined second height H2, the control unit 11 determines that the pedestrian upper portion 31 is the predetermined second height H2. It is detected that the pedestrian is lying down by detecting that it has moved to a position lower than the height H2.
  • the value of the predetermined second height H2 may be appropriately set according to the embodiment so as to be lower than the predetermined first height H1.
  • the method for detecting the lying state is not limited to such an example, and the control unit 11 may detect the lying state of the pedestrian by other methods. Further, in the example of FIG.
  • the predetermined second height H2 is expressed with reference to the ground, as in FIG.
  • the expression format of the predetermined second height H2 is not limited to such an example, and may be appropriately selected according to the embodiment.
  • the predetermined second height H2 may be expressed with the camera 2 as a reference.
  • step S104 determines in step S104 that the pedestrian's state corresponds to any of the pedestrian's falling state, crouching state, and lying state
  • the process proceeds to the next step S105.
  • step S104 determines in step S104 that the pedestrian's state does not correspond to any of the pedestrian's fallen state, cramped state, and lying state
  • the process according to this operation example Exit.
  • the state to be detected among the various states of the pedestrian may be appropriately selected according to the embodiment. That is, at least one of the fall state, the stagnation state, and the lying state may be excluded from the detection target.
  • the control part 11 may detect the state of pedestrians other than the above based on other conditions of said each conditions.
  • the type of the state of the pedestrian to be detected in step S104 may be appropriately selected according to the embodiment, may be selected by the user, or may be set in advance.
  • At least one of the falling state, the crouching state, and the lying state may be set as not being an abnormal state.
  • the control unit 11 when it is set that the cramped state is not an abnormal state, the control unit 11 does not determine that the pedestrian is in the abnormal state when detecting that the pedestrian is in the crooked state, It is determined that the pedestrian is in a normal state.
  • the control part 11 abbreviate
  • the control unit 11 determines whether the pedestrian is in a pedestrian fall state, a stagnation state, or a lying state. When it is determined that the state does not correspond to the state, the pedestrian may be recognized as being in a normal state. And the control part 11 may alert
  • Step S105 In the next step S105, the control unit 11 functions as the notification unit 55.
  • the control unit 11 confirms that the pedestrian is in the abnormal state.
  • An abnormal state detection notification is sent to notify. Thereby, the processing according to this operation example is completed. Note that the means by which the control unit 11 performs the abnormality detection notification can be appropriately selected according to the embodiment.
  • the abnormal state detection device 1 when used in a facility such as a hospital, the abnormal state detection device 1 can be connected to equipment such as a nurse call system via the external interface 15.
  • the control part 11 may perform abnormality detection notification in cooperation with equipment such as the nurse call system. That is, the control unit 11 may control the nurse call system via the external interface 15. And the control part 11 may perform the call by the said nurse call system as abnormality detection notification. Accordingly, it is possible to appropriately notify a nurse or the like who watches the pedestrian that the pedestrian is in an abnormal state.
  • control unit 11 may perform abnormality detection notification by outputting a predetermined sound from the speaker 14 connected to the abnormal state detection device 1. Further, for example, the control unit 11 may display a screen on the touch panel display 13 for notifying that an abnormal state of the pedestrian has been detected as an abnormality detection notification.
  • control unit 11 may perform such an abnormality detection notification using an e-mail, a short message service, a push notification, or the like.
  • the e-mail address, telephone number, and the like of the user terminal that is the notification destination may be registered in the storage unit 12 in advance.
  • the control part 11 may perform abnormality detection notification using this e-mail address, telephone number, etc. which are registered beforehand.
  • the abnormal state detection device 1 analyzes the state of a pedestrian based on the captured image 3 including depth data indicating the depth of each pixel. As described above, since the depth of each pixel is acquired with respect to the subject surface, the position of the subject surface in real space can be specified by using the depth data. Therefore, the abnormal state detection device 1 according to the present embodiment uses this depth data to measure the behavior in the real space of the upper part 31 of the pedestrian to be observed among the pedestrian's body, and the measured walking Based on the behavior of the upper part 31 of the person, it is detected whether or not the pedestrian is in an abnormal state.
  • the abnormal state of the pedestrian is detected based on the behavior of the local part of the pedestrian rather than the entire body of the pedestrian. Therefore, since the body region to be observed is limited to the upper part 31 of the pedestrian, the processing load for analyzing the state of the pedestrian can be reduced, and the state of the pedestrian can be analyzed at high speed. Moreover, since the observation target is narrowed down, the analysis content becomes simple.
  • the abnormal state detection device 1 detects the pedestrian's falling state, crouching state, and lying state based on the fluctuation and height of the pedestrian's upper portion 31. The measurement of the fluctuation and the height of the upper part 31 of the pedestrian hardly causes an error. Therefore, the state of the pedestrian can be analyzed with high accuracy. Therefore, according to this embodiment, a pedestrian can be watched appropriately.
  • the upper part 31 of a pedestrian is employ
  • Timing of notification processing As an example, in the above-described embodiment, when the control unit 11 determines in step S104 that the pedestrian is in an abnormal state, it immediately performs an abnormality detection notification. However, the timing at which the abnormality detection notification is performed may not be limited to such an example.
  • the control unit 11 may function as the notification unit 55 and perform abnormality detection notification when the abnormal state of the pedestrian continues for a certain time or more.
  • step S104 the control unit 11 determines whether or not the abnormal state of the pedestrian continues for a certain time or more. And the control part 11 performs abnormality detection notification in step S105, when it determines with the abnormal state of a pedestrian continuing more than fixed time.
  • the control unit 11 omits the process of step S105 and ends the process according to the above operation example.
  • the threshold value for determining whether the abnormal state of the pedestrian has continued for a certain time or more may be set as appropriate according to the embodiment.
  • the abnormality detection notification is made so that the state of the pedestrian satisfies the abnormal state condition for a moment. It is possible to prevent false reports of abnormality detection notifications. For example, when a pedestrian tries to pick up an object that has fallen on the ground, the state of the pedestrian can be cramped for a moment. In such a case, for example, if an abnormality detection notification is made by the speaker 14 or the like, a state different from the actual state of the pedestrian is notified to the people around the speaker 14, and the people concerned are erroneously notified. Information will be transmitted. On the other hand, according to the modified example, it is possible to appropriately notify the pedestrian's abnormality detection by preventing such a false alarm.
  • the abnormal state detection device 1 detects a pedestrian's fall state, cramped state, and lying state as the pedestrian's abnormal state. ing.
  • the types of abnormal states of pedestrians to be detected are not limited to these, and may be appropriately selected according to the embodiment.
  • the abnormal state detection device 1 may detect such walking with a high risk of falling as an abnormal state. Specifically, the movement of the joint in the lower limb of the pedestrian is reduced by reducing the range of motion of the joint due to aging, a decrease in physical strength, and the like. For example, the angle of the toes with respect to the walking surface (ground) decreases, and the distance from the bottom of the walking foot to the walking surface (ground) decreases.
  • the control unit 11 measures the behavior of the leg in step S103. For example, the control unit 11 specifies the range in which the leg portion is captured by performing pattern matching or the like in the person region extracted in step S102. Next, the control unit 11 uses the depth of each pixel in the portion where the toe appears in the range where the leg appears, and analyzes the shape of the toe to thereby determine the angle of the toe relative to the walking surface (ground). calculate. A known image analysis method may be used as a method of calculating the angle of the toe with respect to the walking surface (ground). Moreover, the control part 11 calculates the distance between the lowest point (bottom part) of a leg part, and the ground using the depth of each pixel of the range which a leg part shows.
  • the position (height) in the real space of the ground may be given by an arbitrary method as described above. Then, the control unit 11 continuously plots the angle between the toe with respect to the walking surface (ground) and the distance between the lowest point (bottom) of the leg and the ground. Thereby, the control part 11 can measure the behavior of the leg part in real space.
  • step S104 the control unit 11 refers to the continuously plotted data, and determines whether or not the maximum value of the angle of the toe with respect to the walking surface (ground) is equal to or less than a predetermined value. Moreover, the control part 11 determines whether the maximum value of the distance between the lowest point (bottom part) of a leg part and the ground is below a predetermined value. And the control part 11 has the maximum value of the angle with respect to the walking surface (ground) of a toe below a predetermined value, and the maximum value of the distance between the lowest point (bottom part) of a leg part and the ground is predetermined. When it determines with it being below a value, a pedestrian is in an abnormal state and a process is advanced to following step S105. On the other hand, the control part 11 complete
  • the abnormal state detection device 1 may detect a walk with a high risk of falling as an abnormal state.
  • the predetermined values for the angle and the distance which are threshold values for determining whether or not the state is abnormal, may be appropriately set according to the embodiment.
  • the target leg may be a right leg, a left leg, or both legs.
  • the abnormal state detection device 1 may hold the angle and the distance in a normal state measured in advance. And the said abnormal condition detection apparatus 1 is the amount of reduction

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Physics & Mathematics (AREA)
  • Dentistry (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Physiology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Alarm Systems (AREA)
  • Emergency Alarm Devices (AREA)
  • Traffic Control Systems (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

L'invention concerne un système permettant de surveiller des piétons de manière appropriée. Selon un aspect de la présente invention, le dispositif de détection d'état anormal comprend : une unité d'acquisition d'images qui acquiert des images photographiées qui capturent un piéton en train de marcher et qui comprennent des données de profondeur qui indiquent la profondeur de chacun des pixels dans les images photographiées ; une unité d'extraction qui extrait une région de personne qui consiste en la région dans laquelle le piéton apparaît dans les images photographiées ; une unité de mesure de mouvement qui référence la profondeur des pixels inclus dans la région de personne extraite et, en spécifiant de manière continue la position dans l'espace réel d'un site local qui est une cible d'observation sur le corps du piéton qui apparaît dans les images photographiées, mesure le mouvement dans l'espace réel du site local ; une unité de détermination d'état qui détermine si le piéton est dans un état anormal sur la base du mouvement mesuré du site local ; et une unité de notification qui, lorsque les résultats de détermination ont déterminé que le piéton est dans un état anormal, effectue une notification de détection d'anormalité permettant de faire savoir que le piéton est dans un état anormal.
PCT/JP2016/050281 2015-03-23 2016-01-06 Dispositif de détection d'état anormal, procédé de détection d'état anormal, et programme de détection d'état anormal WO2016152182A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2017507517A JP6737262B2 (ja) 2015-03-23 2016-01-06 異常状態検知装置、異常状態検知方法、及び、異常状態検知プログラム

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2015-059277 2015-03-23
JP2015059277 2015-03-23

Publications (1)

Publication Number Publication Date
WO2016152182A1 true WO2016152182A1 (fr) 2016-09-29

Family

ID=56978927

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/050281 WO2016152182A1 (fr) 2015-03-23 2016-01-06 Dispositif de détection d'état anormal, procédé de détection d'état anormal, et programme de détection d'état anormal

Country Status (2)

Country Link
JP (1) JP6737262B2 (fr)
WO (1) WO2016152182A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018099267A (ja) * 2016-12-20 2018-06-28 株式会社竹中工務店 運動量推定装置、運動量推定プログラム、及び運動量推定システム
CN112260402A (zh) * 2020-10-22 2021-01-22 海南电网有限责任公司电力科学研究院 基于视频监控的变电站智能巡检机器人状态的监控方法

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014155693A (ja) * 2012-12-28 2014-08-28 Toshiba Corp 動作情報処理装置及びプログラム
JP2015042241A (ja) * 2013-01-18 2015-03-05 株式会社東芝 動作情報処理装置及び方法

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9161708B2 (en) * 2013-02-14 2015-10-20 P3 Analytics, Inc. Generation of personalized training regimens from motion capture data

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014155693A (ja) * 2012-12-28 2014-08-28 Toshiba Corp 動作情報処理装置及びプログラム
JP2015042241A (ja) * 2013-01-18 2015-03-05 株式会社東芝 動作情報処理装置及び方法

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018099267A (ja) * 2016-12-20 2018-06-28 株式会社竹中工務店 運動量推定装置、運動量推定プログラム、及び運動量推定システム
CN112260402A (zh) * 2020-10-22 2021-01-22 海南电网有限责任公司电力科学研究院 基于视频监控的变电站智能巡检机器人状态的监控方法
CN112260402B (zh) * 2020-10-22 2022-05-24 海南电网有限责任公司电力科学研究院 基于视频监控的变电站智能巡检机器人状态的监控方法

Also Published As

Publication number Publication date
JPWO2016152182A1 (ja) 2018-01-18
JP6737262B2 (ja) 2020-08-05

Similar Documents

Publication Publication Date Title
JP6115335B2 (ja) 情報処理装置、情報処理方法、及び、プログラム
JP6534499B1 (ja) 監視装置、監視システム、及び、監視方法
JP6504156B2 (ja) 情報処理装置、情報処理方法、及び、プログラム
JP6500785B2 (ja) 情報処理装置、情報処理方法、及び、プログラム
JP6432592B2 (ja) 情報処理装置、情報処理方法、及び、プログラム
JP6638723B2 (ja) 画像解析装置、画像解析方法、及び、画像解析プログラム
KR101053405B1 (ko) 구조물 변형 감지 시스템 및 방법
JP6489117B2 (ja) 情報処理装置、情報処理方法、及び、プログラム
JP6780641B2 (ja) 画像解析装置、画像解析方法、及び、画像解析プログラム
JP6712778B2 (ja) 物体検知装置、物体検知システムおよび物体検知方法
WO2016031314A1 (fr) Dispositif d'identification individuelle, procédé d'identification individuelle, et programme d'identification individuelle
EP2476999A1 (fr) Procédé de mesure du déplacement, dispositif de mesure du déplacement et programme pour la mesure du déplacement
JP2014236312A (ja) 設定装置および設定方法
KR101972582B1 (ko) Ptz 촬상장치 기반 변위 측정 시스템 및 방법
JP2011209794A (ja) 対象物認識システム及び該システムを利用する監視システム、見守りシステム
WO2016152182A1 (fr) Dispositif de détection d'état anormal, procédé de détection d'état anormal, et programme de détection d'état anormal
JP2011053005A (ja) 監視システム
TW201518759A (zh) 物體偵測裝置、物體偵測方法及物體偵測系統
JP6607253B2 (ja) 画像解析装置、画像解析方法、及び、画像解析プログラム
JP6922914B2 (ja) 見守りシステム、見守り装置、見守り方法、および見守りプログラム
JP6645503B2 (ja) 画像解析装置、画像解析方法、及び、画像解析プログラム
JP6565468B2 (ja) 呼吸検知装置、呼吸検知方法、及び呼吸検知プログラム
JP6780639B2 (ja) 画像解析装置、画像解析方法、及び、画像解析プログラム
JP6606912B2 (ja) 浴室異常検知装置、浴室異常検知方法、及び浴室異常検知プログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16768085

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2017507517

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16768085

Country of ref document: EP

Kind code of ref document: A1